Contents

Hours of Opportunity

Click here to download the full report:
 Hours of Opportunity

The sites undertook a set of activities to meet the initiative’s goals: increasing access, improving quality, developing better information to improve decisionmaking, and increasing sustainability. To answer research question 2, we tracked what each city did to achieve the four goals. We begin by describing what each of the sites achieved in terms of access, quality, information, and sustainability. We then describe the types of activities undertaken to achieve results for each goal.

Results of the OST Initiative

The sites were able to accomplish much under the Wallace grant. At the point of our final data collection (spring 2009), two of the sites had completed five years of implementation (Providence and New York City), while the other three were only two years into their implementation grant. Thus, as one would expect, we see more progress in Providence and New York City toward the goals than in the other sites (see Table 3.1).

In Providence, the AfterZones offered after-school opportunities to all middle school students, and approximately 34 percent of middle school students participated—an increase from about 500 to 1,700 slots. PASA helped secure federal 21st Century Community Learning Center funds to support the AfterZones, and PASA, with direct help from the mayor, was successful in bringing in many private donations to support system building and programs. Data on participation were used in daily decisionmaking, informed planning, and helped garner additional funding for the efforts. Building on the AfterZone success, PASA began to support system-building efforts at the high school level.

 

 

In New York City, over the course of the initiative, DYCD moved programming to high-need areas, expanded the number of slots from 45,000 to more than 80,000, and set a uniform cost model. It required all providers to enter program and participation data into an MI system. Data from this system were used to hold providers accountable for participation, signal potential quality issues, and help garner additional funding for OST. In fact, New York City’s sustainment plan was to use participation and evaluation data to prove the benefits of OST programming to attract increased city funding in an increasingly competitive environment.

In Boston, the PSS demonstration was folded into the activities of the Triumph Collaborative, a group of schools with a similar model of OST provision. In addition, Boston was just starting its complementary CLI. Participation increased in the PSS schools, as five of these schools had no OST program prior to the demonstration. In 2008, 927 students were enrolled in after-school programs across the ten PSS sites. The MI system was in development and there were no changes in how OST was funded or sustained.

All the major public agencies in Chicago had functional MI systems, and, in spring 2009, data from all agencies had been merged into a single data set to allow the agencies to review data across the entire OST system. Chicago had established a quality pilot that was under way in 43 sites, and the Chicago Public Library had led an active campaign to improve teenage participation. There was no change in how OST was funded or sustained.

In spring 2009, Washington, D.C., had OST programming in all its public schools, and each school had an on-site OST coordinator, funded by the school system. The Trust continued to use its MI system to track participation, and the school system tracked OST program participation using its school MI system. The mayor called on the schools, the Trust, and other agencies to regularly report on programs and participation.

We cannot comment on whether quality improved, as our study did not track program quality outcomes. However, each of the sites had made efforts to improve the quality of OST providers, including adopting standards, observing program quality, and giving providers professional development.

Activities to Meet The Wallace Foundation’s Goals

Using proposal and interview data, we categorized the activities reported by the sites into the four goal areas. It is important to remember that New York City and Providence received their grants earlier; thus, one would expect to see more activities in the implementation phase in these sites. Greater detail on site activities can be found in McCombs, Bodilly, et al. (2010).

Goal 1: Increase Access and Participation

Across the sites, a first order of business was to increase access and participation—in specific locations or for specific populations. Efforts varied, but common activity areas, as shown in Table 3.2, were to address transportation issues, increase convenience for students, increase the number of locations and available slots in the programs, increase enrollment, and ensure affordability.

Address Transportation

Adequate transportation was identified as a key issue in the sites, with the exception of New York City and Chicago. In New York City, with its very dense population and heavy reliance on public transit, students walked or used public transportation to get to and from programs. In Chicago, the focus was on teens who already used the city’s public transportation independently. Thus, lack of transportation, while still possibly prohibiting access for some, was not seen as a key concern.

In other cities without convenient city public transportation routes to schools, children had to transit from the schools to the programs or from the school-based program to home. This required the running of additional school district buses, especially on the homeward trip. In Providence, the only transportation costs incurred by PASA are to transport students from their home schools to programs that take place in off-site locations, such as at local recreation centers, Boys and Girls Clubs, or parks and museums.

 

In Washington, D.C., the original focus under the Trust was on programming in neighborhood middle schools, later extended to programming in all DCPS schools. DCPS buses special education students only. All other students walk or rely on public transportation. Nonetheless, parents did express concerns about their children returning from OST programs safely, and concerns were greatest during the winter when students would have to walk home in the dark. This led some middle school OST programs to operate under winter hours, so the program ended earlier. Issues of access remained when DCPS began operating programs in all public schools. As more children switched to charter schools and more traditional schools closed, the neighborhood patterns began disappearing. Planners worried that more children would feel unsafe on the return trips home if they had to cross unfamiliar neighborhoods, especially in areas where gangs were present.

Boston interviewees noted that transportation was an unsettled issue that undercut efforts to increase access. Boston public schools use an open enrollment plan in three regions for grades K–8, with open districtwide enrollment for high school students. Every school day, children in grades K–8 commute within their region to their schools of choice using district-provided transportation, while high school students take public transit. The mayor opened the schools to afterschool programs in the late 1990s, but transportation home was not provided. Thus, children found their own way home from programs or relied on parents to pick them up. Because Boston focused its OST efforts on a school-based model as opposed to a neighborhood-based model, students coming from out of the neighborhood would have to find their own way home. Finding transportation home seemed to be a key to the initiative’s success. Additional transportation was not provided in the planning or implementation proposals. Instead, the initial goal of PSS was to create after-school opportunities in the students’ home neighborhoods through CBOs without adding more bus routes, which would accrue transportation costs.

Increase Convenience

In four of the five cities, the planners sought to increase the convenience of the programs, hopefully increasing access by moving programs closer to the children and running the programs for more hours. Providence adopted the neighborhood campus concept, with programs offered in or near the schools and running until 5:00 p.m. with transportation home.

New York City increased convenience by moving the programs closer to underserved populations. When it put out requests for proposals (RFPs) to vendors for more programs, it specified geographic areas of the city that had to be served. Providers stepped forward to deliver programs in those underserved areas, thus increasing the convenience to the children.

In Chicago, because furthering a plan depended on the development of MI systems, we did not uncover any coordinated efforts to increase convenience, aside from those that already existed. There were community centers and parks throughout the city that already offered programs, as did the schools. Thus, the planners felt that programs were already conveniently available. In some areas, population shifts had made the location of some parks and community centers less than ideal in terms of providing youth programming to high-need populations, but moving a park or a center was considered prohibitively expensive.

In Boston, the initiative initially focused on ten low-performing schools (PSS schools) in the first two years of the grant. Five of those schools had no after-school programming prior to the grant. The plan established programs in these schools that were open until 6:00 p.m. This set-up was convenient for those who had transportation home but not for those who came from other parts of the city and did not have easy access to transportation.

Increase the Number of Locations and Available Slots

Three cities (Washington, D.C., New York City, and Providence) intended to significantly increase the number of children in afterschool programs. These plans were heavily dependent on placing more quality providers into specific geographic areas and obtaining additional funding. While each worked to recruit higher-quality providers, they also aimed to recruit more providers or providers who could serve more students.

For example, leaders at the Trust concluded that it would be more effective to get small to midsized providers to agree to provide more slots than to get new providers to enter the field. This required a change in how the leaders of those small provider organizations thought about and managed their operations. The initiative in Washington, D.C., called Project My Time, established the Institute for New Leaders, New Communities, designed to train and coach leaders of small provider organizations to develop the managerial capacity to expand. Attendance at the institute would guide the CBO leader through the development of a strategic plan and actual implementation. About 60 providers were targeted for this training over a two-year period.

New York City and Providence spent considerable effort obtaining additional funding to increase the number of slots available. Providence successfully sought to get external funding through grants and federal 21st Century Community Learning Center funding for some of its AfterZones and provider organizations. AfterZones increased access among middle school children to OST programs. According to estimates provided by PASA, during the 2008–2009 school year, 34 percent of enrolled public middle school students in Providence participated in a PASA program—approximately 1,700 students. PASA estimated that only 500 middle school youth participated in OST programming each year prior to the creation of the AfterZones. New York City planners used the data they had developed to demonstrate to the mayor and city council both the need for more slots and their successes in placing more children. They were able to successfully advocate for greater funding allotments against competing programs because they could show data to support their claims. They successfully increased the budget available to DYCD for these purposes from $46.4 million in fiscal year (FY) 2006 to $116.6 million in FY 2009, thereby increasing the number of slots from approximately 45,000 to more than 80,000.

Boston also increased the number of children participating in OST programming at its PSS sites. There was no OST programming in five of the schools prior to PSS. In 2008, 927 students were enrolled in after-school programs across the ten PSS sites.

Increase Enrollment

Early planning surveys and other more general research indicated that many children and parents did not use after-school programs because they did not know about them. Thus, each of the sites undertook efforts to increase public awareness. Four sites (Boston, Chicago, Washington, D.C., and New York City) developed online “program locators” to encourage enrollment. On these websites, consumers could type in their address, zip code, or other location information and identify programs being offered in their area. In several instances, the program locator connected to the providers’ website so that consumers could read descriptions of the activities.

Others took additional steps. For example, New York City published a summer activities booklet and launched an advertising campaign. Providence used flyers, recruitment fairs, advertising, parentteacher organization meetings, and open houses to get its message out. In Chicago, the public schools disseminated a guide to available programs, and the libraries led an active teen marketing campaign.

In Providence, Washington, D.C., and Boston, the role of the site coordinator was key to working with principals and teachers to ensure that they understood and actively supported the programs and encouraged enrollment and regular attendance by students.

Ensure Affordability

A final potential stumbling block to enrollment might be cost or fees. In most of the cases here, the programs were available for free to the most in-need students, in part because of the strong efforts made by the agencies and intermediaries to obtain funding. For instance, in Providence, where there is very limited city funding for OST, PASA has continuously sought federal, state, and philanthropic funding to support programming. In 2008, PASA’s board voted not to collect any fees for its OST programming, and PASA chose to focus more on securing 21st Century Learning Center grants (federal dollars managed by the Rhode Island Department of Education) to fund the AfterZones.

PASA has been successful in bringing in additional grants and support for Providence’s coordinated OST effort beyond The Wallace Foundation and 21st Century funding. Providence’s mayor has helped PASA secure federal Community Development Block Grant funding and introduced a line item in the city budget for after-school programming for the first time. PASA was particularly successful in raising private funding from multiple sources. However, braiding these funds together took a concerted effort, and interviewees in Providence noted that long-term sustainability remains a challenge.

Goal 2: Improve Quality

Leaders at the sites were aware that, prior to the initiative, some of the existing programming was not of high quality. Several sites concentrated significant effort on developing standards of provision, qualityassessment systems to monitor providers, and incentives and contractual mechanisms to ensure better provision, as well as on evaluating outcomes to drive improvement across the board (see Table 3.3). In addition, several sites invested in professional development for providers and the coordinators who were placed in the neighborhood schools to manage the programs. However, even after several years of effort, none claimed that the programs being offered were of universally high quality, nor could they demonstrate quantitative improvements in quality. Thus, while much was accomplished, work remains in this particular area.

Create Standards and Assessment Tools

Three of the sites (Washington, D.C., New York City, and Providence) developed and implemented a new set of standards and tools to assess providers. For instance, in Providence, PASA leadership felt that it was vital to develop quality measures through a community effort andengaged various groups to accomplish this goal. Starting in November 2004, a workgroup was assembled to consider quality. A group of approximately 25 participants considered already established standards from other cities and adapted them to meet Providence’s needs. Interviewees told us that this workgroup created buy-in from providers and created an identity for Providence’s after-school programming at a critical time prior to the formal launch of the AfterZones. The established standards are now used across the state of Rhode Island.

 

After standards were chosen, it became necessary to develop indicators and assessment tools. A smaller team met in late 2005 and early 2006 to develop these indicators and to consider an assessment tool. Participants included representatives from advocacy groups, staff from professional development nonprofits, and city officials, as well as representatives from some provider organizations. The discussion of indicators occurred in concert with the selection of an assessment tool. According to respondents, there was tension between advocates of a totally homegrown tool, reflective of the community planning effort to create quality standards and indicators, and advocates of a well-known tool that had more widespread recognition and credibility. Eventually, a hybrid tool, the Rhode Island Program Quality Assessment (RIPQA), was developed. The tool uses the HighScope Youth Program Quality Assessment’s Form A (a valid instrument designed to evaluate the quality of youth programs at the point of service), and the PASA-developed Form B, which assesses organizational capacity.

Boston also worked to develop standards and an assessment tool, but after merging the PSS schools into the Triumph Collaborative, it ended up relying primarily on existing standards and assessments already used by the DELTAS office.

Chicago began implementing a program improvement pilot initiative in September 2009 in 43 OST program sites: two Chicago Public School sites, four After School Matters sites, four library sites, eight Park District sites, and 25 Family and Support Services sites. The pilot consisted of peer coaching, a self-administered program assessment, and an external assessment. Based on these assessments, program staff and their coach developed and implemented a program improvement plan. The program assessment tool used was a version of HighScope’s Youth Program Quality Assessment that was customized for Chicago. The Chicago Area Project, a private nonprofit, focused on preventing delinquency and servicing disadvantaged urban youth, provided technical assistance and training to pilot sites, and oversaw the external evaluation process.

Monitor Quality and Vet Providers

Cities developed different mechanisms for monitoring quality. In Providence, PASA and outside evaluators from OST providers across the city and state used the assessment tool to conduct observations of the programs and to provide constructive feedback. Respondents there said that this process benefited the programs and raised the observers’ awareness. The entire process was viewed as assistance and was not used punitively to reduce funding or eliminate the provider from the effort. In fact, interviewees described the process as a professional development tool for the community of providers.

In New York City, DYCD program managers used a modified version of the New York State Afterschool Network (NYSAN) Program Quality Self-Assessment tool to measure program quality during two site visits per year as a way to monitor the progress of OST programs and to ensure that they received the support they needed. When a program was struggling, program managers referred it to PASE, the technical assistance provider, for additional assistance and follow up.

The Trust began conducting regular quality assessments through its Project My Time site directors and staff in January 2008, and quality scores became a key criterion for future funding in September 2008. Meanwhile, DCPS put in place a formal vetting process for the providers with which it would contract, including a review of their basic health and safety certifications and curriculum.

Provide Professional Development and Performance Incentives

In Providence, professional development changed over the course of the implementation grant. Initially, professional development was not aligned with the developed program standards. Therefore, leaders thought it was not as effective as the more current offerings, although it did build some goodwill with providers. There were monthly workshops on such topics as parent engagement and staff retention, along with a 32-hour youth development certificate program known as the BEST (Building Exemplary Systems for Training Youth Works) youth worker program. But more recent PASA professional development for after-school providers now aligns with the various modules of the assessment tool (RIPQA). Programs not participating in RIPQA can still benefit from the training, which emphasizes practices to improve program quality that can apply to all programs (e.g., providing a safe and supportive environment, ensuring positive interactions with youth, promoting youth engagement).

In New York City, DYCD made a substantial financial investment in improving the quality of staff in the OST programs it funds. As the result of an RFP process, DYCD awarded PASE a three-year contract and provided $500,000 annually for a variety of training, technical assistance, and capacity-building opportunities for programs. These services were provided free of charge to organizations receiving DYCD OST funding. PASE offered a variety of professional development workshops and conferences throughout the year. In 2008, it also offered on-site training in Staten Island and Far Rockaway—two locations where participation by providers in centrally offered training had been low.

In New York City, interviewees noted that some programs were heavy users, or “frequent flyers,” while other programs took advantage of professional development opportunities to a lesser extent. Many of these offerings helped fulfill programs’ licensing requirements. PASE also solicited ideas for training from DYCD, OST program staff, and their consultants. In addition, PASE provided training and support for the use of MI systems.

For OST programs that failed to meet quality standards, PASE brokered targeted on-site technical assistance. After receiving a referral from a DYCD program manager, PASE would follow up with the program, conduct a needs assessment, and contract with one of its consultants to provide the needed technical assistance on site.

A new initiative in 2009 was to provide technical assistance in infrastructure and management to provider organizations operating a large number of programs (i.e., organizations that ran ten or more OST programs) to improve their internal operations and thus provide stronger services to students.

In addition to giving direct providers professional development, as described earlier, the Trust offered training to leaders of OST nonprofits to get them to think about how to provide quality programs on a larger scale. This required a change in how the leaders of those organizations thought about and managed their operations.

In Boston, DELTAS employed coaches to assist the school site coordinator in a variety of capacities (e.g., parent engagement, leadership and supervision, curriculum, supporting English language learners). Each coach was in charge of between five and ten schools. One respondent described the coach as “extremely good at helping to professionalize what we do here. . . . He comes to partner meetings, [and] I meet [someone] at a networking event, and my coach says, ‘Let me draft the MOU or work plan so there is a paper trail’—or other things that a lot of times schools or community organizations tend to gloss over.” Universally, interviewees found the coaching extremely helpful.

Evaluate Progress

Finally, New York City and Providence hired outside evaluators to assess their efforts. Boston had also planned an outside evaluation but felt that it was too early for the evaluation, particularly considering the high turnover among key staff; thus, it ended its evaluation after the first year.

In Providence, the Center for Resource Management took an initial look at AfterZone outcomes in 2007 and reported on AfterZone participant demographics as well as linkages between school outcomes and AfterZone participation. Most significantly, the report showed that students who participated in PASA programs tended to have slightly higher rates of school attendance than nonparticipants. The report also indicated that PASA was not, in the words of one source, “skimming the cream,” or attracting an atypical group of students as compared to the total Providence middle school population. At the time of our last site visit to Providence in spring 2009, Public/Private Ventures was in the midst of a three-year longitudinal study funded by The Wallace Foundation that included surveys of AfterZone participants and nonparticipants.

In New York City, DYCD contracted with Policy Studies Associates to conduct a three-year evaluation of the OST initiative. DYCD appeared to be an active user of information that emerged from the evaluation. For instance, after the evaluation found that parents particularly liked and needed summer programs, DYCD made summer programming a requirement in the next round of RFPs. Interviewees throughout the system—from all levels of DYCD and leaders in the field—mentioned and referred to the Policy Studies Associates study. In 2009, DYCD remained committed to continuing the evaluation even in the face of potential budget cuts. As one DYCD official noted, “It has been important to maintain the core mission and the component parts, and that is quality direct services and also evaluation. Very often you say, ‘Let’s throw out the evaluation, the capacity building.’ For us, that is not fluff; that is core.”

Goal 3: Develop Information Systems for Decisionmaking

A major thrust of the initiative was to encourage the development of an MI system to track children and enrollment patterns. From the point of view of The Wallace Foundation, this was essential to understanding whether the programs were attracting children and whether the children’s participation was frequent enough to affect their development. The cities made varying progress in the development of MI systems for student tracking purposes, but, as the systems were developed, the cities found important additional uses for the information. Data-based planning and communication strategies adopted to improve access and quality had multiplier effects and often generated greater coordination and communication. Additional details on this subject can be found in McCombs, Orr, et al., 2010.

All five cities devoted considerable energy to developing MI systems to track enrollment, participation, and student demographics. For instance, Chicago dedicated the majority of its effort in the early years to developing and implementing an MI system for the Park District, Chicago Public Schools, Family and Support Services, and After School Matters. Each organization had a customized system, but data from each could be easily merged to provide a comprehensive view of OST enrollment and attendance in Chicago.

During this period, four of the cities adopted and used an MI system that tracked student enrollment, attendance, and demographics. The exception was Boston, where an MI system that could be linked to the public schools data system was in development. The use of MI systems to track student enrollment, attendance, and demographics represented a major step forward for these four cities. For the first time, they knew across a large number of programs how many students were enrolled and attending on a regular basis as well as the characteristics of the students.

This simple step was particularly important for Providence, where surveys during the early planning period had shown that parents were reluctant to send their children to after-school programs unless the provider could ensure the child’s safety, including knowing where the child was at all times. PASA used the system to allow it to track the children into and home from the programs on a daily basis, including on the buses. In this way, PASA could immediately determine the location of a child upon parent request.

These same four cities also used these systems to collect information about providers, including the type of programming offered, and used these data to determine which programs were attracting the most students and where they were located. This was most advanced in Providence and New York City. Again, the centralized data system was a first for these cities.

Several sites then sought to go further with data collection. For example, Washington, D.C., hoped to merge information about students’ academic backgrounds with after-school attendance data to determine whether the children who attended had associated improvements in academic outcomes. Additionally, some hoped to merge the attendance data with information about each student’s involvement in the juvenile justice system or family services, believing that this information would allow providers to craft supports to meet the child’s particular needs.

However, practical and legal barriers prevented this from occurring, including the agencies’ need to protect student records as required under state and federal human subject protections. Other practical barriers had to be overcome to develop the systems to this point. Funding and expertise needed for data collection and analysis was in short supply across the sites. Interviewees reported institutional inertia and turf issues that led to each agency favoring its own system and an unwillingness to share data with other agencies.

Compared to site reports on what existed prior to the initiative efforts, by the spring of 2009, sites were developing and using information for a range of purposes. All the sites, except Boston, were using an MI system to track daily attendance in OST programs and to understand some basic characteristics of who enrolled by program type and geographic location in the city.

Three cities took a further step to understand why children were attending different programs. Providence conducted surveys of the children as they proceeded through the programs. It used a combination of the survey and attendance data to identify problematic programs and to work with them to improve, as well as to develop new programs to meet the interests of the children. PASA provided its student survey information to its evaluator for use in assessing the impact of the programs on student motivation, aspiration, and engagement in school. New York City and Washington, D.C., used program attendance as a proxy for quality, assuming that children would vote with their feet and that poor-quality programs would be visible by poor attendance. Analysts reviewed attendance records to determine which programs seemed to have the biggest draw and ensured that these program types were offered. This approach also focused attention on programs with poor attendance, helping to understand why this was happening. In New York City, program providers were held accountable for achieving specific attendance goals and were paid accordingly. Washington, D.C., was considering such action.

Interviewees in New York City and Chicago noted that the use of an MI system shifted the nature of contracting, enabling agency staff to monitor programs and provide assistance to them on an ongoing basis. Without an MI system, contract officers received attendance reports on a quarterly or annual basis, and often on paper. Thus, it was difficult to identify struggling programs and impossible to provide assistance to help programs improve in a timely way. However, an MI system allowed agency officials to flag potential program problems early and intervene with assistance. OST providers also recognized this shift.

Finally, the ability to plan and advocate was seen by many as an important unforeseen outcome of the MI system development effort. In Providence, New York City, and Washington, D.C., information collected from the attendance systems and the surveys was used to effectively advocate for stable or increased funding for after-school programs. Armed with data and evidence that funds were being spent more efficiently but demand remained (i.e., that poor providers were being weeded out, programs were being located in the highest-need areas, and demand remained), agency heads and intermediaries began to argue for increased funding and city support. When city agencies that competed for funding could not show similar progress in moving toward accountability or proof of needed services, the after-school agencies won greater funding, especially in New York City and Providence. Seeing the data, the mayors could argue that they were fulfilling their campaign promises and began to demand these data.

In summary, the development and use of student tracking systems, student surveys, and provider information proved to be key parts of building a more coordinated effort to meet the initiative’s goals. Information was used to support improved access by offering programs of interest to students and ensuring that they were located where students could access them. In Providence, it was also used to ensure that students were safe and supervised. The information was also used to improve quality by identifying programs with little student support and by providing professional development or needed training and holding providers responsible for improved attendance, as in New York City. In at least a few instances, such systems were responsible for providing needed data that could be used to argue for increased funding, and work on the development of the system itself encouraged collaboration and coordination that had not occurred before. In short, the development and use of systemwide information that had been almost nonexistent prior to this effort added significantly to the initiative.

Goal 4: Plan for Financial Sustainability

Sustainability here refers to both sustaining the collaborative effort and sustaining the programmatic funding levels needed to meet the initiative plans for expansion, although we heavily emphasize the latter. We reviewed sites’ plans for sustainment of both the collaborative effort and the funding. The activities they described fell into four areas (see Table 3.4). In planning and developing more stable funding or funding for growth, the plans talked of finding new funding sources and activities designed to maintain general public support. In ensuring that coordination was maintained, they pointed to clarifying roles across the organizations and activities or embedding coordination into the system’s structures, such as MOUs or contractual relationships. The sites were struggling with issues of financial sustainment when the study ended. Several had sought new funding sources, such as local and national foundations or federal funds for 21st Century Community Learning Centers. However, all faced uncertain funding prospects in spring 2009.

The five cities used a combination of resources to support current programming but relied primarily on government contracts and foundation grants. PASA in Providence had moved to ensure stronger funding by helping several CBOs gain federal 21st Century Learning Center status through grant writing and providing data to support the proposal. New York City had increased funding based on the strong support of the mayor and the clear evidence of effectiveness. And discussions among ICSIC members in Washington pushed DCPS after-school program managers to reallocate some internal resources to increase funding.

At the time of our spring 2009 visit, the sites reported struggling with sustainment of program funding. Several of the cities were forecasting reduced budgets, and the various leads were pursuing the means to at least hold steady if not grow in the coming months.

 

Three of the five sites thought in similar terms. Downturns in city budgets had occurred before, and agency leaders we interviewed thought that the best way to address them was to argue for the effectiveness of the programs in meeting important city goals, such as reduced crime and increased graduation. Therefore, in rough times, they thought that the data from the MI system and from any evaluation that showed increased effectiveness could be used to argue for the programs’ value. Washington, D.C., New York City, and Providence, in particular, sought to generate information on both the effectiveness of the programs and the growing efficiency of their operations and to publicize these results. In addition, they sought to engage community leaders and parents in support of the programs to act as advocates with city hall. Mayors who were strongly supportive of the programs to begin with, armed with data showing their effectiveness, would see them through—or so these leaders hoped.

Chicago’s sustainability efforts focused on securing dedicated funding for after-school programs at the state level. Given the state’s budget crisis, this effort seems unlikely to bear fruit in the short term, although sources hastened to point out that it was still necessary so that after-school funding would someday be “first in line” when economic conditions and budgetary conditions improved. Boston’s efforts to establish a sustainability plan were delayed due to reorganization of the initiative.

In terms of maintaining collaboration across organizations in pursuit of the initiative’s goals, most interviewees in New York City and Chicago assumed their programs would survive as long as strong outcomes persisted because they had become embedded in the routine of government agencies. For example, New York City had established an MOU with its Department of Education, which provided school facilities free of charge to OST programs. The MOU helped ensure that this collaboration would continue into the future. In addition, New York City had embedded coordination in the contractual arrangements it made with providers, ensuring that providers were evaluated and received professional development to improve. Chicago was considering such options, and with its new MI systems and piloting of its quality standards was maintaining the interest of the various organizations.

Providence’s efforts, however, were led by an intermediary organization. PASA chose to use its success to increase its presence and cement further relations at the state level and to begin offering its professional development and quality-assurance services at other sites across the state by pooling resources. In addition, Providence was moving toward expanding programming into the high school arena, with strong support from the mayor. A new coordinating group had been established in his office that brought together the major city agencies that might have the resources to support after-school programs, such as facilities or buses, in an effort to identify efficiencies that could generate additional revenues for provision. The coordinating role of the intermediary, with support from the mayor and other agency heads, appeared to sustain and support growth.

Boston and Washington, D.C., were also led by intermediaries, but these organizations had not been successful in leading the efforts for reasons discussed previously. In these two cities, the nature of further collaboration was unclear, as was the role that intermediaries would play. At the time of our visits in spring 2009, while work was under way in the public schools to improve coordinated services, the level of interorganizational coordination between city and noncity agencies was undergoing change. For example, in Boston, respondents were starting to focus on the CLI as the means to promote collaboration among schools, the libraries, and the parks and recreational centers. Respondents in both cities expressed uncertainty about how these types of coordinated efforts would be sustained.

In summary, we found all the sites struggling with issues of funding, several struggling with continued collaboration, and all preparing for a difficult year or two as budgets tightened.

Summary

In this chapter, we described what the sites did to address the initiative’s expectations regarding access, quality, use of information for decision making, and sustained funding. We reviewed the cities’ progress made by comparing the statements from early proposals and interviewees aware of early efforts to later similar sources in spring 2009.

Access. Sites addressed issues of convenience and lack of access by locating additional programs in neighborhood schools, attempting to provide transportation, developing online program locaters, and marketing programs to target populations. The number of children served expanded in most of the cities. Further, the initiatives addressed transportation and convenience issues of parents, thereby increasing access in Washington, D.C., New York City, and Providence.

Quality. Several sites concentrated significant effort on developing standards of provision, quality-assessment systems for providers, and incentives and contractual mechanisms to ensure better provision. In addition, several sites invested in professional development for the providers and for coordinators placed in the neighborhood schools to manage the programs.

Information for Decisionmaking. A few cities invested in evaluations of their efforts, some of which included student outcomes, and all the cities devoted considerable energy to developing MI systems to track enrollment, participation, and demographics. Several developed systems to collect information about providers and to determine which programs were attracting students. While gathering program data of this type may seem commonplace, this was the first time these cities had such systems and could begin to plan more effectively to increase and improve provision. Data-based planning and communication strategies adopted to improve access and quality had multiplier effects and often generated greater coordination and communication.

Sustainability. The sites were struggling with issues of financial sustainment when the study ended. Several had sought new funding sources, such as local and national foundations or federal funds for 21st Century Community Learning Centers. Three of the sites used data to develop “success stories” to help maintain public support for programming. Sites attempted to maintain partnerships by delineating clear roles among organizations and embedding the coordination in an MOU, shared MI systems, contractual arrangement, and elsewhere.

However, all faced uncertain funding prospects in spring 2009 that might threaten further collaboration.

< < Previous | Next > >