Contents

AfterZones: Creating a Citywide System

Click here to download the full report:
 AfterZones: Creating a Citywide System

Neighborhood scans of Providence’s OST programs prior to the initiative’s launch revealed not only the relative dearth of programs designed specifically for middle school youth but also that the quality of the existing programs varied considerably. And, as research has shown, if OST programs are not high quality, youth are unlikely to benefit from participating.41 High-quality programming is especially important for middle school youth. They are less compelled by their parents to attend after-school programs than are elementary school students. Therefore, to successfully engage this age group, programs need to be challenging, interesting and fun and must offer youth opportunities to engage in positive interactions with adults and peers.

PASA understood this from the beginning. Improving the quality of after-school programs for the youth of Providence is an explicit part of its mission. Quality is emphasized in all AfterZone written materials and is articulated on PASA’s website, which identifies “high quality” as a core principle of the AfterZone model. Indeed, one of PASA’s three senior staff works solely on developing and implementing systemic quality improvement strategies.

In this chapter, we describe the components of PASA’s quality improvement strategies and discuss their benefits and limitations. We then examine the quality of AfterZone programs from the perspective of adults who observed and rated program activities using a quality assessment tool, and from the perspective of youth who completed a survey about their experiences in the programs. Both perspectives measure quality in terms of the nature of relationships among and between peers and adults, perceived learning, opportunities to exercise choice and youth’s engagement and interest in participating in the program.

PASA’s Quality Improvement Strategy

The quality improvement strategies PASA used with AfterZone providers consisted of the following components:

  • An agreed-upon set of standards by which programs are gauged;
  • An assessment tool and feedback mechanism; and
  • Professional development opportunities for activity instructors and their supervisors.

Quality Standards and Indicators

Developing a consensus within the provider community around a set of standards of high-quality OST programs for middle school youth was the first step in PASA’s multi-pronged quality improvement strategy. Early in the AfterZone initiative, PASA organized a working group of providers, policymakers, and youth advocates to gather standards developed in other cities and identify those that would be appropriate for Providence.42 The standards they ultimately adopted addressed the following five areas:

  • Health, safety and environment;
  • Relationships;
  • Programming and activities;
  • Staffing and professional development; and
  • Administration.

Sample quality standard for “relationships” and associated indicators:

Participants interact with one another in positive ways and feel they belong.

  • Children and youth demonstrate good social problem-solving skills and positive social behavior (e.g., can negotiate solutions, make compromises, work together toward a common goal, empathize with others’ feelings, cooperate and work well together).
  • Children and youth strongly identify with the program/organization (e.g., use ownership language such as “our program,” wear gear with the name of the program/organization on it and hold one another accountable for rules and guidelines).


Each of these areas had related practices—or indicators—that could be measured or rated.43 Therefore, PASA’s next step involved identifying an assessment tool that could measure the extent to which these practices were being implemented in AfterZone programs.

The Rhode Island Program Quality Assessment (RIPQA) Tool

The assessment tool chosen by PASA and the working group was the Youth Program Quality Assessment (YPQA, Form A) developed by the High/Scope Educational Research Foundation specifically for youth in grades 4–12. They selected this tool because it is a research-validated observational instrument that is well aligned with the quality standards adopted for AfterZones.

Using the tool, observers could rate programs along the following quality dimensions:

  • Safe Environment—addressing both the physical and emotional safety of the environment;
  • Supportive Environment—describing adult support for youth development and learning;
  • Interaction—capturing adult and peer interactions; and
  • Engagement—measuring opportunities for youth to plan, make choices and reflect44.

To assess the other quality standards the working group adopted, PASA developed its own form (referred to as Form B), in partnership with High/Scope. Form B focuses on program practices in the following domains: Family and Civic Engagement, Staffing and Professional Development, and Administration. This two-form measure—consisting of the YPQA’s Form A and the new High/Scope-PASA Form B—was named the Rhode Island Program Quality Assessment (RIPQA).45

Appendix A provides the list of scales and subscales included in YPQA/RIPQA.

Assessment and Feedback Process

After establishing the assessment tool, PASA hired two consultants (referred to as quality advisors) who had been trained by High/Scope in the use of the YPQA; these quality advisors helped train AfterZone staff and began observing program sessions.

The assessment and feedback process proceeds in the following way: A team that typically consists of a quality advisor, an AfterZone manager and, when possible, supervisors from the CBO site management agency observes each session for 45 minutes.46 The team tries to observe the session from the start, as the instructor leading the activity generally sets the tone for the session and explains its purpose at the beginning. The observers take notes on what they see, focusing on the RIPQA scales and subscales. After the observation, the team completes scoring each RIPQA dimension. Later, the team members compare their scores and discuss and resolve any discrepancies. During a follow-up meeting, the team works with the instructor(s) who led the activity to develop an action plan and offer additional coaching, if appropriate.

Programs offered in multiple AfterZones or over several sessions are typically observed once a year. Because PASA wants to concentrate its resources on programs that are likely to be offered in more than a single AfterZone program cycle, programs delivered by City Year AmeriCorps members, who usually stay for only a year, are not observed. Also, PASA does not observe new programs until they have run for at least a full year, to give providers a chance to correct any start-up “kinks.”

Initially, the team observed two sessions of each program. However, after the first year, PASA decided to observe each program only once due to the cost and time needed for repeated observations. 47 However, if a program scored low on any dimensions of the observation tool, the team would observe it a second time to see if it had improved. To our knowledge, only two or three programs scored low enough to warrant a follow-up visit.

As we discuss next, this assessment and feedback process has clear benefits and specific limitations as well.

Benefits and Accomplishments

PASA has taken several steps to secure the providers’ cooperation and buy-in with regard to the quality improvement strategy. First, providers are not required to be observed; participation is completely voluntary. Second, PASA makes it clear that the purpose of the process is self-improvement rather than monitoring and that the results do not at all affect the program’s possibilities for future funding. Third, to make the process less threatening, AfterZone managers, who work closely with providers, serve on the observation teams. PASA also hopes that having an independent consultant (the quality advisor) on the team adds an impartial expert opinion to the feedback given to the providers. And, finally, continuous improvement is emphasized by discussing the findings with the providers following observations and creating a jointly developed plan of action.

In terms of getting the providers to agree to be observed, PASA’s strategy appears to have succeeded. PASA believes that not making observations mandatory engenders the providers’ willingness to participate. In fact, PASA reports that, to date, no provider whom it has approached for an observation has refused.

Our interviews indicate that the providers who went through the assessment and feedback process viewed it as useful and as an opportunity for growth. While three providers mentioned they received good scores on the RIPQA observations and thus were not asked to make any changes, two providers were given specific recommendations to improve their programs. One commented on how the advice he received had changed his interactions with the students:

Instead of raising my voice, [the quality advisor] told me to just stand there and tell [the students] that we can’t start the game until they are all quiet. And sometimes I blow my whistle [to get their attention]. Both things worked….It helped with these discipline issues and handling kids. In a sense, [the observation feedback] improved that part of the program.

The other provider described the steps his program took after the observation feedback:

[We made] sure the place was safe, with good lighting, with no debris, with the floor dusted and mopped, and we secured the equipment so no one would get hurt….[The observation feedback] was like a wake-up call. It made me realize that PASA was concerned about the safety of kids and whether [the students] were getting proper training.

Another strength of the assessment process became apparent during the second year of the study, when PASA began to use RIPQA scores to identify the training needs of program providers and tailor professional development to address these needs. Furthermore, in response to providers’ feedback, in the coming year (2009–10), PASA plans to make a quality advisor available to provide one-on-one technical assistance to providers so that follow-up training can be tailored to individual needs.

Another accomplishment is that PASA’s selfassessment process is now being used outside of Providence. In partnership with the Rhode Island After-School Plus Alliance (RIASPA), PASA has worked to disseminate quality standards and to promote statewide use of the RIPQA. As a result, both the RIPQA and the self-assessment process (utilizing PASA’s quality advisors) have been adopted statewide by the Rhode Island Department of Education in the self-assessments mandated by the 21st CCLC grant.

Limitations

We observed two possible drawbacks to the RIPQA process as it is currently implemented. First, although the process is very thorough and comprehensive, it is also time-consuming and therefore costly. Due to cost and time limitations, it has not been possible for the team to observe and provide feedback to each program provider in a single year. It took two years for the team to observe all of the providers PASA targeted for observation. And, as noted, programs that had not yet run for a full program cycle and programs run by City Year AmeriCorps members, college students or other transitional staff were not targeted.

Second, while one-time observations produce snapshots of the programs’ quality and allow PASA to identify those that need improvement, observing only once limits the potential benefits of continuous program improvement. While PASA did conduct follow-up observations of the few programs that received low scores on their first observation, followup observations have not been routinely carried out. Without follow-up observations, it is impossible to know if an action plan has produced the desired change.

Engaging providers in the RIPQA assessment process only once a year may not be frequent enough to effect desired improvements in program quality and foster a culture of self-improvement. In fact, studies have found that effective quality improvement processes incorporate frequent observations and opportunities for staff coaching.48 Aware of this limitation, during the 2009–10 school year PASA plans to focus its RIPQA process on a small group of 10 programs each program cycle.49 This approach will allow PASA to observe each program twice and give the providers more intensive technical assistance and systematic follow-up.

Professional Development

Professional development for providers is another vehicle used by PASA to help enhance the quality of AfterZone programs. Indeed, the quality of OST programming is largely dependent on the quality and capacity of its workforce.50

In an effort to build the capacity of organizations across the city and state, PASA offers free professional development trainings to AfterZone providers as well as staff of any other youth-serving agency. The trainings generally last three to four hours and are offered in the morning on workdays.

Training Content

With at least two trainings available every month, PASA’s workshops cover a variety of topics. For the past three years, PASA partnered with the Boston Medical Foundation’s Building Exemplary Systems for Training Youth Workers (BEST) Initiative to provide a 32-hour, 8-week training on youth development principles. BEST workshops focus on such topics as strategies for behavior management, competencies of youth workers and positive youth outcomes. PASA estimates that, over the three years, staff from at least 50 percent of AfterZone provider organizations (including independent providers) participated in the BEST training.51

Professional development activities are most effective when linked to the identified training needs of the audience. In 2008–09, PASA’s professional development workshops became more closely aligned with the quality standards and practices assessed by the RIPQA. For example, PASA arranged for two trainers from BEST to attend training from High/Scope in the YPQA principles and then develop eight workshops to help boost providers’ competencies in the effective youth programming practices measured by the assessment tool.

In creating its professional development agenda, PASA used providers’ feedback and knowledge gained through the RIPQA observations to identify the practices that needed improvement. For example, RIPQA scores revealed that AfterZone providers of sports programs tended to offer fewer opportunities for youth to reflect, plan activities and make choices. They also needed to engage youth more frequently in small groups, allowing them to act as group facilitators and letting them partner with adults to run activities. Informed by the RIPQA scores, PASA designed two workshops for the sports program providers that focused on how to integrate youth development principles into sports programs.

Increasingly, PASA has designed professional development workshops to support its more concentrated focus on aligning AfterZone activities with school-day learning and offering activities that systematically incorporate middle-school academic standards. In order to strengthen providers’ skills in this area, in 2008–09 PASA collaborated with one of its longstanding programs to develop a six-workshop series on integrating academics with after-school curricula. The first workshop introduced the concept of a standards-based curriculum. Subsequent workshops focused on specific content areas: literacy, health, arts and sciences. A follow-up workshop was organized to offer additional assistance to providers who were beginning to include the standards in their programs.

Response to Professional Development

We learned about providers’ participation in and response to PASA’s professional development program through interviews with provider agency staff, P/PV surveys of 60 AfterZone program instructors and PASA’s own surveys of provider agency staff.

Consistent with its overall voluntary capacity-building orientation, PASA encourages and reminds providers to attend professional development opportunities but does not require their involvement. The Memorandum of Understanding between PASA and provider agencies only states that providers “shall participate in professional development whenever possible.” Participation in professional development activities is only required for programs that wished to become “endorsed,” a category of programs that have maintained certain youth attendance levels and receive expedited review for inclusion in the AfterZones in return. (The endorsement process is described in greater detail later in this chapter.)

While PASA’s approach to voluntary participation in professional development activities is consistent with experts’ current view about how intermediaries can work most effectively with program providers,52 it also has drawbacks. Specifically, getting AfterZone providers to attend has been challenging. Staff from small provider organizations with just a few people or only a single person on staff found it difficult to participate due to lack of time. Some individuals who offer programs in local AfterZones had other jobs that conflicted with the trainings or did not participate because they were not being paid to attend. When agencies sent staff to training, they reimbursed them for their time. This presented an additional expense that was burdensome for agencies with limited budgets. Coordinating trainings and scheduling them at times when most people were available to attend presented another challenge. To overcome these challenges, PASA is working to raise funds and provide stipends to make attending trainings less burdensome for AfterZone providers.

The challenge of getting full participation in PASA’s professional development activities is reflected in the results of surveys P/PV administered to 60 providers who led or assisted in the AfterZone activities observed in 2008–09. Several questions in the survey asked about the training providers received through PASA. About one third (36 percent) reported that they participated in the BEST training offered by PASA. Aside from the BEST workshops, 30 percent of surveyed staff said they participated in other PASA-sponsored professional development activities during 2007–08, spending an average of 3.3 hours in these trainings. Slightly more—38 percent—reported participating in 2008–09, spending an average of 3.6 hours in training.

The majority of individuals who attended PASA’s professional development activities found them valuable, though a small proportion did not. When asked about their experiences in interviews with P/PV researchers, providers replied that the workshops and trainings provided practical information relevant to their work. They also appreciated the opportunities to learn from one another and to network. In a feedback survey PASA administered to AfterZone providers in 2008, 38 percent rated the professional development as “excellent/very effective,” 33 percent rated it as “good/effective,” 10 percent rated it as “fair/somewhat effective” and 19 percent rated it as “not effective.”

Additional Quality Improvement Mechanisms

In addition to the RIPQA self-assessment process and professional development workshops, PASA has created other quality improvement strategies. For example, PASA implements an endorsement incentive to improve the quality of AfterZone programs and retain those that achieve a certain level of excellence. Programs can become “endorsed” if they meet a number of criteria, such as filling at least 60 percent of their available youth participant slots, maintaining an average daily attendance of at least 60 percent, using a written curriculum, and having program staff attend 70 percent of monthly meetings and participate in the RIPQA observation process.

Providers are entitled to several advantages if they achieve endorsed status, including receiving an additional 5 percent of their total AfterZone grant award for administrative and operating costs, completing a shorter grant application than the one required of non-endorsed programs, receiving more intensive coaching and capacity-building assistance from PASA, and receiving consideration as a preferred program when Coordinating Councils 53 make programming and funding decisions. In the two years it has offered endorsed program status, PASA has endorsed 32 programs offered by 24 providers, about one third of the provider pool.54

PASA also uses attendance and retention of youth participants as a broad indicator of program quality. Youthservices.net data are made available to local Coordinating Councils to consider when making decisions about which programs to offer in upcoming sessions. Although PASA understands that there is not always a direct correlation between the popularity of a program and its quality, because RIPQA scores are not given to the Councils to use in deciding what programs to fund, attendance and retention data are used as a proxy for quality: Programs that are poorly attended or have difficulty retaining youth are not likely to be refunded.

The Quality of 2007 to 2009 AfterZone Activities

Studies have empirically linked the quality of OST programs with positive youth outcomes.55 If programs are high quality, youth are more engaged— emotionally and cognitively—and are thus more likely to reap benefits. To learn about the quality of the AfterZone programs, P/PV collected two kinds of data: scores from the RIPQA assessment tool and surveys of youth who participated in the activities.

RIPQA Scores

P/PV obtained RIPQA scores for 76 AfterZone programs observed from December 2007 to March 2009.56 The programs included in the sample were selected from all three local AfterZones and represent the four broad activity types the AfterZones offer: 27 are related to academic enrichment, 24 to art, 11 to life skills and 14 to sports.

To provide context for interpreting these RIPQA scores, we also looked at High/Scope’s “Wave-2 sample” scores, a YPQA validation study.57 The validation study’s sample consisted of 116 observations of activities offered in 46 organizations in Michigan. The majority (53.4 percent) of observed activities were after-school programs, 17.2 percent were summer programs, 16.4 percent were residential, 6.9 percent were offered during school, and 6 percent were other types of programs. The average age of youth who participated in the Wave-2 sample was 13. (Note: the average age of the youth in the observed AfterZone activities was 12.)

Table 3 shows the aggregate scores of the AfterZone programs in the four areas measured by the assessment tool. AfterZone programs scored the highest on measures of Safe Environment and Supportive Environment, both of which had mean scores of 4 or higher (out of 5), which indicates that most safety and support-related staff practices were seen most of the time during the observation. Engagement received the lowest mean score, 2.6 (out of 5), which indicates that youth were given limited opportunities to set goals and make plans and choices during the observation.

Table 3 also indicates that the pattern of mean scores for AfterZone programs is similar to that in High/Scope’s Wave-2 sample. The pattern is also consistent with the findings from another P/PV study that used a comparable observational instrument to gauge the quality of after-school activities; this study found that scores on measures of positive adult support were higher than scores on measures of opportunities for youth to make decisions about their activities.58

 

RIPQA Scores Across Activity Types

As mentioned earlier in this chapter, AfterZone offers activities in four broad categories: art, academic enrichment, life skills and sports. We compared the scores of the four types of activities to learn whether and how they differ on the dimensions measured by the RIPQA. As Figure 3 illustrates, art activities received the highest scores on all RIPQA scales, while sports activities received the lowest. Sports activities, academic enrichment and life skills activities scored similarly on Supportive Environment and Interaction, with all scoring significantly lower than art activities on these dimensions. Sports activities scored significantly lower than the other three types of activities on Engagement, indicating that they were providing fewer opportunities for youth to reflect, make choices and plan than were other activity types. As noted earlier, PASA learned this about sports activities through the RIPQA assessment and consequently developed two workshops to help enhance sports providers’ skills is this area.

 

Youth’s Assessment of Their Experiences

In addition to measuring the quality of AfterZone programs through observation, we wanted to get a sense of how youth experienced the programs. In order to benefit from participating in after-school programs, youth have to be engaged and feel they are gaining new and valued skills and knowledge. Distracted, bored or frustrated youth will get little out of an activity. Youth should also feel supported and encouraged by the adults in the room and valued by their peers. For middle school youth, it is especially important that they feel they have a say in what they do. To assess the quality of youth’s experiences, we surveyed 318 youth who participated in 36 different programs, delivered in all three AfterZones. (Youth’s demographic characteristics are reported in Appendix B.)

Overall, the findings from the survey suggest that youth feel AfterZone programs create supportive and engaging environments in which youth are involved in decision-making and have positive interactions with their peers.

The survey collected youth’s opinions in five general areas:

  • Positive Adult Support—how much adults encourage youth and help them succeed in the program;
  • Engagement—how engaged youth are during the program and how much they enjoy program activities;
  • Perceived Learning—how much youth feel they learn in the program;
  • Voice and Choice—the extent to which youth have opportunities to plan and make choices about their activities.

(See Appendix C for details about the survey’s constructs, items and reliabilities.)

There are differences in the ways the youth survey and the RIPQA define Engagement that must be noted. In the youth survey, Engagement measures how much youth liked the activity and were cognitively challenged by it. The youth survey’s construct Voice and Choice overlaps with what the RIQPA calls Engagement, namely, allowing youth to plan and make choices about their activities.

Table 4 displays the youth’s mean ratings of the five areas. A score of 1 indicates the activity is very weak in this area; a score of 4 indicates it is very strong. The surveyed youth reported high levels of Perceived Learning and Engagement (liking, being challenged) in AfterZone programs. According to the youth, instructors provided high levels of support and afforded the youth opportunities to help plan activities and give input about how they are carried out. Consistent with the profile of scores from the RIPQA observations, on average, youth’s mean ratings of Positive Adult Support were higher than they were for Voice and Choice. That is, both the observers and the youth gave decision-making and choice relatively low ratings.

 

 

Youth’s Experience in Different Activities

As we did with the scores on the RIPQA, to better understand the experiences of AfterZone youth, we compared survey responses of youth participating in the four activity types. Across the activity types, the youth reported similar levels of Positive Adult Support, Engagement, Perceived Learning and Peer Affiliation. However, youth who participated in sports programming gave significantly lower ratings of their opportunities to plan and make decisions about activities than did the youth in art and life skills programs (see Figure 4 for average scores). This result echoes an earlier finding—sports activities also received lower scores on RIPQA’s Engagement scale, which captures the youth input dimension.

Summary

PASA’s quality improvement strategy involves two primary elements: observation and feedback using the RIPQA assessment tool and an agenda of professional development activities for providers. Over time, these two components have become more integrated and refined to better address the needs of providers and to complement ongoing developments in the AfterZone model (i.e., the move toward more alignment with school-day learning).

The observation and feedback process is thorough and comprehensive; it uses a team that includes independent, trained observers and a researchvalidated assessment tool that focuses on key youth development practices. At the same time, however, the process of observing, comparing scores among team members, writing up an action plan and providing feedback to the program’s instructor requires a great deal of staff time. Because of the large number of programs to observe and the limited number of staff to complete the observations and provide feedback, little time is left to return to the program to determine whether suggested improvements are being implemented or if quality has been maintained. This lack of systematic followup limits the potential of the process to foster continuous program improvement.

The snapshot of the quality of AfterZone programs three years after the initiative’s launch is fairly similar to that of another set of programs in Michigan observed using the same assessment tool. AfterZone programs scored high on measures of support and emotional/physical safety. The relatively low scores on measures of youth choice suggests that, despite the AfterZone model’s consistent emphasis on the importance of choice, instructors were still not fully engaging youth in making plans and decisions during the time of our study.

In surveys, youth participants reported that they enjoyed AfterZone programs and found them to be supportive learning environments. Youth’s reports of relatively high levels of adult support and lower levels of youth choice mirror the differences observed in the RIQPA data. According to the RIPQA observation and youth survey data, sports programs provided fewer opportunities for youth to be involved in planning activities and making decisions than did other types of programming. PASA has responded by offering intensive training to sports providers in how to more effectively incorporate youth development practices.

< < Previous | Next > >