Contents

Engaging Older Youth: Program and City-Level Strategies

Click here to download the full report:
 Engaging Older Youth: Program and City-Level Strategies

Using mixed-methods research strategies, the study design brought together both survey data from a large sample of programs and in-depth interview data. This design allowed for both breadth and depth in our understanding of critical issues related to access to and sustained participation in OST programs for older youth. We collected and integrated qualitative and quantitative data and used an iterative analytic process, weaving together findings from both sets of data to confirm, augment, and challenge our understanding of program characteristics—both program practices and structural features—and support from city initiatives. This chapter describes our mixed-methods approach, including city selection, data collection activities, program sample selection and characteristics, and analysis.

City Selection

To understand how program participation may be affected by city initiatives’ supports, we selected our six sites—Chicago, Cincinnati, New York, Providence, San Francisco, and Washington, DC—because they all have

  • An intermediary or government agency coordinating funding and providing services for OST programs
  • A management information system (MIS) or database to keep track of attendance and participation
  • Extensive programming aimed at middle and high school youth
  • A focus on low-income youth and distressed neighborhoods

All of the cities in this study are contending with issues affecting urban areas, including issues related to high poverty rates. Providence’s child poverty rate (36.3 percent), for instance, is twice that of the United States as a whole, while Cincinnati is among the 10 U.S. cities with the lowest median household income.15 The cities’ diversity provides interesting points of comparison and contrast (see Table 1.1). Their population sizes range from fewer than 1 million to more than 8 million, while high school graduation rates vary from 46 percent to 68 percent. San Francisco, interestingly, has the lowest percentage of youth in any large city in the United States, which presents its own set of challenges for participation.16

The six selected initiatives all provide a set of supports to OST providers in the community, and they are making efforts to raise the profile and increase understanding of out-of-school time in their cities; these efforts will be discussed in Chapter 4. (See Appendix A for descriptions of cities and their OST initiatives.) The OST initiatives in each city are profiled in Table 1.2. They are all relatively new, having been founded between 2004 and 2007, and they are coordinated by different types of organizations— both nonprofit intermediaries and government agencies.

Data Collection

 

 

Five main sources of data were used to develop the findings in the report:

  1. MIS participation data. Each city selected for inclusion in this study provided, at a minimum, individual-level attendance data from its respective MIS to document, program by program within its initiative, participation rates over the 2007–2008 school year. In addition, each city provided demographic information on participants (most commonly, ethnicity/race, gender, and age or grade level) so that we could calculate approximate participation rates for middle and high school youth separately. There was variability in how these data were recorded by each city and transmitted to us. The similarities and differences and how we worked with each data set to calculate participation rates are described more fully in Appendix D.
  2. Online program survey. Selected programs within each city (see “Sample Selection” for selection procedures) were asked to complete an online program survey. The survey was designed to generate information about program activities and features, staffing, youth participants, family involvement, use of data, recruitment and orientation practices, practices for fostering and supporting engagement, and involvement with the OST initiative in the city.
  3. Site visits to each city. In-person interviews were conducted with OST program leaders at 28 selected programs and with 47 city-level respondents. The interviews with program leaders covered program activities and structure, the youth who participate, recruitment practices and challenges, attendance issues, retention practices and challenges, developmental issues for older youth, and experience in an OST initiative. The interviews with city-level respondents addressed their role in the city OST initiative, how the initiative supports recruitment and retention, partnerships to support OST programs, data and evaluation, and city contexts for OST.
  4. Document review. Documents provided to us during our site visits and gathered via online searches were reviewed to supplement our understanding of the city initiatives and of how programs were working to recruit and retain older youth.
  5. Community of Practice.iii The Community of Practice enabled us to vet and expand on the ideas coming from the survey and interviews. It comprised teams of three or four individuals from 12 cities—the 6 research sites and 6 other cities working on city-level support for OST—as well as consultants and representatives from national organizations. The group met six times over the course of the study to discuss themes related to participation and emerging findings (see Appendix B for more information about the Community of Practice and a list of members).

Additionally, a thorough literature review of OST participation for older youth as well as a review of the emerging literature on OST systems deepened our understanding of the developmental needs of middle and high school youth and helped us develop a theoretical lens to guide our instrument development, analysis, and interpretation of findings.

Sample Selection

We used a funnel approach to select our samples: After we identified the six cities for inclusion in the study, we then identified a large number of programs in these cities with high participation rates among middle and high school youth based on city-level MIS data, and administered a survey to program leaders. Out of the sample of programs that returned a survey, we selected a smaller subset of programs to interview in depth.iv Thus we have two program samples in this study: a survey sample and an interview sample. We also selected a group of city-level respondents to be interviewed for the study (see Appendix C). This section of the chapter describes our sampling strategies; Appendix D provides more detailed information on how we developed our survey samples and on the characteristics of programs that responded to our survey.

Program survey sample

To generate the sample of programs to complete our online survey, we used data from each city’s OST management information system to calculate average participation rates for each program in the initiative. In general, we calculated average program participation rates as the proportion of program sessions youth attended, averaged across all youth attending the program.v For example, a youth who comes to half the sessions offered would have a participation rate of 50 percent; if a second youth has a 100 percent participation rate (attending all the sessions offered), the program’s average participation rate across both youth participants would be 75 percent. (See Appendix E for more detailed information on calculations.)

After the MIS data were analyzed, programs with a participation rate of at least 44 percent were selected for inclusion in the survey. vi This cutoff allowed us to identify a large number of programs within each city that had a range of success at engaging older youth. In order to detect the differences between more successful programs and less successful programs, we wanted a sample with both strong and moderate older-youth participation. Because the literature already suggests a set of practices that seem associated with engaging older youth, we decided not to include programs with poor participation rates; we thought we would learn less from these programs.vii Our goal was to select approximately 50 programs per city that met the minimum participation criterion of at least 44 percent. In some cities, this meant choosing all the programs that met the criterion; in cities in which there were more than 50 programs that met the criterion, we sampled from all the programs that met the criterion. This process identified 346 programs from the MIS data that were included in the online survey portion of the study. We received a total of 198 completed program surveys (or 57 percent of those surveyed), which constituted our survey sample for quantitative analysis.

Interviewed program sample

Results from the survey data guided in part the selection of 28 programs across the six cities for more in-depth study (see Appendix F for descriptions of these programs). Criteria for the qualitative program sample included an MIS participation rate of 60 percent or higher, geographic distribution across the city, a mix of program activities and goals, and service to primarily low-income youth as defined by percentage of free or reduced-priced lunch participants. We also examined retention rates to ensure that we included some programs with high retention. Program lists for each city were vetted with leaders of the city OST initiatives, who suggested additional programs for the sample based on these programs’ reputation for participation and engaging activities.viii

The interviewed program sample includes 18 school-based and 10 community-based programs, 14 of which focus on middle school, 8 on high school, and 6 on a combination of the two. Examples of program content areas include jewelry making, music, theater, college prep, law education, and a soccer and writing program.

City-level respondent sample

To understand the role of city OST initiatives in middle and high school youth participation, we interviewed 47 city-level respondents who represented a range of city-level stakeholders, including lead agency representatives; MIS developers; people responsible for quality improvement and professional development efforts at the city level; heads of large community-based organizations; representatives from parks, recreational facilities, and libraries; and mayoral staff. These respondents were selected in consultation with the lead agencies of each city’s initiative.

 

Program Sample Descriptions

Table 1.3 displays the participation rates based on the overall combined MIS data,ix the full program sample (“Survey Sample”), and the subset of 28 programs that took part in our in-depth qualitative study (“Interview Sample”).

Table 1.3 indicates that the participation rates from the full MIS database are relatively high (65 percent). Because programs were selected to receive a survey only if they met a minimum criterion of 44 percent participation, the average for the survey sample is higher (70 percent). The interview sample is representative of programs that have even higher rates of participation, so the average rate across that sample of programs is still higher (79 percent). The average participation rates for the high school youth within programs are similar to those for the middle school youth within programs in both the survey and interview samples; however, in the full database, high school youth have a higher participation rate. More descriptive information on the participation rates for each of the samples is presented in Appendix D.

Table 1.4 describes other program characteristics of the survey sample and the interview sample. As the table indicates, the two samples are similar along most of the dimensions, including age of participants, when they operate, whether or not they have been operating 5 or more years, and their service area. One difference stands out: A greater proportion of the interviewed programs are school-based.

 

Youth Served

Through our interviews we learned that the youth attending the programs we studied in depth are in schools and neighborhoods with high rates of violence, crime, and gang activity, and with few resources for youth services and programs. Many of these youth must constantly navigate these issues in their neighborhoods, making OST a low priority for some and a much-needed refuge for others.

Table 1.5 provides a summary of demographic information on the youth from both samples. One common feature across the programs in this study (in both the survey sample and the interview sample) is that participating youth are struggling with poverty. Across surveyed programs, an average of 79 percent of participants were eligible for free or reduced-price lunch; the proportion for the interview sample was 87 percent.

On average, more than 90 percent of youth participants in the survey sample are non-White. Programs serve a mix of boys and girls; an average of 52 percent of total participants in the survey sample are female. Only 4 percent of programs serve girls exclusively; 2 percent serve only boys. The rates are similar for the interview sample.

In the survey sample, an average of 25 percent of youth participants were estimated to attend other OST programs, based on staff responses. On average, almost a quarter (24 percent) of youth have siblings attending the same program.

Programs serve as many as 6,400 youth annually, but only 10 programs serve 1,000 or more youth annually. The median number of youth served annually is 90.

 

Analysis

Calculating retention

Our quantitative analysis focused on the program practices and structural features associated with retention (duration of participation) of youth in programs. Retention was selected as the main outcome of interest, rather than intensity (number of hours per week) of participation, for both theoretical and pragmatic reasons. First, prior work on the effects of OST programs on older youth suggests that they reap benefits—particularly those associated with meaningful relationships with staff and peers—through participation over a longer period of time rather than through intense participation over a short period of time.17 Second, the data on intensity gathered via the cities’ MIS were not all comparable, whereas the survey questions were asked in the same way across all the programs (see Appendix D for more explanation).

Each program’s retention rate was calculated based on respondents’ answers to a series of questions on the survey. Respondents indicated the proportion of their participants who remained in the program for 3, 6, 12, 18, and 24 months or longer. Percentages for youth coming for 12, 18, or 24 or more months were summed to indicate the proportion of youth in the program who were retained for 12 months or longer.

Retention rates for both the survey and interview samples are presented in Table 1.6. As the table shows, in the survey sample, on average approximately a third of youth (34 percent) were retained for 12 months or more. By design, the average retention rate is higher for the interview sample (43 percent) because we wanted to ensure that we learned through our interviews with staff about practices used to increase retention; thus retention rate was one of the variables we considered when selecting programs to be included in the interview sample. The table also shows the proportion of programs within each sample that reported that 50 percent or more of the older youth served were retained for 12 months or longer. Although the results presented in Table 1.3 indicate that the average rate of intensity of participation for high school youth was found to be similar to that of middle school youth across the two samples, as Table 1.6 shows, the average rates of retention are significantly and substantially higher for high school youth in the programs in both samples compared with those of middle school youth.

The variation in retention rates reflects in part the nature of how city initiatives are set up. In at least two of the cities in the sample (Providence and Chicago), programming for older youth consists of sets of shorter, more intensive programs and activities that older youth would only be expected to attend over a short period of time (e.g., activities in the Providence AfterZones or a session of Afterschool Matters), but they might attend multiple sessions over the course of a year.

Quantitative analysis

To identify characteristics that were significantly associated with higher rates of retention among older youth participants, we used a two-step process.

 

First, we examined which of the numerous individual program practices and structural features from the survey data were significantly more common in high-retention programs than in lower-retention programs (see Appendix G for the usage rates).

Next, we conducted a regression analysis of retention including those practices and features identified in step one. Regression analysis allowed us to isolate which of the many competing practices and features are uniquely associated with the variation in retention rates, even when taking into account other practices and features. Results of regression analyses also provide information on the relative contribution of each factor, above and beyond the contribution of other factors, in explaining retention (see Appendix E for a fuller description of the regression analyses). Chapter 2 describes the findings of the regression analyses in detail.

Qualitative analysis

Analysis of our interviews and document review enabled us both to identify program practices that respondents cited as relating to greater retention and to create a picture of what it takes in programs and at the city level to keep youth engaged in programs over time, using a grounded-theory approach.18 We developed our codes and coding structure based on what our review of the literature and early findings indicated were important elements to include in a study of participation and retention and then refined our codes over time.x

For our analysis of program interviews, we focused on the major themes present across programs related to the successes and challenges of achieving high participation and retention rates and what program practices or features were linked to these efforts. We also analyzed program data to understand how programs participate in OST initiatives. For our analysis of city-level interviews, we created detailed city-level descriptions of the initiatives and identified their major efforts related to participation as well as the challenges they face in improving access and participation.

Throughout the analysis, we cross-walked findings from the interviews and the survey against each other to refine our understanding. Sometimes both the regression analysis and the qualitative analysis agreed, as was the case with the importance of leadership opportunities for older youth. Some themes appeared in the qualitative data that would not be found in the quantitative data because there was no corresponding survey question. In other cases, the findings disagreed. For example, the regression analysis did not identify developmentally appropriate incentives as being important to retention, whereas the bivariate analysis and interview data did. By digging deeper into the interview data, we discovered that incentives can be important and that different types of incentives matter in different ways to older youth in urban areas. Thus, the mixed-methods approach ultimately strengthened our understanding of participation among older youth and what it takes for programs to keep youth engaged over time and for cities to support programs’ efforts.

Limitations of the Study

This study has several strengths. First, it is based on a much larger survey sample than many prior OST studies. Second, cross-referencing our qualitative and quantitative findings has greatly strengthened what is known about how programs can retain older youth. However, there are limitations to what we can conclude based on our methodology.

Our sample selection for both cities and programs was guided in large part by our interest in the contributions of OST city-level initiatives; programs that were not included in these initiatives were also not included in the study (with a few exceptions). Our conclusions thus apply most directly to the population of programs nested within our sample of six city-level OST initiatives. Generalization to other OST programs outside of our study should be done with some caution.

In addition, our study was bounded by examining participation and retention from program- and city-level perspectives. Interviews with youth would have given us a richer and more personal understanding of sustained participation, but that was beyond the scope of the study.

Also noteworthy is the program lens through which the study examined factors related to youth participation and retention. We know that older youth likely participate in a range of different OST programs over the course of a year, a month, and even a week, and there may be a separate set of factors that predicts sustained participation in OST experiences more generally. Nevertheless, the data presented here are informative about program practices and features related to sustained participation within a single program.

Finally, we do not have data on program-level participation before the launch of the city initiatives, so we cannot draw firm conclusions about how having the initiative has affected older youth participation in OST programs; rather, we rely on the interviews with program and city initiative-level staff to address questions about how the initiative supports participation.

< < Previous | Next > >

References

iii. A Community of Practice is an intentional, focused, and voluntary group whose members come together around a common interest or problem to share knowledge, find solutions, improve performance, and discuss and test the transferability and scaling of solutions and innovations. The Community of Practice convened regularly to discuss topics important to the study and contributed to the overall framing of the study and to our understanding of specific recruitment and retention strategies.

iv. A few interviewed programs were chosen based on recommendations and reputation.

v. Some cities track enrollment and exit dates for individuals, allowing for more precise participation rates to be calculated; others do not track this information, and therefore we needed to make estimations differently. When a program served both middle and high school youth, we calculated the participation rates for the two groups separately. We then asked the program respondents to think about either the middle or the high school youth they serve when responding to the survey questions, depending on which group met the minimum participation level. When both age groups met the minimum level, we asked the respondent to focus on one or the other age group to get a similar representation of both middle and high school programs.

vi. Because Cincinnati’s initiative has fewer programs for older youth than the other initiatives in this study, we used this participation rate cutoff where data were available and developed a reputational sample for the rest of the survey and interview sample.

vii. Our goal in selecting programs was to include a sample with a great enough range in participation rates to allow us to explore staff practices and program features that correlate with higher retention rates. Given that programs with low participation rates may have a host of organizational and infrastructure issues that may be relevant to low participation generally (rather than to participation of older youth specifically), we wanted to be careful that the lessons we generated from the data collection and analysis would be particular to understanding programs’ effectiveness in attracting older youth and not limited by general program weaknesses such as poor quality or uneven programming. Thus the programs we selected did not include the very worst performers on participation rates, but rather the more average programs.

viii. When describing data from the interview sample, we refer in most cases to the programs’ high participation rates because we do not have retention information from all of them. Some of the cases are not high-retention programs but were selected because they had high participation and were interesting along another dimension, such as the use of stipends or interactions with families.

ix. The table does not include data from New York City or Cincinnati. The participation calculations in New York City were not comparable to those of the other cities; because the bulk of the programs surveyed from Cincinnati were selected based on nominations, participation data were often not available.

x. We used NVivo to organize the qualitative data.