Contents

Improving State Evaluation of Principal Preparation Programs

Click here to download the full report:
Improving State Evaluation of Principal Preparation Programs

I. INTRODUCTION

Implementing a better system of evaluating principal preparation programs is complex work, and it requires that states have certain conditions and capabilities already in place. Before undertaking the work of designing and implementing a new evaluation system, we recommend that states assess their capacity to implement the recommendations in the guide. This tool is designed to help with that assessment of readiness. It has two parts: (1) a readiness assessment rubric and (2) process recommendations for completing the assessment.

The readiness assessment rubric includes information in two broad areas:

  1. Focus, alignment, and positioning of state leadership: The extent to which state leadership has prioritized school leadership-and specifically school leader preparation-in the state's educational improvement agenda, and the extent to which the state education agency (SEA) is positioned to be an effective resource for local education agencies and leadership preparation programs.
  2. Technical capabilities of the state education agency: The extent to which the SEA has crucial capabilities needed to support a new evaluation system, especially those related to data collection and the analysis and substantive review of programs.

The process recommendations outline how states might use information in these two areas to arrive at conclusions about their readiness to restructure or refine their assessment of leadership preparation programs. Completing this rubric will enable states to determine whether current conditions are ideal, workable or underdeveloped for implementing the recommendations in the guide.

When conditions are ideal, states may move forward with confidence. When conditions are workable in most areas, states may decide to move forward and work on improving conditions at the same time. When conditions are underdeveloped, states would benefit from developing supportive conditions before adopting our relevant recommendations. To move forward when conditions are underdeveloped would invite low-quality implementation and could unintentionally result in poor and potentially negative outcomes.

It is important to note that this is not a scientifically validated instrument. They do not lend themselves well to absolute determinations. Rather, it is a heuristic, allowing states to make sensible judgments about where to start and how fast to proceed.

II. READINESS ASSESSMENT RUBRIC

A. Focus, Alignment, and Positioning of State Leadership
 UnderdevelopedWorkableIdeal

A1. Commitment to improving school leadership

State leadership prioritization: Public commitment by state leaders and key stakeholders to improving school leadership

State political leaders (governor, state chief, state board, legislative leaders) rarely discuss school leadership as a way to improve schools.

Stakeholders (e.g., associations, prominent local education agency [LEA] leaders, university leaders) have major disagreements on the importance of leadership.

State political leaders communicate about school leaders as one among many issues of concern.

Stakeholders have a broad array of perspectives on the importance of leadership.

Investment in leadership: Visibility of school leadership in state strategic plan and in allocation of resources

The state's strategic plan says very little about strategies to improve school leadership.

No discretionary dollars are allocated to improving school leadership, and no effort is made to encourage local investments in leadership.

School leadership is in the strategic plan but is a secondary priority or one on a long list of stated priorities.

Investments in leadership are regularly communicated as allowable expenditures in state grant programs (as appropriate).

School leadership is a major focus of the state's strategic plan; the state has a clear understanding of how improved school leadership will contribute to improved educational outcomes.

The state has targeted funds (including public and privately sourced funds) to specific efforts to improve school leadership.

A2. Commitment to improving principal preparation

State leadership alignment: Unified stance of state leaders to improve principal preparation

Lines of authority for improving principal preparation programs are divided or ambiguous.

Agencies with responsibility for principal preparation and licensure (e.g., SEA, professional licensing boards) have limited communication and differing priorities.

Lines of authority for improving principal preparation programs are clear.

Agencies with responsibility for principal preparation and licensure have goals and strategies that do not conflict, and they communicate regularly.

State political leaders and relevant agencies are unified in a commitment to improving principal preparation programs and agree on the need to rigorously assess the quality of programs, help programs improve, and take action to address underperformance.

Agencies with responsibility for principal preparation and licensure have shared goals and are committed to collaboration with each other and with programs (especially when it comes to sharing data).

Policy framework: Policies in place to foster innovation

There is little effort by state political leaders to influence the practices of principal preparation programs.

The state's policy framework allows for innovative principal preparation program design.

The state pursues new partners and encourages universities and other providers to create innovative principal preparation programs.

A3. Commitment to and capacity for continuous improvement 17

Collaboration: Perceptions of the SEA as a collaborative partner

LEAs and preparation programs have little interaction with the SEA, viewing the agency as primarily concerned about compliance with statutes and regulations.

LEAs and preparation programs have little interaction with the SEA, viewing the agency as primarily concerned about compliance with statutes and regulations.

LEAs and preparation programs view the SEA as a trusted partner committed to continuous improvement; compliance still matters, but the SEA works to make it as seamless as possible.

Communication: SEA systems for communication with partners

Information coming from the SEA to LEAs and preparation programs is either nonexistent or perceived by programs as excessive and disjointed, often sending mixed messages.

Information coming from the SEA to LEAs and preparation programs is perceived by programs as organized, predictable and reasonably clear.

The SEA convenes local partners in ways that foster two-way communication.

Innovation: Perceptions of the SEA as a source of ideas

LEAs and preparation programs do not look to the SEA for new ideas to improve schools and universities.

The SEA serves as an effective information clearinghouse, making innovations in the field visible to LEAs and preparation programs.

The SEA shares data; engages LEAs and preparation programs in conversations about improvement; and offers new learning opportunities, including creative strategies for implementing federal and state policy.

Decision making: Use of evidence in SEA decisions

The SEA offers little explanation or unclear justification for policy changes.

The SEA reports on data used in the design of new policies and articulates the reasons for policy changes.

The SEA transparently shares data, data analysis, and operating theories that underlie policy design and implementation decisions.

Expertise: Knowledge and skills to manage change process for leadership work

LEAs and preparation programs view SEA leadership as having limited understanding of core leadership issues and as being unresponsive or unhelpful in managing the process of large-scale change.

SEA leadership communicates a solid understanding of the connections between leadership and student outcomes, as well as the adaptive challenges associated with large-scale change.

SEA leadership communicates a strong understanding of-and solutions for-the adaptive challenges associated with large-scale change.

SEA leadership is deeply involved in national and statewide conversations about the practice and impact of school leaders.

 

B. Technical Capabilities of the State Education Agency
 UnderdevelopedWorkableIdeal

B1. Data and data system requirements

Program data system: System that collects program data (e.g., number of applicants, clinical hours required, 100-word description) from preparation programs

Data are available in isolated locations without an overarching system for integrating the different sources or linking the data points.

A program data system is in place, but it may not include all data points needed for the SEA's annual report; some data may be missing, inaccurate, or lack comparability. Systems support might be needed to design new tools or interfaces to collect needed information from multiple sources and/or agencies. Substantial budgeting would be required for staff time to request missing data, monitor data completion, and build necessary data systems. Time is allocated to ensure data integrity.

A program data system is in place and includes all fields/variables needed for the SEA's annual report. The system enables consistent reporting and data aggregation. Data are complete and accurate. Programs use common definitions of indicators, making the data comparable across programs. The system is not overly burdensome for programs, districts, or school partners.

Placement data systems: Systems that track individual educators and their annual placement role (teacher, principal, assistant principal, other school leader, district leader)

Data are available in isolated locations without an overarching system for integrating the different sources or linking the data points.

Placement data systems exist and are coordinated but have lots of inaccuracies and missing data.

Budgeting would be required for staff time to request missing data and monitor data. Time is allocated to clean data.

Placement data systems are complete and accurate.

Unique identifiers for program participants: Identifiers that link data from preparation programs, licensure status, placement data systems, and effectiveness ratings from educator evaluation system

It is not possible to link individuals across data systems (for programs, licensure, placements, school outcomes).

Unique identifiers do not exist, but it is possible to link two or more data systems, and the SEA has the capacity to do this.

Budgeting would be required for junior analyst time to link data systems.

Unique state-level identifiers are in place to link individuals to all of the data required by the evaluation system.

Comparable survey data: Common survey administered to program graduates that gathers their perceptions of program process indicators

Graduates of most programs are not surveyed, or the response rates are too low to make results meaningful.

Surveys of program graduates exist, and response rates are reasonable, but the surveys differ, preventing comparison of data across programs.

A common survey is administered to all graduates in the state with reasonable response rates, enabling comparison of data across programs.

Measures of teacher and leader effectiveness: Ratings of individual teachers and principals on the state performance evaluation system

Measures of teacher and leader effectiveness:

Ratings of individual teachers and principals on the state performance evaluation system.

Measures exist and have some variability but lack validity and reliability.

The SEA does not share results publicly and does not provide caveats that caution users on interpretation.

Measures exist, have variability, and have been found to be both reliable and valid. The SEA has the capacity to use measures in contextually appropriate ways.

The SEA ensures that any public release of data meets federal and state privacy guidelines.

Measures of student learning gains: Student achievement scores across grade levels in core subject areas

Measures are not based on individual student growth from year to year.

Consistent and methodologically sound measures of individual student growth, including proper controls for student- and school-level variables, exist, but they are not comprehensive across grade levels and subject areas.

Measures exist, but ns are small (less than 10 individuals) for most programs.

Consistent and methodologically sound measures of individual student growth, including proper controls for student- and school-level variables, exist. These measures allow for assessment of school leaders' influence on student learning after three years at a school site. Adequate consideration is given to bias against high-needs schools.

B2. Data compilation and analysis capacity

Monitoring data reporting completion and accuracy: Requires staffing to ensure the submission and accuracy of data from preparation programs and other data sources

No staff or resources exist.

Staff assignments and/or resources could be prioritized for data monitoring.

Staff and/or resources are already assigned to data monitoring.

Creating and publishing annual reports: Requires technical skill for website/report design and senior analytical skill to make methodological decisions

No staff or resources exist.

Staff assignments and/or resources could be prioritized for data reporting.

Staff and/or resources are already assigned to data reporting.

Creating and implementing methodology for summative rating: Requires specialized assessment and statistical skill

No staff or resources exist.

Staff assignments and/or resources could be prioritized for data analysis/methodology.

Staff and/or resources are already assigned to data analysis/methodology.

B3. Review process capabilities

Staffing: Requires specialized leadership experience and skills

There is no SEA staff committed to leadership preparation, or those responsible have multiple other roles.

There are staff members at the SEA focused on school leadership, including preparation, but they have limited backgrounds in school leadership or adult leadership.

There are staff members at the SEA focused on school leadership, including preparation, and they are deeply credible with leaders and preparation providers in the state.

Management and training of reviewers: Requires specialized review process capabilities

No staff or resources exist.

The state has a reasonably adequate pool of high-quality, credible reviewers but does not have a track record of systematically vetting them for leadership expertise or training them for inter-rater reliability.

The state does not have a strong track record of outsourcing functions and maintaining quality.

The state has a robust pool of high- quality, credible reviewers who have been (or could be) trained for inter-rater reliability and normed to provide useful feedback to programs.

or

The state has a strong track record of outsourcing functions and maintaining quality. This allows for bringing in an established process (e.g., review by the Educational Leadership Constituent Council).

Implementation of reviews: Requires financial and human resources

No staff or resources exist.

A review process exists, but it is not sufficient for quality, in-depth review of all programs flagged.

Sufficient resources exist to carry out in-depth reviews for all programs flagged as needing it, and for conducting periodic reviews of all programs.

III. PROCESS RECOMMENDATIONS

The readiness assessment rubric can be used in more than one way. A state working to build political support for an evidence-based approach to assessing the quality of principal preparation programs may want a formal process to engage stakeholders in completing the rubric and agreeing on next steps for the work. Meanwhile, a state already committed to an evidence-based approach to assessing the quality of principal preparation programs may want the SEA to simply undertake an internal diagnosis of conditions in order to surface critical gaps and needed resources.

For a more extensive process, these general steps are recommended:

  1. Create a vision for the work. In order to demonstrate executive-level commitment to an open and honest process of assessing the state's readiness for implementing a better system of evaluating principal preparation programs, it can be helpful to write a purpose statement outlining why the work is important and how it connects to the state's broader vision of leadership. The state's strategic plan for education is an important resource for this step.
  2. Create a project plan. In order to ensure that the right people will be engaged and will have access to authentic information, it can be helpful to craft a project plan that includes roles and responsibilities and to assemble available data to conduct the readiness assessment.
  3. Convene stakeholders. In order to build trust in and commitment to the process, it can be helpful to convene leaders from universities, preparation programs, administrator associations, districts and schools. The purpose of such a convening is to share the goals and work plan, ask for authentic feedback, and ask for a commitment to participating in the process.
  4. Conduct the assessment. This is the heart of the work: gathering data, making sense of it, surfacing and discussing important substantive issues and agreeing on rubric ratings.
  5. Set action steps. With the assessment complete, state leaders and stakeholders need to make decisions about their readiness and identify areas of focus that are consistent with the conclusions from the readiness assessment. This is also an opportunity to establish a new work plan for the implementation phase, including strategies for addressing any areas of weakness that need to be remedied in the short term.

For a more targeted approach within an SEA, the critical steps are numbers 4 and 5 above, as well as some amount of stakeholder engagement (step 3). Note, however, that some categories of the rubric require information from sources outside of the SEA (e.g., perceptions held by LEA leaders and program leaders), so some level of external engagement is helpful regardless of the scope of the analysis.

« Previous

References

17. Determined through anonymous surveys of program leaders.