| Session Title: Walking the Tightrope: Developing Valid Tools to Measure Fidelity of Implementation That Meet Stakeholder Needs |
| Multipaper Session 934 to be held in the Granite Room Section A on Saturday, Nov 8, 4:00 PM to 5:30 PM |
| Sponsored by the Pre-K - 12 Educational Evaluation TIG |
| Chair(s): |
| Tania Jarosewich, Censeo Group LLC, tania@censeogroup.com |
| Abstract: Examination of program implementation of grant expectations is less common than is examination of student outcomes. However, without measuring fidelity of implementation, the connection between grant activities and outcomes is weak. Recognizing that a multiple site implementation provides unique opportunities for measuring fidelity of implementation, this panel will focus on methods and systems of collecting systematic data regarding implementation in statewide grant reading and math programs. The authors will describe the tensions inherent in developing valid instruments that meet client expectations and local grantees' needs, can be used by a variety of stakeholders, and may be applied to high-stakes decision-making. Panelists will focus on the methods they used to develop and validate the instruments, and to train users in collecting valid data. The papers will also reflect on the convergences and divergences of the processes used in each evaluation. |
| Developing Classroom-Level Measures and School-Level Measures of Implementation Fidelity |
| Catherine Callow-Heusser, EndVision Research and Evaluation, cheusser@endvision.net |
| The Bureau of Indian Education's Reading First program uses school-reported self-assessment data based on the Planning and Evaluation Tool for Effective Schoolwide Reading Programs - Revised (Kame'enui and Simmons, 2003) as a measure of implementation fidelity to contribute to decisions about continued funding. However, as the external evaluators, we felt an independent measure would likely be more aligned with student outcomes. We developed a school-based measure of implementation fidelity that included research-based indicators that aligned with the four pillars of Reading First: instructional programs and strategies, valid and reliable assessments, professional development, and instructional leadership. Additionally, a classroom-level measure of implementation fidelity is aligned with reading programs and research-based reading teaching strategies. Both classroom-level measures and school-level measures of implementation fidelity explain substantial portions of the variability in student outcomes. In this presentation, we will discuss development of the instrumentation and statistical outcomes. |
| Measuring Program Implementation Using a Document-Based Program Monitoring System |
| James Salzman, Cleveland State University, j.salzman@csuohio.edu |
| The Reading First - Ohio (RFO) grant indicated that implementation would be measured through a 'rubric reflective of the state's accountability system' (ODE, 2003, p. 157). The RFO Center designed a rubric to use in a document review process to hold districts accountable for attaining and sustaining fidelity. School personnel gathered artifacts and wrote summary statements for each of the grant's indicators to provide evidence of implementation for the document review. Regional consultants, supervised by the Center, reviewed the documentation multiple times each year, as both a formative and summative process. Schools that did not show progress toward full implementation after their second year in the program could lose funding. This presentation will discuss the tightrope that the Center walked in developing a tool that met the requirements of staff members of the Ohio State Department of Education and also showed strong reliability and validity. |
| Measures of Implementation Fidelity for Use by External Evaluators and Program Staff |
| Tania Jarosewich, Censeo Group LLC, tania@censeogroup.com |
| The Oklahoma Reading First grant did not identify how the Oklahoma State Department of Education (SDE) would monitor program implementation or evaluate district and school adherence to grant requirements. Evaluation staff and the State Department of Education team developed a statewide monitoring system that would provide a clear understanding of the strengths and needs of implementation across the state and allow state staff to make the high-stakes decision of which districts would be continued in the grant. The state evaluation team and SDE staff used a school self-assessment, a reading observation form, and site visit protocols to collect information about fidelity of implementation through a site visit at each participating Reading First school. This presentation will describe development, training, and use of the tools, and discuss the challenges inherent in developing a system in which an external evaluation team and internal project staff collect data about fidelity of implementation. |
| Syntheses of Local Data for Global Evaluation of Program Fidelity and Effectiveness |
| Elizabeth Oyer, Evaluation Solutions, eoyer@evalsolutions.com |
| Tania Jarosewich, Censeo Group LLC, tania@censeogroup.com |
| The Illinois Math and Science Partnership state evaluation framework includes five dimensions of program outcomes, including quality of professional development activities and partnerships as well as changes in teachers' content knowledge, teachers' instructional practices, and students' achievement. The cornerstone of the state-level evaluation design is the cross-site meta-analyses of local evaluation results to assess program effectiveness. The meta-analytic approach is combined with hierarchical linear modeling to analyze local and global outcomes. Measures of program implementation are a key moderating variable in analyses and are measured at the local level using a combination of logs, journals, classroom observations, and extant data. The quality of the partnerships are evaluated by triangulating a comprehensive interview protocol with artifact analyses and surveys of all key partners including teachers, local education administrators, higher education faculty, and industry partners. The presentation will discuss the issues related to balancing the needs of local evaluations with the need to provide global analyses of the statewide initiative. |