|
Session Title: Implementation From the Ground Up: Defining, Promoting, and Sustaining Fidelity at All Levels of a State Program
|
|
Panel Session 203 to be held in Lone Star B on Thursday, Nov 11, 9:15 AM to 10:45 AM
|
|
Sponsored by the Program Theory and Theory-driven Evaluation TIG
|
| Chair(s): |
| Elizabeth Oyer, EvalSolutions Inc, eoyer@evalsolutions.com
|
| Abstract:
Fidelity is the extent to which the intervention, as realized, is “faithful” to the pre-stated model. Measuring implementation fidelity provides data for understanding the overall impact of the program. Presenters will discuss the issues around developing a state framework for evaluating the Illinois Mathematics and Science Program and the policies and resources that are needed to sustain and scale up the initiative. Site evaluators will discuss tools and analyses for formative and summative evaluation of progress toward state goals for IMSP, which employs a comprehensive site visit protocol to create profiles of the local grants and develop themes across grants for understanding implementation of the program. Evaluators will discuss the tools for the site visit as well as the results from 2007-2008 and 2008-2009 program evaluation. Finally, the George Williams College of Aurora University MSP project evaluator will discuss the local evaluation of implementation for the IMSP.
|
|
Understanding State-level Program Impact: Leveraging State Policies and Resources for Effective Implementation
|
| Elizabeth Oyer, EvalSolutions Inc, eoyer@evalsolutions.com
|
| Marica Cullen, Illinois State Board of Education, mcullen@isbe.net
|
| Gilbert Downey, Illinois State Board of Education, gdowney@isbe.net
|
|
At the state program level, measuring implementation fidelity provides data for understanding the overall impact of the program as well as developing a framework for planning the policies and resources that are needed to sustain and scale up the initiative. Measuring implementation fidelity at the state level requires sensitivity to local evaluations as well as attention to the core elements of adherence to the broad guidelines of the state program. The Illinois Mathematics and Science Program (IMSP) Implementation Evaluation framework balances the needs of the state with the local program implementation needs. Multiple data sources at both the local project level as well as the state level provide rich sources for understanding the influence of adherence to implementation on synthesized outcomes. Panelists will discuss tools and analyses for formative and summative evaluation of progress toward state goals for the IMSP as well as how these analyses have shaped state policies for the program.
|
|
|
Understanding the Forest by Examining the Trees: Creating Profiles of Local Grantees to Develop Themes in Implementation of a State Mathematics and Science Partnership Program
|
| Tania Jarosewich, Censeo Group, tania@censeogroup.com
|
| Debra Greaney, Area V Learning Technology Center, dgreaney@lth5.k12.il.us
|
|
The Illinois Mathematics and Science Partnership program evaluation employs a comprehensive site visit protocol to collect data about the degree to which program components are delivered as prescribed; exposure, or the amount of program content received by participants; and the quality of program delivery in terms of the theoretical base of processes and content, participants’ responsiveness, and unique features of the program that distinguish it from other programs. Based on the data collected in the visits, the team creates profiles of the local grants and describes themes across grants to contribute to understanding the implementation of the program across the state. The two members of the evaluation team that visit the project sites to collect the implementation data will discuss the tools for the site visits, the challenges of collecting cross-site data, benefits of in-depth site visits, and summarize key results from 2007-2008 and 2008-2009 program evaluations.
| |
|
Measuring Implementation and Building Adherence: Assessing Fidelity and Improving Understanding in an Illinois Mathematics and Science Program
|
| James Salzman, Ohio University, salzman@ohio.edu
|
|
At the local level, implementation fidelity must be measured comprehensively to align with local project objectives and provide a foundation for measuring progress. There are many data collection considerations for establishing a plan for monitoring and assessing fidelity. The evaluator must identify measures for necessary preconditions and align all measures with current evaluation data sources for efficiency and to eliminate overlap. Finally, mixed methods using multiple data sources are needed to triangulate evidence of implementation.
The George Williams College of Aurora University MSP project evaluator will discuss the local evaluation of implementation for the IMSP. The Framework for Teaching (Danielson, 1996) has been modified for identifying specific classroom practices commensurate with inquiry methods being taught. The protocol incorporates training for observers, including facilitated conversations of how teaching episodes aligned with the levels of a rubric which is used as a self-reflection tool that allows participants to better understand implementation of the specific strategies in classrooms.
| |