Evaluation 2009 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Assessing the Degree of Implementation When Evaluating a State-Wide Initiative
Panel Session 240 to be held in Wekiwa 10 on Thursday, Nov 12, 9:15 AM to 10:45 AM
Sponsored by the AEA Conference Committee
Chair(s):
Patricia Noonan, University of Kansas, pnoonan@ku.edu
Abstract: The purpose of this panel presentation is to open a dialogue regarding the trends in assessing implementation within the context of high quality evaluations of large-scale, state-wide initiatives. Contemporary trends in assessing implementation have advanced evaluation methods and practices from a primary reliance on compliance measures to a concern for depth of implementation, attention to implementation drivers, and the increasing use of web-based technology for real-time data collection. This presentation builds on the shared understanding that the technical adequacy of the implementation measures must meet the same high standards for rigor as the technical adequacy of the outcome measures used in an evaluation. In this presentation, practical guidelines for advancing the assessment of implementation will be forwarded by five evaluators with experience in state-wide initiatives in the unique contexts of Mississippi, Missouri, New Hampshire, and Ohio.
Implementation Analysis: Historical Context and Contemporary Thinking
Julie Morrison, University of Cincinnati, julie.morrison@uc.edu
This presentation explores the historical context for assessing the degree to which an evaluation object is implemented as planned. Key terms from the evaluation lexicon, such as implementation analysis, process evaluation, formative evaluation, and performance monitoring will be defined as well as related terms from intervention-level activity, such as treatment integrity, intervention fidelity, and procedural adherence. Trends in assessing implementation (i.e., from compliance measures to a concern for depth of implementation and attention to implementation drivers) will be discussed. Current advances in conceptualizing implementation analysis and its role in the evaluation process and context will be examined.
Missouri Integrated Model: Using Evaluation Data in an Iterative Process
Amy Gaumer Erickson, University of Kansas, aerickson@ku.edu
Patricia Noonan, University of Kansas, pnoonan@ku.edu
Zach McCall, University of Kansas, zmccall@ku.edu
The Missouri Integrated Model (MIM) draws on the common core components of several research-based initiatives with the goal of creating collaborative and effective schools where parents, community members, and school staff work together in making data-driven decisions to ensure positive social and educational benefits for all students. This panel presentation will describe the MIM's approach to evaluation and how formative evaluation findings have been utilized to inform the decision-making process during the course of the project. Given that the project is in its second year of a five year development grant, formative evaluation activities have focused on iterative design. The evaluation team has captured and analyzed critical process data through interviews with Implementation Facilitators, focus groups with school and administrator teams, ongoing document review, and online surveys. In this iterative process, results for the intervention groups are compared to a matched marker of similar schools within the State of Missouri.
Mississippi's State Personnel Development Grant (SPDG): Using Evaluation Data to Inform Practice
Patricia Mueller, Evergreen Educational Consulting LLC, eec@gmavt.net
This panel presentation describes Mississippi's SPDG approach to evaluation and how formative evaluation findings have been utilized to inform the decision-making process during the course of the five-year project. The emphasis of MS's SPDG (2005-2010) is to build a sustainable structure of systematic professional development by (a) planning, aligning, and implementing professional development activities with universities; 2) developing a cadre of personnel across the state to serve as mentors with enhanced skills and knowledge of positive behavioral support and scientifically based reading instruction strategies; and 3) identifying schools to serve as demonstration sites. Members of the project management team receive ongoing data reports relative to the frequency and type of professional development provided across model school sites. Frequent data scans have assisted management in determining high end users and those who may not have met the required benchmarks for success, thus triggering more targeted intervention.
Practical Guidelines for Advancing the Assessment of Implementation
Julie Morrison, University of Cincinnati, julie.morrison@uc.edu
Patricia Mueller, Evergreen Educational Consulting LLC, eec@gmavt.net
Patricia Noonan, University of Kansas, pnoonan@ku.edu
Amy Gaumer Erickson, University of Kansas, aerickson@ku.edu
Zach McCall, University of Kansas, zmccall@ku.edu
This presentation builds on the shared understanding that the technical adequacy of the implementation measures must meet the same high standards for rigor as the technical adequacy of the outcome measures used in an evaluation. In addition to adequate reliability and validity, these implementation measures must also meet requirements for feasibility, accuracy, utility, and propriety within the context of evaluation. In this presentation, practical guidelines for advancing the assessment of implementation will be forwarded by five evaluators with experience in large-scale, state-wide initiatives.

 Return to Evaluation 2009

Add to Custom Program