2011

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Evaluating Program Implementation: A Decision Framework and Perspectives From the What Works Clearinghouse
Multipaper Session 628 to be held in Laguna A on Friday, Nov 4, 10:45 AM to 11:30 AM
Sponsored by the Research on Evaluation
Chair(s):
Christina Christie,  University of California, Los Angeles, tina.christie@ucla.edu
Fundamental Issues for Evaluation Implementation: A Decision Framework
Presenter(s):
Robert Owens, Washington State University, rwowens@wsu.edu
Abstract: This conceptual paper explores fundamental issues in assessing implementation. Fundamental issues have received attention in the literature, but have yet to be compiled in a decision framework. Assessing implementation provides important information regarding program feasibility, interpretation of program outcomes, and program theory. Implementation assessment is of particular importance in practical or real-world settings, which lack the control of clinical trials or laboratory settings. Five fundamental issues identified in literature are presented in the form of decisions evaluators make when planning implementation assessment. Fundamental decisions include (1) a theoretical approach of fidelity or adaptation; (2) a focus on delivery or receipt; (3) measurement of quantity, quality, or structure; (4) consideration of implementation globally or of specific program components; and (5) systematic manipulation of implementation or naturalistic observation. Fundamental decisions are explored using actual or hypothetical examples in education and prevention programs.
A Review of Fidelity of Implementation Measures in the What Works Clearinghouse
Presenter(s):
Timothy Ho, University of California, Los Angeles, timothyho@ucla.edu
Abstract: The past decade has seen an increased emphasis on randomized controlled trials in evaluating educational interventions, culminating in the What Works Clearinghouse (WWC), a repository of interventions that have been shown to be effective through rigorous research designs. At the same time, calls for measuring fidelity of implementation of interventions have also been raised to help illuminate the black box of randomized controlled trials. As such, different frameworks have been offered to understand fidelity of implementation (i.e. Dane & Schneider, 1998; Dusenbury et al. 2003; Mowbray et al. 2003; & Century et al. 2010). These different frameworks are addressed in this literature review. Additionally, the original research articles that led to the inclusion of specific interventions into the WWC will be studied in context of these frameworks. The research articles describing the effectiveness of educational interventions will be examined for how fidelity of implementation has been measured in these studies and the extent to which these measures have been shown to be related to outcomes.

 Return to Evaluation 2011

Add to Custom Program