Evaluation 2008 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Fitting the Design to the Context: Examples of Innovative Evaluation Designs
Panel Session 691 to be held in Centennial Section B on Friday, Nov 7, 4:30 PM to 6:00 PM
Sponsored by the Quantitative Methods: Theory and Design TIG
Chair(s):
Debra Rog,  Westat,  debrarog@westat.com
Discussant(s):
Charles Reichardt,  University of Denver,  creichar@du.edu
Abstract: Evaluators are often faced with situations in which they must balance the need for rigor, attention to stakeholder needs, and addressing the complexities of the evaluation context. Adequately balancing all three concerns often takes creative and innovative approaches that need to be nimble and flexible in the light of dynamics that can occur within the study situation. Evaluations that appear to effectively balance the multiple needs are those that produce results that are considered valid and credible but unbiased, and offer direction for action. This session will offer three examples of evaluations -- varying in scale, substantive area, and political attention -- that fit the design to the context while also attending to stakeholder concerns and concerns for rigor.
Fitting the Design to the Context: Using a Collaborative Design Sensitivity Approach To Produce Actionable Evidence
Debra Rog,  Westat,  debrarog@westat.com
Robert Orwin,  Westat,  robertorwin@westat.com
This presentation will describe two studies underway -- one multi-site evaluation of supportive housing programs for homeless families and one multi-program evaluation of programs aimed at reducing poverty. Both studies are using a combination of modified evaluability assessments and a design sensitivity approach in the design of outcome evaluations. The strategy in both evaluation situations is to understand the nature of the program context(s) in order to design outcome evaluations that are maximally sensitive to the features that can affect the ability to adequately understand the outcomes that result. Both evaluation studies also incorporate a high degree of collaboration with decision-makers in designing the evaluation. This presentation will describe each of the experiences, highlighting the dimensions of the context that created design challenges and the need for creativity.
Fitting Design to Context: The Story of a High Profile National Evaluation
Susan Berkowitz,  Westat,  susanberkowitz@westat.com
This presentation will discuss the evolution of the complex design of a high profile longitudinal impact evaluation of a national youth anti-drug media campaign. Starting with proposal writing and formulation of an alternative design, it will identify contextual factors contributing to shifts in the design, including budgetary constraints, Congressional concerns and often conflicting expectations from the funding agency and the study sponsor. At the same time, we consider the challenges experienced by the evaluation team as they modified the design in response to these exigencies, and changing one component of the design necessarily affected the others. Indeed, the need to adapt to context while maintaining basic integrity of the design and analysis persisted throughout the life of the evaluation, even after a clear, if highly challenging, design did emerge.
Fitting the Design to the Context: Examples from ITEST
Leslie Goodyear,  Education Development Center Inc,  lgoodyear@edc.org
In the past five years, the National Science Foundation has funded over 100 projects as part of its Innovative Technology Experiences for Students and Teachers program. Each of these projects offers exciting, hands-on Science, Technology, Engineering and Math experiences for students and teachers with the goal of sparking student interest in pursuing STEM careers. This presentation will highlight some of the innovative approaches used to evaluate the ITEST projects that respond to the complexities of context while at the same time strive for methodological rigor. Examples include innovative performance measures using games; rubrics to gauge learning that is situational; and innovative adaptations of nationally validated measures.

 Return to Evaluation 2008

Add to Custom Program