|
Session Title: New Directions in Multisite Evaluation
|
|
Panel Session 775 to be held in Laguna A on Friday, Nov 4, 4:30 PM to 6:00 PM
|
|
Sponsored by the Theories of Evaluation TIG
|
| Chair(s): |
| Robert Blagg, EMT Associates Inc, rblagg@emt.org
|
| Abstract:
Naturally occurring variation (e.g., setting, implementation, outputs, effects) within and across multisite evaluations can be measured to provide a basis for analyses that produce internally and externally valid knowledge supporting evidence-based practice. However, capturing this variation requires careful measurement of site environments, implementation (e.g., intervention or service), and outcomes across sites. An inductive (e.g., micro to macro) mixed-method measurement approach, which is flexible enough to adapt to program context, can accurately and efficiently capture this "real world" variation and support analysis that reveals the relation of this variance (constraints and contributions) to outcomes. This session will include four presentations which detail the evaluation theory and technique behind the natural variation approach (i.e., identify necessary steps in modeling, measurement, analysis and interpretation) and highlight several examples of current successful application of this approach. Implications and value of this approach for evaluation theory and practice will be discussed throughout.
|
|
Natural Variation Designs: Maximizing the Information Potential of Multi-site Evaluations
|
| J Fred Springer, EMT Associates Inc, fred@emt.org
|
|
This presentation provides an overview of the logic and purpose of natural variation approaches as alternatives to experimental design when multiple contexts (e.g., programs, communities, classrooms) are units of analysis. The presentation a) explicates the logic of the approach and its design requirements ; b) identifies measurement approaches (most often mixed method measurement) appropriate to capturing the important elements of variation in setting, design and implementation; c) identifies design adaptations (e.g., in-site comparisons, over time, effect size, clustering, meta-analytic technique, exploratory-confirmatory iterations) that provide a robust balance of internal and external validity in different resource and data environments; and e) identifies alternative analysis techniques that can be used in these studies (e.g., hierarchical analysis, cluster comparisons, meta-analytic regression). Specific examples from past and current evaluations are used throughout, and the benefits to development of useful evidence-based practice are emphasized.
|
|
|
Natural Variation Designs: Diverse Multisite Applications
|
| Robert Blagg, EMT Associates, Inc., rblagg@emt.org
|
|
This presentation explicates the breadth of application of natural variation designs by presenting similarities and differences in four major multi-site studies. Studies to be highlighted include: the five year, 48 site, CSAP funded National Cross-Site Evaluation of High Risk Youth Programs through which many aspects of natural variation design were explicated and refined; the current ONDCP funded National Evaluation of the Drug Free Communities program that is implementing a rigorous natural variation design including several hundred communities; the SAMHSA CSAT-funded, 20 site Adult Treatment Drug Court multi-site evaluation; and the US Department of Education analysis of bullying laws and policies enacted in four states and twenty four sites. These studies represent unique multi-site environments that support variations in multi-site design. Benefits of the natural variation approach in producing actionable information useful to decision makers will be highlighted.
| |
|
Evaluation of Susbstance Abuse and Mental Health Services Administration (SAMHSA) Center for Substance Abuse Treatment (CSAT)-funded Adult Treatment Drug Courts
|
| Carrie Petrucci, EMT Associates Inc, cpetrucci@emt.org
|
|
This presentation provides a comprehensive measurement strategy to support the natural variation design and analysis being used in the multisite evaluation of 20 SAMHSA CSAT-funded Adult Treatment Drug Courts. The measurement model is based in a realist evaluation framework which answers the question: what works best for whom under what circumstances or contexts. The presentation includes the logic model underlying the measurement tools; the comprehensive data sets that are drawn on to document setting, design, implementation and outcomes; and the site visit protocol that provides a practical solution to integrating observations, interviews, and program data gathered in site visits. Both quantitative and qualitative data are collected as needed to best understand processes and outcomes, including concept mapping interviews, focus groups and program records. The comprehensive measurement design was developed and applied through our highly skilled collaborative team that includes Westat, EMT, and Carnevale Associates.
| |
|
Application of the Natural Variation Design to California's Statewide Evaluation of the Mental Health Services Act
|
| Elizabeth Harris, EMT Associates Inc, eharris@emt.org
|
|
UCLA's Center for Healthier Children, Families and Communities and subcontractor EMT Associates are conducting the Statewide Evaluation of California's Mental Health Services Act. The study uses a natural variation design that emphasizes a) identifying county environment characteristics that distinguish between distribution, quality and efficiency of mental health service implementation; b) identifying service configurations that maximize quality and efficiency criteria; and c) identifying county policy and administrative practices that produce accountability and coordinated service delivery among multiple providers. Natural variation approaches to measurement and analysis provided a sound perspective for a) modeling the analysis, b) designing data collection, and c) conducting exploratory analyses to identify latent structures in setting and process data relevant to analysis questions. The presentation will provide detail on a) selection of optimal program and fiscal data, b) extraction of relevant data from program and fiscal records, c) creation of comparable data across counties, and d) preliminary results.
| |