Evaluation 2009 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Evaluating the Federal Safe Schools/Healthy Students Initiative: Design and Analysis Approaches to Address Context Challenges
Panel Session 257 to be held in Sebastian Section I3 on Thursday, Nov 12, 10:55 AM to 12:25 PM
Sponsored by the Cluster, Multi-site and Multi-level Evaluation TIG
Chair(s):
Steve Murray, RMC Research Corporation, smurray@rmccorp.com
Discussant(s):
Danyelle Mannix, United States Department of Health and Human Services, danyelle.mannix@samhsa.hhs.gov
Abstract: The Safe Schools/Healthy Students Initiative is a collaboration among 3 federal agencies to support implementation of comprehensive community-wide plans to create safe and drug-free school environments. The program currently includes 175 grantees that target 4,204 schools and nearly 3.8 million students across the U.S. Consequently, numerous design challenges exist, such as accounting for variation in locally determined programs and data collection, in addition to federal context challenges, such as performance reporting requirements. The panel is composed of federal project officers and members of the national evaluation team who will discuss the large-scale, multi-site evaluation, specifically (1) overviewing the federal program environment, (2) describing local grant context, implementation, and the development of a program theory model to guide the evaluation, (3) illustrating the integration of qualitative data in the evaluation's mixed method design, and (4) discussing the innovative use of meta-analysis to analyze outcome data collected by local grant sites.
The Federal Context of the Safe Schools/Healthy Students Initiative
Michael Wells, United States Department of Education, michael.wells@ed.gov
Danyelle Mannix, United States Department of Health and Human Services, danyelle.mannix@samhsa.hhs.gov
Patrick Weld, United States Department of Health and Human Services, patrick.weld@samhsa.hhs.gov
The SS/HS Initiative offers a unique opportunity to examine large-scale program evaluation in the context of a federal environment that places many requirements and constraints on how the grants are conducted and managed. Federal programs stress the requirements for performance-based outcomes (e.g., Government Performance and Results Act, Program Assessment Rating Tool), valid and reliable data, addressing important problems, ensuring efficiency and fiscal responsibility, reducing burden on federal staff and grantees, and developing and disseminating useful solutions and recommendations. This evaluation involves the coordinated efforts of Federal Project Officers, local educational agencies, technical assistance providers, and national and local evaluators across a diverse set of socioeconomic contexts. It also involves coordination and integration of findings among several contractors.
Safe Schools/Healthy Students Local Grant Context and Program Theory
Andrew Rose, MANILA Consulting Group Inc, arose@manilaconsulting.net
G A Hill, MANILA Consulting Group Inc, ghill@manilaconsulting.net
Jennifer Keyser-Smith, MANILA Consulting Group Inc, jkeyser-smith@manilaconsulting.net
Shauna Harps, MANILA Consulting Group Inc, sharps@manilaconsulting.net
Kathleen Kopiec, MANILA Consulting Group Inc, kkopiec@manilaconsulting.net
Julia Rollison, MANILA Consulting Group Inc, jrollison@manilaconsulting.net
The national evaluation is further complicated as the 146 local grant sites adopt different approaches, activities, and programs to address problems specific to their communities. Local approaches are developed by partnerships, which must minimally be composed of representatives from the local education, mental health, juvenile justice, and law enforcement agencies. Further, local grant sites, although required to address common elements (e.g., reduced violence and alcohol and drug use, improved mental health services and early childhood social/emotional development), are not required to use common process or outcome measures. Consequently the evaluation entails analyzing and synthesizing data from a variety of sources, including project directors, state agencies, schools, and students as well as required grant partners using surveys, site visits, interviews, and focus groups. This presentation will stress the role of program theory and logic models in guiding the evaluation.
Integration of Qualitative and Quantitative Data in the Safe Schools/Healthy Students Evaluation
Alison J Martin, RMC Research Corporation, amartin@rmccorp.com
Marina L Merrill, RMC Research Corporation, mmerrill@rmccorp.com
Ryan D'Ambrosio, RMC Research Corporation, rdambros@rmccorp.com
Nicole Taylor, RMC Research Corporation, ntaylor@rmccorp.com
Lauren A Maxim, RMC Research Corporation, lmaxim@rmccorp.com
Roy M Gabriel, RMC Research Corporation, rgabriel@rmccorp.com
To respond to the federal program context and evaluation purpose, the National Evaluation Team developed a mixed methods approach, a concurrent nested design (Creswell et al., 2003), in which quantitative methods serve as the predominant method enhanced by qualitative methods implemented to describe partnership process and context. Quantitative data are collected through multiple surveys seeking information on grant activities, perceptions of partner contributions, and partnership functioning, and sites' annual performance reports contain required Government Performance and Results Act indicators that measure grant outcomes. Qualitative data are collected concurrently through annual group interviews on topics such as grant planning, implementation barriers, and partnership history and organization. This presentation will discuss data integration, which occurs at the analysis and interpretation stages. Qualitative data have been used to enhance quantitative data through thematic analysis and the transformation of qualitative data into numerical codes. Challenges inherent in qualitative-quantitative data integration will also be addressed.
Innovative Use of Meta-Analysis to Evaluate Large-Scale Multi-Site Federal Initiatives
Bruce Ellis, Battelle Memorial Institute, ellis@battelle.org
James Derzon, Battelle Memorial Institute, derzonj@battelle.org
Ping Yu, Battelle Memorial Institute, yup@battelle.org
Sharon Xiong, Battelle Memorial Institute, xiongx@battelle.org
A common challenge in conducting large-scale, multi-site evaluation studies of school safety and substance abuse prevention efforts has been the inability to measure and analyze changes in implementation and outcomes over time due to variations in types of data and the timeframe of the data being collected across different sites. The proposed paper discusses an innovative approach in addressing this challenge through the use of meta-analysis. Specifically, the proposed paper discusses how outcome data from different sites are collected; how outcome data using different instruments are processed and prepared for meta-analysis; and how the processed data are analyzed and used to assess changes in outcomes relating to school safety and student health over time. The presentation will be beneficial to evaluators in operationalizing data collection efforts, developing meta-analytic databases, and applying specific meta-analytic techniques to analyze diverse data.

 Return to Evaluation 2009

Add to Custom Program