Evaluation 2009 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Management and Analysis of National Multisite Program Evaluation Data: Center for Substance Abuse Prevention's Data Analysis Coordination and Consolidation Center
Multipaper Session 757 to be held in Sebastian Section I3 on Saturday, Nov 14, 10:55 AM to 11:40 AM
Sponsored by the Cluster, Multi-site and Multi-level Evaluation TIG
Chair(s):
Beverlie Fallik, United States Department of Health and Human Services, beverlie.fallik@samhsa.hhs.gov
Discussant(s):
Beverlie Fallik, United States Department of Health and Human Services, beverlie.fallik@samhsa.hhs.gov
Abstract: Multisite and cross-site program evaluation poses specific data management challenges. This session will describe the data assessment, validation, cleaning, management, and analysis processes at the Center for Substance Abuse Prevention's Data analysis Coordination and Consolidation Center (DACCC). After a brief introduction of the DACCC and its activities, two papers will be presented to demonstrate the two major functions of the Center, that is, data management and data analysis. The first paper will be presented by DACCC's Data Management Team Lead and will demonstrate the Center's data quality assurance procedures. It will cover topics such as procedures for quality assessment, standard data cleaning rules, and statistics on frequently encountered threats to data quality. The second paper will be delivered by DACCC's Data Analysis Team Lead and will demonstrate how the cleaned data are analyzed by presenting the results of an analysis of program outcomes within the context of site-specific factors.
Data Quality Assessment and Data Management Practices: An Example From the Center for Substance Abuse Prevention's Program Evaluation Data
P Allison Minugh, Datacorp, aminugh@mjdatacorp.com
Nicolletta A Lomuto, Datacorp, nlomuto@mjdatacorp.com
Susan L Janke, Datacorp, sjanke@mjdatacorp.com
Meeting performance goals in the context of "real world" program evaluation is a critical evaluation task. In order to demonstrate whether programs are effective, data must be trustworthy. This presentation focuses on common data quality issues involved in evaluating the Center for Substance Abuse Prevention's grant programs. Focusing on data quality assessment procedures that are used by CSAP's Data Analysis Coordination and Consolidation Center (DACCC), the presentation describes common data quality threats, the DACCC's procedures for evaluating and compiling information on data quality, the feedback loop established between DACCC and CSAP's grantees to improve data integrity, and the Center's data cleaning and management processes designed to respond to these assessments and to dialog with grantees. Data quality statistics will be presented for a variety of CSAP's prevention programs and the impact of data quality on key outcome and mediating variables will be discussed.
The Impact of Program Dosage and Intervention Strategy on Program Outcomes: An Analysis of Data Submitted to the Center for Substance Abuse Prevention
Nilufer Isvan, Human Services Research Institute, nisvan@hsri.org
Lavonia Smith LeBeau, Human Services Research Institute, llebeau@hsri.org
This presentation demonstrates the data analysis activities of Center for Substance Abuse Prevention's Data Analysis Coordination and Consolidation Center, assessing individual-level baseline, and exit data within the context of specific program characteristics. The focus will be on the role of program dosage, service delivery format, and intervention type in evaluating program outcomes for individual participants. Preliminary findings based on approximately 8,000 program participants from multiple sites suggest that program dosage affects outcomes only when considered in the context of service type, delivery format (group vs. individual), and the specific combination of intervention strategies implemented at the grantee site. Additional results from a detailed multivariate analysis will be presented, further investigating the interaction of these site-specific contextual factors with participant characteristics in predicting participant- and site-level program outcomes.

 Return to Evaluation 2009

Add to Custom Program