Evaluation 2008 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Closing the Loop From Data to Action: How Evaluation Feeds Program Improvement
Panel Session 690 to be held in Centennial Section A on Friday, Nov 7, 4:30 PM to 6:00 PM
Sponsored by the Organizational Learning and Evaluation Capacity Building TIG
Chair(s):
Thomas Chapel,  Centers for Disease Control and Prevention,  tchapel@cdc.gov
Abstract: In large organizations, evaluation and strategic planning often exist in hermetically sealed boxes. Planning and evaluation should become an iterative cycle of 'What do we do?' - 'How are we doing?'- 'What should we do differently? - 'What do we do?' This requires attention to extracting and interpreting evaluation results and defining their implications for the next program action. This panel presents the experience of program staff who have contemplated the program description and the creation of mechanisms for feedback and who have used those mechanisms to turn evaluation results into program change. The presentation provides a more elaborate description and definition of feedback mechanisms, how they work, and why they are important. It also details the evaluation approaches of the presenters' programs and their process for ensuring feedback, and it details the ways in which evaluation results have been turned into immediate program change.
Walking the Talk: Evaluation Is as Evaluation Does
Matt Gladden,  Centers for Disease Control and Prevention,  mgladden@cdc.gov
Michael Schooley,  Centers for Disease Control and Prevention,  mschooley@dcd.gov
Rashon Lane,  Centers for Disease Control and Prevention,  rlane@cdc.gov
In addition to supporting evaluation requirements for funded programs, the Centers for Disease Control and Prevention's (CDC) Division for Heart Disease and Stroke Prevention (DHDSP) has turned its attention inward to institute a portfolio of evaluation activities to enhance evidence-based decision making and assess DHDSP as an enterprise. In contrast to a discrete program evaluation, DHDSP evaluators are working to build an organizational culture which values and routinely uses evaluation techniques to enhance the effectiveness of DHDSP initiatives. This requires simultaneously conducting program evaluations, fostering the use of findings, supporting evaluative thinking, and building evaluation capacity. We discuss the lessons learned about building a culture of evaluation in a complex organization including establishing systems that support evaluation, defining boundaries and priorities for evaluation work, developing and maintaining credibility, and fostering both evaluative thinking and rigorous program evaluation.
Development of a Strategic Planning Process to Complement an Existing Evaluation System
Tessa Crume,  Rocky Mountain Center for Health and Education,  tessac@rmc.org
Jill Elnicki,  Rocky Mountain Center for Health and Education,  jille@rmc.org
Pat Lauer,  Rocky Mountain Center for Health Promotion and Education,  patl@rmc.org
Karen Debrot,  Centers for Disease Control and Prevention,  kdebrot@cdc.gov
Sound program planning is critical to useful program evaluation, but too often these processes are done independently. CDC's Division of Adolescent and School Health (DASH) developed a strategic planning process that serves as a central link for program planning, implementation, and evaluation. This process serves to connect planning and evaluation as part of the program improvement process. This presentation will describe how DASH incorporated current planning tools including logic models, SMART objectives, and evaluation tools, such as process evaluation measures, into a strategic planning process that generates strategies to achieve long-term program goals.
Making Good on the End Game: Putting Evaluation Findings to Work for School-Based Asthma Programs
Marian Huhman,  Centers for Disease Control and Prevention,  mhuhman@cdc.gov
Cynthia Greenberg,  Centers for Disease Control and Prevention,  cgreenberg@cdc.gov
Laura Burkhard,  Centers for Disease Control and Prevention,  lburkhard@cdc.gov
Pam Luna,  Centers for Disease Control and Prevention,  pluna@cdc.gov
The prevalence (10.1%) of asthma among school-aged youth has led many schools to implement asthma management programs for their students. The Division of Adolescent and School Health (DASH) of the Centers for Disease Control and Prevention (CDC) has provided evaluation technical assistance to selected asthma management programs to help them assess fidelity of implementation of the intervention and to determine the short-term impact on students' asthma management. Programs are now using the evaluation findings in various ways, including arguing for the need to expand the program, implementing changes in the intervention, and influencing adjustments in policy and practices. This presentation will describe how the Albuquerque Public School District generated evaluation findings about their asthma programs and how those findings were leveraged for program improvements.

 Return to Evaluation 2008

Add to Custom Program