Evaluation 2009 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Frameworks for Evaluation
Multipaper Session 589 to be held in Sebastian Section I2 on Friday, Nov 13, 3:35 PM to 4:20 PM
Sponsored by the Cluster, Multi-site and Multi-level Evaluation TIG
Chair(s):
Pamela Bishop,  University of Tennessee at Knoxville, pbaird@utk.edu
Development of a Visual Program Theory Framework for Multilevel Evaluation of the National Institute for Mathematical and Biological Synthesis
Presenter(s):
Pamela Bishop, University of Tennessee at Knoxville, pbaird@utk.edu
Abstract: Evaluators of multilevel programs face a daunting challenge of organizing and aligning evaluation plans that respond to the unique needs of each project while remaining focused on the overall objectives of both the program and its funding agency. Additionally, evaluators must work to communicate a complex evaluation framework to stakeholders who may be unfamiliar with the processes and terminology of program evaluation. The current multilevel evaluation framework covers a mathematical biology institute offering many levels of both research and education/outreach oriented projects. The proposed paper outlines the presenter's method in developing a visual program theory model that aligns resources and outcomes at all program levels, and serves as a utilization-focused collaborative communication tool for developing the program theory and evaluation process with program stakeholders.
Reach, Effectiveness, and Implementation: A Reporting Framework for Multisite Evaluation in Public Health
Presenter(s):
Douglas Fernald, University of Colorado Denver, doug.fernald@ucdenver.edu
Mya Martin-Glenn, University of Colorado Denver, mya.martin-glenn@uchsc.edu
Abigail Harris, University of Colorado Denver, abigail.harris@ucdenver.edu
Stephanie Phibbs, University of Colorado Denver, stephanie.phibbs@ucdenver.edu
Vicki Weister, University of Colorado Denver, vicki.weister@udenver.edu
Elizabeth Ann Deaton, University of Colorado Denver, elizabeth.deaton@ucdenver.edu
Nicole Tuitt, University of Colorado Denver, nicole.tuitt@ucdenver.edu
Arnold Levinson, University of Colorado Denver, arnold.levinson@ucdenver.edu
Abstract: A voter-approved tobacco tax in Colorado supports a variety of public health projects across several tobacco-related disease areas. To evaluate a complex portfolio of funding that covers a range of project designs, target populations, and diseases we sought an evaluation framework that could: 1) guide our assessment of current programming, and 2) guide the development of a standard set of reporting tools for individual projects that would fit within their existing budget. Because projects had to demonstrate an evidence base for their work, our evaluation sought an approach that emphasizes explaining programming reach and implementation over effectiveness. Drawing from existing work that emphasizes evaluating external validity in individual interventions, we developed a reporting toolkit to capture reach and implementation data in a standardized format. This paper describes the development and implementation of a reporting framework and toolkit for projects.

 Return to Evaluation 2009

Add to Custom Program