Evaluation 2009 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Contextualizing the Evaluand: Planning and Implementing an Evaluation of the Injury Control Research Center (ICRC) Program
Multipaper Session 486 to be held in Wekiwa 7 on Friday, Nov 13, 9:15 AM to 10:45 AM
Sponsored by the Research, Technology, and Development Evaluation TIG
Chair(s):
Sue Lin Yee, Centers for Disease Control and Prevention, sby9@cdc.gov
Discussant(s):
Thomas Bartenfeld, Centers for Disease Control and Prevention, tbartenfeld@cdc.gov
Abstract: The context of a program and perspectives of key stakeholders introduce a set of assumptions that influence the planning and implementation of an evaluation and ultimate utility of the evaluation findings. In evaluations of research and technology programs, systematic attention toward these assumptions yields an evaluation that thoughtfully addresses competing realities of different contexts. In 2008, CDC's National Center for Injury Prevention and Control (NCIPC) conducted a portfolio evaluation of 12 Injury Control Research Centers (ICRC) using the CDC Framework for Program Evaluation, a utilization-focused planning tool. The evaluation team will discuss the dilemmas and resulting solutions that arose from addressing the myriad of contexts in stakeholder engagement, clarifying the evaluation focus, data collection and analysis, and communicating findings. In closing, we will offer lessons learned that will be insightful for any evaluator of research and technology seeking to maximize the utility of their evaluation.
Negotiating Diverse Contexts and Expectations in Stakeholder Engagement
Sue Lin Yee, Centers for Disease Control and Prevention, sby9@cdc.gov
In most evaluations, the contexts and perspectives of key stakeholders overlap and often compete with one another. Conducting an evaluation that is meaningful and useful to all stakeholders requires an understanding of each stakeholder's expectations from the beginning and a willingness to revisit them throughout the evaluation. In the ICRC Portfolio Evaluation, the primary stakeholders are university grantees conducting research, training, and coordination of injury activities, and the funder, the CDC National Center for Injury and Control (NCIPC). Secondary stakeholders also play an important a role in assessing and providing program recommendations. To negotiate these diverse contexts and expectations, the ICRC Portfolio Evaluation Workgroup was established to guide the planning, implementation, and use of evaluation findings. This presentation will describe the perspectives of the major stakeholders and the contexts in which they operate and offer strategies for sustained interaction, despite the reality of varied priorities and power differentials.
Clarifying the Evaluation Focus in a Complex Program Context
Howard Kress, Centers for Disease Control and Prevention, hak6@cdc.gov
This presentation describes the iterative and dynamic processes undertaken to focus the purpose of the ICRC Portfolio Evaluation. Specifically, we developed the following tools to guide our understanding of the program context and that of the key stakeholders: (1) a hierarchical tree of the evaluation questions, (2) a program conceptual model, and (3) two logic models that describe the ICRC program. We will describe the iterative process of developing, vetting, and validating the evaluation questions with the logic models, and discuss the manner in which these tools laid the groundwork for subsequent phases of the evaluation. The presentation will close with lessons learned that should be helpful for other research and technology evaluations seeking to clarify their evaluation focus as well as negotiate complex program context.
Considering Context in Data Collection and Analysis
Jamie Weinstein, MayaTech Corporation, jweinstein@mayatech.com
The ICRC Portfolio Evaluation Team addressed the context and perspectives of the stakeholders in identifying the most appropriate data collection and analyses methods. The complex nature of the program context seemed best explored through using qualitative data collection methods. The evaluation employed a four phase data collection approach, in which each phase was designed to meet the specific needs of the evaluation and maximize the utility of the findings. Qualitative data was collected through site visits to two centers, teleconference interviews with each of the twelve participating centers, and teleconference interviews with past and current CDC staff. At every stage of the data collection and analysis, iterative data analyses ensured that the evaluation questions and purpose linked back to the evaluation goals, and the needs of the key stakeholders. Challenges faced during the data collection process will be discussed and lessons learned will be shared.
Contextual Influences and Constraints on Communicating Findings
Kristianna Pettibone, MayaTech Corporation, kpettibone@mayatech.com
A critical component of any evaluation is to share findings with stakeholders. The ICRC Portfolio Evaluation involved multiple stakeholders who brought an array of perspectives on how the findings should be shared. As the funder, CDC's National Center for Injury Prevention and Control provided the primary context for determining the utility of the evaluation, which initially was to produce an internal report for documenting accountability and identifying areas for program improvement. Involvement of stakeholders such as the ICRC directors and CDC staff introduced another set of contextual assumptions that influenced decisions related to sharing findings. Finally, in conducting the evaluation and identifying recommendations for program improvement, we discuss other potential uses of the evaluation findings. This presentation examines the influence and constraints that key stakeholders introduced on sharing findings and proposes strategies for evaluators of research and technology on managing these competing contexts.

 Return to Evaluation 2009

Add to Custom Program