Evaluation 2009 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Evaluation Capacity Building: Taking Stock of and Advancing Frameworks, Strategies, Outcomes and Measurement
Panel Session 734 to be held in Wekiwa 5 on Saturday, Nov 14, 9:15 AM to 10:45 AM
Sponsored by the Collaborative, Participatory & Empowerment Evaluation TIG
Chair(s):
Yolanda Suarez-Balcazar, University of Illinois at Chicago, ysuarez@uic.edu
Abstract: Although the literature on ECB has grown, there is little synthesis of what we have learned thus far about ECB conceptualizations, strategies, and outcomes. This panel will address these issues. The first paper by Lubin et al. describes the method and findings from a synthesis of 100 ECB empirical studies using a framework developed by Duffy et al. The second paper by Duffy et al. describes the framework as it was initially conceptualized and discusses how it has been revised based on findings from the synthesis discussed in the Labin et al presentation. The third and fourth papers describe the validation of a contextual model of ECB using two complementary approaches; in the third, Suarez et al. describe a multi-method multiple case study approach and in the fourth, Taylor-Ritzler et al. describe a large quantitative validation approach. All presenters will discuss implications for ECB theory, research and practice.
Synthesis of Evaluation Capacity Building Literature
Susan N Labin, Independent Consultant, slabin@pobox.com
Jennifer L Duffy, University of South Carolina, jenniferlouiseduffy@gmail.com
Duncan Meyers, University of South Carolina, meyersd@mailbox.sc.edu
Abraham Wandersman, University of South Carolina, wandersman@sc.edu
Evaluation capacity building (ECB) is a growing area reflecting an ever increasing expectation of evidence that programs are achieving their goals. Community-based organizations often must demonstrate their ability to engage in program evaluation to obtain funding. ECB is also reflective of the shift towards participatory evaluation methods. For example, capacity building is one of the basic precepts of empowerment evaluation. This presentation focuses on the method and findings from a synthesis based on detailed coding of approximately 100 ECB empirical studies. The conceptual framework used is discussed in the Duffy, et al. presentation. The broad-based synthesis method used derives from the evaluation syntheses developed at the Government Accountability Office and includes qualitative and quantitative data from a variety of evaluation designs. The research questions focus on ECB strategies, evaluations, and outcomes, and include some contextual and implementation variables. Reporting will also include challenges encountered and lessons learned.
A Conceptual Framework for Assessing the Empirical Literature on Evaluation Capacity Building
Jennifer L Duffy, University of South Carolina, jenniferlouiseduffy@gmail.com
Duncan Meyers, University of South Carolina, meyersd@mailbox.sc.edu
Susan N Labin, Independent Consultant, slbain@pobox.com
Abraham Wandersman, University of South Carolina, wandersman@sc.edu
As the practice and research of evaluation capacity building (ECB) have grown, multiple useful frameworks have been developed to highlight important aspects of the ECB process. As preparation for a synthesis of the empirical research on ECB, key frameworks and theories were reviewed to guide coding and analysis. We then constructed a unified framework by integrating concepts from theory and existing frameworks into a general logic model format, including needs leading to ECB, activities characterizing ECB, and outcomes attributed to ECB. This presentation will describe the framework as it was initially conceptualized and discuss how it has been revised based on findings from the synthesis discussed in the Labin et al presentation. Elements of the framework that are lacking empirical research will be highlighted, and implications for ECB research and practice will be discussed.
Evaluation Capacity: Model Validation Using a Mixed Method Multiple Case Study Approach
Yolanda Suarez-Balcazar, University of Illinois at Chicago, ysuarez@uic.edu
Tina Taylor-Ritzler, University of Illinois at Chicago, tritzler@uic.edu
Edurne Garcia-Iriarte, University of Illinois at Chicago, edurne21@yahoo.com
Within the ECB literature, there are currently no published examples of validated instruments or systems for assessing evaluation capacity. Together with the fourth presentation in this panel, the purpose of this presentation is to present The Evaluation Capacity Building Contextual Model and share data on the validation of the model, using an in-depth mixed method multiple case study approach. The model was developed based on work with diverse human service organizations and on reviews of evaluation, cultural competence, and capacity building literatures. To begin to address the lack of validated models of evaluation capacity in their field, our team used a multiple case study approach to validate the ECB Contextual Model using survey, interview and document review data collected from multiple stakeholders within nine organizations. In this presentation we will share the measures, analytic methods and results of the case study approach to validation.
Evaluation Capacity: Model Validation Using Quantitative Methods
Tina Taylor-Ritzler, University of Illinois at Chicago, tritzler@uic.edu
Yolanda Suarez-Balcazar, University of Illinois at Chicago, ysuarez@uic.edu
Edurne Garcia-Iriarte, University of Illinois at Chicago, edurne21@yahoo.com
In this presentation we will present the methods and results of a large quantitative validation study of our ECB Contextual Model using a survey developed by our team. We will describe the survey methods, analytic results and final model and measure with audience members. We will present quantitative analyses of the variable importance of different individual factors (i.e., awareness of the benefits of evaluation, motivation to conduct evaluation and knowledge and skills related to conducting evaluation) and organizational factors (i.e., leadership, learning climate, resources, and funders' demands) on evaluation capacity outcomes (i.e., use and mainstreaming of evaluation practices). We will conclude the presentation by comparing and contrasting the results of the case study and quantitative approaches to validation and by discussing the implications for future research and practice related to conceptualizing and measuring ECB.

 Return to Evaluation 2009

Add to Custom Program