2011

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: From Theory to Practice: Potential Avenues via Evaluation Capacity Building and Research on Evaluation
Panel Session 578 to be held in Laguna A on Friday, Nov 4, 8:00 AM to 9:30 AM
Sponsored by the Research on Evaluation
Chair(s):
Leslie Fierro, Claremont Graduate University, Leslie.Fierro@cgu.edu
Discussant(s):
Christina Christie, University of California, Los Angeles, tina.christie@ucla.edu
Abstract: Evaluation scholars have repeatedly called for research on evaluation-in particular applied research. Since evaluation is largely a professional discipline, it is important for research on evaluation to move into the applied realm by generating information and developing methodologies that can be widely adopted by practitioners. Panelists in this session will demonstrate how processes and products of evaluation research and practice can inform each other. The first set of presenters will focus on evaluation capacity building (ECB)-describing a model generated from the existing empirical and theoretical knowledgebase about ECB and organizational learning and connect this to practical ECB approaches. The second set of presenters will turn the focus towards the use of evaluation techniques for explicating evaluation and program theory-discussing how the use of tools used in researching evaluation can translate into improved evaluation practice and potentially accelerate the sharing of important insights between evaluation scholars and practitioners.
Making Sense of "Capacity" in the Evaluation Process by Leveraging Existing Theories and Frameworks
Leslie Fierro, Claremont Graduate University, Leslie.Fierro@cgu.edu
Evaluation capacity has become a topic of great interest among members of the professional evaluation community. Multiple frameworks depicting the components of evaluation capacity within organizations have been published, as have multiple examples of "interventions" designed to build evaluation capacity. What is missing from the existing literature is a clear connection between these elements of capacity and how they contribute to supporting continuous evaluative inquiry processes within organizations. The current presentation will use an existing model of organizational learning to connect evaluation capacity to doing evaluation, collectively understanding evaluative findings, and acting on these findings. Implications for using this model to stimulate plans for evaluation capacity building efforts will be explored.
A Framework for Understanding Contextual Motivating Factors of Evaluation Capacity Building
Anne Vo, University of California, Los Angeles, annevo@ucla.edu
Evaluation capacity building (ECB), as an activity and area of research, has continued to pique the interest of evaluation practitioners, scholars, and more recently policy-makers. This is evidenced by the policy initiatives and federal calls for research on evaluation practice and methodology that have been enacted under the Obama Administration. However, the ECB literature suggests that there are discordant views regarding what "evaluation capacity building" means and where, when, how, why, and with whom it occurs. To begin addressing these issues, a deeper understanding of the contextual conditions that enable ECB is needed. A framework for understanding potential motivating factors that lead up to and foster ECB will be discussed in this presentation. This discussion will be grounded in examples of ECB efforts that have been taking place within a coalition of university-based educational outreach programs. Directions for future research on ECB will also be suggested.
Mixed Model Theory Development: Building a Theory-informed and Practice-informed Model of Evaluation
Michael Harnar, Claremont Graduate University, michaelharnar@gmail.com
Understanding the application of evaluation approaches helps inform our understanding of evaluation theories. Exploring this interaction of practice and theory is at the heart of research on evaluation. Because evaluations are much like the programs we evaluate, where our activities are expected to lead to outcomes, using a program theory-driven evaluation technique of modeling evaluation practice should increase our understanding of our approaches. This presentation extends previous evaluation theory modeling by describing a process that engages evaluators in modeling their practice with theoretically derived variables. In this method, evaluators model their preferred practice in an online modeling software and the produced models are combined to create one representative model that evaluators review and comment on, improving the model's reflection of practice. The final product is a theory- and practice-informed picture that might then be analyzed and tested for comprehensiveness and consistency in practice.
Testing Program Theory Using Structural Equation Modeling
Patricia Quiones, University of California, Los Angeles, pquinones@ucla.edu
Anne Vo, University of California, Los Angeles, annevo@ucla.edu
Developing methods to empirically test a program's underlying theory-that is, the linkages that connect program inputs and activities to outputs and outcomes on a logic model-has presented enduringly interesting challenges for evaluators. However, due to issues related to limited resources, measurement, and lack of quality data, a program tends to be evaluated primarily in terms of its component parts. As a result, only singular linkages are tested. While examining a program and its theory may be challenging, we maintain that the conduct of holistic evaluation remains a worthwhile and feasible endeavor. In this presentation, we propose a potential framework for using structural equation modeling (SEM) to evaluate a program and its theory in whole. We address ways in which one can transform a program theory into structural and measurement models. We also discuss practical and methodological issues to consider as one engages in this process.

 Return to Evaluation 2011

Add to Custom Program