Evaluation 2008 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Using Technology to Push the Boundaries of Theory-Driven Evaluation Science: Implications for Policy and Practice
Panel Session 829 to be held in Centennial Section E on Saturday, Nov 8, 9:50 AM to 10:35 AM
Sponsored by the Program Theory and Theory-driven Evaluation TIG
Chair(s):
Stewart I Donaldson,  Claremont Graduate University,  stewart.donaldson@cgu.edu
Discussant(s):
John Gargani,  Gargani and Company Inc,  john@gcoinc.com
Amanda Bueno,  First 5 LA,  abueno@first5la.org
Abstract: Theory-driven evaluation science is now being used across the globe to develop and evaluate a wide range of programs, practices, large-scale initiatives, policies, and organizations. One of the great challenges faced while using this approach is addressing high levels of complexity. In this session, we will illustrate new breakthroughs in the development of conceptual frameworks to guide evaluation practice and policy. A range of practical examples from evaluation practice will be presented to illustrate the value of these new tools and processes. The implications for evaluation policy and practice will be explored.
Using Interactive Software to Develop Complex Conceptual Frameworks to Guide Evaluations
Christina Christie,  Claremont Graduate University,  tina.christie@cgu.edu
Stewart I Donaldson,  Claremont Graduate University,  stewart.donaldson@cgu.edu
Tarek Azzam,  Claremont Graduate University,  tarek.azzam@cgu.edu
This paper will focus on how to use new advances in software design and application to develop and use program theory to improve evaluations. Based on the principles of program theory-driven evaluation science (Donaldson, 2007), a step-by-step approach will be reviewed to illustrate the value of using conceptual frameworks to engage stakeholders in a process that leads to accurate, useful, and cost-effective program evaluations. New interactive software will be illustrated to demonstrate how it can be used to engage stakeholders, facilitate needs assessments, develop program theory, formulate and prioritize evaluation questions, help answer key evaluation questions, and to communicate evaluation findings in ways that increase use and influence. The implications toward improving evaluation policy and practice will be discussed.
Examples of Complex, Interactive Conceptual Frameworks to Guide Evaluation Planning, Enhance Data Analysis, and for Communicating Findings
Tarek Azzam,  Claremont Graduate University,  tarek.azzam@cgu.edu
Stewart I Donaldson,  Claremont Graduate University,  stewart.donaldson@cgu.edu
Christina Christie,  Claremont Graduate University,  tina.christie@cgu.edu
This paper will explore several examples from evaluation practice of technology-enhanced logic models, program theories, and more complex conceptual frameworks to guide evaluation practice and policy. These technology-enhanced evaluations span education, health, and child development programs. Furthermore, an in-depth analysis of a technology-enhanced evaluation of a full portfolio of early childhood initiatives totaling over $800 million, will be presented to illustrate how these new tools can be used to address complexity problems, to guide strategic management processes, and to improve evaluation policy and practice.

 Return to Evaluation 2008

Add to Custom Program