Evaluation 2009 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: The Impact of Evaluation Capacity Building Activities
Multipaper Session 223 to be held in Suwannee 12 on Thursday, Nov 12, 9:15 AM to 10:45 AM
Sponsored by the Organizational Learning and Evaluation Capacity Building TIG and the Health Evaluation TIG
Chair(s):
Andrea Fuhrel-Forbis,  University of Michigan, andreafuhrel@yahoo.com
Discussant(s):
Molly Engle,  Oregon State University, molly.engle@oregonstate.edu
A Multidisciplinary Organizational Capacity-Building Partnership Working in Kenya: Developing, Implementing and Evaluating HIV Prevention for Kenyan Youth
Presenter(s):
Leah Neubauer, DePaul University, lneubaue@depaul.edu
Gary Harper, DePaul University, gharper@depaul.edu
Alexandra Murphy, DePaul University, amurphy1@depaul.edu
Andrew Riplinger, DePaul University, aripling@depaul.edu
Theresa Abuya, Kenyan Episcopal Conference-Catholic Secretariat, abuya@catholicchurch.or.ke
Amanda Gibbons, American International Health Alliance, agibbons@aiha.com
Abstract: Through a partnership managed by the American International Health Alliance, with United States President's Emergency Plan For AIDS Relief (PEPFAR) funding, the Kenyan Episcopal Conference-Catholic Secretariat in Nairobi and DePaul University in Chicago have formed a global capacity-building partnership to provide an integrated multi-pronged approach to developing, delivering and evaluating HIV prevention for Kenyan youth. The multi-layered capacity building approach is designed to address the complex context that exists in international HIV prevention. Partners bring diverse perspectives on multi-ethnic post-colonial Kenyan culture, a Kenyan free primary education system, interpersonal communication, the contributing factors associated with the HIV/AIDS epidemic, and monitoring and evaluation. Collaboration across this wide range of disciplines helps to contribute to the creation of sustainable, culturally-sensitive programs targeted to the unique needs and conditions of individual communities in Kenya. The collaborations require continual program monitoring, feedback and evaluation to ensure optimal inputs, outputs and overall functioning.
Providing Evaluation Technical Assistance to Community Organizations: Context Matters
Presenter(s):
Jessi Erickson, Saint Louis University, ericksjl@slu.edu
Cheryl Kelly, Saint Louis University, kellycm@slu.edu
Web Brown, Missouri Foundation for Health, wbrown@mffh.org
Nancy Mueller, Washington University, nmueller@gwbmail.wustl.edu
Virginia Houmes, Washington University, vhoumes@gwbmail.wustl.edu
Darcell Scharff, Saint Louis University , scharffd@slu.edu
Abstract: To address increasing rates of obesity in Missouri, the Missouri Foundation for Health (MFH) created the Model Practice Building (MPB) funding program under its Healthy & Active Communities Initiative to identify successful practices that promote healthy and active lifestyles. This funding supports 20 grantees working with different populations (e.g., children) and settings (rural, schools). MFH funded an external evaluation team to provide intensive evaluation technical assistance (TA) and training to increase grantees' evaluation capacity and skills. TA activities were developed to address the unique needs and varying contexts in which the grantees operate. TA methods include; 1. identification of TA needs; 2. development of individualized TA plans; 3. on-going support through site visits, conference calls and email; 4. assistance with data collection, management and analysis (e.g., developing surveys); and 5. development of evaluation training opportunities (e.g., workshops). Methods, lessons learned, and implications for the field of evaluation will be described.
Evaluation of Technical Assistance Influence Using Project Activity Networks
Presenter(s):
Peter Kreiner, Brandeis University, pkreiner@brandeis.edu
Abstract: This study developed a new approach to evaluating the influence of technical assistance on project progress in five communities funded to address youth substance abuse. The evaluation developed a 'citation' network of project activities, capturing, for each project activity, which prior activities gave rise to it, including technical assistance activities. We then applied the quantitative tools of citation network analysis to derive a measure of each activity's influence on downstream activities, yielding measures for each project of: (1) the average influence of technical assistance activities; (2) the average influence of non-technical assistance activities; and (3) the relative influence of technical assistance activities (the ratio of the two) for each community project. Using these measures, we explored the relative influence of technical assistance activities over time, across projects, and across project activity categories. Project evolution, as displayed in these networks, and patterns of activity influence corresponded well with project coordinators' understanding.
Evaluating Impact of Development Training and Capacity Building: Evaluation Approaches, Their Strengths and Limitations
Presenter(s):
Eric Abdullateef, Directed Study Services, eric.abdullateef@mac.com
Rahel Kahlert, University of Texas at Austin, kahlert@mail.utexas.edu
Abstract: This paper presentation analyses evaluation approaches and methods that are typically used to assess the impact of development evaluation trainings at several levels: impact on individuals' evaluation knowledge, professional practice, and institutional change. The authors discuss different types of evaluation approaches that use direct measures (e.g., tests), indirect measures (e.g., perception of graduates), and documentation of training usage (e.g., training provision to colleagues and evaluation community engagement). The authors point out strengths and limitations of each approach and suggest ways to mediate these limitations. Examples are drawn from evaluations of the International Program for Development Evaluation Training (IPDET.org), organized by the World Bank and the University of Carleton in Ottawa. Each year approximately 30 evaluation specialists train 200 professionals from more than 50 countries about evaluation methods and applications for assessing social programs in developing nations.

 Return to Evaluation 2009

Add to Custom Program