|
Session Title: Lessons From the Field: Zambian Case Study
|
|
Panel Session 421 to be held in Peale Room on Thursday, November 8, 3:35 PM to 5:05 PM
|
|
Sponsored by the International and Cross-cultural Evaluation TIG
|
| Chair(s): |
| Alice Willard,
Independent Consultant,
willardbaker@verizon.net
|
| Abstract:
Learning is an essential component in the development cycle; the essence of learning implies changing the way one does business, or implements a particular project/program, being adaptive through reflection. This implies that throughout the life cycle of a project or program, the overall orientation of any M&E system should be geared towards learning with an emphasis on continuous improvement, by identifying best practices, highlighting opportunities and enhancing competencies of all implementing partners. To maximize the benefits of organizational learning it is important that all stakeholders, inclusive of beneficiaries, are fully integrated into the process. With a strategic focus on the need for learning and the resources to accomplish it, CRS will be able to share its best practices and successful models more widely while finding innovative ways to keep staff members learning and sharing what they learn. As an agency CRS stands to learn much from the Zambian experience.
|
|
Step by Step: Key Points and Audiences for Learning in a Field Monitoring and Evaluation (M&E) System
|
| Alice Willard,
Independent Consultant,
willardbaker@verizon.net
|
|
The basic cycle of assessment, design, implementation, monitoring, and evaluation (ADIME) has become axiomatic in development projects. International evaluators must be adept at each aspect of the cycle, whether they are part of the institution and can follow a project throughout the cycle, or whether they are more transient consultants in the cycle. No matter the locus of the evaluator, there are often missed opportunities for learning at each moment in the cycle. The type and opportunity of learning vary with the particular audience the evaluator encounters: project staff, clients, local partners, donors. This paper identifies key points for learning, and discusses the types of learning possible for different audiences. The author will present options and opportunities for learning for those different points in, and audiences for, the basic M&E system.
|
|
|
Catholic Relief Service (CRS)/Zambia Impact of Lessons From Rigorous Evaluation
|
| James Campbell,
Catholic Relief Services,
jcampbell@crszam.org.zm
|
| Shannon Senefeld,
Catholic Relief Services,
ssenefeld@crs.org
|
|
Stringent evaluation criteria are necessary in any evaluation that seeks to attribute impact or outcome to a program intervention. These evaluations enable international development organizations to appropriately plan and implement programs with targeted interventions. However, the very process of conducting a stringent evaluation in many rural areas can lead to unexpected outcomes and learning.
In 2005, Catholic Relief Services (CRS) began a six-month evaluation to determine the impact of nutritional supplements on anthropometric status and quality of life of HIV positive clients in rural Zambia, using a three-armed approach (two experimental groups and one control). The evaluation relied on existing local partner networks for data collection and verification. Throughout the evaluation, CRS and partners learned valuable lessons regarding the implementation of such a rigorous exercise in a resource poor rural environment. This presentation highlights the key lessons and opportunities for learning for staff and participants.
| |
|
Learning by Design: SUCCESS II and Partner Participation in the Monitoring and Evaluation Design
|
| Thomas Moyo,
Catholic Relief Services,
tmoyo@crszam.org.zm
|
| Alice Willard,
Independent Consultant,
willardbaker@verizon.net
|
|
In 2007, CRS/Zambia contracted an overhaul of the M&E system for one of their HIV/AIDS projects. The project had been renewed and expanded after an initial implementation cycle. At this time, both the donor and CRS wanted more outcome-level indicators, yet worked with partners quite resistant to the addition or complication of already contentious reporting requirements. As part of identifying the moments of learning in the implementation cycle, the consultant worked closely with staff and partners to identify capacity and training requirements. One key element in the design was to establish particular mechanisms for all of the participants of the 'information food chain' to learn what the data meant, and what the data could mean to their own management decisions. This presentation describes the process of setting up an embedded learning function as a specific part of the M&E system, rather than an afterthought to the reporting cycle.
| |