|
Monitoring and Evaluation (M&E) at Room to Read: Concepts and Practice in an International Non-Profit Educational Organization
|
| Presenter(s):
|
| Michael Wallace, Room to Read, michael.wallace@roomtoread.org
|
| Rebecca Dorman, Room to Read, rebecca.dorman@roomtoread.org
|
| Peter Cooper, Room to Read, peter.cooper@roomtoread.org
|
| Abstract:
Room to Read is a nonprofit organization committed to transforming the lives of children in developing countries by focusing on literacy and gender equality in education. Our M&E system is based on program-specific logical frameworks that include goals, objectives, and indicators. M&E is part of a performance management system that helps us improve planning and report results. This paper describes the challenges of matching practical activities with conceptual frameworks, including:
• Monitoring: choosing indicators (outputs and outcomes); developing a system for collecting and analyzing data; determining how much data is enough (census or sample); data collection frequency; and ensuring buy-in from stakeholders in data collection and analysis.
• Evaluation: choosing the type (outcome/summative or process/formative); developing questions and hypotheses; choosing an evaluator (internal or external); design decisions (number of countries, level of significance for program decisions); and working with the evaluator.
The paper concludes with lessons learned and challenges ahead.
|
|
Approach to Performance Measurement and Effectiveness of the Children’s Investment Fund Foundation
|
| Presenter(s):
|
| Nalini Tarakeshwar, Children's Investment Fund Foundation, nalini@ciff.org
|
| Peter McDermott, Children's Investment Fund Foundation, pmcdermott@ciff.org
|
| Anna Hakobyan, Children's Investment Fund Foundation, ahokobyan@ciff.org
|
| Tomohiro Hamakawa, Children's Investment Fund Foundation, thamakawa@ciff.org
|
| Abstract:
The Children’s Investment Fund Foundation (CIFF) invests in programmes that demonstrably improve the lives of children living in poverty in developing countries by achieving large scale and sustainable impact in areas of children’s health, nutrition and education. Robust Performance Measurement and Effectiveness (PME) systems ensure ongoing course correction and independent evaluation through the life of the programme. Evaluations adopt a fit-for-purpose approach wherein the methodology is shaped by feasibility and programmatic needs. Emphasis is placed on making relevant, timely and high quality data available for course correction and to leverage findings for potential policy and practice change. CIFF attempts to take evidence-based, evaluation-centred programming to a new level in international development. This paper discusses the challenges and benefits of such an approach by presenting three cases: an HIV/AIDS initiative in India, a rural healthcare initiative in Ethiopia, and a last-mile health delivery model in Uganda.
|
|
A Strategic Planning Case Study: Implementing a Data Dashboard for a Nonprofit Board’s Self-Evaluation, Monitoring, and Evidence-based Decision Making
|
| Presenter(s):
|
| Veronica Smith, data2insight, veronicasmith@data2insight.com
|
| Abstract:
As strategic planning committee co-chair, I led the effort to more systematically monitor and evaluate a public radio station’s progress against its strategic plan, resulting in the design and implementation of a strategic planning dashboard. This paper serves as a practicum project report, summarizing the initiative process and offering lessons learned. The process began with a facilitated board discussion, which resulted in a list of committee action items, including creating and organizing metrics for more frequent board review. We partnered with management and board to identify metrics for the organization’s strategic priorities. We also thoughtfully designed the dashboard to ensure effective and efficient communication. Finally, we created practices and protocols to sustain data quality, use, and accuracy. The dashboard rollout represented a successful strategic planning initiative that 1) resulted in improved management and board partnership and 2) increased the quality of management’s and board’s self-evaluation, monitoring, and evidence-based decision making.
|
|
Building and Implementing a Performance Management System to Inform Evaluation: Lessons Learned
|
| Presenter(s):
|
| Eric Barela, Partners in School Innovation, ebarela@partnersinschools.org
|
| Abstract:
This paper documents an education nonprofit’s efforts to build and implement a performance management system designed to provide data on both individual and organizational accountability and to inform internal and external evaluation efforts. In 2009, experts from the corporate world were hired to design the system’s architecture and roll out a prototype. Around this time, the nonprofit also experienced substantial changes in its leadership structure and began to consider scaling up. As such, the system needed to also inform evaluation efforts so the nonprofit could provide evidence of its effectiveness to attract potential new funders. This paper will explore lessons learned from designing the system, ensuring that the generated data could inform both internal and external evaluation efforts, and changing the organization’s culture to understand the need to link individual performance to programmatic effectiveness.
|
| | | |