2011

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Valuing Innovation in Democracy Assistance Evaluation
Panel Session 918 to be held in El Capitan B on Saturday, Nov 5, 12:35 PM to 2:05 PM
Sponsored by the International and Cross-cultural Evaluation TIG
Chair(s):
Georges Fauriol, National Endowment for Democracy, georgesf@ned.org
Abstract: Democracy assistance presents a particular set of challenges to the field of evaluation. By their very nature, these types of projects and programs are extremely difficult to evaluate. Traditional methods are not always feasible given the conditions under which democracy assistance projects and programs take place. This has led democracy assistance organizations to explore innovative methods to monitor and evaluate their work. This panel will explore the evaluation innovations of a grantmaking organization and its four core institutes working in the field of democracy assistance around the world.
Cumulative Assessments: Innovative Evaluation of Long-term Projects and Grantees
Rebekah Usatin, National Endowment for Democracy, rebekahu@ned.org
The National Endowment for Democracy (NED) is a private, nonprofit organization created in 1983 and funded through an annual congressional appropriation to strengthen democratic institutions around the world through nongovernmental efforts. NED's grants program provides support to grassroots organizations in more than 80 countries to conduct projects of their own design. The varied political and cultural contexts of NED grantees coupled with the difficulties of attributing programmatic success to a single small grant have led the Endowment to look for innovative methods for measuring short and long term success of its grantees. This presentation will discuss the conception and implementation of a pioneering grantee self assessment process that was launched in 2010.
Innovating Intuition: Documenting Program Development and Adaptation in the Evaluative Process
Liz Ruedy, International Republican Institute, eruedy@iri.org
With more than 25 years of experience in the democracy and governance sector, the International Republican Institute has accumulated a wealth of institutional knowledge about how to design and implement effective programs. Efforts to monitor and evaluate these programs, however, depend on developing an in-depth understanding of what is often an intuitive process: identifying a specific set of needs, determining how and why proposed interventions will address those needs, and recognizing when and how a program must adapt as needs or circumstances change. In addition to developing "nuts and bolts" monitoring and evaluation tools, such as quantitative indicators, IRI is therefore applying innovative tools such as program theory framework templates, process journals, and outcome and system mapping to capture the logic and decisions that constitute the foundation of successful programs. This presentation will discuss the utility of these tools, and their applicability to larger monitoring and evaluation processes.
Evidence-based Advocacy: An Innovative Approach to Evaluating Policy Advocacy
Joel Scanlon, Center for International Private Enterprise, jscanlon@cipe.org
The Center for International Private Enterprise works in partnership with local private sector organizations to strengthen democracy through market-oriented reform. CIPE's programs aim, in the short-run, to build local private sector capacity and, longer-term, to achieve institutional reforms through policy advocacy. As a democracy assistance organization, CIPE is interested in evaluating the policy outcomes of advocacy campaigns, but also, and equally important, the quality of the partners' engagement in the political process. Policy advocacy is a relatively new focus area for evaluation, and advocacy evaluations primarily have been conducted for campaigns in the U.S. political environment. This presentation will discuss CIPE's on-going development and implementation of tools, for both CIPE and partner organizations, for monitoring and evaluating advocacy in our international programs.
Using Innovative Technologies to Bridge the DC-Field Divide in Evaluation Capacity Building
Linda Stern, National Democratic Institute, lstern@ndi.org
How does a democracy assistance organization build M&E capacity across its global programming? In 2008, the National Democratic Institute for International Affairs (NDI) set about answering this question, starting with a thorough self-assessment of its capacity to monitor, evaluate and learn from its programming. The assessment revealed a number of strengths as well as weaknesses, not least of which was a critical gap in the capacity of DC- and field-based staff to fully integrate evaluative practice into their project cycles. With support from the National Endowment for Democracy, NDI has developed an online learning portal designed to extend M&E capacity building processes, tools and resources to its staff throughout the NDI world. This presentation will briefly highlight organizational assessment methodologies and findings, and then share NDI's experience in using innovative technologies to bridge the DC-field divide.
Congruence of Principles and Practice: Innovative Evaluation of Workers Rights Programs
Dona Dobosz, Solidarity Center, ddobosz@solidaritycenter.org
How does a democracy assistance organization reconcile its mission, principles and priorities with the purpose and objectives of its donor organizations? The Solidarity Center has sought to reconcile its programmatic priorities - building the capacity of trade unions and their worker rights' allies around the world to advance labor rights and standards and improve living and working conditions; and challenging and reforming laws and practices that repress worker and human rights - with the objectives of its funders by developing evaluation that addresses both. This exercise carves out shared space for donor missions, principles and practices that are congruent with the Solidarity Center's own. This presentation will highlight how the SC assesses its performance by identifying and structuring internal evaluation points and selecting from a cross-section of external donor-driven factors to structure evaluation that reflects and responds to both the internal truths of the organization and the external realities of the donor.

 Return to Evaluation 2011

Add to Custom Program