2011

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Before It's Too Late: Lessons Learned From Strategic Learning Evaluations of Advocacy Efforts
Panel Session 937 to be held in San Clemente on Saturday, Nov 5, 12:35 PM to 2:05 PM
Sponsored by the Advocacy and Policy Change TIG
Chair(s):
Sarah Stachowiak, Organizational Research Services, sarahs@organizationalresearch.com
Discussant(s):
Sarah Stachowiak, Organizational Research Services, sarahs@organizationalresearch.com
Huilan Krenn, WK Kellogg Foundation, hyk@wkkf.org
Abstract: Advocates for policy change - in this country and around the world - are impassioned and focused. They may resist advice about strategy from "outsiders" like evaluators. Effective formative evaluation tools and approaches can help them see the value of evaluation and improve their strategies. This session will share some methods, tools and approaches for working with clients to use evaluative data and lessons learned to adjust advocacy strategy. We will also examine how we as evaluators - in turn - adjust our approach in response to our clients' changes in strategy and in response to changes in the political or social environment. Speakers will draw on experiences in the United States, Europe and Africa helping clients and partners make mid-course corrections to their policy change efforts.
Learning as we go, but are we Going far Enough?
David Devlin-Foltz, The Aspen Institute, david.devlin-foltz@aspeninst.org
Learning as we go is APEP's mantra for our clients. Sometimes clients go along easily. Our work with the Advocacy Progress Planner in the United States, and in modified form in Tanzania and France, has yielded some encouraging examples of shared commitment to learning among funders, advocates and evaluators. We have collaborated with other clients in the U.S. on defining more precisely what kinds of changes in behavior they want to see from key policymakers and influential former officials. We have worked with clients to track their contribution to those changes in policymaker behavior over time. This too has helped clients adjust their focus and target their advocacy more precisely. Even the best benchmarks tell us nothing unless we have reliable data about progress, of course. We will discuss the challenge of identifying and using benchmarks that are both meaningful and measurable.
Strategic Learning in the Long-term: Issues and Challenges
Julia Coffman, Center for Evaluation Innovation, jcoffman@evaluationexchange.org
For almost nine years, Harvard Family Research Project has been using a strategic learning approach to evaluate the David and Lucile Packard Foundation's Preschool for California's Children grantmaking program. This year, a teaching case was developed on the evaluation of this long-term advocacy effort to highlight the issues that arise from using a strategic learning approach to evaluation in the long-term, identifying key points at which the evaluation switched course because methods were not working, or because the Foundation's strategy shifted. This presentation will highlight responses to key questions that are relevant to all strategic learning approaches, such as: How to evolve the evaluation in response to changing strategy; how to "embed" the evaluator while maintaining role boundaries; how to manage often competing learning versus accountability needs; and how to time data collection so it is rapid and "just in time" but also reliable and credible.
Evaluating for Strategic Learning Within an Education Reform Effort: Lessons Learned
Anne Gienapp, Organizational Research Services, agienapp@organizationalresearch.com
In 2009-10, with the support of the W.K. Kellogg Foundation, ORS worked with The Chalkboard Project, a funding collaborative engaging in K-12 education reform efforts in Oregon, to conduct a prospective evaluation of Chalkboard's advocacy efforts. During this period, significant changes occurred in the policy environment, necessitating adjustments in the organization's direction for policy change and an accompanying revisit of the recently developed strategic plan. This dynamic environment also required the prospective evaluation design and data collection plan to become an ongoing, iterative process. The Chalkboard experience may characterize the very nature of prospective evaluation of advocacy efforts which are most usefully conducted at a time when flexibility and rapid feedback is needed to inform strategic decision-making. This presentation will share insights and lessons learned regarding approaches for data collection, sharing results, reporting findings and how these processes strengthened the client's advocacy efforts.

 Return to Evaluation 2011

Add to Custom Program