2010 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: The Intersection of Strategy and Evaluation: What Are We Learning?
Panel Session 124 to be held in BONHAM B on Wednesday, Nov 10, 4:30 PM to 6:00 PM
Sponsored by the Non-profit and Foundations Evaluation TIG
Chair(s):
Sarah Stachowiak, Organizational Research Services, sarahs@organizationalresearch.com
Discussant(s):
Julia Coffman, Center for Evaluation Innovation, jcoffman@evaluationexchange.org
Abstract: Increasingly, philanthropic organizations want to look beyond individual grant-level evaluations to see what they can learn across their portfolios to make strategic decisions. For evaluators, this raises important questions: What is the intersection between strategy development and evaluation? What levels of evidence are necessary? What approaches to data collection are a good fit and match strategy and budget cycles? What kinds of products are created? How do the needs of philanthropic organizations differ from public sector or non-profit organizations? What are unique dynamics or contexts in collecting information from grantees (e.g., sun-setting funding, balancing expectations among grantees)? This session will explore these questions from the perspectives of evaluation consultants with experience working on strategy-level evaluation and foundation evaluation staff. Using a “fishbowl” approach, panelists will engage in a dialogue with each other, providing real-world examples of how they wrestle with the unique opportunities and challenges associated with this work.
Perspectives From Hallie Preskill, Foundation Strategy Group (FSG) Social Impact Advisors
Hallie Preskill, Strategic Learning & Evaluation Center, hallie.preskill@fsg-impact.org
Increasingly, strategy and evaluation are being seen as two sides of a coin. For example, questions about the progress and impact of a particular strategy can help frame an evaluation, while an evaluation’s findings can inform the refinement of a strategy. This session will explore the following questions: What is the evaluator’s responsibility when at the start of an evaluation he/she realizes that the initiative is based on a faulty strategy, or worse, no strategy? What competencies are needed by evaluators who bridge the strategy-evaluation continuum? How does the developmental evaluation approach reflect the intersection of strategy and evaluation, and how does it challenge conventional evaluation wisdom, roles and approaches? While some in the evaluation profession may bristle at the notion of evaluators engaging in strategy work, it may be time for evaluators to acknowledge that we leave a lot on the table when our work stops where strategy begins.
Perspectives From Lance Potter, Bill & Melinda Gates Foundation
Lance Potter, Bill & Melinda Gates Foundation, lance.potter@gatesfoundation.org
The Bill & Melinda Gates Foundation aspires to make measurement, learning, and evaluation central to strategy development and grant making. We seek to ensure that our MLE practices are of consistently high quality and utility, while allowing programs to design measurement that is responsive to a range of needs. As a relatively young organization, we are actively evolving our measurement culture and practices. To that end, we are currently piloting foundation-wide measurement guidance. The Actionable Measurement Guidelines offer a perspective on why we measure, and guidance on designing strategically aligned measurement plans that will provide results with programmatic and strategic utility. The Guidelines help program teams select the appropriate type of measurement (e.g., monitoring versus evaluation) for a given situation, and suggest the nature of evaluation questions and methods for various categories of measurement. Through this effort, we hope to achieve rigorous, actionable, and parsimonious measurement throughout the organization.
Strategy, Evaluation and Strategic Learning at the Packard Foundation, Perspectives From Gale Berkowitz
Gale Berkowitz, David and Lucile Packard Foundation, gberkowitz@packard.org
The Packard Foundation’s commitment to effectiveness directly plays out in our grantmaking culture and practices. No strategic effort can be successful without feedback on a continuous basis and without good data. We have tried to create a culture of improvement and collaboration, with staff and grantees being allowed to make mistakes, to go in a different direction, and take corrective action. As a Foundation, we also need to be aware that the burden of evaluation can be large and so we consciously work to minimize the burden and maximize the value of evaluation for our grantees. We have been steadily shifting from evaluation for accountability to that for program improvement, or “real-time” evaluation. For us, real-time means an appropriate mix of monitoring, evaluation, and learning. Balance and feedback are essential as we try to get the highest quality and timely information, to make the best program strategy decisions possible.
Perspectives From Mayur Patel, Knight Foundation
Mayur Patel, John S and James L Knight Foundation, patel@knightfoundation.org
Are a program’s goals realistic? Are the critical assumptions on which a strategy was built valid? Are the methods used to implement a set of objectives the most effective means of achieving the desired impact? These questions are at the heart of strategy evaluation. Whereas project evaluations regularly use case study methodologies, strategy evaluations often resemble exercises in market research and comparative analysis. At the Knight Foundation this kind of assessment has helped us address: how the design and format of a grant making contest could be improved; what operating assumptions about innovation were embedded in a seed funding initiative and whether these were aligned with how certain projects were successfully developed and adopted? What the range of financing sources (private and public) were available to grantees in a particular sector and what this revealed about the role for philanthropic funding?
Perspectives From Organizational Research Services
Sarah Stachowiak, Organizational Research Services, sarahs@organizationalresearch.com
In all our evaluation consulting, we strive to work as a thought partner with our clients—outsiders with expertise in evaluation and measurement who can integrate varied points of views, offer a fresh perspective and provide useful and actionable observations and recommendations. This orientation has become increasingly important when we work with philanthropic clients to look across initiatives to inform future strategy development. At the same time, unique issues have arisen: How much and what kind of evidence is necessary for decisionmaking? What methods provide meaningful data on a useful timeline without overly burdening grantees? What products or processes are most useful to support internal planning? Based on a variety of recent experiences with strategy evaluation, such as looking across learning questions for a grant program finalizing their strategy refresh, we will share our approaches and lessons learned.
Perspectives from Tom Kelly, Annie E Casey Foundation
Tom Kelly, Annie E Casey Foundation, tkelly@aecf.or
The Annie E. Casey Foundation focuses its grantmaking on initiatives in the United States that have significant potential to demonstrate innovative policy, service delivery, and community supports for disadvantaged children and families. As Associate Director of Evaluation, Tom brings to this discussion a vantage point of thinking about strategy evaluation across a range of programs, including communications, advocacy and policy evaluation for the KIDS COUNT Network to the ten-year place-based comprehensive community change initiative, Making Connections, and more. He will also speak to lessons learned and implications from recent foundation-wide reorganization, the role of past evaluations and theory of change development, and the foundation’s results framework orientation.

 Return to Evaluation 2010

Add to Custom Program