Date: Friday, November 7, 2025
Hi fellow evaluators! We are Cecilia Borges Farfan, Muniba Ahmad, and Joel Gutierrez from ORS Impact, a consultancy founded in Seattle, WA that specializes in measuring complex system change efforts. Our policy advocacy evaluation work spans strategy development and capacity building to outcome measurement and learning. In our experience evaluating policy advocacy efforts, we’ve surfaced several lessons learned that remind us of the unique challenges—and opportunities—of this kind of work.
Evaluating multi-year policy advocacy efforts is more like hiking through a forest—uneven, winding, and at times unpredictable—than following a paved road. Strategies shift, priorities change, and new opportunities become available. Evaluation must be flexible enough to keep pace.
Depth matters more than breadth. We’ve learned that fewer, better-focused learning questions yield richer insights and allow for deeper reflection.
Reflection needs space. Policy advocacy work can move fast, and competing priorities can crowd out time for learning. Carving out moments for synthesis—after data collection and before reporting—and learning strengthens insights and improves strategy.
Theory of change benefits greatly from perspectives of folks closest to the work. Their input grounds theory and assumptions in real-world experiences, helps elevate gaps and barriers that can impede progress, and surfaces opportunities to refine strategies to achieve desired impact.
Metrics alone don’t tell the full story. While quantitative indicators, like number of policy wins and dollars mobilized, are important, they rarely capture the nuance of advocacy work. Meaningful outcomes often lie in building advocacy coalitions across the ideological spectrum, shifting narratives, and creating infrastructure (e.g., funder tables, aligned giving) for long-term change. Elevating qualitative data helps tell a richer story.
Policy advocacy evaluation continues to evolve as funders and advocates grapple with how to measure progress in complex, shifting landscapes. The lessons we’ve shared here reflect just our experience, and we’re curious about how others are navigating similar challenges. What approaches have worked in your context?
Check out these publications:
The American Evaluation Association is hosting APC TIG Week with our colleagues in the Advocacy and Policy Change Topical Interest Group. The contributions all this week to AEA365 come from our AP TIG members. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org. AEA365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.