Date: Tuesday, December 30, 2025
Hello! Ian Goldman here, writing with colleagues about new approaches to evaluation that matter for public management in challenging times. A paper in the Journal of Multidisciplinary Evaluation will be published in early 2026 – here is a sneak preview and some emerging lessons.
We are facing a polycrisis – climate breakdown, ecological collapse, wars, resurgent poverty, inequality and democratic erosion etc. Most public management systems still operate as if the world were stable, relying on rigid plans, siloed structures, and compliance-driven performance regimes poorly suited to fast-moving and systemic risks. We argue that evaluation can help drive adaptative management —if positioned as a learning tool and embedded throughout real government decision-making, not just as an audit after the fact.
Embedding evaluative practice throughout the policy cycle—not just at the end—creates space for embedding learning, feedback, and adaptation. South Africa’s evaluation system and the Climate Investment Funds’ approach illustrate the importance of building evidence and stakeholder engagement into diagnosis, design, and implementation.
Public managers need a flexible toolbox of different evaluative practices that can provide quick, good-enough insights for crisis response as well as deeper learning over the medium and long term. This includes evaluative thinking in monitoring, rapid evaluations, synthesis of multiple studies, developmental evaluation, reflective workshops etc.
Evaluation is most powerful when done with managers and stakeholders, not to them. In Brazil, co-produced evaluation with policy teams and civil society builds ownership, trust, and actionable recommendations. This collaborative stance – evaluators as facilitators and knowledge brokers rather than neutral technicians – increases trust, ownership and the likelihood that evidence will be used in difficult trade-offs. South Africa’s rapid evaluations pair government staff with professional evaluators, boosting response times and relevance.
To address today’s “polycrisis,” and inform systemic change, evaluators need to embrace systems thinking and complexity-sensitive approaches. South Africa’s has evaluation criteria on transformative equity and climate and ecosystems health to help focus on the polycrisis.
Cape Town shows the value of central capability: its Future Planning and Resilience Directorate unites foresight, data science, risk, performance, and social research. By integrating these, the city uses evidence to iterate annual plans and budgets, achieve long-term goals, and respond dynamically—even without a formal evaluation unit.
Managing polycrisis requires governments to respond to shocks while also looking ahead. Foresight explores possible futures and vulnerabilities, while evaluation tests what actually happens in real contexts. Used together, they enable adaptive government—something Cape Town is beginning to model, but too often foresight and evaluation remain siloed.
Evaluative practice still faces barriers—siloed departments, punitive accountability, slow evaluation cycles—but practical shifts are emerging. Incentives need to move away from compliance and toward shared learning, systems understanding, and honest reflection on underlying paradigms. Promoting diverse evaluative practices, user engagement, and real-time feedback helps managers better respond to both crises and deep, systemic change.
Thanks for reading and do read the full article when its out —and please share how you’re embedding evaluative thinking in your context!
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.