Date: Monday, March 23, 2026
I am Cassange Bitere, a development practitioner and evaluation consultant specializing in global and public health, mixed-methods research, and implementation-focused evaluation. My work has taken me across the United States, Sub-Saharan Africa, and the Caribbean, supporting organizations in strengthening learning systems and translating evidence into practice. Working in complex environments has shaped how I understand the role of evaluation during periods of instability.
When contexts become chaotic due to political transitions, humanitarian crises, funding disruptions, or public health emergencies, a familiar statement often emerges: “Now is not the time to conduct evaluation or research.” Yet these are precisely the moments when learning becomes most critical.
Programs are not designed for perfect conditions; they operate in the real world, where uncertainty and disruption are constant. Evaluating only under stable circumstances provides an incomplete picture of program performance. What often matters more is understanding how teams adapt, how systems respond under pressure, and where innovation emerges when conditions are far from ideal.
Programs and policies continuously evolve in response to external forces such as leadership transitions, supply chain interruptions, community distrust, or natural disasters. Embedding evaluation during unstable moments allows for real-time insights that support course correction, inform strategic investments, and strengthen decision-making when it is needed most.
Delaying evaluation carries significant consequences. Timely evidence is lost at the very moment programs are being tested. Opportunities to understand whether interventions are resilient or fragile disappear. Most importantly, the perspectives of those most affected risk being overlooked, even though their experiences are often most visible during disruption. When learning is postponed, it can unintentionally signal that knowledge generation is optional rather than foundational, weakening both accountability and innovation.
Evaluating in challenging environments does not require compromising rigor. Instead, it invites a broader understanding of rigor, one that values relevance, responsiveness, and utility alongside methodological strength.
Consider a reproductive health program operating in a conflict-affected setting. A conventional evaluation might wait until infrastructure stabilizes. An adaptive approach asks immediate questions: How are people still accessing care? What informal solutions are emerging? What new barriers have appeared? Capturing these realities in real time is essential for improving services and informing future interventions.
Similarly, evaluating a youth program during a pandemic can reveal how engagement shifts, what alternative spaces for connection develop, and how young people reinterpret risk. Insights generated during disruption are often difficult, if not impossible, to reconstruct retrospectively.
The world does not pause during crises, and neither should the commitment to learning. Evaluation is not a luxury reserved for stable moments; it is a critical tool for navigating uncertainty. Prioritizing adaptive approaches positions evaluation to do what it does best: help programs remain responsive, accountable, and effective even in the most challenging environments.
The UNFPA Independent Evaluation Office Guidance on Adaptive Evaluation offers practical direction on designing and conducting evaluations in complex and rapidly changing environments. It outlines tools and approaches that help teams generate timely, actionable insights while maintaining rigor, making it especially relevant for crisis and instability settings.
The American Evaluation Association is hosting Health Evaluation TIG Week with our colleagues in the Health Evaluation Topical Interest Group. The contributions all this week to AEA365 come from our Health Evaluation TIG members. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org. AEA365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.