Date: Wednesday, October 22, 2025
I am Sara Dada, an incoming Ad Astra Professor of Global Health at University College Dublin where I also recently completed my PhD in global health. As a PhD student, I encountered the world of scientific realism – conducting my first realist synthesis and realist evaluation – and found myself consistently reflecting on the methodology and processes along the way.
As I embarked on my own realist evaluation, I turned to published examples for guidance. While many articles presented some detail on methods, few described the early thought processes and steps – how to make decisions around data or participants, and the logistics of collecting, analysing, and presenting evidence. This inspired a recent article published in the American Journal of Evaluation: Reflections from Planning and Conducting Data Collection for a Realist Evaluation in Zambia. In this blog, I’d like to share some of the key lessons I learned along the way.
The realist evaluation is a theory-driven approach to understanding how and why interventions work, for whom, and in what contexts. For my research in community engagement, this was the perfect methodology to address unanswered questions in the field. Yet in practice, conducting a realist evaluation can be just as complex and nuanced as the interventions it seeks to examine. I quickly learned that beyond developing, refining and testing theories, the real challenges lay in navigating practical decisions: what data to collect, how to meaningfully engage with participants and partners, and how to balance methodological rigor with flexibility. These challenges pushed me to reflect deeply on my approaches and led to several lessons that I carry forward.
By writing this article (and blog post), I hoped to make visible the often “hidden” decisions that evaluators face. These choices can feel messy and even overwhelming, but they are central to both conducting and reporting realist evaluations transparently and in ways that can inform future policy, practice, and research. My hope is that our reflections will provide reassurance and practical guidance for others who are planning similar work, especially in low- and middle-income settings.
The American Evaluation Association is hosting the American Journal of Evaluation (AJE). All posts this week are contributed by evaluators who work for AJE. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on theAEA365 webpage so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org . AEA365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.