Date: Sunday, March 8, 2026
Hello, AEA365 community! Liz DiLuzio here, Lead Curator of the blog. This week is Individuals Week, which means we take a break from our themed weeks and spotlight the Hot Tips, Cool Tricks, Rad Resources and Lessons Learned from any evaluator interested in sharing. Would you like to contribute to future individuals weeks? Email me at AEA365@eval.org with an idea or a draft and we will make it happen.
Hello Evaluators! I’m Bich-Hang Duong, an independent evaluation consultant and also a methodologist at Franklin University. As a mixed-methods researcher working on multi-year, multi-partner initiatives, I’m passionate about using qualitative insights to support continuous learning and to tell nuanced stories of impact across education, sustainability, and youth development.
Across my evaluation work, I’ve seen the field shift from simply asking whether a program worked to exploring how, why, for whom, and under what conditions change occurs. These deeper questions demand stronger engagement with qualitative evidence, yet our reports too often reduce rich narratives to a few boxed quotes. Coming from an academic background, I initially fell into the trap of “theme dumping.” Only with experience did I realize that presenting qualitative evidence is not an exercise in listing themes; it’s an act of sparking curiosity and supporting sense-making while guiding action.
Below are four lessons drawn from both successes and missteps when I worked on two large education initiatives: a global system reform program and a digital learning project, both supporting education in low- and middle-income countries.
Qualitative findings are most powerful when they prompt stakeholders to ask better questions. Instead of treating themes as final answers, reframing them as “questions-in-motion” encourages exploration.
For instance, shifting “Instructors might not be ready for digital teaching” to “How have instructors adapted to digital teaching, and what shapes that variation?” sparks deeper thought. Even a small quote like “COVID was an accelerator for us” could open new lines of inquiry as we presented our evaluation findings for the digital learning Project.
Qualitative insights often act as the glue that connects disparate data points into a meaningful impact story. Quantitative data can show patterns, but it is qualitative evidence that helps explain why those patterns occur and what drives variation across contexts.
In the global system reform program, for example, survey results showed uneven improvements in teaching practice and student outcomes. It was the qualitative interviews and reflections that revealed the underlying factors that shaped these outcomes. These narratives transformed a set of fragmented quantitative results into a coherent account of how and why teacher practice and mindset changed over time.
Insights gain influence when clearly linked to action. Qualitative evidence in evaluation reporting should move from “what participants said” to “what this means for program choices.” Effective framing helps stakeholders understand consequences and move forward with more clarity and confidence by asking questions like:
Narratives, vignettes, journey maps, thematic diagrams, and personas help stakeholders see dynamics that remain invisible in tables. These tools make complexity more memorable and give life to the lived experiences behind the data. In my evaluations, visual thematic maps, Venn diagrams, and short vignettes helped partners more easily grasp participants’ experiences, especially the challenges they faced.
In brief, presenting qualitative evidence well is not simply about aesthetics; it is central to effective evaluation. When we treat qualitative data as a catalyst for curiosity and action, we strengthen both the credibility and the usefulness of our work.
There is limited guidance on how to present qualitative evaluation insights in ways that spark curiosity, support sense-making, and guide action. That said, a number of excellent resources do exist, particularly around qualitative dashboards or visualization of qualitative data.
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.