Date: Friday, May 8, 2026
How do you ensure your evaluation data is used? From project design, to stakeholder engagement, to report writing and data sharing, evaluators have many opportunities to help ensure the data we collect are actually used for decision-making. In this series of blog posts, Washington Evaluators draws on our experiences in Washington, DC and beyond to share best practices for making evaluations useful.
I am Ashima Singh, a consultant who began her career in evaluation, went to higher education (HE) assessment and accreditation (both specialized evaluations), and then hung up her own shingle at Ashima Singh Consulting (ASC). I learned key lessons in HE that have informed my work at ASC.
Accreditation is a high-stakes quality assurance evaluation necessary for HE institutional sustainability and fiscal viability. Assessment – a significant part of demonstrating compliance with accreditation standards – is faculty, staff, and administrators routinely and systematically reflecting on the impact of their work on student learning and unit and institutional goals and making adjustments as they go.
HE personnel often perceive assessment as a threat to their safety and academic freedom. Increasing public and political scrutiny of HE adds tension to a fraught environment where successful accreditation is necessary and yet crucial parts of it fundamentally alienate key stakeholders. Initially, selling a used car seemed easier than getting HE personnel to embrace assessment. They needed a functional relationship with assessment, one that restored their agency and left room for creativity. I learned facilitating the shift was easiest when using a heart-first strategy.
If English was the only shared language between me and a friend, I would not speak to them in Hindi. Similarly, using evaluation-specific language with stakeholders unfamiliar with it placed hurdles between them, me, and their engagement with assessment. So, instead of relying on them to understand me, I worked to be understood. I used stories, metaphors, and similes to remove barriers.
An education program, for example, had set a 70% pass rate as a benchmark for success. So, I turned that into a visual of a 100 people walking into a hospital and only 70 walking out alive. That image quickly revealed the benchmark would neither be informative nor give them any bragging rights. They giggled and eagerly discussed what benchmarks made sense instead. Humor worked.
Unpleasant tasks rightly call for justification. So, I walked stakeholders through “why” questions until they saw that assessment extended beyond measuring progress towards goals. Instead, it was a strategic step toward program sustainability, funding, development, meeting stakeholder needs, determining resource needs, and so on. People began to see assessment as a useful tool that met their needs. As their apprehension ebbed, curiosity flowed in, and their shoulders relaxed. What seemed threatening started to look like “what if” possibilities. One group realized that their assessment data filled a crucial gap in a grant application. Thus, relaxed shoulders became my desired outcome for all meetings.
Stakeholders may not know that assessment and evaluation tell their program’s story across time. I invited people to think about their family album as a living history of personal growth and time with loved ones. That image was effective in helping them see that evaluation and data are snapshots of their work. I designed a matrix as a resource to help stakeholders document data-motivated decisions and their outcomes – their program history. This returned agency to the people responsible for the program.
Research shows that emotional (heart) thinking has rapid automaticity which, when paired with analytic (head) thinking, leads to better decision-making. The cool tricks I shared are not tricky at all but are still powerful in helping stakeholders build a functional relationship with evaluation. The easiest way to people’s head is through their heart.
The American Evaluation Association is hosting Washington Evaluators Week with our colleagues in the Washington Evaluators local affiliate group. The contributions all this week to AEA365 come from Washington Evaluators members. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org. AEA365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.