Date: Tuesday, May 13, 2025
Hello, AEA365 community! Liz DiLuzio here, Lead Curator of the blog. This week is Individuals Week, which means we take a break from our themed weeks and spotlight the Hot Tips, Cool Tricks, Rad Resources and Lessons Learned from any evaluator interested in sharing. Would you like to contribute to future individuals weeks? Email me at AEA365@eval.org with an idea or a draft and we will make it happen.
Hello AEA community, we are Nate Mandel (strategy leader) and Krystin Roehl (researcher) with Stand Together Foundation. At Stand Together, we believe in elevating community voices and using data to foster bottom-up solutions that empower individuals. Listening deeply to “customers” – people who receive nonprofit services – is fundamental to this approach and part of a core strategy that we call Customer First Measurement (CFM). Today, we’d like to share how Stand Together is advancing the field of nonprofit listening and measurement by refining perceptual feedback questions to ensure they genuinely capture the voices of those we serve.
While surveys are a key tool for listening, not all questions are created equal. Some questions inadvertently fail to resonate, while others measure satisfaction but fall short of exploring whether customers report tangible benefits from the services they received. Recognizing this, we conducted a cognitive interviewing study to improve how we measure customer voices.
We used cognitive interviewing to test four survey questions, which allowed us to pinpoint where respondents struggled to understand or interpret certain questions. While this method is familiar to evaluators, we focused on tailoring it to nonprofit survey contexts and outcomes like customer success, where subtle word choice can make a significant impact. We interviewed 33 nonprofit service recipients from three different human service nonprofits as part of our IRB-approved study, and asked about constructs like the Net Promoter Score (NPS), as well as various derivatives of that question targeted at concepts like empowerment, transformation, and organizational responsiveness.
The study revealed that 48% of participants reported that some of our questions were either too complex or difficult to understand, highlighting the need to simplify language and adjust phrasing to reflect lived experiences. Across each question, respondents provided feedback on word choice and phrasing. The commonly used NPS question yielded a 100% understandability rate, further cementing NPS as a valid construct for assessing nonprofit customer satisfaction. Other questions – especially those aimed at capturing deeper outcomes like life improvement – were refined based on participant feedback, with up to 79% aligning on suggested changes to improve clarity and relevance.
These adjustments not only improved data accuracy, they also inspired us to think more deeply about what it means to empower individuals through feedback and to be truly customer-centric in designing and evaluating nonprofit work. Small changes, like shifting from complex concepts to plain language, give customers greater confidence in expressing their true perspectives. Our partners now use these questions to benchmark their feedback data over time, while disaggregating results to explore discrepancies in experiences. By involving beneficiaries as well as staff from our partner organizations in the design and interpretation of this study, we hope to have fostered a truly participatory approach for learning and iteration. This effort brings diverse perspectives into each stage of our work and deepens our commitment to advancing measurement practices that prioritize the dignity and voice of individuals.
Our next phase of research has focused on youth surveys, leveraging psychometrically validated instruments, and synthesizing lessons from existing measures of empowerment across various fields. We continue to engage our partner organizations to ensure these approaches are actionable and relevant across diverse service areas. As we continue on this journey, we’re reminded how powerful listening can be – both as a tool for evaluation as well as a critical step toward positive transformation. If you or your organization has explored similar efforts, we’d love to hear your insights! Please feel free to explore Stand Together Foundation’s updated survey templates and customer listening resources. Let’s keep the conversation going on how to bring customer voice to the forefront of evaluation.
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.