Date: Sunday, December 7, 2025
Hello, AEA365 community! Liz DiLuzio here, Lead Curator of the blog. This week is Individuals Week, which means we take a break from our themed weeks and spotlight the Hot Tips, Cool Tricks, Rad Resources and Lessons Learned from any evaluator interested in sharing. Would you like to contribute to future individuals weeks? Email me at AEA365@eval.org with an idea or a draft and we will make it happen.
We’re pleased to contribute to the AEA365 as program evaluators at the American Medical Association (AMA). My name is Melissa Ritter, and I am a program manager of evaluation and strategy and my co-author, Faith Washington, is a senior market research analyst. We are members of an internal research and evaluation team, Strategic Insights, at AMA.
Program evaluation isn’t new at the AMA, which turned 178 years old this year. We’re proud of the breadth and depth of expertise across the organization, and the rigor with which it’s applied to advance our mission. While it might seem as if a new, more disciplined framework for program evaluation would be easily embraced, we were surprised at some lessons we learned, and which we believe could help everyone who works to implement these changes. Sometimes, it really comes down to the basics.
A common language: First, we found it’s not simply about new jargon, but establishing a common language. We needed consensus on a set of definitions, which was harder in practice than in theory. How do we define a “program”, and what differentiates it from a “project” or “product”? Establishing or revisiting specific descriptions of some core concepts was new, and essential, to apply the framework appropriately. The process opened needed conversations among teams to clarify the scope of various efforts and recommit to a shared and updated roadmap.
We’re not here to judge: Second, the word “evaluation,” is scary! It can often be read as “judgement”, no matter what font we use. We are socializing the framework we developed, which has five stages and provides guidance on the tasks or critical steps they should take with their program at each stage. We also offer an Evaluation Capacity Tool to help identify where they need to build capacity for a more robust evaluation. These and other tools are not mandatory steps or an audit, but ways to empower them in their own efforts. We brand ourselves as “your friendly neighborhood evaluators”, providing case studies of how we can do to help them use what they already have (e.g., data, processes, documentation). A book that I have found very helpful with these approaches is The Great Nonprofit Evaluation Reboot: A New Approach Every Staff Member Can Understand?by Elena Harman, PhD.?by Elena Harman, PhD (which I happened to win in a raffle at an Indiana Evaluation Association conference).
Grassroots support: Finally, buy-in and support from leadership is necessary, but like having keys to a car with an empty tank, it won’t get you far! We needed bottom-up buy-in. The teams, managers, and participants need to know more about why an evaluation is important to support and how it can help them achieve their goals. We did this by establishing a common language and definitions; explaining in detail the background and rigor that went into the framework’s development; and offering optional tools and resources, with a feedback loop, so we are responsive to their needs rather than making assumptions. And, importantly, we meet people where they are. As our field and the language to describe our work grows, it can become inaccessible to “outsiders”. Our team works intentionally to decode it for our colleagues. Folks need to know the “why”, and it can’t just be because we (or our leadership!) said so.
There’s so much we’ve learned from the AEA community, and we hope to have more to share as we continue implementing our new framework. Teams and organizations have unique challenges with program evaluation, and it’s exciting to see the ways our field is exploring how to solve these. We’ve found that giving enough time and attention to the fundamentals can help accelerate turning those ideas into action, and we think it may help you, too!
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.