Date: Monday, September 1, 2025
Hello, AEA365 community! Liz DiLuzio here, Lead Curator of the blog. This week is Individuals Week, which means we take a break from our themed weeks and spotlight the Hot Tips, Cool Tricks, Rad Resources and Lessons Learned from any evaluator interested in sharing. Would you like to contribute to future individuals weeks? Email me at AEA365@eval.org with an idea or a draft and we will make it happen.
Hello, colleagues! I’m Elizabeth DiLuzio, a consultant and trainer in evaluation and data analytics.
We often think of databases as tools for storage. Somewhere to house participant information, track outcomes, or generate reports. But in practice, building a program database does something more powerful: it forces clarity.
In my work at the Behavioral Health Improvement Institute at Keene State College, I’ve seen this firsthand. When we sit down with a partner to learn about what needs to be captured in a database, it quickly becomes apparent which aspects of their program are clearly defined and which need sharpening. Workflows that felt intuitive on paper get murky when translated into fields and forms. Program models that seemed solid start showing cracks. And suddenly, the database becomes not just a technological tool, but a mirror.
That mirror can be uncomfortable. But it’s also transformative.
Here are a few lessons I’ve learned along the way.
Workflows reveal themselves in the details
When staff describe their work, the big steps usually sound aligned: outreach, intake, follow-up. But when we ask about the details, like how many times they attempt outreach before calling someone unreachable, differences emerge. One staff member might stop at three calls while another after two weeks of no contact, regardless of how many times they outreached. A database can’t accommodate two different standards. Building one requires the team to make decisions together, clarifying practice in the process.
Databases enforce alignment
Once workflows are standardized, the database encodes them. Drop-down menus narrow choices to agreed-upon categories, prompts remind staff of next steps, and automations keep cadence consistent. Over time, this reduces drift. Staff aren’t just told what the standard is. The system reinforces it every day.
Continuity depends on a single source of truth
Turnover is inevitable, and onboarding on top of providing services can be challenging. Good news is that a well-built database functions as a scaffold, showing new staff not just what to do but how the program defines success at each step. Instead of relying solely on oral handoff or shadowing, they step into a workflow that reflects the team’s collective agreements. The database becomes the single source of truth about how the program is implemented.
Adaptation prompts reflection
Practice will inevitably change as services expand, funders request new metrics, and community needs shift. Each time the database requires adaptation, the team revisits its model: What is essential? What can change? What does fidelity look like now? This process doesn’t just update the technology. It reinforces organizational learning, creating a living record of how the program has evolved.
Whether or not you’re designing a database soon, you can simulate its benefits by taking one workflow from your program and pressure-testing it. Try mapping it as if it were going into a database: step by step, with all the decision points spelled out. You may be surprised by what surfaces, and the clarity you gain will strengthen both implementation and evaluation.
I’ll be diving deeper into this topic at the American Evaluation Association’s annual conference in my session Designing Databases, Shaping Practice: How Evaluation Technology Supports Program Clarity and Continuity. Together, we’ll explore real-world examples of how databases both challenge and strengthen programs, and how evaluators can lead this work with purpose.
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.