Date: Saturday, December 6, 2025
My name is David Larwin, and I am a senior lecturer at Kent State University at Salem. I have been teaching psychology courses to undergraduate students for nearly 40 years, guiding learners through foundational concepts in human behavior, cognition, and research methods.
An Evaluator and a Statistician walk into a bar… Well, that’s not exactly how it began. But there is a new collaborative effort in the works involving members of the American Evaluation Association’s Quantitative Methods and Design TIG and the American Statistical Association. It began about a year ago as an effort to bridge the fields of evaluation and statistics. The goal is to foster collaborative work between the two fields to establish a platform that is not only stronger, but more actionable and more meaningful.
Historically, our two groups have worked in parallel lanes. While evaluators focus on translating data into stories of value—asking what works, for whom, and at what cost?—statisticians provide precision, modeling power, and confidence testing. But this new collaboration, the Evidence to Impact Network, seeks to close this gap by integrating evaluative reasoning and statistical modeling.
Whether you’re working on a group or research project, clarify roles in writing at the start. Decide who owns which tasks, how decisions will be made, and when you’ll check in. It feels formal, but it prevents misunderstandings, protects relationships, and keeps the focus on the work instead of the drama.
A June 16, 2025 webinar, “The Evidence Partnership: Bridging Statistical Thinking and Evaluation Practice,” showcased the cross-disciplinary synergy this collaboration can offer. The webinar featured leaders from both AEA and the ASA, and highlighted real-world examples from education, health science, and community programs where collaboration improved evidence quality and decision confidence. Here is the link to the recording of this partnership event.
The Evidence to Impact Network is creating a space for both disciplines to adapt to new rules—ones that emphasize not only methodological strength but also usability and fiscal clarity. Together, evaluators and statisticians are showing that when evidence is built collaboratively, it becomes more actionable, defensible, and valuable to those who fund and implement programs.
The Evidence to Impact Network provides new professional opportunities and hybrid roles requiring both statistical expertise and evaluative insight and communication. It has the potential to increase demand for professionals who can translate complex analyses into accessible, actionable findings for stakeholders. It represents a growing career pathway in fields such as data analytics, evaluation consulting, public policy, and applied research. At its core, the Evaluation to Impact Network emphasizes connection and community, not just technical skill. It is intended to foster a culture that values both precision and purpose, turning evidence into real-world impact.
As evaluation practice evolves, these new rules remind us that the greatest impact occurs where rigor meets relevance—and where data not only informs but also inspires action.
Another webinar is being planned for December 2025. When the planning process is complete, information about this event will be shared with the membership of AEA and ASA. Please join us then to learn more about the Evaluation to Impact Network and how you can get involved.
The American Evaluation Association is hosting Quantitative Methods: Theory & Design TIG Week. The contributions all this week to AEA365 come from evaluators who do quantitative methods in evaluation. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org. AEA365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.