Date: Tuesday, September 2, 2025
Hello, AEA365 community! Liz DiLuzio here, Lead Curator of the blog. This week is Individuals Week, which means we take a break from our themed weeks and spotlight the Hot Tips, Cool Tricks, Rad Resources and Lessons Learned from any evaluator interested in sharing. Would you like to contribute to future individuals weeks? Email me at AEA365@eval.org with an idea or a draft and we will make it happen.
Hello colleagues and greetings from the Center for Justice Research and Innovation (JRI) at the CNA Corporation. Hello, I’m Vivian Elliott, Director of CNA’s Center for Justice Research and Innovation (JRI), where we help justice agencies transform research into real-world programs and strategies. Here, we raise several key issues we encounter in our work to evaluate the impact of training and technical assistance (TTA) programs. In this context TTA refers to government-funded initiatives to help agencies and communities successfully implement programs they have received funding for, hence giving the funded entities better chances at successful implementation, and successful outcomes. In our view, daunting challenges encountered in this work should not deter us from making the best effort possible to determine effects and impacts, for the benefit of our funders and our local clients.
For the past 15 years, JRI has implemented and, in various ways, self-evaluated the local effects and impacts of our TTA work on justice agencies and their respective communities across the country. We have worked with over 700 agencies and communities in primarily law enforcement settings, and in prosecution and corrections settings. JRI implemented several national TTA programs for the US Department of Justice (DOJ), including: the Smart Policing Initiative, National Public Safety Partnership, Project Safe Neighborhoods, Collaborative Reform, Crime Analysts in Residence, Justice Reinvestment Initiative, Body Worn Cameras, and, most recently, the Jails and Justice Support Center.
The TTA products and services we develop and deliver resemble those found in other similar programs within, and outside of, the justice arena – expert advice, site visits to exemplary and evidence-based programs, self-assessment protocols, peer-to-peer learning opportunities, communities of practice, checklists and other brief learning aids, formal training, webinars, podcasts, and more.
In line with DOJ requirements and our own commitment to self-assessment and continuous improvement, we track and evaluate the effectiveness of our TTA programs and the impacts (process and outcome) of our TTA in several ways, using several different approaches and methods. We present some of our findings and suggestions below.
We note here that our TTA evaluation work follows the rubric and recommendations of Kirkpatrick’s four-level approach to training evaluation (reaction, learning, behavior, results), with some innovations for the technical assistance aspects of our work. The Kirkpatrick model, in our opinion, remains a very comprehensive, instructive, and helpful approach to the robust assessment of TTA. It does not directly address how to evaluate technical assistance (as opposed to training).
Several methods and approaches we employ in our TTA evaluations include:
Several other activities and approaches to TTA evaluation that we employ include:
To be sure, our TTA evaluation work relies heavily on participatory and action-oriented research models and approaches (including empowerment and utilization-focused evaluation approaches), which has its own set of plusses and minuses. As a science-driven, nonprofit, public service organization, we work closely with our clients on all aspects of the evaluation research that affect their operations, and we welcome their participation in the work.
For more information about JRI’s TTA evaluation work, see our evaluation resource page and our one-pager on TAA. Questions? Contact Vivian Elliott at CNA (elliottv@cna.org).
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.