Date: Thursday, March 26, 2026
Hi! We’re Amy Kerr and Heather Zook, evaluators at Professional Data Analysts (PDA), where we have spent over a decade as colleagues working on public health evaluation projects. At PDA, we focus on providing client-centered, utilization-focused evaluations, so we tailor our approach and methods to client needs. Many public health clients ask for evaluation plans, but some already have robust evaluation practices. When PDA was brought in as a new external evaluator for a state public health department, we recognized that they needed more than a traditional evaluation plan to ensure the current evaluation activities were comprehensive and aligned with program strategies. We initiated a multi-phase approach to assess the client’s existing evaluation design and developed tools and recommendations to better align evaluation activities with their strategic goals and learning objectives.
We used three key steps to better understand the program and develop strategic evaluation recommendations.
The first step in the process was to review program documents to understand and organize program activities within their four strategic plan priority areas. In addition to document review, PDA facilitated two conversations with program staff and leadership using an online collaboration tool to gather more context on program implementation by priority area. We then created a map of program activities and their corresponding evaluation data sources (see example below), which the client reviewed.
In the next phase, we facilitated a discussion about evaluation questions and priorities using the online collaboration tool. The first part of the conversation focused on the client’s perceptions of the strengths and opportunities in their evaluation work. We asked what they wanted to continue, expand, or change about their current evaluation activities and how they used and shared evaluation data.
In the second part of the conversation, we assessed priorities for new or modified evaluation activities and the specific needs or goals for these activities. Staff identified their own individual priorities prior to the session, and during the session, the team discussed and collectively chose three activities to prioritize first. We asked them to articulate any changes or additions to each priority evaluation activity and identify what they wanted to learn through the evaluation. Finally, we asked them about big picture evaluation questions that might not correspond to a single program activity.
In the last phase, we assessed the information we’d gathered to determine the evaluation strengths and areas of growth. We compared the current monitoring and evaluation work with activities in other states, as well as public health evaluation best practices. We then compiled documentation of strengths, opportunities, and recommendations using the template and icons below.
Lastly, we conducted a post-report survey with the client to promote discussion around use of recommendations. After program leadership read the report, they completed a brief survey rating each evaluation recommendation on a scale of 1-5 (1=low, 5=high) for importance and feasibility.
Sometimes clients need an assessment of their current evaluation activities more than they need a new evaluation plan. This multi-phase approach allowed us to document and align the client’s understanding of their public health program with their evolving needs. The tools helped guide the discussions and clarify unspoken assumptions. We also identified several barriers to this approach, such as significant time and commitment required from the client, as well as adequate time for our team to fully understand this complex program.
The American Evaluation Association is hosting Health Evaluation TIG Week with our colleagues in the Health Evaluation Topical Interest Group. The contributions all this week to AEA365 come from our Health Evaluation TIG members. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org. AEA365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.