AEA Summer Institute Banner

Workshops

AEA is excited to announce the return of the Summer Evaluation Institute, taking place June 26-28, 2024 in Washington, DC. 
These interactive, immersive workshops will give you the opportunity to learn from your peers, engage in meaningful discussions, and work through challenges or barriers in your evaluation practice alongside the evaluation community.

Each workshop has a designated capacity in order to create an intimate learning environment. We recommend you register early to secure your spot!

Learn more about the workshops below! Space is limited for each; don’t wait to reserve your spot.

Wednesday Workshop:

All times are listed in Eastern Time.

It’s Not the Plan, it’s the Planning: Strategies for Evaluation Plans and Planning

Wednesday, June 26, 9:00 AM - 4:30 PM 

Presenters: Sheila B. Robinson, Custom Professional Learning, LLC; Elizabeth Grim, Elizabeth Grim Consulting

"If you don’t know where you’re going, you’ll end up somewhere else" (Yogi Berra). Few evaluation texts explicitly address the act of evaluation planning as independent from evaluation design or evaluation reporting. This interactive session will introduce you to an array of evaluation activities that comprise evaluation planning and preparing a comprehensive evaluation plan. You will leave with an understanding of how to identify the primary intended users of evaluation, the extent to which they need to understand and be able to describe the the program, tips for conducting literature reviews, strategies for developing evaluation questions, considerations for evaluation designs, and using the Program Evaluation Standards and AEA’s Guiding Principles for Evaluators in evaluation planning. Instructors will also briefly demonstrate how AI can be used to assist in generating evaluation plans. You will be introduced to a broad range of evaluation planning resources including templates, books, articles, and websites.

The pre-event workshop requires separate registration and is capped at 100 registrants.

Thursday Workshops:

All times are listed in Eastern Time.

Workshop 1: Creating Data Dashboards Using the Google Suite

Thursday, June 27, 9:00 AM - 12:30 PM; 2:00 PM - 5:30 PM

Presenters: Emily Peterson Johnson, Texas Health Institute; Jessica Cargill, Texas Health Institute

This interactive workshop will equip attendees to develop powerful web-based dashboards using simple and free tools within the Google Suite.

Dashboards offer a compelling way to track evaluation progress or share evaluation findings with technical and non-technical audiences. While there are many software that can be used to develop sophisticated dashboard visualizations, such as Tableau or ArcGIS Pro, many times these tools are cost-prohibitive and require advanced user knowledge. The Google Suite offers an alternative set of tools, including Looker Studio, Google Sheets, and Google Maps – all of which are free and simple to use. Developing a working knowledge of free software and tools, such as those from Google, is essential for advancing equity in the evaluation workforce, and can also empower the communities engaged by our program evaluation.

Workshop 3: The Basics of Using Theory to Improve Evaluation Practice

Thursday, June 27, 9:00 AM - 12:30 PM; 2:00 PM - 5:30 PM

Presenters: John LaVelle, University of Minnesota; Stewart Donaldson, Claremont Graduate University

This workshop is designed to provide practicing evaluators with an opportunity to improve their understanding of how to use theory to improve evaluation practice. Lecture, exercises, and discussions will help participants learn how to apply evaluation theories, social science theories, and stakeholder theories of change to improve the accuracy and usefulness of their evaluations. A range of examples from evaluation practice will be provided to illustrate main points and take-home messages.

Workshop 5: The CREATE Framework: 6 Steps for Building Your Own Workshop or Course

Thursday, June 27, 9:00 AM - 12:30 PM; 2:00 PM - 5:30 PM

Presenter: Sheila B. Robinson, Custom Professional Learning, LLC

Are you eager to share your evaluation knowledge and expertise in a workshop or course? Do you aspire to create something that not only imparts valuable information but also transforms the lives of your participants? Do your evaluation clients require you to do training? This fast-paced, interactive workshop will equip you with the knowledge and skills you need to craft a high-quality, impactful workshop or course of your own, using a step-by-step framework that takes course-building from daunting to doable. CREATE stands for 6 actionable steps: Create the vision, Ready the content, Engage participants, Tie it up in a package, and Evaluate, revise and refine.

You’ll learn research-based strategies for teaching and learning that will make your course effective. Whether it will be on-site or online, self-paced, synchronous, or a hybrid approach, your workshop will be built on a foundation of content relevant to your learners, layered with strong, effective instruction, and designed with learner engagement strategies that will lead them on a transformational journey. You’ll receive a comprehensive workbook that encourages you to plan out critical steps for success and provides checklists for what you’ll need to pull together to make it happen.

If you recognize the importance of strong, effective instructional design and care deeply about whether your participants learn, retain what they learn, and are able to apply what they learned from you, this workshop is for you!

Workshop 6: Consulting Skills for Evaluators: An Introductory Workshop to Learn Basic Consulting Skills and How to Start an Independent Practice

Thursday, June 27, 9:00 AM - 12:30 PM; 2:00 PM - 5:30 PM

Presenters: Matt Feldmann, Goshen Consulting; Laura Keene, Keene Insights

As an evaluator, are you thinking about going out on your own? For many, this is an exciting but intimidating prospect. This workshop will reveal the simple but essential design and start-up skills needed for success. Matt Feldmann and Laura Keene will lead this important introductory workshop that has been foundational to the development of many consulting practices and small internal independent evaluation shops. This workshop will provide you with a plan for how to initiate your independent consulting practice.

Workshop 7: How to Equitably Communicate Data Findings

Thursday, June 27, 9:00 AM - 12:30 PM; 2:00 PM - 5:30 PM

Presenters: Alissa Marchant, Innovation Network; Elizabeth Grim, Elizabeth Grim Consulting, LLC

Evaluators hold substantive power in disseminating and influencing high level decision-making. To ensure that evaluative findings support community needs, reduce harm, and reflect the diversity of those communities, evaluators must be skilled at communicating equitably. This requires careful consideration of how written words and visual presentations of data are received by readers, users, and audiences. This skill-building workshop will explore questions such as: How far does equity continue beyond your evaluation report? Does your reporting truly communicate what you intended? How can analysts, researchers, and developers apply an equitably-conscious lens to their data analysis and visualization work? The workshop is facilitated by two thought-leaders in equitable communications. They have developed guides on principles and strategies for communicating equitably and adopting inclusive and non-violent language, and will share insight from other leaders on collecting, analyzing, and reporting gender and sexual orientation data. The workshop will include a blend of brief presentations and interactive activities so that attendees can practice equitable communication skills as we reflect on a sampling of reports, visualizations, and other dissemination products and scenarios.

Workshop 13: Community and Culture-driven Theories of Change

Thursday, June 27, 9:00 AM - 12:30 PM; 2:00 PM - 5:30 PM

Presenters: Katie Rose Dailey, FHI 360; Saori Iwamoto, FHI 360; Pam Carter, FHI 360; Amy Detgen, FHI 360

This workshop will demonstrate how the incorporation of community voice in evaluation planning can enhance and strengthen efforts, to the benefit of all stakeholders. We will discuss the concepts of community voice and culture and why evaluators should consider elevating culture in evaluation design. We will then introduce the Community-Driven Theory of Change tool, walking participants through the key steps in its implementation. Participants will have time to work in facilitated small groups to practice applying this tool to their own projects. A mix of self-reflection, sharing with peers, and facilitated discussions will enable participants to engage with the Community-Driven Theory of Change tool and concepts from multiple perspectives and hear examples of its application to further their learning.

Workshop 14: Evaluative Thinking: Principles and Practices to Enhance Evaluation Capacity and Quality

Thursday, June 27, 9:00 AM - 12:30 PM; 2:00 PM - 5:30 PM

Presenter: Thomas Archibald, Virginia Tech

How does one “think like an evaluator”? How can program implementers learn to think like evaluators? Recent years have witnessed an increased use of the term “evaluative thinking,” yet this particular way of thinking, reflecting, and reasoning is not always well understood. Patton warns that as attention evaluative thinking has increased, we face the danger that the term “will become vacuous through sheer repetition and lip service” (2010, p. 162). This workshop can help avoid that pitfall. Drawing from our research and practice in evaluation capacity building, in this workshop we use discussion and hands-on activities to address: (1) What evaluative thinking (ET) is and how it pertains to your context; (2) How to promote and strengthen ET among individuals and organizations with whom you work; and (3) How to use ET to identify assumptions, articulate program theory, and conduct evaluation with an emphasis on learning and adaptive management.

Workshop 15: The Survey Design Studio

Thursday, June 27, 9:00 AM - 12:30 PM; 2:00 PM - 5:30 PM

Presenter: JoAnna Hillman, Hillman Associates, LLC

Welcome to the Survey Design Studio, where we create ONLY GREAT SURVEYS. This workshop is a deep dive into the artistic and creative world of outstanding survey design for program evaluation. Explore the art of survey design by composing evaluation purpose statements, crafting key evaluation questions, immersing yourself in the rich palette of question types, and sculpting surveys that capture the essence of program evaluations with flair and finesse. We’ll use logic and intuition to create surveys that resonate deeply with our audiences, illustrate creative approaches for elevating our survey artistic ability, and discuss considerations for special contexts. Prepare to unleash your survey design creativity as you dive into hands-on activities, transforming theoretical concepts into magnificent works of survey art. You’ll engage in collaborative critique sessions, where you'll refine your craft alongside fellow survey artists.

Step into a vibrant world where we’ll transform surveys into masterpieces that unlock quality data for decision making. Get ready to unleash your survey masterpiece!

Workshop 18: Evaluators as Practitioners for Racial Justice Transformation and Change Management

Thursday, June 27, 9:00 AM - 12:30 PM; 2:00 PM - 5:30 PM

Presenter: Horace Duffy, Keecha Harris and Associates, Inc.; Sierra Fernandez, Keecha Harris and Associates, Inc.; Blanca Guillen-Woods, Strategic Learning Partners for Innovation (SLP4i)

In this workshop, the strategy, evaluation, and learning team from Keecha Harris and Associates, Inc. (KHA) will actively engage peer evaluators and attendees in learning about best practices and evaluative methods for racial justice transformation with philanthropic organizations, their partners, and governmental entities. Workshop facilitators will present a framework composed of five phases used by the team when working with clients across a diverse portfolio of evaluation projects in environmental justice, reproductive justice, organizational development, and economic mobility.

The key phases of the framework include 1) project discovery through a racial equity lens, 2) engaging community effectively, 3) co-creation and ownership of process and products, 4) racial equity capacity building, and 5) sustainability and longevity of change management. Each phase of the framework will be introduced and expanded on in detail during the workshop and resources visually depicting the framework will be provided. During the session, the KHA team will discuss real examples from their practice to illustrate each of the phases, citing impactful stories and evaluation methodologies that have shaped systems and institutional changes during the consultancies’ work. Lessons learned relevant to risk mitigation across client relationships, authenticity concerns of equity-focused work, and dynamic, cross-skilled teamwork will be uplifted. Customizable design is integrated into the framework and addresses the distinctive needs and desired impact of projects and clients, with intentional consideration of the culture of the specific organization and sector.

Evaluative techniques to be discussed across the five phases include facilitating in-depth interviews, listening sessions, staff/board retreats, workshop series, and town-hall style conversations, while also amplifying those narratives by gathering and reporting quantitative data through mediums like surveys, in-session polls, and desk research.

The session will be designed to offer attendees opportunities for engagement in rich discussion about the applicability of the framework across their varied fields and topical areas of interest. Interactive elements will be included throughout and attendees will leave the workshop having developed an evaluation approach embedding the phases from the framework in collaboration with facilitators and fellow participants.

Workshop 20: Leveraging an Innovative and Interactive Tool, The Eval Matrix©, to Design Culturally Responsive and Equity-Focused Evaluations

Thursday, June 27, 9:00 AM - 12:30 PM; 2:00 PM - 5:30 PM

Presenters: Katrina L. Bledsoe, Ph.D., Strategic Learning Partners for Innovation/Abt Global, Inc.; Leon D. Caldwell, Ph.D., Strategic Learning Partners for Innovation; Blanca Guillen-Woods, M.A., Strategic Learning Partners for Innovation; Christy Peterson, Strategic Learning Partners for Innovation

Many funders are focused on working with a racial equity and a social justice lens and are asking for specific evaluation approaches that will bring partners, communities, and those most impacted together to identify important questions and outcomes. Likewise, evaluators will often say they are going to use a specific culturally responsive and/or equity-focused evaluation approach, but it is often unclear what that really means in practice or how any one approach differs from others.

During this workshop, funders, researchers and practitioners will gain a better understanding of seven culturally responsive and equity focused evaluation approaches. They will also learn about seven key principles shared across each of the approaches. Participants will engage in meaningful discussions that will help them to be able to start to design more equitable and inclusive evaluations that will contribute to social change. Practitioners and program staff will learn how best to build considerations for equitable evaluation into their programs up front.

Friday Workshops:

All times are listed in Eastern Time.

Workshop 2: Making ROI Part of Your Story

Friday, June 28, 9:00 AM - 12:30 PM; 2:00 PM - 5:30 PM

Presenters: Patti P. Phillips, Ph.D., ROI Institute; Jack J. Phillips, Ph.D., ROI Institute

In the best-selling book, The Cost-Benefit Revolution, Cass Sunstein, former administrator of the White House Office of Information and Regulatory Affairs under the Obama administration, wrote that for public sector organizations to thrive, they must conduct a cost-benefit analysis of major projects, programs, initiatives, and regulations.

The concept of cost-benefit analysis is not new. It has been a usable concept for centuries as governments examine the benefits of proposed regulations compared to the costs. ROI is more of a business term, dating back at least 200 years. They use the same measures but present them in two different ways. The key to having a credible cost-benefit analysis is the process, rules, standards, and assumptions used to capture the benefits part of the equation and the consistency, credibility, and conservative approach on the cost side of the equation. The ROI Methodology, adopted by 30 central governments and many NGOs worldwide, including the United Nations, provides a framework for producing five levels of outcome data in a logic model format. It measures:

1. The reaction to a program or project

2. How well are the participants learning to make the project successful?

3. The application of the learning to make the project successful

4. The impact of the application (the principal definition of the success of the project.)

5. The monetary benefits of the impact compared to the total project costs (direct and indirect) (ROI).

This workshop will show how this model helps program evaluators add additional dimensions to their current system. It is not meant to replace a particular evaluation model but to enhance current models by providing this ultimate accountability of showing the economic benefits compared to the total cost of the initiative.

Workshop 4: Appreciative Evaluation 2.0

Friday, June 28, 9:00 AM - 12:30 PM; 2:00 PM - 5:30 PM

Presenters: Tessie Catsambas, EnCompass LLC; Stewart Donaldson, Claremont Graduate University

Appreciative evaluation is known for its intentionality in crafting compelling questions about successful experiences, inviting affirming multi-stakeholder engagement, generating insightful stories about lived experience, and grounding the evaluation in a compelling vision of the future. Regardless of the evaluation design and methods selected, Appreciative Evaluation is an excellent addition that will enhance the effectiveness, culture competence, ethics, and equity in evaluation. A growing body of research in positive psychology helps us understand the impact of embedding appreciative evaluation into any design and method.

This workshop will cover what Appreciative Evaluation is, the growing evidence behind the methodology, and three ways to apply it in evaluation practice applied to interviewing, framing, and reporting. Participants will understand how the appreciative stance and the appreciative method combine to strengthen any evaluation, and why every evaluator would benefit from adding it to their professional practice.

Workshop 8: Introduction to Evaluation Capacity Building

Friday, June 28, 9:00 AM - 12:30 PM; 2:00 PM - 5:30 PM

Presenters: Ann Marie Castleman, MPH, MA, University of Cincinnati Evaluation Services Center; Heather D. Codd, PhD, MPA, Pyxis Management, Inc.; Leslie A. Fierro, PhD, MPH, Max Bell School of Public Policy, McGill University; Fierro Consulting, Inc.

Over the past two decades, there has been consistent interest in Evaluation Capacity Building (ECB) and a pressing need to respond to the calls for building evaluation capacity in individuals and organizations. This interactive workshop provides an introduction to ECB as a similar but distinct practice from evaluation and opportunities for attendees to explore ECB practice. Attendees will be introduced to the fundamentals of ECB, including the intended outcomes of ECB interventions (Fierro & Christie, 2017) and tools and techniques to measure, strengthen, and sustain evaluation capacity. Participants will also engage in a series of reflective exercises to examine which evaluation capacity builder competencies they are strong in and where they need further development and will put into place a professional development plan for enhancing their skill set in the upcoming three years.

Workshop 9: Letting the Data Speak: Effectively integrating and synthesizing your data to tell the story

Friday, June 28, 9:00 AM - 12:30 PM; 2:00 PM - 5:30 PM

Presenters: Anais Henriquez, EnCompass LLC; Michael Moses, EnCompass LLC

Over the years, we have developed an approach that efficiently and rigorously integrates and interprets data. We call it the Data Analysis, Integration, and Synthesis (DAIS) process. This participatory approach allows us to integrate complex data streams, often collected and analyzed by multiple evaluators. The DAIS process is interactive and iterative. During a DAIS process, we put analyzed data visually in front of the evaluation team (using cards or sticky notes), and we work together to integrate it into initial findings. We have found that this process can overcome the challenge of moving from analysis to report writing and can help lead to a well-structured report with strong findings, a clear narrative reflected in the conclusions, and recommendations grounded in data.

In this workshop, we present the DAIS process and invite the participants to practice each step. Participants will experience participatory strategies during this workshop to ensure data are effectively synthesized. We first present an introduction to the approach through a mini lecture and discussion around structuring a DAIS workshop. What is it? What are the benefits? Who needs to be there? How will it be formatted? We will introduce the process of creating data analysis summaries which comprise early emerging theme statements with supporting evidence and data sources. These are the key inputs to a DAIS. We will present a mock study with 4 key evaluation questions, and mock supporting data. Participants work in small groups to practice developing emerging theme statements for each question. We will then discuss how to prepare for a DAIS workshop. We will practice the workshop steps, focusing mostly on how an evaluation team can affinity map emerging themes into higher-level findings during a DAIS workshop. We will then discuss the difference between findings, conclusions, and recommendations and practice developing strong conclusions and recommendations from the emerging findings. Finally, we will discuss good practices and strategies to move from the DAIS to drafting a cohesive narrative in an evaluation report. Throughout, participants will experience how DAIS processes produce traceable, contextualized and supportable evidence-based findings and user-appropriate conclusions and recommendations, with a confident team standing behind the story the data tell.

Workshop 10: Learning to Love Your Logic Model: Better Planning, Implementation, and Evaluation Through Program Roadmaps.

Friday, June 28, 9:00 AM - 12:30 PM; 2:00 PM - 5:30 PM

Presenter: Thomas Chapel, MA, MBA, (Semi-retired) Centers for Disease Control and Preventionte

The bad rap on logic models in some quarters is well-deserved. What should be a flexible and practical tool often deteriorates into overly-bureaucratic mandatory templates mired in terminology that puzzles both users and all but the most experienced evaluators. This course aims to recapture the original spirit and utility of logic modelling by emphasizing function over form. While we will cover the "usual suspect" components of the traditional logic model—activities and outcomes, inputs and outputs, mediators and moderators--we’ll introduce concepts step by step and, at each point, show how insights from that step contribute (OR NOT) to a more thorough understanding of your program. More importantly, we’ll show how logic models--customarily a tool in program evaluation—are even more useful in setting, assessing, and course-correcting strategy and implementation, even before the first iota of data are collected. These “process use” applications, while not denying the importance of logic models in setting an evaluation focus, excite planners and implementers, and make the evaluator a welcome participant even at the earliest stages of program formation.

Workshop 11: Data Quality Management

Friday, June 28, 9:00 AM - 12:30 PM; 2:00 PM - 5:30 PM

Presenter: Anne Coghlan, PhD, Michigan State University

A major purpose of many program evaluations is to generate data for decision making. However, how can we be sure that our data are of good enough quality to make well informed decisions? While evaluators may receive training in aspects of data quality, overarching ways to enhance and manage data quality are rarely addressed. In this workshop, evaluators will be introduced to a comprehensive data quality management system for quantitative data, first developed by the Global Fund and several international development agencies, that consists of specific data quality assessment criteria and standard operating procedures. Through large and small group discussions, participants will first identify their own data quality issues. Participants will then review and relate their own experiences to certain assessment criteria and identify procedures for strengthening the quality of their data. Lastly, participants will review the basic components of a Data Quality Management Plan.

Workshop 12: Evaluation Report Makeover

Friday, June 28, 9:00 AM - 12:30 PM; 2:00 PM - 5:30 PM

Presenters: Maggie Pustinger, Emory Centers for Public Health Training and Technical Assistance; Kristin Giordano, Emory Centers for Public Health Training and Technical Assistance

Presenting evaluation data tailored for each audience helps drive program improvement, engage program partners and enables data-driven decision-making. But with so many different types of audiences, how can you efficiently and effectively communicate with them? How can evaluators ensure that data is used to make decisions? How can evaluators make data not only accessible but also useful for program leadership? A traditional evaluation report will not reach all audiences, which means that developing and disseminating other types of evaluation products is an essential part of evaluation work.

Workshop 16: AI-enabled Evaluation Basics

Friday, June 28, 9:00 AM - 12:30 PM; 2:00 PM - 5:30 PM

Presenter: Zach Tilton, The MERL Tech Initiative

The workshop will consist of 5 mini-modules introducing participants to methods for integrating generative AI in their evaluation practice. Modules include 1) a primer on generative AI and its use in evaluation; 2) ethical and responsible principles for GenAI-enabled evaluation practice; 3) prompt engineering basics; 4) chatbots for Theory of Change work; 5) and using chatbots for natural language coding (no prior coding experience required). Sessions will consist of a combination of lectures, demonstrations, large group discussions, and small group work. The proposed workshop combines specific AI-enabled evaluation training sessions that have been offered as individual units for the ITCILO, UNFPA, YLabs, and the MERL Tech Natural Language Processing Community of Practice.

Workshop 17: (Re)Imagining Evaluation Practice to be more Equitable, Transformative, and Full of Soul

Friday, June 28, 9:00 AM - 12:30 PM; 2:00 PM - 5:30 PM

Presenters: Min Ma, MXM Research Group; Chantal Hoff, MXM Research Group

The field of evaluation is undergoing important shifts as it examines equity within the practice and its role in the social change ecosystem. These shifts are being guided by numerous approaches and methods that challenge evaluators to examine WHAT aspects of traditional research and evaluation practice need to change to be more culturally responsive and in better service of equity. This workshop focuses on HOW. Participants will examine their current evaluation practice through self-reflection, small group discussions, and engaging with real-world examples. We will consider shifts within our own evaluation practice that enable us to be in better service of more equitable, transformative relationships with 1) communities and people most impacted by the work, 2) clients and primary evaluation audiences, and 3) our own evaluation teams. This workshop is grounded in foundational principles of data equity and draws from transformative evaluation, culturally responsive and equitable evaluation approaches, arts-based methods, and appreciative inquiry.

Workshop 19: Trauma-Informed and Equitable Evaluation: Tools and Practices Towards Healing and Liberation

Friday, June 28, 9:00 AM - 12:30 PM; 2:00 PM - 5:30 PM

Presenters: Carolyn Fisher, Institute for Community Health; Laura McElherne, Institute for Community Health; Ariela Braverman Bronstein, Institute for Community Health

The body and brain's normal responses to traumatic stress can result in changes to cognition, emotion, affect, and behavior that vary widely in terms of duration and severity. Beyond these biological and psychological effects, trauma can also impact our relationships with each other, and our relationships with our institutions, cultures, histories, and social structures. This informs how people who have experienced trauma relate to and participate in the fields of research and evaluation as both subject and investigator. Further, the historical and contemporary maldistribution of wealth, resources, and protection from violence for people with excluded and minoritized identities means that this is a phenomenon with urgent social justice implications as well. As evaluators, we need tools and practices that will permit us to both see and account for the multilevel impacts of trauma and to work towards healing our communities and our practices. In this workshop, the facilitators will bring an interdisciplinary and equity-focused understanding of trauma-informed evaluation practices, informed by extensive experience in the field as well as their backgrounds in social work, anthropology, epidemiology, and medicine. Workshop participants will leave with a set of practical tools and practices for trauma-informed, equitable evaluation.

The workshop will begin with an introduction to trauma, trauma-informed principles and practices, an explanation of what we mean by a multi-level trauma-informed approach, and the relationship with Equitable Evaluation. We then facilitate a discussion around the ultimate goals of evaluation and its potential for facilitating healing and liberation. Finally, we walk workshop participants through the different stages of evaluation—planning, data collection, data analysis, and dissemination—and collaboratively identify specific strategies for trauma-informed and equitable practices at each stage. Throughout the day, participants will work together in small groups to creatively apply the concepts to case studies, in the process teaching and learning from one another.

Search