New this spring, grow your skills with AEA’s Spring 2026 Workshop Series—featuring 11 engaging virtual professional development workshops. Registrants will participate in 3- to 6-hour sessions designed to deepen knowledge of timely evaluation topics, learning directly from seasoned professionals with real-world expertise.
Pricing | Workshops | Schedules | FAQs
February 23 and 26 | Noon-3:00 p.m. ET | Two 3 Hour Sessions Speakers: Thomas Archibald and Jane Buckley
How does one “think like an evaluator”? How can program implementers learn to think like evaluators? Recent years have witnessed an increased use of the term “evaluative thinking,” yet this particular way of thinking, reflecting, and reasoning is not always well understood. Patton warns that as attention to evaluative thinking has increased, we face the danger that the term “will become vacuous through sheer repetition and lip service” (2010, p. 162). This workshop can help avoid that pitfall. Drawing from our research and practice in evaluation capacity building, in this workshop we use discussion and hands-on activities to address: (1) What evaluative thinking (ET) is and how it pertains to your context; (2) How to promote and strengthen ET among individuals and organizations with whom you work; and (3) How to use ET to identify assumptions, articulate program theory, and conduct evaluation with an emphasis on learning and adaptive management. We base this workshop on 15 years’ worth of research and practice focused on ET, via which we have elucidated the following definition of ET: “Evaluative thinking is a cognitive and relational process, motivated by an attitude of inquisitiveness and a belief in the value of evidence, that involves identifying assumptions, posing thoughtful questions, marshaling evidence to make judgments, pursuing deeper understanding, and making logically aligned, contextualized decisions in preparation for action.”
Register (Session I) Register (Session II) View Course
March 6, 13, 20 | Noon - 2 p.m. ET | 3 Friday Workshops Speakers: Breanne Burton and Kristin Cowart
Are you excited to explore new, creative ways to visualize your data and share meaningful results? Join this session to explore the data collection, analysis, and intuitive dashboarding capabilities of Airtable, a dynamic cloud-based platform. Airtable is trusted by industry leaders like LinkedIn and Netflix, as well as individuals, small businesses, universities, nonprofits, and more. A previous workshop participant shared, "The fact that it is affordable, doesn't require coding, and can plug-and-play with a dataset is super beneficial. I left excited to try it out in our evaluation work and share what I learned with my team." Bring your laptop and get ready to learn through real-world examples and hands-on activities, including live data collection, survey building, and dashboard creation, and see how Airtable can be a valuable addition to your evaluation toolbox (see an interactive dashboard example here - signing up for Airtable is quick, easy, and free).
Register (Session I) Register (Session II) Register (Session III) View Course
March 24 | 9:30 a.m. - 4:30 p.m. ET | Full-Day - 6 Hour Workshop Speakers: Jennifer Billman, Nicole Ernst, Alyssa Hokky, and Lana Rucks
Program evaluation is inherently place-based—where programs operate, who they serve, and local conditions shape a program’s effectiveness. Geospatial tools offer evaluators a powerful way to analyze and visualize data through a spatial lens, uncovering patterns that traditional evaluation methods may overlook. When applied to evaluation, these tools provide deeper insights into data, enhancing decision-making and fostering community engagement. Whether mapping resource distribution, analyzing social determinants of health, or engaging communities through interactive story maps, geospatial technologies provide a means to integrate local knowledge and lived experience into evaluation practice. In this full-day, hands-on professional development workshop, participants will explore the potential of geospatial technologies to transform evaluation by visualizing data in meaningful ways. Through interactive demonstrations, case studies, and applied exercises, attendees will gain practical skills in using geospatial tools to support evidence-based decision-making and storytelling.
In the morning session, participants will be introduced to various geospatial products, including maps, apps, dashboards, story maps, Business Analyst Online, and Survey123. Presenters will showcase real-world applications demonstrating how evaluators can integrate these tools to enhance data accessibility, equity-focused insights, and community engagement. Participants will learn about the key functionalities, strengths, and potential limitations of each tool, with a focus on how geospatial analysis can uncover patterns and trends often hidden in traditional evaluation approaches.
The afternoon session will transition into an interactive, hands-on experience. Participants will create a geospatial survey, collect data, and use mapping software to analyze and visualize results. Through guided exercises, they will build interactive dashboards and story maps, learning how to use ArcGIS tools to communicate findings to diverse audiences effectively. Emphasizing community engagement and participatory evaluation, the session will highlight ethical considerations and best practices for using geospatial tools to address inequities and amplify marginalized voices in evaluation work.
By the end of the workshop, attendees will have gained tangible skills in leveraging geospatial technology for evaluation and decision-making. They will leave with practical knowledge for integrating geospatial approaches into their work, ensuring their evaluation efforts are more inclusive, insightful, and action-oriented.
Register View Course
March 12 | Noon - 3:00 p.m. ET | Half-Day - 3 Hour Workshop Speakers: Michael Trevisan and Tamara Walser
We have facilitated several workshops on evaluability assessment both nationally and internationally. Examples include evaluability assessment workshops for the Claremont Evaluation Center Professional Development Workshop Series, Canadian Evaluation Society Annual Conference, United Nations Population Fund (3-day training), American Evaluation Association Annual Conference, European Evaluation Society Biennial Conference, and the Frontiers in Education Annual Conference. We have also published on the topic of evaluability assessment, including journal articles, a book chapter in the new, Handbook on Health Services Evaluation, and a 2015 book with Sage Publications, Evaluability Assessment: Improving Evaluation Quality and Use. More recently, we have developed and piloted a micro-credential on evaluability assessment with the World Food Programme.
The workshop feedback we have received has been positive overall, with many participants noting the relevance and usefulness of the learning to their professional practice. Based on feedback and our ongoing work with evaluability assessment and program evaluation, we continue to make changes to our materials and learning strategies, updating our materials for relevance and incorporating learning strategies to increase engagement and application. For example, participants have often noted the meaningfulness of the small group discussions and associated activities and some have suggested more tasks and activities that are similarly engaging. Thus, we have shifted our format so that participant-centered activities outweigh direct presentation, for example, using a KWL activity to generate insight about the use of evaluability assessment as a collaborative and transformative methodological approach.
April 1-2 | Noon-3 p.m. ET | Two Half-Day - 3 Hour Workshops Speaker: Donna Mertens
AEA has made a commitment to diversity, equity, inclusion, and belonging. This workshop combines the need to act upon that commitment through an exploration of the transformative framework that can be used to design strategies to address inequities in the world by the way they design their evaluations. Transformative mixed methods designs are explicitly constructed to serve this purpose. The transformative epistemological assumption directly focuses on challenging existing power structures and engaging with communities in ways that are culturally responsive. This workshop is designed to teach the use of mixed methods for transformative purposes to better address the needs of members of marginalized communities, such as women, people with disabilities, people living in poverty, racial/ethnic minorities, sexual minorities, and religious minorities. Participants will learn how to use a transformative lens to identify those aspects of culture and societal structures that support continued oppression and how to apply mixed methods designs to contribute to social transformation. Interactive learning strategies will be used, including whole group discussion and working in small groups to apply the design of a transformative mixed methods evaluation to a case study. Given the political climate, the workshop will provide evaluators with a space to consider the consequences of the policy changes that have brought issues of racism, sexism, ableism, and homophobia into the spotlight. There are other opportunities for evaluators to learn about generic mixed methods approaches, but my passion and expertise are in transformative mixed methods. Mixed methods design has become more sophisticated and has developed far beyond the idea of combining focus groups with surveys. The transformative approach to mixed methods is recognized as one of the major frameworks for conducting mixed methods studies. It is the only framework that starts with the ethical assumption that our evaluation work needs to explicitly address the issues of discrimination and oppression and that we can provide a basis for transformative actions that increase justice in the world.
April 8 | Noon - 3:00 p.m. ET | Half-Day - 3 Hour Workshop Speakers: Cheralynn Corsack and Ann Price
Authentic community engagement goes beyond interviews and focus groups—it requires centering community voices and ensuring they lead the design, implementation, and evaluation of solutions that strengthen communities. This interactive workshop equips evaluators with practical tools to foster shared leadership and deepen collaboration with community members.
April 10 and 17 | 11:00 a.m. - 2:00 p.m. ET | Full-Day - 6 Hour Workshop Speaker: Dana Benjamin
How can evaluators make creating and operationalizing logic models fun, inclusive, interactive, and equitable? In this workshop, participants will learn a new, participatory way to develop logic models, and the techniques learned can be applied across the entire evaluation cycle. They will be encouraged to think creatively to ensure that organizations build logic models to mitigate power dynamics, that all voices are heard, and that final products are used. Using the Lego®Serious Play®(LSP) method, participants will build outcomes and activities using Lego® bricks and connect them in an "if, then" framework that results in a three-dimensional relational diagram. Participants will learn new techniques to engage stakeholders in meaningful dialogue, stimulate creativity and innovation, foster reflection and sensemaking, build consensus, and enhance the validity and reliability of evaluation data. By incorporating LSP into their toolkit, evaluators can unlock new possibilities for conducting participatory, inclusive, and impactful evaluations.
April 20 | Noon - 3 p.m. ET | Half-Day - 3 Hour Workshop Speaker: David Fetterman
Empowerment evaluation is an interest holder involvement approach to evaluation. It is aimed at learning and improvement. Empowerment evaluations help people learn how to help themselves and become more self-determined by learning how to evaluate their own programs and initiatives. Key concepts include a critical friend (evaluator helping to guide community evaluations), cycles of reflection and action, and a community of learners. Principles guiding empowerment evaluation range from improvement to capacity building and accountability. The basic steps of empowerment evaluation include: 1) mission: establishing a unifying purpose; 2) taking stock: measuring growth and improvement; and 3) planning for the future: establishing goals and strategies to achieve objectives, as well as credible evidence to monitor change. An evaluation dashboard is used to compare actual performance with quarterly milestones and annual goals. The role of the evaluator is that of a coach or facilitator in an empowerment evaluation since the group is in charge of the evaluation itself. The workshop is open to colleagues new to evaluation as well as seasoned evaluators. It highlights how empowerment evaluation produces measurable outcomes with social justice-oriented case examples ranging from eliminating tuberculosis in India to fighting for food justice throughout the United States. Additional examples include empowerment evaluations conducted with high-tech companies such as Google and Hewlett-Packard, as well as work conducted in rural Arkansas and squatter settlements in South Africa. Employing lectures, activities, demonstrations, and discussions, the workshop will introduce the theory, concepts, principles, steps of empowerment evaluation, and technological tools of the trade. (See TED Talk about empowerment evaluation for more details.)
April 28-29 | 11:00 a.m. - 2:00 p.m. | Full-Day - 6 Hour Workshop Speaker: Lisa Aponte-Soto
The Latine community in the U.S. currently accounts for nearly 19% of the total population and is projected to comprise about one-third of Americans by 2050 (U.S. Census, 2021). Latines encompass a variety of cultural groups, sociopolitical identities, ethnicities, and national origins (Rumbaut and Portes, 2001). Acknowledging the growing Latine community in the U.S., it is critical to understand the complexity and diversity of the Latine culture to honor culture and context by centering community voices using liberatory and participatory practices. Enacting culturally responsive and equitable evaluation (CREE) frameworks with diverse Latine and BIPOC communities calls for evaluators to honor culture and context by centering community voices using liberatory and participatory practices. Implementing CREE for engaging an interdisciplinary community of scholars in discourse around actionable advocacy, democratic principles, courageous leadership, and social justice agency that advance the knowledge of attending to culture and context.
Recognizing the added-value of co-learning with cross-cultural evaluators working in partnership with Latine- and BIPOC-serving programs, this workshop is structured in three main components to allow participants to reflect and apply CRE. This workshop is structured in three main components. Part I will provide an overview of social justice evaluation theories and foundational principles of CRE with an emphasis on LatCrit and contemporary indigenous praxis-oriented paradigms for working with Latine and BIPOC communities. Part II will focus on self-reflection exercises to assess the evaluators’ positionality as CREE agents. Part III will guide participants through applied case study exercises in small groups.
In Part I, participants will reflect on the foundational social justice evaluation theories and principles of CRE with an emphasis on LatCrit and contemporary indigenous praxis-oriented paradigms for working with Latine communities. Building on the 9-step CRE framework (Hood et al., 2015), this component will discuss the unique cultural values and identity in relevance to Latine culture and inclusive, liberatory, and participatory approaches. Applied CREE examples will be shared that reflect how to avoid common misconceptions and missteps. Part II will guide participants through self-reflection, self-critique, and cultural humility exercises to assess their positionality as evaluators and agents of CREE (Symonette, 2008). Part III will lead participants in small-group discussions through applied case study exercises using CREE. The facilitator will also extract from a series of evaluation projects and case studies for working in partnership with Latine and BIPOC communities.
The facilitator is a Latine evaluator with experience conducting Latine culturally responsive and equitable evaluation (LatCREE; Aponte-Soto et al., 2024). The facilitator will share unique perspectives and experiences practicing LatCREE and reflect on where the field has been and what directions it needs to take when practicing evaluation for and with Latine and BIPOC communities. Practical applications explore tactics for attending to the unique heterogeneity of Latine culture across multiple contexts, settings, communities, and geographic regions. Participants will have an opportunity reflect on their experiences with Latine and BIPOC-focused evaluation planning and practice.
May 7-8 | 9:30 a.m. - 4:30 p.m. ET | Full-Day - 6 Hour Workshop Speaker: Man Gasiorowicz
This session brings to life Mari Gasiorowicz’s Systems Evaluation Topical Interest Group Blogpost, Why Systems Thinking Should Drive Every Evaluation” posted on December 3, 2024. This highly interactive full-day workshop introduces systems thinking concepts and tools and fosters their use in evaluation practice. Participants will 1) practice using the Habits of a Systems Thinker, 2) identify characteristics of complex system (purpose, elements, boundaries, dynamics, interconnections), including creating Behavior-Over-Time Graphs (BOTGs) and a system map, 3) ask evaluation questions using the graphs and maps that groups developed, 4) consider examples of novel tools for evaluations of initiatives explicitly designed to create systems change. This is a workshop that you will not forget!
June 8 | 9:30 a.m. - 4:30 p.m. ET | Full-Day - 6 Hour Workshop Speakers: Thomas Archibald and Leah Neubauer
This interactive skill-building workshop will introduce Communities of Practice (CoPs) and demonstrate their application for illuminating CoPs as a methodology to interrogate one’s evaluation practice, advance thinking & learning about the evaluation profession, and consider diverse, complex evaluator roles in society. Increasingly, evaluators are called to evaluate and participate in CoPs in their in-person or virtual global settings. Grounded in critical adult education and transformative learning, this session will focus on CoPs that engage learners in a process of knowledge construction, unlearning/relearning around common interests, ideas, passions, and goals. Participants will develop a CoP framework that includes the three core CoP elements (domain, community, practice), and processes for generating a shared, accessible repertoire of knowledge and resources. The three core elements and framework will provide a larger foundation to discuss monitoring, evaluation, learning (MEL), and evaluative thinking. Co-facilitators will highlight examples of CoP implementation in MEL from across the globe in development, education, and community health through the lenses of transformation. Participants will engage in a series of hands-on inquiry-oriented techniques, analyzing how CoPs can be operationalized in their evaluation practice.
As part of the Spring 2026 Workshop Series, members will participate in 3- to 6-hour virtual workshops focused on current hot topics in evaluation, led by experienced practitioners who are shaping the field today.
Designed for evaluation professionals at every stage of their career, these workshops welcome participants of all experience levels. Whether you’re planting the seeds of your evaluation journey or cultivating advanced expertise, the Spring 2026 Workshop Series provides space to learn, grow, and engage with peers across the field.
Pricing for each workshop varies by AEA membership status. Check our pricing table to learn the price for members, non-members, and students.