AEA students, members, and student non-members - please login to receive your special discounted price when you add products to your cart.
If you already have an AEA account, click login in the top right corner.
In-depth eStudy courses offer a deep dive into top-of-mind evaluation themes and topics. Open to both members and nonmembers alike, eStudies provide a diverse learning experience where collaboration is encouraged.
Key things to know before registering:
- eStudy courses are recorded and made available for 14 days to all paid registrants
- Your registration for an eStudy course covers all sessions for that course
- There will be a short homework assignment, requiring an hour or less of your time, between sessions
- Registration will close five business days in advance of the first session meeting
- The webinar operates through GoToWebinar. Check your computer's requirements here.
eStudy 095: Using the Cultural Consensus Method to Evaluate Program Impacts
Presenters:Peggy Ochandarena, Chief of Party- Enhancing Palestinian Justice Program, Chemonics International Inc; Co-Director, Global Impact Collaboratory and Roseanne Schuster, Assistant Research Scientist, School of Human Evolution and Social Change, Arizona State University; Director of MEL, Global Impact Collaboratory.
Dates: November 27, 12:00-1:30pm (EDT); November 29, 12:00-1:30pm (EDT)
Culture is the shared beliefs of a particular group of people, and strongly shapes what is socially acceptable and thus shapes action. Understanding the culture of an intervention's beneficiaries is critically important in designing interventions for effectiveness. In this webinar, leaders of the Global Impact Collaboratory, a partnership between Arizona State University and Chemonics International, give learners hands-on interaction with the theory, instrument design, and analysis for the CCM, with demonstration of its use in international development projects and application to a case study.
eStudy 096: The M&E Map: How to systematically describe and measure any project
Presenters: Errol Goetsch, Founder, 4Sight Prediction Solutions Pty. Ltd.
Dates: December 4, 12:00-1:30pm (EDT); December 6, 12:00-1:30pm (EDT); December 18, 12:00-1:30pm (EDT); December 20, 12:00-1:30pm (EDT)
This eStudy will introduce evaluators to the world of M&E and give existing professionals new tools and a common language for monitoring and evaluating any project in the international development aid sector. This course is for anyone who interacts with and wants to influence donors, sponsors, regulators, lobbyists, media, beneficiaries, Government departments, aid agencies, service providers, program or project or partner managers and staff and project designers or auditors, using a language and system that elegantly captures the key features of each and every project, using words, numbers and pictures.
eStudy 097: More than two options: How to collect LGBTQ inclusive data
Presenters: Ash Philliber, PhD, Senior Evaluator, Philliber Research & Evaluation
Dates: December 11, 12:00-1:30pm (EDT); December 13, 12:00-1:30pm (EDT)
This eStudy focuses on training evaluators how to ask questions in ways that are inclusive of LGBTQ communities. This is an ever-changing field and no forms have been developed that can be used nation-wide without issue. Here we will discuss common terms, things to be avoided, and how to develop forms that will be appropriate in different places and with different groups.
eStudy 098: Working with Assumptions to Unravel the Tangle of Complexity, Values, Cultural Responsiveness
Presenters: Jonathan Morell, Ph.D. Principal, 4.669 Evaluation and Planning/Editor, Evaluation and Program Planning; Apollo M. Nkwake, CE. Ph.D. International Technical Advisor, Monitoring and Evaluation, Education Development Center; Katrina L. Bledsoe, Ph.D. Research Scientist, Education Development Center/Principal Consultant Katrina Bledsoe Consulting
Dates: January 15, 12:00-1:30pm (EDT); January 29, 12:00-1:30pm (EDT); February 12, 12:00-1:30pm (EDT)
It is impossible for evaluators not to make assumptions that simplify the world in which programs, initiatives and “wicked problems” exist. Simplification—and parsing out to key values—is necessary because without it, no evaluation can reveal relationships that matter. We always need a model that provides a simple and straightforward guide for the construction of evaluation designs and data interpretation. The model may be formal or informal, elaborate or sparse, formally constructed or implicit. But always, there is a model, and always, to be useful, the model must provide a parsimonious explanation of the phenomena—and world—at hand.