SUMMER EVALUATION INSTITUTE 2019

AEA_313196-18_SummerInstitute_Banners2019_705x346_1.jpg

Workshop Descriptions

Workshop 1: INTRODUCTION TO EVALUATION

Offered: Sunday, June 9, 9:00 AM – 4:00 PM

Level: Advanced

Speaker: Tom Chapel

This workshop will provide an overview of program evaluation for Institute participants with some, but not extensive, prior background in program evaluation. The foundations of this workshop will be organized around the Centers for Disease Control and Prevention’s (CDC) six-step Framework for Program Evaluation in Public Health as well as the four sets of evaluation standards from the Joint Commission on Evaluation Standards. The six steps constitute a comprehensive approach to evaluation. While its origins are in the public health sector, the Framework approach can guide any evaluation. The workshop will place particular emphasis on the early steps, including identification and engagement of stakeholders, creation of logic models, and selecting/focusing evaluation questions. Through case studies, participants will have the opportunity to apply the content and work through some of the trade-offs and challenges inherent in program evaluation in public health and human services.

What you will learn:

  • A six step framework for program evaluation
  • How to identify stakeholders, build a logic model, and select evaluation questions
  • The basics of evaluation planning

 

Workshop 3: ADDING COSTS TO HELP YOUR EVALUATION GET USED: COST-EFFECTIVENESS AND COST-BENEFIT ANALYSES FOR HEALTH AND HUMAN SERVICES

Offered: Tuesday, June 11, 1:30 PM  4:30 PM; Wednesday, June 12, 9:00 AM  12:00 PM 

Level: Intermediate 

Speaker: Brian T. Yates, PhD

Including costs of programs—to consumers as well as providers and taxpayers—can help your evaluation get funded, read, and used. Also evaluating the monetary outcomes (or “benefits”) of programs, such as reduced client uses of health services and increased client productivity and income, can further influence decision-makers. Incorporating consumer, provider, and funder costs into cost-effectiveness, cost-benefit, and cost-utility analyses provides the foundation for calculating cost per Quality-Adjusted Life Year Gained as well as Social Return On Investment (SROI) estimates. Participants will finish this workshop knowing what "cost studies" all too often are, and what cost-inclusive evaluation can be. Examples from real evaluations of substance abuse prevention and treatment programs, and health as well as mental health services, are used throughout.

What you will learn:

  • How to recognize, interpret, and use findings from basic analyses of cost, cost-effectiveness, and cost-benefit
  • How to design and conduct evaluations that include costs of programs as well as the monetary and other universal outcomes resulting from programs
  • How to communicate findings from cost-inclusive evaluations in simple graphs
  • How to avoid or recover from ethical pitfalls common in cost-inclusive evaluations
  • How to anticipate, understand, and work with resistance to cost-inclusive evaluation

 

Workshop 4: ASSESS BEFORE YOU INVEST: SYSTEMATIC SCREENING AND EVALUABILITY ASSESSMENTS

Offered: Monday, June 10, 9:00 AM  12:00 PM; Monday, June 10, 1:30 PM – 4:30 PM  

Level: Intermediate

Speaker: Aisha Tucker-Brown; Kincaid Lowe Beasley

Evaluation resources are always scarce, and deciding where to invest in evaluation implementation is not an easy task.  Implementing systematic screening and assessments (SSA) can be used as a strategy to help make those decisions. Rather than risk the entire evaluation budget on an assumption that a program is being implemented with fidelity and is ready for evaluation, implementing a system that includes expert panel review and evaluability assessments (EA) to assess and sift through programs may help in the search for evidence of evaluation readiness. This workshop will present strategies for discerning when to invest in rigorous evaluation for evidence building. The workshop will discuss the practical implementation of SSAs and EAs to mitigate investment risk in the search for building evidence for programs operating in the field and how adaptations of the method have been used to identify readiness for evaluation when met with limited time and/or resources. Participants will have the opportunity to apply many of the steps through case examples.

What you will learn:

  • The utility of pre-evaluation
  • Approaches for assessing evaluability
  • How to develop criteria for selection and implement a SSA/EA
  • Approaches for sharing preliminary evidence
  • Adaptations of the SSA/EA

 

Workshop 5: BREAKING BARRIERS IN EVALUATION CAPACITY BUILDING: THE POWER OF COACHING

Offered: Monday, June 10, 1:30 PM  4:30 PM; Tuesday, June 11, 1:30 PM  4:30 PM 

Level: All Levels

Speaker: Betsy Block

Evaluators tend to hold power in their relationships with clients, which can undermine the client’s voice and prevent them from taking deeper, more meaningful ownership of their own evaluation work and data. We can be stronger evaluators if we prize the value our clients bring over our own expertise, so that they are equal partners in the evaluation process and participate in building their own evaluative capacity. Prioritizing client leadership and expertise requires us to incorporate coaching into our engagements. While evaluators have many tools to address technical evaluation barriers, we are rarely trained on dealing with the common perceptual or emotional barriers that get in the way. Coaching offers an impactful tool to break through these barriers and help people connect their questions to the data they need to answer them and to apply what they have learned. This workshop will introduce participants to the evidence-based principles and practices of evaluation coaching. Participants will experience coaching firsthand and discover how to use it in their own evaluation practice. 

What you will learn:

  • The distinctions between consulting, mentoring and coaching – and why these distinctions are critical in evaluation capacity building
  • The core competencies of coaching
  • Examples of how evaluation coaching has been used
  • Coaching approaches that can be effectively used in evaluation capacity building, and when to use them
  • How and where to get training to expand coaching competencies

 

Workshop 6: BUILDING INTERACTIVE DASHBOARDS IN EXCEL

Offered: Monday, June 10, 9:00 AM  12:00 PM; Monday, June 10, 1:30 PM  4:30 PM 

Level: Advanced

Speaker: Jennifer R. Lyons; Sarah Beu, Natalya Wawrin

Today, data is everywhere. Our clients and partners have access to massive amounts of data about finances, clients, and program effectiveness.  Extracting useful takeaway messages and next steps from our massive data can be very challenging. This is where a dashboard comes in handy! Dashboards can help an organization synthesize and make meaning from a large amount of data. Like your car’s dashboard, they summarize your organization's data in a visually engaging way that is simple and easy-to-understand.  They can be shared with boards, staff, donors, and grantmakers. The best part is that they can be interactive and auto-update when new data is added. When we can engage our audiences with intentional reporting, they will be more equipped to make data-driven decisions. In this training, we will make an interactive dashboard in Excel from scratch. Jennifer will teach best practices – grounded in the literature – that will make your data come to life in a meaningful and time-efficient manner. Participants will learn how to use pivot tables, pivot charts, graphics, and strategic text in designing the dashboard.

What you will learn:

  • The steps needed to build a dashboard in Excel
  • Graphic design best practices to enhance data visualizations with simple, implementable steps
  • How to be better equipped to present data that tells a story, leading to increased audience engagement and data-driven decision making

 

Workshop 7: COLLABORATIVE MAPPING: INTEGRATING STAKEHOLDER KNOWLEDGE FOR BETTER PROGRAM MODELS

Offered: Monday, June 10, 1:30 PM  4:30 PM; Tuesday, June 11, 1:30 PM  4:30 PM   

Level: Advanced

Speaker: Bernadette Wright; Steven E. Wallis

Evaluators have long recognized that an effective evaluation centers on a good understanding of the realities of how the program or activity being evaluated functions (or is expected to function). The American Evaluation Association’s Guiding Principles include understanding the range of perspectives and interests of people and groups. Yet, many evaluations are based on a simple understanding – often represented as a logic model or theory of change – that may not fit how people see the program.

In this session, participants will learn and practice “collaborative mapping,” a technique to surface, share, and integrate stakeholders’ understandings of their situations. Importantly, this new approach to mapping helps you create program models that are more useful for tracking and program evaluation. Collaborative mapping also lets you design new programs with a greater chance of successful implementation. Participants will also learn how to use the resulting map as a framework for decision-making, communication, and data tracking. This presentation is based on our upcoming SAGE Publications book, scheduled for publication in June 2019.

What you will learn:

  • How to facilitate a collaborative mapping process with stakeholders, to surface and integrate their perspectives on their situation
  • How to integrate multiple collaborative maps, to find areas of similarity and difference across stakeholder groups
  • Techniques for communicating large “messy” maps in a user-friendly way
  • How to guide conversations around a map for collaborative action-planning
  • How to select key performance indicators on your map, to focus data collection on what's most important

 

Workshop 8: CONSULTING SKILLS FOR EVALUATORS: AN INTRODUCTORY WORKSHOP TO LEARN THE BASICS FOR CONSULTING AND HOW TO START AN INDEPENDENT PRACTICE

Offered: Monday, June 10, 1:30 PM  4:30 PM; Wednesday, June 12, 9:00 AM – 12:00 PM

Level: Beginner

Speaker: Matt Feldmann

As a program evaluator, are you thinking about how to promote your "soft skills," or even considering starting your own independent practice? For many, the consulting aspects are uncomfortable and independent consulting is overwhelming. This workshop will reveal the simple but essential design and start-up skills needed for success. Matt Feldmann will lead this important introductory workshop that has been foundational to the development of many consulting practices and small internal independent evaluation shops. In addition to providing rich information on a variety of topics including marketing, financial planning, contracts, and business structures, the workshop also will include valuable samples, worksheets, and insider tips. Through this workshop, participants will identify clients, services, their competitive edge and prepare an action plan. 

What you will learn:

  • Determine if consulting is an appropriate career choice
  • Define your competitive edge
  • Develop a plan for generating new business
  • Explore what’s needed to set up shop and manage the day-to-day
  • Find out how to set fees, project expenses, and track time



Workshop 9: CREATING ONE-PAGE REPORTS: HOW TO ENGAGE BUSY READERS

Offered: Tuesday, June 11, 1:30 PM  4:30 PM; Wednesday, June 12, 9:00 AM – 12:00 PM

Level: All Levels

Speaker: Emma Perk; Lyssa Wilson Becho

One-page reports are a great way to provide a snapshot of evaluative findings to clients and stakeholders. Summarizing key outcomes, conclusions, or recommendations in a format that is easily and quickly digestible engages busy readers and draws attention to the important details of an evaluation. Although traditional, long-form evaluation reports are still an excellent way to distribute evaluation results, one-page reports increase readers’ engagement with evaluation findings, as well as their understanding and use of them. In this workshop, we will discuss what a one-page report is, why you might want to create a one, and how they are different from other types of reports. Then we will walk through 10 steps for creating a one-page report. These steps will guide participants through the creation process to help you determine what information should be included, how to layout the page, and what visual elements to add. Make sure to bring your computer with Microsoft PowerPoint or a similar program, an open mindset, and a fun attitude.

What you will learn:

  • What a one-page report is and how it can be used in your practice
  • How to identify the audience, purpose, and key information to include in a one-page report
  • Visual strategies to intentionally design and communicate data
  • Tips and tricks for creating one-page reports in Microsoft PowerPoint

 

Workshop 10: DESIGN THINKING FOR USER-CENTRIC STAKEHOLDER ENGAGEMENT IN EVALUATION

Offered: Monday, June 10, 9:00 AM  12:00 PM; Monday, June 10, 1:30 PM – 4:30 PM 

Level: All Levels

Speaker: Asma M. Ali, PhD; Isabel Cuervo, PhD

Despite the growing popularity of user-centered frameworks that facilitate the integration of stakeholders’ needs, life experiences, and understandings in evaluation processes, few instructional guides exist to support the engagement of underrepresented communities or other stakeholder groups. Often utilized in instructional and product design, Design Thinking (DT) techniques provide a guided process that can be time-limited and utilize participatory strategies that are transferable for various evaluation settings. The facilitators will share brief examples from their own work and guide participants in utilizing the five hands-on tools and techniques of DT. Participants will have the opportunity to work in small groups to apply these techniques to a sample project. This skill-building session will provide an opportunity for participants to more effectively integrate stakeholder experiences in the evaluation.  

What you will learn:

  • Examine real-life opportunities for DT stakeholder engagement solutions based on presenters’ and participants’ experiences
  • The five basic components of the Design Thinking process in stakeholder engagement for evaluations
  • How to apply Design Thinking theory, concepts, and tools in a case study exercise to support stakeholder engagement

 

Workshop 11: EVALUABILITY ASSESSMENT BASICS AND BENEFITS

Offered: Monday, June 10, 9:00 AM  12:00 PM; Tuesday, June 11, 9:00 AM – 12:00 PM 

Level: Beginner

Speaker: Tamara M. Walser; Michael S. Trevisan

Do you want to know which evaluation approaches and designs are most appropriate and feasible given program culture and context? Do you want to know if the theory of how a program is intended to work aligns with program reality? Do you want to increase program plausibility and effectiveness? Evaluability assessment is an approach to evaluation that addresses these questions. This workshop will include a brief overview of current evaluability assessment theory and practice, including its resurgence across disciplines and globally. The focus of the workshop will be the basics of implementing evaluability assessment using our four-component model as a guiding framework. Using evaluability assessment to support culturally responsive evaluation, address program complexity, and build evaluation capacity will be a thread for reflection and discussion throughout the workshop. Examples, case scenarios, small group activities, and discussion will allow participants to engage with the content and gain insight into how they can incorporate evaluability assessment in their work.

What you will learn:

  • Current theory and uses of evaluability assessment
  • How to implement an evaluability assessment
  • How evaluability assessment can support culturally responsive evaluation, address program complexity, and build evaluation capacity

 

Workshop 12: EVALUATING COALITIONS AND COLLABORATIVES

Offered: Tuesday, June 11, 1:30 PM – 4:30 PM; Wednesday, June 12, 9:00 AM – 12:00 PM

Level: All Levels

Speaker: Susan M. Wolfe; Ann Webb Price

This skill-development workshop is designed for evaluators who evaluate coalitions and community collaborative. Through lecture, discussion, and exercises, this hands-on, interactive skills development workshop will provide the foundations and tools needed to conduct evaluations of coalitions. Workshop presenters will review topics such as frameworks for evaluating coalitions, measures, and tools. Participants will participate in role play exercises that will highlight some of the challenges in evaluating community coalitions and collaborative. The presenters will share their experiences and provide case studies as a basis for discussion and to facilitate participants’ ability to apply the material to the types of situations and settings they will encounter.

What you will learn:

  • The theoretical and methodological frameworks that can be useful to frame the coalition evaluation
  • The stages of coalition development and their implications for evaluation
  • A variety of measures and tools available for coalition evaluation
  • Challenges to evaluating coalitions and how they can be overcome
  • Best practices that can be applied to ensure success

 

Workshop 13: EVALUATING THE TRIPLE BOTTOM LINE IMPACTS: ASSESSING PRIVATE, PUBLIC, AND NON-PROFIT SECTOR PROGRAMS

Offered: Monday, June 10, 9:00 AM – 12:00 PM; Tuesday, June 11, 9:00 AM  12:00 PM 

Level: All Levels

Speaker: Carter Garber; David Pritchard

The workshop gives an introduction to the topics and the challenges evaluators face as they are asked to evaluate social enterprises, public and non-profit programs that seek triple bottom line outputs (economic, social, and environmental). Participants share information about their level of experience with the topics and what they seek to learn. Facilitators illustrate their ample experience through North American and international examples. The workshop provides participants to share the lessons they have learned as well as what else they would like to learn in the field. The facilitators point participants to resources and further training opportunities. 

What you will learn:

  • How evaluators are measuring endeavors that seek to provide triple bottom line impacts (economic, social and environmental)
  • How evaluators can adapt and improve the methods used in the private sector to assess social and environmental impacts in public and non-profit programs
  • What investors, investment managers, foundations, governments, and CDC & health entities want measured to determine their further funding of triple bottom line entities in US and internationally
  • About the rapidly growing international fields of impact investing and social enterprises are providing new opportunities for evaluators who develop the skills that are required to be better consultants or staff

 

Workshop 14: EVALUATION FOR SOCIAL TRANSFORMATION - CANCELED

Offered: Monday, June 10, 9:00 AM – 12:00 PM; Monday, June 10, 1:30 PM – 4:30 PM 

Level: All Levels

Speaker: Mandy Pippin Whitaker

Evaluators of programs for social transformation must grapple with systems of supremacy and intersecting oppressions. This workshop helps evaluators become aware of how supremacy structures manifest in their evaluation practice and to develop creative ways to transform those practices into ways of conducting an evaluation that embodies the transformation sought. This workshop will be especially helpful to evaluators and teams seeking to help organizations transform their policies and programs for social justice work.

What you will learn:

  • Critical awareness of ways oppression and supremacy manifest in evaluation practice
  • Creative strategies to interrupt counter-productive practices and interactions
  • Skills for embodying transformative evaluator-client relationships
  • Self and organizational assessment of the best strategic approach for each participant’s transformative practice

 

Workshop 15: EVALUATION SURVEY DESIGN AND IMPLEMENTATION: BUILDING SKILLS TO BALANCE BEST PRACTICES AND REALITY

Offered: Monday, June 10, 1:30 PM – 4:30 PM; Tuesday, June 11, 1:30 PM  4:30 PM 

Level: All Levels

Speaker: Jeanette Y. Ziegenfuss, PhD; Jennifer Renner 

Surveys are an invaluable way to gather feedback from participants in the world of program evaluation, but designing and implementing a survey that measures your intended outcomes can be daunting. How do you know if you’re asking the right questions in the right way? How do you maximize participant response? How do you balance best practices and competing forces introduced through tight timelines, constrained resources, and client preferences? Using real world examples from a busy survey center, this session will answer these key questions and more, giving participants the tips and tricks needed to successfully design and implement their own evaluation surveys.

What you will learn:

  • The basics in survey and question design (and why it matters) including: question order and questionnaire length to maximize response, question writing to minimize response bias, how to think through types of response sets, and how to match the question root to the response options
  • Best practices in survey methodology: the pros and cons of different survey modes, how to maximize participant response, and when surveys may not be the best option (gasp!)
  • How to identify and avoid common mistakes in survey implementation

 

Workshop 16: EVALUATIVE THINKING: PRINCIPLES AND PRACTICES TO ENHANCE EVALUATION CAPACITY AND QUALITY

Offered: Tuesday, June 11, 1:30 AM – 4:30 PM; Wednesday, June 12, 9:00 AM – 12:00 PM

Level: All Levels

Speaker: Thomas Archibald; Jane Buckley 

How does one “think like an evaluator?” How can program implementers learn to think like evaluators? Recent years have witnessed an increased use of the term “evaluative thinking,” yet this particular way of thinking, reflecting, and reasoning is not always well understood. Patton warns that as attention evaluative thinking has increased, we face the danger that the term “will become vacuous through sheer repetition and lip service.” This workshop can help avoid that pitfall. Drawing from our research and practice in evaluation capacity building, in this workshop we use discussion and hands-on activities to address evaluative thinking and how to use it.

What you will learn:

  • What evaluative thinking (ET) is and how it pertains to their context
  • How to promote and strengthen ET among individuals and organizations with whom they work
  • How to use ET to identify assumptions, articulate program theory, and conduct evaluation with an emphasis on learning and adaptive management

 

Workshop 17: FOSTERING CULTURAL COMPETENCY AND LGBT DIVERSITY IN EVALUATION PRACTICE

Offered: Tuesday, June 11, 9:00 AM – 12:00 PM; Wednesday, June 12, 9:00 AM – 12:00 PM

Level: All Levels

Speaker: Gregory Phillips, II; Dylan Felt

Recent discourse within the field of evaluation has come to emphasize the importance of cultural responsiveness across all aspects of evaluation work. However, these efforts to date have primarily centered on race and ethnicity, in addition to extensive discussion on the inclusion of women and girls in evaluation work. While both of these discussions are timely and merited, there are other dimensions of culture and identity within individuals and community groups that must be considered. In her 2017 Culturally Responsive Evaluation and Assessment Conference keynote presentatio, Dr. Robin Miller called for the field to do better when it came to including sexual orientation and gender identity (SOGI) assessments within evaluation practice. To do so, it is crucial that evaluators have the language, understanding, and strategies to be inclusive of the LGBT community in their work. This workshop will highlight the importance of LGBT cultural competency and community inclusion both within evaluations tailored to this population, as well as in more general evaluation work. After laying the groundwork of why this matters, the facilitators will use activities and discussion sections to clearly define what inclusive and culturally responsive language entails when working with the LGBT community. The remainder of the workshop will focus on an interactive, collaborative approach to teaching participants how to apply this mindset and information to evaluation design, and will provide guidance for participants to help strategize how they will implement this knowledge in their work moving forward.

What you will learn:

  • The importance of collecting SOGI data, and of exhibiting LGBT cultural competence and inclusion in their evaluation work
  • Common identity categories, terminology, and language that is inclusive and culturally responsive to the LGBT community
  • Best practices for collecting demographic data on sex, sexual orientation, and gender identity
  • Actionable strategies for overcoming anticipated barriers and integrating lessons learned into evaluation design and implementation

 

Workshop 18: INTRODUCTION TO COMMUNITIES OF PRACTICE (COP) FOR EVALUATORS AND EVALUATION

Offered: Monday, June 10, 9:00 AM  12:00 PM; Tuesday, June 11, 9:00 AM – 12:00 PM

Level: Beginner

Speaker: Leah Christina Neubauer; Thomas Archibald

This interactive skill-building workshop will introduce Communities of Practice (CoPs) and demonstrate their application for monitoring, evaluation, learning, and professional development for the evaluation community. CoPs are designed to engage learners in a process of knowledge construction around common interests, ideas, passions, and goals. Through identifying the three core CoP elements (domain, community, and practice), members work to generate a shared repertoire of knowledge and resources. These three core elements often provide the foundation for monitoring, evaluation, and evaluative thinking. Increasingly, evaluators are called to evaluate and participate in CoPs in their settings. CoPs can be found in many global arenas: monitoring and evaluation teams, corporations, schools, and non-profit settings. This worskhop will explore CoP development and implementation for evaluators focused on understanding experience, increasing knowledge, and ultimately improving practice. Workshop facilitators will highlight examples from the fields of evaluation, health promotion, and extension education. Participants will engage in a series of hands-on inquiry-oriented techniques and analyze how CoPs can be operationalized in relation to their respective areas of practice. CoP methodologies will be discussed including: storytelling, arts-based, collaborative inquiry, evaluative thinking, and critical self-reflection.

What you will learn:

  • How to describe key theories and models guiding CoPs
  • How to apply the CoP core elements
  • How to analyze tensions in CoP implementation
  • How to identify questions to guide the planning and evaluation of CoPs
  • How to discuss essential fundamentals of developing, evaluating and sustaining a CoPs

 

Workshop 19: INTRODUCTION TO SOCIAL NETWORK ANALYSIS FOR PROGRAM EVALUATION

Offered: Tuesday, June 11, 9:00 AM – 12:00 PM; Wednesday, June 12, 9:00 AM – 12:00 PM

Level: Beginner

Speaker: Rebecca Woodland

Social Network Analysis (SNA) is a methodological approach that enables the mathematical and visual examination of relationships between people, organizations, and/or other actors. A program’s capacity to diffuse innovations (i.e. create, transfer, and implement services), and to promote access to social capital (i.e. increase the potential and actual set of intellectual, social, and material resources made available through direct and indirect relationships with others) are mediated by the structure of social networks and the embedded ties between actors within them. SNA is predicated on a relational way of thinking in which individuals and groups are seen as structured, embedded, and active social networks. It has emerged as a powerful mechanism for examining how networks may impede or promote organizational learning and resource exchanges between individuals and groups within them. In this workshop participants will be introduced to SNA, explore how SNA can be used for program evaluation, and experiment with SNA data collection, visual analysis, and reporting using Excel, Ucinet, and NetDraw.

What you will learn:

  • Ways to incorporate SNA into their own evaluation practice
  • How to recognize key differences between SNA and quantitative and qualitative methods of data collection, analysis, and reporting
  • How to appreciate the principles of social capital theory (how networks constrain and/or support access to programmatic resources)
  • How to appreciate the principles of diffusion of innovation theory (how program networks constrain and support the pace, direction, and speed of desired change)
  • How SNA overlaps with organizational network analysis (ONA)
  • Dispell common myths related to SNA
  • How to collect and analyze 1-mode and 2-mode SNA data with Excel and Ucinet
  • How to visualize networks and make basic network maps using NetDraw

Workshop 20: IT’S NOT THE PLAN, IT’S THE PLANNING: STRATEGIES FOR EVALUATION PLANS AND PLANNING

Offered: Tuesday, June 11, 1:30 PM –  4:30 PM; Wednesday, June 12, 9:00 AM – 12:00 PM

Level: Beginner

Speaker: Sheila B. Robinson, EdD

If you don’t know where you are going, you will end up somewhere else (Yogi Berra). Few evaluation texts explicitly address the act of evaluation planning as independent from evaluation design or evaluation reporting. This interactive workshop will introduce learners to an array of evaluation activities that includes evaluation planning and prepararing a comprehensive evaluation plan. Participants will leave with an understanding of how to identify stakeholders and primary intended users of evaluation, the extent to which they need to understand and be able to describe the evaluand (the program), strategies for conducting literature reviews, strategies for developing broad evaluation questions, considerations for evaluation designs, and using the Program Evaluation Standards and AEA’s Guiding Principles for Evaluators in evaluation planning. Participants will be introduced to a broad range of evaluation planning resources including templates, books, articles, and websites.

What you will learn:

  • How to identify types of evaluation activities (e.g. questions to ask of potential clients) that comprise evaluation planning
  • Potential components of a comprehensive evaluation plan
  • How to identify key considerations for evaluation planning (e.g. client needs, collaboration, procedures, agreements, etc.)

 

Workshop 21: MIXED METHODS DESIGN IN EVALUATION

Offered: Monday, June 10, 9:00 AM – 12:00 PM; Tuesday, June 11, 9:00 AM – 12:00 PM

Level: Intermediate

Speaker: Donna Mertens

Developments in the use of mixed methods have extended beyond the practice of combining surveys and focus groups. This workshop combines theory and practice, based on the evaluation branches identified by Alkin (2013) and Mertens and Wilson (2019). The sophistication of mixed methods designs in evaluation based on different theoretical frameworks will be explained and demonstrated through illustrative examples taken from diverse sectors and geographical regions. Examples of mixed methods designs will illustrate application of different theoretical frameworks for projects that focus on determining the effectiveness of interventions. Participants will have the opportunity to create mixed method designs using evaluation vignettes for each approach to evaluation.

What you will learn:

  • The definition of mixed methods and terms used in describing mixed methods designs
  • The major theoretical frameworks that are guiding mixed methods design decisions
  • How to apply the theoretical frameworks to the design of a mixed methods evaluation using a case study

 

Workshop 22: PLACING POLICIES UNDER THE MICROSCOPE: THE USE OF LEGAL EPIDEMIOLOGY AS A FUNDAMENTAL TOOL FOR POLICY EVALUATION

Offered: Tuesday, June 11, 1:30 PM – 4:30 PM; Wednesday, June 12, 9:00 AM – 12:00 PM

Level: Advanced

Speaker: Erika Fulmer; Tara Ramanathan

The 50 United States are often referred to as “laboratories of democracy.”  Whenever a state implements a new law or regulation, it experiments in governance in a way that can impact the population.  The same can be said of counties, cities, and towns.  We as evaluators can tap into this legal world to better understand when government actions help good ideas stick.  The purpose of this workshop is to teach participants how to use a practical, staged policy evaluation approach that utilizes legal epidemiology–the study of how law impacts population wellness–to link evidence to policy and population data.  Participants will learn basic principles of legal epidemiology for evaluation projects; have the opportunity to research, code, and analyze state and local laws and regulations in groups; and build feasible evaluation plans.  Participants do not need to have a legal background to benefit from this interactive workshop.

What you will learn:

  • How to apply a practical, staged approach to guide comprehensive policy evaluation
  • When and how to use legal epidemiology for assessing the public health impacts of law
  • How to incorporate legal epidemiology into mixed methods evaluation planning
  • What practical strategies can be applied to address the challenges of policy evaluation

 

Workshop 23: QUALITATIVE EVALUATION METHODS: BALANCING RIGOR, TIME, AND RESOURCES

Offered: Monday, June 10, 1:30 PM – 4:30 PM; Tuesday, June 11, 9:00 AM – 12:00 PM

Level: Beginner

Speaker: Christine Lagana Riordan

This workshop will provide a variety of hands-on strategies to help evaluators conduct successful qualitative evaluation studies. It is designed for evaluators who want to learn more about qualitative evaluation methods including design and planning, interview and focus group facilitation, and basic analytic techniques using the constant comparison method. Participants will learn skills that can be applied to “real world” program evaluations in government, educational, and non-profit settings where time and resource constraints may hinder qualitative strategies. Practical exercises, case studies, small group activities, and discussion will equip participants with tools and resources that they can begin using in their evaluation work.

What you will learn:

  • Tactics for conducting qualitative evaluation studies with “real-world” time and resource constraints
  • Approaches for designing successful qualitative data collection efforts
  • Effective facilitation techniques for use in interviews and focus groups
  • Hands-on strategies for qualitative data analysis with and without a framework

 

Workshop 24: REFRAMING EVALUATION PRACTICE THROUGH APPRECIATIVE INQUIRY AND POSITIVE PSYCHOLOGY

Offered: Monday, June 10, 1:30 PM – 4:30 PM; Tuesday, June 11, 9:00 AM – 12:00 PM

Level: Beginner

Speaker: Stewart I. Donaldson; Tessie Tzavaras Catsambas

This workshop will use mini-lecture, exercises, and discussions to help participants learn how to apply positive and appreciative perspectives and approaches to improve the comprehensiveness, accuracy, and usefulness of their evaluations.  A range of examples from evaluation practice will be provided to illustrate main points and key messages.

What you will learn:

  • How to define and describe positive and appreciate evaluation perspectives and approaches
  • How positive perspectives and approaches can improve evaluation practice
  • How appreciative perspectives and approaches can improve evaluation practice
  • How positive and appreciate perspectives can be used to guide evaluation decisions
  • How to use positive and appreciated perspectives to improvement the implementation of the CDC Evaluation Framework

 

Workshop 25: SYSTEMS THINKING: BASIC CONCEPTS FOR EVALUATION PRACTICE

Offered: Monday, June 10, 9:00 AM – 12:00 PM; Tuesday, June 11, 9:00 AM – 12:00 PM

Level: Beginner

Speaker: Janice Noga

The discussion and use of systems concepts in evaluation practice has increased dramatically in the last few years. As much as we might intuitively understand that we live in an inter-connected world, it can be challenging for evaluators to know how to skillfully apply these concepts in an evaluation. This workshop provides an introduction to the basic concepts of systems thinking. Through small group work, participants will learn to provide a solid foundation for continued practice. Participants will find the techniques and tools used during the workshop useful in working with evaluation clients and stakeholders to help them understand and appreciate a systems approach.

What you will learn:

  • The core systems concepts, including boundaries, perspectives, interrelationships, that are foundational to systems approaches and how they relate to evaluation practice
  • How systems thinking can be used in program evaluation to deepen contextual understanding of the program; clarify program theory and program logic; and expand stakeholder/client understanding of programs and their potential outcomes

 

Workshop 26: THE BASICS OF USING THEORY TO IMPROVE EVALUATION PRACTICE

Offered: Tuesday, June 11, 1:30 PM – 4:30 PM; Wednesday, June 12, 9:00 AM – 12:00 PM

Level: Beginner

Speaker: John LaVelle; Stewart Donaldson

This workshop is designed to provide practicing evaluators with an opportunity to improve their understanding of how to use theory to improve evaluation practice.  Lecture, exercises, and discussions will help participants learn how to apply evaluation theories, social science theories, and stakeholder theories of change to improve the accuracy and usefulness of their evaluations.  A range of examples from evaluation practice will be provided to illustrate main points and key messages.

What you will learn:

  • How to define and describe evaluation theory, social science theory, and program theory
  • How theory can be used to improve evaluation practice
  • How implicit and explicit social science theories can be used to guide evaluation decisions
  • How to describe the components and processes of several commonly used social science theories that have been used to develop and evaluate programs
  • How developing stakeholder’ theories of change can be used to improve evaluation practice

Workshop 27: THE INTERACTIVE SPORT OF QUALITATIVE DATA ANALYSES

Offered: Monday, June 10, 9:00 AM  12:00 PM; Tuesday, June 11, 9:00 AM  12:00 PM  

Level: Intermediate

Speaker: Ayana Perkins

Qualitative methods are recognized as important strategies for evaluation. In this workshop, participants will explore the use of cost effective qualitative analyses and visualization strategies to improve evaluation effectiveness in supporting positive change among stakeholders, communities, and policies.  The workshop is divided in two parts. The first part describes qualitative standards of evidence, data preparation, the strengths and limitations of different qualitative software, and benefits of thematic analysis and constructive grounded theory. At the end of the first session, participants will work in groups to develop a hierarchical coding structure based on content provided by the instructor.  Each of the groups will present their findings and explain the rationale for their coding structure.  During the second part of the course, participants will perform an independent exercise in data visualization using the free 30-day trial of NVivo 11 software.  This workshop is fast paced so materials and supplementary videos will be provided one month before the course starts.

What you will learn:

  • How to differentiate between quantitative and qualitative standards of evidence
  • Understand strengths and weaknesses of popular qualitative software
  • How to identify steps in independent constructive grounded theory
  • How to identify steps in group thematic analyses
  • How to create preconfigured nodes, In vivo nodes, and emerging nodes
  • How to develop a word map from your nodes created under “Open Codes”
  • How to generate “hierarchy chart of nodes coded at nodes” from preconfigured nodes
  • How to create mind map to organize codes in hierarchical structure
  • How to create nodes based on mind map

 

Workshop 28: TOOLS FOR MAPPING POWER AND PRIVILEGE: ADVANCING EQUITY IN EVALUATION

Offered: Monday, June 10, 9:00 AM – 12:00 PM; Monday, June 10, 1:30 PM – 4:30 PM

Level: Beginner

Speaker: Cristina Magana; Gaby Magana

This workshop will offer practical tools that culturally-responsive evaluators can use to identify and navigate issues of power and privilege in order to promote diversity, equity and inclusion in settings that may not be receptive or attuned to those values. The presenters will frame the timeliness of this issue, drawing from more than 30 years of experience conducting evaluations with diverse communities in a variety of sociopolitical contexts. They will describe how their consulting firm has evolved beyond addressing diversity to integrating practices that incorporate equity and inclusion in their work. The presenters will also share tools for navigating issues of power and privilege and guide participants through application of the tools through interactive activities. Participants will learn how to foster individual and organizational reflection around issues of power and privilege, and how to identify and address power dynamics among diverse stakeholders, clients, and teams.

What you will learn:

  • The importance of understanding how power and privilege structure barriers and opportunities in order to advance equity in evaluation
  • How to use and apply tools for identifying, analyzing, and addressing issues related to power and privilege
  • How to tailor tools to use within their work settings and/or at their organizations

 

Workshop 29: USING THE AEA EVALUATOR COMPETENCIES TO ENHANCE PROFESSIONAL PRACTICE

Offered: Monday, June 10, 9:00 AM – 12:00 PM; Tuesday, June 11, 9:00 AM – 12:00 PM

Level: All Levels

Speaker: Laurie Stevahn

The recently adopted AEA Evaluator Competencies join three previously adopted professional foundations for effective evaluation practice, namely the Program Evaluation Standards, the AEA Guiding Principles, and the AEA Public Statement on Cultural Competence. In this interactive workshop participant s will learn how the AEA Evaluator Competencies were developed; examine specific competencies within each of the five broad competency domains; consider their usefulness for grounding and guiding practice in one’s own professional context; use the competency self-assessment tool to reflect on one’s own strengths and areas for growth; and discuss implications for further professionalizing the field of evaluation.

What you will learn:

  • How the AEA Evaluator Competencies were developed and recurring complex issues that surfaced throughout
  • The AEA Evaluator Competencies taxonomy comprised of five broad domains, each containing specific relevant competencies
  • How the AEA Evaluator Competencies apply to one’s own professional context, such as education, health, social services, business, government, international, nonprofit, etc.
  • Personal strengths and opportunities for growth based on a competency self-assessment tool for reflection on one’s own evaluation practice
  • Implications for further professionalizing the field of evaluation

 

Workshop 30: USING THE ESSENTIAL ELEMENTS FRAMEWORK TO GUIDE STRATEGY SELECTION, ADAPTATION, AND EVALUATION

Offered: Tuesday, June 11, 9:00 AM – 12:00 PM; Wednesday, June 12, 9:00 AM – 12:00 PM

Level: All Levels

Speaker: Kimberley Freire; Linda Vo, MPH

Practitioners often balance delivering evidence-based strategies as originally designed (i.e., with fidelity) and making adaptations in response to real world circumstances. One approach to balancing fidelity with flexibility is to identify or estimate the essential elements of a strategy and to assess how potential adaptations maintain or compromise these elements. Essential elements – sometimes called core elements or core components – are the active ingredients of a prevention strategy assumed to be responsible for achieving intended outcomes. They include what strategy content and principles should be delivered, how they should be delivered, and characteristics of who should ideally deliver or lead a strategy. The Essential Elements Framework and guidance was developed within violence prevention and draws on implementation and adaptation models applied to various public health topics. The Framework was originally developed to use with evidence-based strategies, and has been applied to strategies developed in practice settings to facilitate program evaluation. This workshop brings together implementation science and program evaluation methods as complementary parts of public health and social change strategies. Facilitators will focus on using the Framework to select, adapt and evaluate prevention strategies including programs, policies and practices. During this workshop, presenters will define essential elements and related concepts and guide participants through a series of activities that build on each other to meet the learning outcomes.

What you will learn:

  • How to assess fit of a strategy to a local context
  • How to apply essential elements to develop implementation evaluations
  • Methods to track and evaluate adaptations for different types of prevention strategies
  • A process and methods to engage stakeholders in defining the essential elements of practice- based strategies to facilitate evaluation

 

Workshop 31: UTILIZATION OF A RACIAL EQUITY LENS TO HELP GUIDE STRATEGIC ENGAGEMENT AND EVALUATION

Offered: Tuesday, June 11, 1:30 PM – 4:30 PM; Wednesday, June 12, 9:00 AM – 12:00 PM

Level: All Levels

Speaker: LaShaune Johnson; Mindelyn Anderson

The field of evaluation is being challenged to move from the traditional role of evaluation, and its perceived role of objectivity, to a process that considers who is being evaluated and who is conducting the evaluation. Over four years ago, Public Policy Associates, Inc. (PPA) worked to develop useful frameworks, tools, and approaches for evaluators to include in their toolkits to focus on the ways that race and culture influence an evaluation process. This practice resulted in the development of a framework for conducting evaluation using a racial equity lens. This workshop focuses on the practical use of a racial equity lens when conducting evaluation. The framework argues that culture and race are important considerations when conducting an evaluation because there are both critical and substantive nuances that are missed, ignored, and/or misinterpreted when an evaluator is not aware of the culture of those being evaluated and does not adopt a racial equity approach to the work. Participant s will be provided with a Template for Analyzing Programs through a Culturally Responsive and Racial Equity Lens, designed to focus deliberately on an evaluation process that takes race, culture, equity, and community context into consideration. Presenters will also share a “How-to Process” to identify the cultural competencies of individuals conducting evaluations and strategies for how these  competencies can be improved. This “How-to Process” is the result of developing a self-assessment instrument for evaluators, is based primarily on the cultural-proficiencies literature, and relates specifically to components of the racial equity template.

What you will learn:

  • A framework for conducting evaluation using a racial equity lens
  • Key elements of the framework that can improve the quality of evaluation in diverse settings
  • How to assess their own individual cultural background and explore how it influences the design choices they make in their evaluation work



Workshop 32: YOU BUILT IT. WHY WON’T THEY COME? EVALUATION CAPACITY BUILDING

Offered: Monday, June 10, 1:30 PM – 4:30 PM; Tuesday, June 11, 1:30 PM – 4:30 PM

Level: Intermediate

Speaker: Jasmine Williams-Washington; Michelle Revels

Many philanthropic organizations have made it their mission to improve the lives of children and communities of color. To create and support the systemic and comprehensive change needed for this improvement, many of these foundations have embraced place-based grantmaking. Place-based grantmaking enables foundations to focus on improving whole communities rather than a single issue (Murdoch, 2007). Many foundations have also made a significant investment in increasing the evaluation capacity of non-profit and community—based organizations that are their grantees in their chosen places.  Evaluation Capacity Building (ECB), also known as evaluation technical assistance,  is critical to the success of  grantmaking because it equips community-based organizations with the skills and tools needed to assess and monitor their progress; make changes as needed, and ultimately,  to tell their story. ECB is more than increasing community-based organizations capacity to collect, analyze, and interpret data (Chavis, 2015).  However, grantees often do not take full advantage of the wide array of resources and assistance offered.  This  workshop wil discuss the pivotal, and somewhat unique, role of evaluation capacity building; discuss common reasons for non-participation; discuss the role of the evaluation TA provider, who is often an outsider to the community; and outline strategies to encourage active and frequent participation in evaluation capacity building activities. It is important to know that the evaluation capacity building approach highlighted during this workshop is grounded in an understanding and respect for place, culture, and history.

What you will learn:

  • What is evaluation capacity building
  • The role of the evaluation capacity builder
  • Challenges associated with doing evaluation capacity building “in place”
  • Proven effective strategies for evaluation capacity building
  • How to build effective relationships that encourage organizations to use the resources
  • How to create a narrative around evaluation that will encourage people to take advantage of evaluation capacity building activities



Workshop 33: STRATEGIC CHECK-IN: USING LOGIC MODELS AND EVALUATION TOOLS FOR PRACTICAL AND RAPID STRATEGIZING AND STRATEGIC PLANNING

Offered: Monday, June 10, 9:00 AM – 12:00 PM

Speaker: Tom Chapel

“Process use” is one of the most valuable terms in evaluation.  It refers to cases where the evaluator’s added value is NOT the findings of the data collection, but the systematic examination of the program, stakeholders, and key evaluation questions at the start of the evaluation.  While these will lead to strong evaluations in time, at the outset of the evaluation, programs identify logical gaps or areas of weakness and strength which they can address in real time. This added value makes evaluators welcome at the table from the start.  In this highly interactive session, Tom Chapel will use real-life cases and exercises to show how evaluation tools, especially simple logic models, can help programs identify challenges, gaps, and actionable insights that, once implemented, will increase program effectiveness.  Better yet, the output of this process can easily be used to test the program’s existing vision, mission, and goals and objectives, and/or generate these customary components of a strategic plan.

What you will learn: 

  • How to convert program narrative into simple program logic models/roadmaps
  • How to describe how key terms in a logic model yield insights that can improve program planning and implementation
  • How to convert the output of these roadmap discussions into a traditional strategic plan



Workshop 34: REFLECTIVE PRACTICE FOR EVALUATORS: WHAT IT IS. WHY IT MATTERS. TIPS AND TOOLS TO DO IT.

Offered: Monday, June 10, 1:30 PM – 4:30 PM

Speaker: Sarah Gill

Good evaluation has always been about reflection. Do our interventions work? If not, why not and what can we do differently. AEA’s recently-adopted evaluator competencies, call on evaluators to reflect on their practice, as well, because reflection is one of the primary ways professionals learn.  

In this highly-interactive session, Sarah Gill will integrated approaches from other fields with a recently-published framework for reflection by evaluators. While that framework arose from evaluation for social justice, the approach is transferable to any issue area. Sarah will guide us through several reflective activities in small groups, and use them to identify opportunities for a “reflective pause” in their day-to-day work. The article describing the framework (Archibald, Neubauer, and Brookfield: “The Critically Reflective Evaluator: Education’s Contributions to Evaluation for Social Justice”) will be sent in advance. While not required reading, participants will benefit from skimming the approach in advance

What you will learn:

  • How professional reflection can support both improved professional practice and social transformation
  • How to incorporate a “reflective pause” in your day-to-day work
  • How to avoid some of the stumbling blocks on the way to consistent professional reflection
  • The important role reflection partners play in “assumption hunting,” a key aspect of critical reflection
  • How to identify useful reflection tools and partners