SUMMER EVALUATION INSTITUTE 2018

AEA_213565-17_SEI_750x200_v1.jpg

Workshop Descriptions

Workshop 1: AN INTERACTIVE AND CASE-CENTERED PRIMER ON EVALUATION APPROACHES

Offered: Sunday, June 17, 9:00 am – 4:00 am

Level: Beginner

Speaker: Bianca Montrosse-Moorhead, University of Connecticut

All evaluation rests on an evaluation approach (i.e., theory or model). Sometimes this approach is explicit and sometimes it is implicit. Either way, evaluation approaches guide the reasoning for doing an evaluation, how it will be done, who will be involved and how, and what will be done with results and by whom. This interactive, case-centered workshop covers historical and contemporary evaluation approaches in diverse national and international contexts. Led by a speaker with real-world experience applying and teaching different approaches in practice, this workshop is targeted toward early career evaluators and graduate students with limited or no prior knowledge of evaluation approaches. Senior evaluators and evaluation educators who wish to expand their knowledge and use of contemporary theories and approaches may also benefit. The purpose of this workshop is to use real-life cases to learn about historical and contemporary approaches to designing and implementing evaluation, and practice putting this knowledge to use. Participants completing the workshop will gain insight into how their own backgrounds, training, and contexts may influence their choice or preference for particular approaches. Attendees will be asked to bring a computer and to read a case prior to the workshop, which will be emailed approximately seven days in advance.

What you will learn:

  • How to make sense of terms and concepts used in the evaluation approach literature
  • Why knowledge of different evaluation approaches is important for evaluation practice
  • Different methods-, use-, values-, and social justice-oriented evaluation approaches, including which approaches that are adopted, adapted, or indigenous
  • How to apply different evaluation approaches in practice

 

Workshop 2: INTRODUCTION TO EVALUATION

Offered: Sunday, June 17, 9:00 am – 4:00 am

Level: Advanced Beginner

Speaker: Thomas Chapel, Centers for Disease Control

This workshop will provide an overview of program evaluation for Institute participants with some, but not extensive, prior background in program evaluation. The foundations of this workshop will be organized around the Centers for Disease Control and Prevention’s (CDC) six-step Framework for Program Evaluation in Public Health as well as the four sets of evaluation standards from the Joint Commission on Evaluation Standards. The six steps constitute a comprehensive approach to evaluation. While its origins are in the public health sector, the Framework approach can guide any evaluation. The workshop will place particular emphasis on the early steps, including identification and engagement of stakeholders, creation of logic models, and selecting/focusing evaluation questions. Through case studies, participants will have the opportunity to apply the content and work through some of the trade-offs and challenges inherent in program evaluation in public health and human services.

What you will learn:

  • A six step framework for program evaluation
  • How to identify stakeholders, build a logic model, and select evaluation questions
  • The basics of evaluation planning

 

Workshop 3: ADDING COSTS TO MAKE YOUR EVALUATION MORE IMPACTFUL (AND BETTER USED): COST-EFFECTIVENESS, COST-BENEFIT, COST-UTILITY ANALYSES FOR HEALTH AND HUMAN SERVICES

Offered: Tuesday, June 19, 9:00 am – 12:00 am; Wednesday, June 20, 9:00 am – 12:00 pm

Level: Beginner

Speaker: Brian Yates, American University

Can evaluating the costs of programs – to consumers as well as providers and taxpayers—be the missing link between doing an evaluation and doing an evaluation that gets funded, used, and changes policy? Evaluating the monetary outcomes (aka “benefits”) of programs, such as reduced client use of health services and increased client productivity and income, calculating the cost per Quality-Adjusted Life Year (QALY) added by a program,  incorporating consumer, provider, and funder costs into cost-effectiveness, cost-benefit, and a cost-utility analyses are all steps that can help influence funders. Through this workshop, you will learn better undstand a “cost study” and how cost-inclusiveness can make your evaluation more impactful.

What you will learn:

  • How to recognize, interpret, and use findings from basic analyses of cost, cost-effectiveness, cost-benefit, and cost-utility
  • To design and conduct basic evaluations that include costs of programs as well as the monetary and other universal outcomes resulting from programs and how to communicate findings from cost-inclusive evaluation in simple graphs
  • How to recognize and avoid or recover from pitfalls common in cost-inclusive evaluations, including political and ethical problems
  • How to anticipate, understand, and work with resistance to cost-inclusive evaluation

 

Workshop 4: AUDIENCE ENGAGEMENT STRATEGIES FOR POTENT PRESENTATIONS

Offered: Monday, June 18, 1:30 pm – 4:30 pm; Tuesday, June 19, 9:00 am – 12:00 pm

Level: Beginner

Speaker: Sheila Robinson, Greece Central School District

Presentations are ultimately about audience learning. Successful presenters work in service to their participants, whether they are presenting an evaluation report, giving a keynote speech, facilitating a professional development workshop or training workshop, or even running a meeting with stakeholders. Participants in this highly interactive workshop will learn fundamental principles of audience engagement by experiencing each activity as they consider how it can inform their own presentation planning. They will experience more than a dozen ways to engage adult audiences with activities and mini-lessons that flow together coherently, while allowing time for processing and reflection throughout the course. Participants will come to understand the why, the what and the how of audience engagement, explore purpose distinctions for audience engagement strategies, and receive instruction on how to effectively integrate these strategies into presentation planning. During demonstrations the speaker will also share how each principle and strategy has been used in an evaluation-specific context. The guiding principle of this workshop will be, as marketing guru Seth Godin claims, ”Every presentation worth doing has just one purpose: To make change happen.”

What you will learn:

  • Fundamental principles of audience engagement that can inform presentation and professional development planning
  • To articulate several purpose distinctions for audience engagement strategies
  • How to effectively integrate audience engagement strategies in their presentation and professional development planning
  • How to use a variety of interactive strategies to engage audiences for maximum participant satisfaction and learning

 

Workshop 5: CAUSAL KNOWLEDGE MAPPING FOR PRACTICAL PROBLEM-SOLVING

Offered: Monday, June 18, 1:30 pm – 4:30 pm; Tuesday, June 19, 9:00 am – 12:00 pm

Level: Intermediate

Speaker: Bernadette M. Wright, Meaningful Evidence, LLC; Steven E. Wallis, ASK MATT Solutions

A growing number of researchers in diverse disciplines are using “causal knowledge mapping” and related techniques to integrate and present research results in a way that is useful for solving complex problems. Based on our upcoming book with SAGE Publishing, this course will walk participants through the creation, presentation, and improvement of causal knowledge maps for practical problem-solving. With lots of hands-on work and very little jargon, this new tool makes evaluation easier and more effective.

What you will learn:

  • How to create causal knowledge maps to show insights gained from evaluation and research
  • How to quantitatively and qualitatively asses their maps to develop recommendations for action
  • How to create an online, interactive map to support communication and use of research results, using the KUMU platform

 

Workshop 6: COGNITIVE INTERVIEWING METHODOLOGY

Offered: Tuesday, June 19, 1:30 – 4:30 pm; Wednesday, June 20, 9:00 am – 12:00 pm

Level: All Levels

Speaker: Kristen Miller, Centers for Disease Control and Prevention; Meredith Massey, Collaborating Center for Question Design and Evaluation Research/ Centers for Disease Control and Prevention

This course will cover: the importance of data quality and question evaluation for survey research; question response theory and the influence of socio-cultural context on question response; overview of the method, specifically, how to conduct a cognitive interviewing study; and software applications and tools to assist in cognitive interviewing studies.

What you will learn:

  • A better appreciation for question evaluation, validity, and comparability
  • The steps to conduct a cognitive interviewing study
  • Difficulties particular to multicultural, multi-lingual cognitive interviewing studies
  • How to set priorities and trouble shoot problems
  • To explore tools that facilitate analysis and improve data quality

 

Workshop 7: COMPUTER MAPPING (GIS) APPLICATIONS IN EVALUATION

Offered: Tuesday, June 19, 9:00 – 12:00 pm; Tuesday, June 19, 1:30 – 4:30 pm

Level: Beginner

Speaker: Stephen Maack, REAP Change Consultants; Arlene Hopkins, Arlene Hopkins and Associates; Aaron Wilson Kates, Western Michigan University

This workshop provides an introduction to geographic information systems (GIS) and how computer mapping and analyses can inform a variety of evaluation stages. It presents basic GIS mapping concepts and basic and intermediate spatial analysis methodology using actual applications in evaluation. The course will discusses a variety of available software – paid and free – with demonstrations of ArcGIS (paid) and open-source QGIS (free). There will be presentation of quality map standards and limits to the use of GIS and spatial analysis methodology.  Participants will engage in exercises to deepen their understanding of GIS.  However, due to time limits, software complexity, and technology limitations at the course will NOT include hands-on training of GIS software.

What you will learn:

  • Basics of computer mapping (GIS) methodology
  • Information about free and paid computer mapping software
  • Information about available secondary data (e.g., census data) and its limitations
  • Sources for GIS base maps
  • Applications of GIS analyses in evaluation
  • Limitations of GIS
  • How to do GIS analyses
  • How to evaluate computer maps and their use in program and policy evaluations
  • Synthesis of GIS concepts and analyses in evaluation related exercise

 

Workshop 8: DESIGNING EFFECTIVE PILOT PROGRAM EVALUATIONS IN PARTNERSHIP WITH COMMUNITIES

Offered: Tuesday, June 19, 9:00 – 12:00 pm; Tuesday, June 19, 1:30 – 4:30 pm

Level: All Levels

Speaker: Erin McDonald, Feeding America; Shana Alford, Feeding America

This interactive and scenario based workshop will focus on designing and evaluating community pilot programs. When evaluators are tasked with determining whether a pilot program has been effective, there are several factors to consider such as community need and context, program design and delivery, engagement with stakeholders, cost and benefits, and sustainability. Through universal examples of lessons learned from national and international pilot experiences, with emphasis on avoiding common pitfalls, participants will examine core principles of program design theory, community engagement, and navigating implementation and data collection in pilot environments; understand how to set clear objectives to frame pilots as opportunities for learning and assessment with community partners; and discuss ways to design monitoring and evaluation processes to detect early indicators of setbacks or challenges. Upon completion of this workshop, participants will learn and better understand strategies, tools and critical thinking frameworks for incorporating effective evaluation methodology into the design and execution of community pilot programs, and benefit from peer learning exercises and discussions.

What you will learn:

  • Set clear objectives that will frame pilots as learning opportunities
  • Develop evaluation planning techniques for pilot programs
  • Identify evaluation methods that will help you to navigate implementation and data collection in pilot environments
  • Examine community engagement and partnership strategies appropriate for pilot programs
  • Discuss risks and common pitfalls associated with pilot programs



Workshop 9: EMPOWERING COMMUNITY ORGANIZATIONS TO COLLECT, ENTER, AND USE EVALUATION DATA

Offered: Monday, June 18, 1:30 pm – 4:30 pm; Wednesday, June 20, 9:00 am – 12:00 pm

Level: Beginner

Speaker: Peter Lindeman, Northwestern University; Emily Bettin, Institute for Sexual and Gender Minority Health and Wellbeing; Gregory Phillips II

The Center for the Evaluation of HIV Prevention Programs in Chicago (Evaluation Center) is a collaborative effort between the Evaluation, Data Integration, and Technical Assistance (EDIT) Program, the Center for Prevention Implementation Methodology at Northwestern University, and the AIDS Foundation of Chicago. The Evaluation Center oversees the evaluation efforts and serves as a capacity-building entity for 20 HIV prevention demonstration projects funded by the Chicago Department of Public Health at 15 sites throughout the city. Using an Empowerment Evaluation approach the Evaluation Center has engaged with staff from the delegate agencies to plan and conduct 20 site-specific evaluations. During this project, the Evaluation Center has placed an emphasis on helping the partner agencies learn to write their own data collection questions, collect and enter the data, and then use the data to improve or scale their programming. During this workshop, the Evaluation Center staff will share best practices about how to foster community buy in and effectively increase capacity at these organizations to take the lead on collecting and using data.

What you will learn:

  • How to engage staff at community organizations to participate in the creation of quality data collection tools
  • How to create customized data entry tools to empower community organizations to enter quality data
  • How to use simple data visualization to identify potential data collection/entry errors and build community evaluation capacity
  • How to engage community organizations in the use and/or dissemination of evaluation findings

 

Workshop 10: ESSENTIALS FOR SURVEY RESEARCH METHODOLOGY

Offered: Tuesday, June 19, 1:30 – 4:30 pm; Wednesday, June 20, 9:00 am – 12:00 pm

Level: All Levels

Speaker: John Barner, University of Georgia Carl Vinson Institute of Government

The workshop provides specific detail on methods, tips, and strategies for survey success. The workshop covers the essentials of survey research methodology and does not require an extensive knowledge of statistics (though statistical terms and analyses are covered in the material) and provides a wealth of resources for people conducting survey research at any level, from those who would be designing their first ever survey question to those who are already experienced with fielding surveys.

What you will learn:

  • Reasons for utilizing survey methods in evaluation
  • Basics of question design, survey mode, and sampling
  • Basic principles of implementation, sampling, measurement of reliability and validity, and reporting of results
  • Several models of data collection and types of measurement error controlled by varying methods
  • Common issues in survey research as well as tips for avoiding common problems

 

Workshop 11: ETHNOGRAPHIC EVALUATION: MOBILIZING EVIDENCE IN CULTURAL CONTEXT

Offered: Tuesday, June 19, 1:30 – 4:30 pm; Wednesday, June 20, 9:00 am – 12:00 pm

Level: All Levels

Speaker: Mary Odell Butler, University of Maryland; Eve Pinsker, University of Illinois at Chicago (UIC)

This workshop presents a program evaluation approach that is both culturally attuned to program realities and faithful to the evidence base required for credible evaluation.  Ethnographic orientation and mixed methods are used to build evidence that can support action and policy grounded in cultural complexity. In this workshop, participants will review methods for evidence development and participate in a mix of presentations, group problem solving, and a one-hour laboratory exercises utilizing this approach . The workshop is derived from Dr. Butler’s book, Evaluation, A Culture Systems Approach (2015) Taylor and Francis.

What you will learn:

  • Design program evaluations that are both culturally sensitive and rigorously evidence-based
  • How to use ethnography as a conceptual framework mixed method evaluations of culturally-embedded evaluations
  • Understand the development of evidence in ethnographic evaluation

 

Workshop 12: EVALUABILITY ASSESSMENT BASICS AND BENEFITS

Offered: Tuesday, June 19, 1:30 – 4:30 pm; Wednesday, June 20, 9:00 am – 12:00 pm

Level: All Levels

Speaker: Tamara Walser, UNC Wilmington; Michael S. Trevisan, Washington State University

Do you want to know which evaluation approaches and designs are most appropriate and feasible given program culture and context? Do you want to know if the theory of how a program is intended to align with program reality? Do you want to increase program plausibility and effectiveness? Evaluability assessment is an approach to evaluation that addresses these questions. This workshop will include a brief overview of current evaluability assessment theory and practice, including its global resurgence across disciplines. The focus of the workshop will be the basics of implementing evaluability assessment using our four-component model as a guiding framework. Participants will learn to use evaluability assessment to support culturally responsive evaluation, address program complexity, and build evaluation capacity. Examples, case scenarios, small group activities, and discussion will allow participants to engage with the content and gain insight into how they can incorporate evaluability assessment in their work.

What you will learn:

  • Current theory and uses of evaluability assessment
  • How to implement an evaluability assessment
  • How evaluability assessment can support culturally responsive evaluation, address program complexity, and build evaluation capacity

 

Workshop 13: EVALUATING COALITIONS AND COLLABORATIVES

Offered: Monday, June 18, 9:00 – 12:00 pm; Monday, June 18, 1:30 – 4:30 pm

Level: All Levels

Speaker: Susan Wolfe, Susan Wolfe and Associates, LLC

This workshop is designed for evaluators who evaluate coalitions and community collaboratives. Through lecture, discussion, and exercises, this hands on, interactive skills development workshop will provide the foundations and tools needed to conduct evaluations of coalitions. Workshop speakers will review topics such as frameworks for evaluating coalitions; measures and tools; and challenges. Participants will participate in role play exercises that will highlight some of the challenges and issues. Through real-word expereinces and case studies, participants will learn how to apply lessons learned to the situations and settings they will encournter.

What you will learn:

  • The theoretical and methodological frameworks that can be useful to analyze and evaluate coalitions
  • The levels of measurement and stages of coalition development and implications for evaluation
  • A variety of measures and tools available for coalition evaluation
  • Challenges to evaluating coalitions and how they can be overcome
  • Best practices that can be applied to ensure success

 

Workshop 14: EVALUATIVE THINKING: PRINCIPLES AND PRACTICES TO ENHANCE EVALUATION CAPACITY AND QUALITY

Offered: Monday, 1:30 – 4:30; Tuesday, June 19, 1:30 – 4:30 pm

Level: Beginner

Speaker: Thomas Archibald, Virginia Tech; Jane Buckley, JCB Consulting

How does one “think like an evaluator”? How can program implementers learn to think like evaluators? Recent years have witnessed an increased use of the term evaluative thinking (ET) yet this particular way of thinking, reflecting, and reasoning is not always well understood. Michael Quinn Patton warns that as attention to ET has increased, we face the danger that the term “will become vacuous through sheer repetition and lip service” (2010, p. 162). This workshop can help participants avoid that pitfall. Drawing from research and practice in evaluation capacity building, this workshop will feature discussion and hands-on activities to address: what ET is and how it pertains to your context; how to promote and strengthen ET among individuals and organizations; and how to use ET to identify assumptions, articulate program theory, and conduct evaluation with an emphasis on learning and adaptive management.

What you will learn:

  • What ET is and how it pertains to their context
  • How to promote and strengthen ET among individuals and organizations with whom they work
  • How to use ET to identify assumptions, articulate program theory, and conduct evaluation with an emphasis on learning and adaptive management

 

Workshop 15: GOT QI? USE OF THE ANCHOR, ADD, ACTIVATE, APPLY, AND ASK MODEL TO TEACH QUALITY IMPROVEMENT SKILLS TO EVALUATORS

Offered: Monday, June 18, 9:00 – 12:00 pm; Monday, June 18, 1:30 –  4:30 pm

Level: Beginner

Speaker: Beverly Triana-Tremain, Public Health Consulting, LLC

In this hands-on workshop, participants will go deeply into the Quality Improvement (QI) world and learn more about how this underused tool is useful in professional practice. Quality improvement in public health is the use of a deliberate and defined improvement process, such as Plan-Do-Check-Act, which is focused on activities that are responsive to community needs and improve population health. It refers to an ongoing effort to achieve measurable improvements in the efficiency, effectiveness, performance, accountability, outcomes, and other indicators of quality in services or processes which achieve equity and improve the health of the community.

What you will learn:

  • How QI can be a partner with and just as useful as Evaluation
  • How to recognize different intensity levels of problems to solve from individual to systematic
  • How to ask the questions of the PDCA Cycle that leads to problem solving with a program or as a process evaluation tool
  • Up to 40 individual tools designed to help use and present data as evidence to show a problem that is reduced or improved after the solution was devised

 

Workshop 16: INNOVATIVE EVALUATION REPORTING

Offered: Monday, 1:30 – 4:30 pm; Tuesday, June 19, 9:00 – 12:00 pm

Level: All Levels

Speaker: Kylie Hutchinson, Community Solutions Planning & Evaluation

How much thought do you give to communicating your results effectively? Do you consider your job complete after submitting a final report? Reporting is an important skill for evaluators who care about seeing their results disseminated widely and recommendations actually implemented, but there are alternatives to the traditional lengthy report. This interactive workshop will present an overview of four key principles for effective reporting and engage participants in a discussion of its role in effective evaluation. Participants will leave with an expanded repertoire of innovative reporting techniques.

What you will learn:

  • The role of effective reporting in good evaluation practice
  • Four key principles for communicating results effectively
  • Three alternative techniques for communicating your results

 

Workshop 17: INTRODUCTION TO COMMUNITIES OF PRACTICE (CoPS)

Offered: Monday, June 18, 9:00 – 12:00 am; Tuesday, June 19, 9:00 – 12:00 pm

Level: Beginner

Speaker: Leah Neubauer, Northwestern University, Feinberg School of Medicine; Thomas G. Archibald

This interactive skill-building workshop will introduce Communities of Practice (CoPs) and demonstrate their application for monitoring, evaluation, learning, and professional development for the evaluation community. CoPs engage learners in a process of knowledge construction around common interests, ideas, passions, and goals. Through identifying the three core CoP elements (domain, community, and practice), members work to generate a shared repertoire of knowledge and resources. These three core elements often provide the foundation for monitoring, evaluation, and evaluative thinking. This workshop will explore CoP development and implementation for evaluators focused on understanding experience, increasing knowledge, and improving practice. Workshop speakers will highlight examples from the fields of evaluation, health promotion, and extension education. Participants will engage in a series of hands-on inquiry-oriented techniques and analyze how CoPs can be operationalized in relation to their respective areas of practice.

What you will learn:

  • How to describe key theories and models guiding CoPs
  • How to apply the CoP core elements
  • How to analyze tensions in CoP implementation
  • How to identify questions to guide the planning and evaluation of CoPs
  • Discuss essential fundamentals of developing and sustaining a CoPs

 

Workshop 18: INTRODUCTION TO CREATING INFOGRAPHICS FOR EVALUATION

Offered: Monday, June 18, 9:00 – 12:00 pm; Monday, June 18, 1:30 – 4:30 pm

Level: Beginner

Speaker: Stephanie Wilkerson, Magnolia Consulting, LLC; Anne Cosby, Magnolia Consulting

Do you want to learn how to use infographics to communicate evaluation findings in an effective and engaging way? This workshop introduces infographic basics, best practices, and practical tips for using low-cost tools to produce well-designed infographics for a variety of evaluation stakeholders. Participants will learn about the purpose, features, and use of infographics in evaluation as well as criteria for reviewing infographics. Participants will also have the opportunity to view demonstrations of tools and steps for developing an infographic and will gain hands-on experience in creating an infographic. No experience with graphic design or infographics required.

What you will learn:

  • The purpose of infographics
  • Best practices for using infographics in evaluation
  • How to use various tools and resources to create an infographic

 

Workshop 19: IT’S NOT THE PLAN, IT’S THE PLANNING: STRATEGIES FOR EVALUATION PLANS AND PLANNING

Offered: Tuesday, June 19, 1:30 – 4:30 pm; Wednesday, June 20, 9:00 am – 12:00 pm

Level: Beginner

Speaker: Sheila Robinson, Greece Central School District

Few evaluation texts explicitly address the act of evaluation planning as independent from evaluation design or evaluation reporting. This interactive workshop will introduce participants to an array of evaluation activities that comprise evaluation planning and preparing a comprehensive evaluation plan. Participants will leave with an understanding of how to identify stakeholders and primary intended users of evaluation, the extent to which they need to understand and be able to describe the evaluand (the program), strategies for conducting literature reviews, strategies for developing broad evaluation questions, considerations for evaluation designs, and using the Program Evaluation Standards and AEA’s Guiding Principles for Evaluators in evaluation planning. Participants will be introduced to a broad range of evaluation planning resources including templates, books, articles, and websites.

What you will learn:

  • How to identify types of evaluation activities (e.g. questions to ask of potential clients) that comprise evaluation planning
  • Potential components of a comprehensive evaluation plan
  • How to identify key considerations for evaluation planning (e.g. client needs, collaboration, procedures, agreements, etc.)

 

Workshop 20: MIXED METHODS DESIGN IN EVALUATION

Offered: Tuesday, June 19, 1:30 – 4:30 pm; Wednesday, June 20, 9:00 am – 12:00 pm

Level: Intermediate

Speaker: Donna M Mertens, Gallaudet University

Developments in the use of mixed methods have extended beyond the practice of combining surveys and focus groups. The sophistication of mixed methods designs in evaluation will be explained and demonstrated through illustrative examples taken from diverse sectors and geographical regions. Participants will have the opportunity to create mixed methods designs using evaluation vignettes for evaluation studies focused on determining the nature of a problem and development of an intervention and determination of its effectiveness.

What you will learn

  • How to identify the components of mixed methods designs in evaluation for the purpose of determining the context of the evaluation, the nature of the problem to be addressed, and the appropriateness and effectiveness of an intervention
  • How to analyze an evaluation scenario to determine an appropriate mixed methods design
  • How to apply the concepts of mixed methods design for a specific context using a case study

 

Workshop 21: PRACTICAL SURVEY DESIGN: FROM IDEA TO ROUGH DRAFT IN THREE STEPS

Offered: Tuesday, June 19, 1:30 – 4:30 pm; Wednesday, June 20, 9:00 am – 12:00 pm

Level: Beginner

Speaker: Melissa Cater, Louisiana State University AgCenter

This interactive workshop will lead participants through a three-step process for designing a survey instrument that can be used for program evaluation or research. Workshop participants will conceptualize constructs or domains of interest based on program outcomes or research objectives, generate survey items that connect to those constructs, and select appropriate response categories. At the conclusion of this workshop, participants will have a rough draft of a survey scale.

What you will learn:

  • How to frame survey constructs using theory and data driven approaches
  • How to generate items that assess a construct and its domains
  • How to select response categories that are appropriate for the specific construct

 

Workshop 22: STRATEGIES FOR EVALUATING PROGRAMS THAT SERVE YOUTH

Offered: Monday, June 18, 9:00 – 12:00 pm; Monday, June 18, 1:30 – 4:30 pm

Level: All levels

Speaker: Krista Collins, Boys & Girls Clubs of America; Tiffany Berry, Claremont Graduate University/Claremont Evaluation Center

As a guiding principle for evaluation practice, evaluators have the responsibility to acknowledge and honor differences among participants. Given the plethora of educational, health, and social programs that serve youth (under the age of 18), it is important to discuss how evaluation practice may change in response to the age of the participants. Evaluators should be mindful about how to engage with stakeholders (including youth) as well as how to design and implement a developmentally sensitive evaluation that not only measures youth outcomes accurately but also leverages the evolving capacity of young people to inform the effectiveness of their own program. This workshop will introduce participants to best practices that lie at the intersection of youth development and program evaluation. Participants will use the Center for Disease Control’s (CDC) evaluation framework to discuss how the evaluation process changes when you consider age differences, youth development principles, youth voice, and developmental science throughout the evaluation lifecycle. Using lecture, case studies, and hands-on activities, participants will discuss effective evaluation practice with youth, while simultaneously modeling high-quality youth development practices that can easily be incorporated into the evaluations of youth-serving programs.

What you will learn

  • Elements of developmental science (i.e., age, youth development principles, and youth voice) that are important to youth-serving programs
  • How to design evaluations in response to these developmental science elements and how to be sensitive to youths’ evolving skillset
  • Innovative methodologies to collect and showcase youth voice in a valid and authentic way

 

Workshop 23: THE BASICS OF USING THEORY TO IMPROVE EVALUATION PRACTICE

Offered: Monday, June 18, 9:00 – 12:00 pm; Monday, June 18, 1:30 pm –  4:30 pm

Level: Beginner

Speaker: John LaVelle, University of Minnesota; Stewart I. Donaldson, Claremont Evaluation Center

This workshop will help practicing evaluators use theory to improve evaluation practice.  Through Lecture, exercises, and discussions, participants will learn how to apply evaluation theories, social science theories, and stakeholder theories of change to enhance the accuracy and usefulness of their evaluations.  A range of examples from evaluation practice will be provided to illustrate main points and take-home messages.

What you will learn:

  • How to define and describe evaluation theory, social science theory, and program theory
  • How theory can be used to improve evaluation practice
  • How implicit and explicit social science theories can be used to guide evaluation decisions
  • How to describe the components and processes of several commonly used social science theories that have been used to develop and evaluate programs
  • How developing stakeholder’ theories of change can be used to improve evaluation practice

 

Workshop 24: THE INTERACTIVE SPORT OF QUALITATIVE DATA ANALYSES AND VISUALIZATION

Offered: Monday, June 18, 1:30 – 4:30 pm; Tuesday, June 19, 9:00 – 12:00 pm

Level: Intermediate

Speaker: Ayana Perkins, Infinite Services and Solutions

Qualitative methods are recognized as important strategies for evaluation. In this workshop, participants will explore the use of cost effective qualitative analyses and visualization strategies to improve evaluation effectiveness. Divided into three parts, this workshop will review qualitative standards of evidence and different types of data analyses, the strengths and limitations of different qualitative software, and establish interrater reliability; guide participants in a group exercise in thematic analyses; and ask participants to work independently on qualitative data visualization exercise using the free 30-day trial of Nvivo 11 software.

What you will learn:

  • Qualitative standards of evidence
  • Strengths and weaknesses of popular qualitative software
  • Interrater reliability
  • Steps in thematic analyses

 

Workshop 25: THEORIES, DESIGNS, AND INSTRUMENTS FOR EVALUATING ADVOCACY AND POLICY CHANGE INITIATIVES

Offered: Monday, June 18, 9:00 – 12:00 pm; Tuesday, June 19, 9:00 – 12:00 pm

Level: All levels

Speaker: Annette Gardner, University of California, San Francisco

There is a need for evaluators to be skilled in designing appropriate advocacy and policy change (APC) evaluations to meet diverse stakeholder needs: increased foundation interest in supporting APC initiatives to achieve systems change; evaluation of democracy-building initiatives worldwide; and diffusion of advocacy capacity beyond the traditional advocacy community (such as service providers). Evaluators have met these needs with great success, building a new field of evaluation practice, adapting and creating evaluation concepts and methods, and shaping advocate, funder, and evaluator thinking on advocacy and policy change in all its diverse manifestations.This workshop will build on this foundation of evaluation thought and practice, and expand individual evaluation capacity. The speaker has combined the plethora of concepts, definitions, designs, tools, empirical findings, and lessons learned thus far into three-hour, practice-focused workshop. Using the recently published book, Advocacy and Policy Change Evaluation: Theory and Practice (Gardner and Brindis) as a guide, the workshop will address the varied evaluation needs of stakeholders by presenting a wide array of options specific to evaluating advocacy and policy change initiatives. It will also address the challenges associated with evaluation practice, such as the complexity and the moving target of the context in which advocacy activities occur and the challenge of attribution issues and identification of causal factors.

What you will learn:

  • Public policy and advocacy scholarship and concepts important to developing an evaluation theory of change as well as inform the development of the evaluation questions, designs, and instruments
  • A range of applicable designs, outcomes, and methods, as well as the challenges to designing advocacy and policy change evaluations, namely issues with rigor, complexity, and uncertainty
  • Learn how to strengthen partnerships with advocates funders and approaches for supporting stakeholder learning and assessing advocacy effectiveness

 

Workshop 26: TIGHT IN DESIGN AND LOOSE IN PRACTICE: MAKING YOUR EVALUATION QUESTIONS YOUR BEST FRIENDS

Offered: Monday, June 18, 9:00 – 12:00 pm; Tuesday, June 19, 9:00 – 12:00 pm

Level: All levels

Speaker: Keri Culver

Evaluation questions (EQs) are the hub around which the evaluation wheel rolls. Clients have an unending list of questions, but they need the evaluators' to ask the right questions  to serve the purposes of the organization being evaluated. This workshop is designed with a little bit of theory and a whole lot of practice. Participants will break into groups by sector to practice working with clients to specify and hone their EQs, develop the EQs into an evaluation plan to capture relevant data, and report knowledgeably on the EQs using a clear analytical structure.

What you will learn:

  • What makes an evaluation question worth answering
  • How an evaluator works with clients to refine evaluation questions
  • How to be sure the evaluation collects the right data

 

Workshop 27: TOOLS AND TECHNIQUES FOR ASSESSING AND STRENGTHENING NONPROFITS' EVALUATION CAPACITY

Offered: Monday, June 18, 9:00 – 12:00 pm; Monday, June 18, 1:30 pm – 4:30 pm

Level: All levels

Speaker: Jennifer Ballentine, Highland Nonprofit Consulting; Ann Webb Price, Community Evaluation Solutions

This workshop will provide evaluators with useful tools and techniques for assessing nonprofits’ evaluation and organizational capacity, using assessment results to provide targeted technical assistance and working with nonprofits to streamline and strengthen their overall evaluation efforts.

What you will learn:

  • How to assess nonprofit capacity using various tools including the CES Evaluation Capacity Measure, the TTC Group’s Core Capacity Assessment Tool (CCAT), and the Marguerite Casey Foundation’s Organizational Capacity Assessment Tool
  • How to effectively use capacity assessment results and other techniques to provide targeted technical assistance to nonprofits and strengthen capacity
  • How to work with nonprofits to develop and implement measurement models that streamline and strengthen their overall evaluation efforts

 

Workshop 28: USING POSITIVE AND APPRECIATIVE PERSPECTIVES AND APPROACHES TO IMPROVE EVALUATION PRACTICE

Offered: Tuesday, June 19, 1:30  4:30 pm; Wednesday, June 20, 9:00 am – 12:00 pm

Level: All levels

Speaker: Stewart Donaldson, Claremont Graduate University; Tessie Tzavaras Catsambas, EnCompass LLC

This workshop will help practicing evaluators use positive and appreciative perspectives and approaches to improve evaluation practice.  Mini-lecture, exercises, and discussions will help participants learn how to apply positive and appreciative perspectives and approaches to improve the comprehensiveness, accuracy and usefulness of their evaluations.  A range of examples from evaluation practice will be provided to illustrate main points and take-home messages.

What you will learn:

  • How to define and describe positive and appreciate evaluation perspectives and approaches
  • How positive perspectives and approaches can improve evaluation practice
  • How appreciative perspectives and approaches can improve evaluation practice
  • How positive and appreciate perspectives can be used to guide evaluation decisions
  • How use positive and appreciated perspectives to improvement the implementation of the CDC Evaluation Framework

 

Workshop 29: UTILIZATION OF A RACIAL EQUITY LENS TO HELP GUIDE STRATEGIC ENGAGEMENT AND EVALUATION

Offered: Monday, June 18, 9:00 – 12:00 pm; Tuesday, June 19, 9:00 – 12:00 pm

Level: All levels

Speaker: Paul Elam, Public Policy Associates; LaShaune Johnson, Creighton University; Mindelyn Anderson, American University

The field of evaluation is being challenged to move from the traditional role of evaluation, and its perceived role of objectivity, to a process that considers who is being evaluated and who is conducting the evaluation.  Over the past three years, Public Policy Associates, Inc. (PPA) has worked to develop useful frameworks, tools, and approaches that evaluators could consider to focus on the ways that race and culture might influence an evaluation process. This workshop focuses on the practical use of a racial equity lens when conducting evaluation.  The framework argues that culture and race are important considerations when conducting an evaluation because we believe that there are both critical and substantive nuances that are often missed, ignored, and/or misinterpreted when an evaluator is not aware of the culture of those being evaluated. Participants will be provided with a template for analyzing programs through a culturally responsive and racial equity lens, designed to focus deliberately on an evaluation process that takes race, culture, equity, and community context into consideration.  The speakers will also share a “how-to process” focused on the cultural competencies of individuals conducting evaluations, how such competencies might be improved, and strategies for doing so.  This “How-to Process” is the result of thinking around developing a self-assessment instrument for evaluators, is based primarily on the cultural-proficiencies literature, and relates specifically to components of the template.Through small-group exercises, participants will apply the concepts contained in the template to real world evaluation processes. 

What you will learn:

  • A framework for conducting evaluation using a racial equity lens
  • Key elements of the framework that can improve the quality of evaluation in diverse settings
  • How to assess individual cultural backgrounds and explore how it may influence design choices in evaluation work

 

Workshop 30: FOCUS GROUPS FOR QUALITATIVE TOPICS

Offered: Tuesday, June 19, 1:30 - 4:30 pm; Wednesday, June 20, 9:00 am -12:00 pm

Level: All levels

Speaker: Michelle Revels

 

Workshop 31: ECONOMIC EVALUATION OVERVIEW

Offered: Monday, June 18, 9:00 – 12:00 pm; Monday, June 18, 1:30 – 4:30 pm

Level; All levels

Speaker: Meenu Anand

Have you ever said, “This is not worth my time?”   If yes, you are using the most basic principles of “economic evaluation” – a term that may have made you nervous!  Understanding the costs and benefits of your program is not only good stewardship of donor and/or taxpayer’s dollars but an excellent way to make a case for your program’s legitimacy.  In this session, participants will learn the foundational concepts of economic evaluation and work though a case study to apply these principles using CDC’s six-step evaluation framework that is familiar to most.  No Greek formulas – only basic concepts and their application!

Workshop 32: SOCIAL IMPACT MEASUREMENT VS EVALUATION: ARE THEY SAME? ARE THEY DIFFERENT? HOW DO EVALUATORS PLAY IN THIS NEW SPACE?

Offered: Monday, June 18, 9:00 pm – 12:00 pm; Tuesday, June 19, 9:00 am – 12 pm

Level: All levels

Speaker: Veronica Olazaba, Jane Reisman 

Over the past few years, social impact measurement has surfaced as the main approach to measuring and managing impact across non-traditional social sector actors such as social enterprises, impact investors, fund managers, and philanthropies. As the pool of capital invested for social and environmental good continues to rise, we predict it will become even more mainstream. What is social impact measurement and how is it the same and/or different from evaluation? What are the most critical evaluation skills needed to deliver an effective impact measurement strategy? What are the new risks that evaluators should look out for? Grounded in the current evidence base, this course will unpack social impact measurement, map out its history, and introduce the skills needed to strengthen your evaluation toolkit.

What you will learn:

  • Discover the parrallel evolution of social impact measurement and evaluation 
  • Learn about M&E and analytical tools that are transferable across the two areas (social impact measurement and evaluation) 
  • Move beyond impact measurement to impact management and find out more about the implications for evaluation 
  • Learn how to play in this new space: key players and trends

Workshop 33: AWAKENING OUR VISUAL THINKING

Offered: Tuesday, 9:00 am – 12:00 pm; Tuesday, 1:30 pm –  4:30 pm

Level: All levels

Speaker: Sara Vaca

Ever since we were children (and we didn't know how to write), we communicate and receive information through images. The problem is that once we learn to write and read, we seem to forget (and no one teaches us) how to communicate more complex concepts with visuals. This workshop aims to awake (or strengthen) our visual thinking dimension that has been there all along, just hasn't got the proper attention.  By discovering (or remembering) the basic foundations of Data Visualization, and seing actual manners of communicating M&E concepts with visuals, we will open our mind and imagination to allow the visual dimension enter some parts of your daily work and your way of thinking and analysing.

The workshop will have a hand-on approach (mostly sketching in paper) where we will play with concepts and ideas visually. By seeing and analyzing other people's work, the facilitator will help you learn (or reinforce) your criteria for judging the relationship between the forms and the content, so you can iterate and improve your visuals, and so you can judge visuals you see. Finally, some tips and tools will be shared to make an easy transition to apply this new, creative approach to some parts of your reports and work.

What you will learn:

  • Purposes of Data Visualization
  • Very basic neuroscience basis of Data Visualization
  • Challenging the way we communicate certain types of information
  • Criteria for discerning effective/non-effective visuals



Workshop 34: EXERCISING VISUAL MAKING USING POWERPOINT/EXCEL

Offered: Wednesday, 9:00 am – 12:00 pm

Level: All levels

Speaker: Sara Vaca

This workshop aims to awake (or strengthen) our visual thinking dimension, but mostly to help you translate your visual ideas into reality with Microsoft Office programs.  By discovering (or remembering) the basic foundations of Data Visualization and seing actual manners of communicating M&E concepts with visuals, we will set the basis for creating effective visuals.

The workshop will have a hand-on approach (mostly working on MS Excel and MS Powerpoint), where we will create (or re-do if you have material you wish to improve) effective visuals, learning some simple, yet very useful tips in Excel and Powerpoint that will allow you to create more sophisticated dataviz. By seeing and analyzing other people's work the facilitator will help you learn (or reinforce) your criteria for judging the relationship between the forms and the content, so you can iterate and improve your visuals, and so you can judge visuals you see.

Finally, some tips and tools will be shared to make an easy transition to apply this new, creative approach to some parts of your reports and work.

What you will learn: 

  • Indicators and "formula" for Data Visualization
  • Analysis of areas where Dataviz could be useful in our work
  • Simple but very useful tools to do infographics in Powerpoint