AEA Newsletter: August 2017

Message from the Executive Director

Executive Director Search and Policy Governance

From Denise Roosendaal, AEA Executive Director

Denise_Roosendaal_111811_144.jpgThe search for the full time Executive Director is now in full swing. The Job Specifications Committee worked with President-Elect Leslie Goodyear to develop a list of additional experience and qualifications for the new, full time Executive Director role. The Job Specifications Committee included 17 AEA members from a variety of disciplines, background, race and ethnicities, geographies, work settings, and AEA tenure. AEA is grateful for their work, thoughtfulness, and inspired input. Members can continue to submit input that will be considered by the Search Committee who will be responsible for interviewing final candidates.

This final job description – which includes both SmithBucklin’s qualifications for an association executive director, and the more specific AEA-generated specifications – can be found on the AEA website staff section. Input from members is still being accepted at AEAEDSearch@eval.org.

I thought I would take a moment to review the governing structure that is in place for AEA. On the list of requested experience for the Executive Director is a background in working with volunteer boards that have employed the Policy Governance™ model. By way of background, this model, developed by John Carver, is a specific governing methodology which is pivotal in focusing a Board of Directors on visionary and leadership activities, leaving the operational decision-making to the Executive Director. Much literature has been developed on this methodology. The AEA Board switched to this method in 2009 and continues to refine their understanding and practice, investing both time and resources into this approach so they can focus on the challenges and opportunities facing evaluators and the evaluation field. Within this structure, there is a clear delineation of roles: The Management Team (staff) is charged with executing the operations in pursuit of the AEA strategic plan, with guidance from the Board (e.g. annual budgets, input into new programming, member input, etc.). The Board is charged with creating and following policy guidelines, developing and monitoring targeted outcomes, and staying in touch with members to understand their views on the future of the profession, evaluation policy, and the overall community.

Member input is critical to AEA’s health as an organization, and both the Management Team and the Board encourage members to share their views, ideas, and concerns. Over the past few years, you may have encountered an important aspect of this governing method when bringing an idea or concern to the Board. Honoring the role clarity that the PBG model brings, there’s a possibility that you may have been referred to the Management Team for a conversation or resolution if the topic was in fact operational. If the item was more strategic in nature, the topic may have made its way into a board meeting discussion or assigned to a Task Force for further examination (Working Groups advise the Management Team; Task Forces provide input into specific strategic elements to the Board). This separation of tasks organizes the Board and Management’s work in a way that is designed to maximize everyone’s roles, skills, and talents.

We may not have always gotten this separation right (we’re humans, of course), but using this methodology is also a work in progress for the Board.

As always, we appreciate getting guidance and input from members, which is an important aspect of Policy Governance™ as well. Those wishing to make a comment or offer a suggestion related to governance, leadership, or visioning topics may contact President Kathy Newcomer. Those wishing to make a comment or offer a suggestion to the Management Team on current or possible programs, or other aspects of operations, may contact Denise Roosendaal. We appreciate the opportunity to serve you and the organization. 

AEA 2017 Election Announcement

Thank you to all who voted!

Thank you for taking the time to vote for your 2018 AEA leadership. The race was close and we're certain the choice among all of the talented candidates was not an easy decision to make. It is with great pleasure that we share with you the outcome of the 2017 election for the American Evaluation Association Board of Directors.

AEA President-Elect:

  • Tessie Catsambas, EnCompass LLC

AEA Board Members-at-Large, 2018-2020:

  • Eric Barela, Salesforce
  • Jara Dean-Coffey, Luminare Group
  • Bianca Montrosse-Moorhead, University of Connecticut

The election was open from July 5 through August 4 and received 1,048 votes (14% response rate). For comparison, recent historical voting rates are noted below.

Recent Historical Voting Percentages:
2016 - 15.2% of eligible members, 1,131 votes cast
2015 - 18.9% of eligible members, 1,327 votes cast
2014 - 16.4% of eligible members, 1,173 votes cast

Congratulations to our newest AEA Board of Directors and thank you to those who took the time to vote!

Stay tuned for more information on the new AEA Directors in upcoming newsletters!
 

Introducing the 2017 AEA Award Winners

AEA offers awards in eight distinct areas to acknowledge outstanding colleagues and outstanding work. Through identifying those who exemplify the very best in the field, we honor the practitioner and advance the discipline. In any given year, there may be no winner or multiple winners for a particular award. Each proposal is judged on its individual merit.

Below are the winners of AEA's 2017 awards:

Thomas Archibald, Assistant Professor & Extension Specialist, Virginia Tech
2017 AEA Marcia Guttentag Promising New Evaluator Award

Stewart I. Donaldson, Dean of the School of Social Science, Claremont Graduate University
2017 AEA Robert Ingle Service Award

Stephanie Evergreen, CEO, Evergreen Data
2017 AEA Alva and Gunnar Myrdal Evaluation Practice Award

Rodney K.M. Hopson, Professor, Division of Educational Psychology, Research Methods, and Education Policy, College of Education and Human Development and Senior Research Fellow, Center for Education Policy and Evaluation, George Mason University
2017 AEA Paul F. Lazarsfeld Evaluation Theory Award

Apollo M. Nkwake, Associate Research Professor of International Affairs, The George Washington University-Elliot School of International Affairs Institute for Disaster and Fragility Resilience
2017 AEA Marcia Guttentag Promising New Evaluator Award

Michael Quinn Patton, Founder and Director, Utilization-Focused Evaluation
2017 AEA Research on Evaluation Award

Parliamentarians Forum for Development Evaluation - South Asia
2017 AEA Advocacy and Use Evaluation Award

Abraham Wandersman, PI
Jonathan P. Scaccia, Program Director
The SCALE Formative Evaluation Team,
2017 AEA Outstanding Evaluation Award

Join us for the presentation of this year's AEA Awards at Evaluation 2017 on Friday, November 10, 2017.

This is a ticketed event. You can purchase tickets ($55) online or at registration. See you there!
 

Face of AEA

Meet Norma Martinez-Rubin

 Norma Martinez-Rubin_2.jpg

 

Name:  Norma Martinez-Rubin

Affiliation: Evaluation Focused Consulting

Degrees:  M.B.A. in Strategic Management, M.P.H. in Population, Family and International Health / Behavioral Sciences and Health Education, B.S. in Biology

Years in the Evaluation Field: 12

Joined AEA: March 2006

Why do you belong to AEA?

I belong to AEA because it links me to what is current in the field. The organization’s Topical Interest Groups (TIGs) offer ample opportunities for involvement with colleagues. These TIGs have become a professional home based on shared evaluation methods or emphasis. Access to academic journals, professional education opportunities, and the annual conference encourages my learning and continual development. But the greater appeal is the approachability of AEA members. I’ve met researchers, theorists, and practitioners without any haughtiness about them. What a delight to be part of an organization whose members’ accomplishments include being genuinely good people.

What is the most memorable or meaningful evaluation you have been a part of?
The most memorable evaluation I’ve been part of included a community assessment of Latino parents’ awareness of childhood obesity. The data collection, via focus groups and key informant interviews, ensured we captured firsthand what parents believed caused obesity. Interviews with community agency administrators exposed local agencies’ priorities for addressing it. We also inquired about interests in furthering agencies’ work in collaboration with the study sponsor. Our findings informed how outreach and health education programs could remain culturally responsive, described how the study sponsor was perceived by community organizations, and confirmed the need to advocate for healthier food options in school settings.

This work was meaningful on several levels: our approach highlighted how community members can be involved when regarded as research partners; the study findings supported community advocates’ work for healthier food and beverages in school settings; and I was reminded how being culturally responsive to parents and agency administrators with different priorities requires a good deal of flexibility and humility to remain attentive to what study participants divulge.

What advice would you give to those new to the field?
Formal training is an entrée to a field. Actually doing the work tests your abilities, helps you sort your priorities, and increases your proficiency. From the time you identify a prospective evaluation project or are assigned to one, how you approach it and proceed combines formally acquired lessons with skills gained from the realities of on-the-ground fieldwork.

Diplomacy and tact are helpful for interactions required in evaluation work. One’s sense of urgency is not always similarly felt by staff of the program or service to be evaluated. This can make even the most patient among us impatient and curt. Catching one's breath and retreating a bit to reflect on what's most important provides an opportunity to reconsider, reorient, and perhaps, too, reconstruct an evaluation plan. A plan guides one’s work, but ultimately the conversations and interactions experienced because of the evaluation largely influence whether its results are considered useful or not for service and quality improvement.
 

Policy Watch

Evidence-Based Policymaking Commission Report & AEA Forum

From Cheryl Oros, Consultant to the Evaluation Policy Task Force (EPTF)

Cheryl Oros_v2.jpg

The Evidence-Based Policymaking Commission Act of 2016 worked to examine how to increase the availability and use of government data to build evidence and inform program design, while protecting the privacy and confidentiality of those data. More specifically, the Commission was required to:

  • Determine how to integrate administrative and survey data and to make those data available to facilitate research, program evaluation, analysis, policy-relevant research, and cost-benefit analyses by qualified researchers and institutions while protecting privacy and confidentiality;
  • Recommend how to overcome legal and administrative barriers, improve data infrastructure to facilitate data merging and access for research purposes, ensure database security, and modify statistical protocols to best fulfill the integration and increased availability of data;
  • Recommend how best to incorporate “rigorous evaluation” (i.e., “outcomes measurement, randomized controlled trials, and rigorous impact analysis”) into program design; and
  • Consider whether a Federal clearinghouse should be created for government survey and administrative data; which types of researchers, officials, and institutions should have access to data; and what limitations should be placed on the use of data provided. 

The 15-member Commission, comprised of economists, lawyers, data security and confidentiality experts, academic researchers and data managers, has had bipartisan support to submit to the Congress and the President a report with its recommendations for legislation or administrative actions on September 7, 2017. AEA offered its support via a letter when the bill was introduced; provided another letter summarizing AEA’s policy positions on evaluation to the Commission; and testified on evaluation issues last November.

AEA will sponsor a discussion forum, “Building on the Recommendations of the Commission on Evidence-Based Policymaking,” on the report findings on Thursday, September 21, 2017, from 8:30 AM – 10 AM ET at the George Washington University Marvin Center in Washington, D.C. Featured guests will include Ron Haskins, Co-Chair of the Commission on Evidence-Based Policymaking, and Nancy Potok (Chief Statistician, OMB), Member of the Commission. This forum will present an opportunity for a panel of members of the professions most affected by the Commission’s recommendations to discuss how professional societies and other non-governmental organizations can embrace the momentum of the Commission’s recommendations. We hope that you will join us to hear the ideas raised and comment.

Of equal or greater importance, AEA will, on its own and by working with various groups, provide advice to the new Administration on evaluation policies. If you have suggestions of groups to contact, please contact me at EvaluationPolicy@eval.com.

Also, please plan to join the EPTF to learn more about its policy work and join in discussions to offer your suggestions at the AEA annual conference, the afternoon of Friday, November 10, 2017, in Washington, D.C.
 

Diversity 

CREA Brings Professional Development Workshops Focused on Culturally Responsive Evaluation to Eval 2017

From Zachary Grays, AEA Headquarters

Zachary Grays.jpgThe American Evaluation Association (AEA) is pleased to announce the continued partnership with the Center for Culturally Responsive Evaluation and Assessment (CREA) to offer a unique thread of professional development training options as part of the pre- and post-conference offerings during Evaluation 2017, November 6-11, 2017, in Washington, D.C. CREA was established in 2011 in the College of Education at the University of Illinois at Urbana-Champaign, with Stafford Hood, Ph.D., Sheila M. Miller Professor, serving as its founding director.

CREA is a culturally diverse and interdisciplinary global community of researchers and practitioners in the areas of evaluation and assessment. CREA’s primary focus is to address the growing need for policy-relevant efforts that take seriously the influences of cultural norms, practices, and expectations in the design, implementation, and evaluation of social and educational interventions. To learn more about CREA, click here.

What can attendees expect from this AEA-CREA partnership? Take a look at the courses being offered this year:

With the generous support of the W.K. Kellogg Foundation, CREA will also host the final Race and Dialogue as a part of their Fourth Annual International Conference. We invite you to join AEA and CREA for this final panel that will take place in Chicago, Illinois, on Thursday, September 28, 2017 at 6:00 PM CT. Although this event is free to the public, you will need to complete your registration to confirm your attendance, either in person or via live-stream. Those who register to participate via live-stream will receive a link to join in their email as we get closer to the event. This will be the final dialogue in this yearlong series before the culminating plenary to be held at Evaluation 2017.
 

Potent Presentations Initiative

New French Versions of our FREE p2i Presentation Tools!

From Sheila B. Robinson, Potent Presentations Initiative Coordinator 

Sheila Robinson-RS 2.pngBonjour et bienvenue à l'Initiative des Présentations Puissantes! (Translation: Hello and welcome to the Potent Presentations Initiative!)

Two p2i tools translated into French!

I’m brushing up on my French because we are so pleased to share that two of our most popular tools - our Slide Design Guidelines and Presentation Assessment Rubric - are now available in French! Andrealisa Belzer, a Credentialed Evaluator who works as Senior Evaluation Advisor with the Government of Canada, worked with Canadian federal translation services to format and translate the two tools. Andrea alerted us to the lack of good bilingual resources on good presentations and slide design and offered to facilitate the translation. “I am very excited for the opportunity to make your resource more widely available, and to refer my federal colleagues to these tools!”

Andrea also reached out to Kylie Hutchinson of Community Solutions Planning and Evaluation, and Kylie has now posted a French translation of her free resource: You Can Be a Better Presenter: 25 Tips for Better Presentations.

Profitez de ces nouvelles ressources! (Translation: Take advantage of these new resources!)

Conference time is drawing near! Time to gather your tools!

Are you starting to think about your conference presentations? While we’re talking tools, let me whet your appetite for learning by introducing you to some free tools that can be used as alternatives or, in some cases, supplements to PowerPoint. Nine Free Presentation Software Alternatives offers a brief glimpse into and links for Canva, Visme, Adobe Spark, LibreOffice Impress, Zoho Show, Google Slides, Slide Dog, and Haiku Deck.

Like any tool, each of these appears to have its advantages and limitations for any given user or purpose, and many feature both free and paid versions. SlideDog, for example, allows users to combine media types in one presentation but is only compatible with Windows. Visme offers a YouTube channel chock full of good videos on various aspects of presentation design. AEA members have shared their experiences working with two of the tools – Canva and Haiku Deck – on AEA’s daily blog, AEA365 Tip-A-Day By and For Evaluators.

Have YOU tried any of these tools? If so, I’d love to hear about your experiences. If you try a new presentation tool and would like to share what you learned in a brief blog post, please contact me at aea365@eval.org.

Don’t forget to visit the P2i site to check out all of our free tools, and contact me at p2i@eval.org with any questions or comments!

p2i August 2017 cartoon.jpg
 

International Evaluation Update

 Coffee Breaks and AEA365 

From Cindy Clapp-Wincek and Veronica Olazabal

In September and early October, there are several ways to learn about global evaluation – what we are learning from our colleagues around the world and what they are learning from us. There will be three Coffee Breaks presented by the International Working Group, and an AEA365 blog the week of September 24, hosted by the International Cross-Cultural Evaluation Topical Interest Group (ICCE TIG), featuring previews of their sessions at Evaluation 2017.

AEA’s International Working Group will present three Coffee Break sessions in September and early October to share our insights, inspire you with what is happening globally, and point you to ways to get involved. We’ll cover topics such as how AEA supports the growth of evaluation in the international arena, evaluation with international youth, how our challenges compare with those of our international colleagues, and more. Below is a list of available webinars. Be sure to visit the AEA Coffee Break page for details and links to register.

  • September 26, 2017, 2 PM - 2:20 PM ET: AEA Partnerships to Expand Evaluation Around the Globe 
  • September 28, 2017, 2 PM - 2:20 PM ET: EvalPartners and the Networks: EvalYouth, EvalGender+, EvalIndigenous, EvalSDG 
  • October 3, 2017, 2 PM - 2:20 PM ET: Q&A on Global Evaluation 

During ICCE TIG's AEA365 week, colleagues will provide a sneak preview of ICCE sessions at Evaluation 2017. The conference, “From Learning to Action,” takes place in Washington, D.C., November 6-11, 2017, and will explore ways the community can learn from evaluation to create better practices and outcomes. ICCE TIG will sponsor a strand of sessions focusing on their areas of interest. Be sure to follow the AEA365 blog during the week of September 24 for more information. The International Working Group and the ICCE TIG hope to see you at the conference!
 

Join the Multiethnic Issues in Evaluation (MIE) TIG Mentoring Program!

The Multiethnic Issues in Evaluation (MIE) TIG invites you to join the MIE TIG Mentorship Program! The goal of the MIE TIG Mentorship Program is to provide support and guidance to AEA members about and around multiethnic issues and the use of culturally responsive approaches (i.e., Culturally Responsive Evaluation) to evaluate initiatives. Additionally, the aim is to facilitate networking and connections between experienced, seasoned program evaluators and new and emerging evaluators when it comes to addressing and attending to aspects of culture, diversity, equity, and inclusion work within evaluation practice and research.

To achieve this goal and be respectful of the amount of responsibilities mentors and mentees have outside of this program, we have developed a simple structure.

  • First, those interested in being mentored will complete a brief survey and be paired, as best as possible, with a mentor by the MIE TIG Mentorship Program committee. 
  • Second, because we envision this mentorship program to be a self-monitoring process, mentees will be encouraged to contact their assigned mentors to introduce themselves. 
  • Following that, mentors and mentees are encouraged to find a communication means (i.e., email, text, phone calls, video calls, etc.) that works best for both mentor and mentee.

If you have any questions and are interested in participating, please don’t hesitate to contact Gabriela Garcia at gabrielagarcia1004@gmail.com
 

September Coffee Breaks

We encourage you to take advantage of the various Coffee Break webinars offered this September. In addition to the three internationally-focused Coffee Breaks mentioned above, two other sessions are available. Learn how to discuss the importance of evaluation and evidence in policy discourse with Dr. Brian Yoder on September 14, or how qualitative analysis can derive deeper insights that impact decision-making with Dr. Anupama Shekar & Dr. Matt Pierson on September 19.

  • Thursday, September 14, 2017, 2 PM - 2:20 PM ET: EvalAction2017: Visit the Office of Your Congressional Representative and Discuss the Importance of Evaluation and Evidence
    • If you believe evaluation and evidence is an important part of policy discourse, and want to discuss this with policymakers, join us for this 20-minute webinar hosted by Dr. Brian Yoder, Director of Assessment, Evaluation, and Institutional Research at the American Society for Engineering Education (ASEE), and chair of the EvalAction2017 initiative. Dr. Yoder will share an overview of EvalAction2017, how to get involved, and what to expect when visiting the office of your congressional representative.
  • Tuesday, September 19, 2017, 2 PM - 2:20 PM ET:  Unpacking the "Why": The Power of Collaboration Coding with Qualitative Data
    • Hosted by Dr. Anupama Shekar & Dr. Matt Pierson, this webinar will help participants understand the power of combining multiple qualitative coding techniques to drive analysis, and share actionable tips to conduct rigorous qualitative analysis to derive deeper insights that impact programming and decision-making.

 More details and links to register are available on the AEA Coffee Break page.

Recent Stories
AEA Newsletter: October 2017

AEA Newsletter: September 2017

AEA Newsletter: August 2017