AEA Newsletter: January 2018

Message from the President

From Leslie Goodyear, AEA President  


Happy New Year, AEA! As I think about closing the door on 2017 and looking ahead to the possibilities of 2018, I am honored and excited to have the opportunity to serve as AEA president. Amidst the challenges that 2017 brought, our association made some great strides in important areas and starts 2018 poised to build on those successes. Before I talk about the theme for Evaluation 2018 and opportunities to engage the theme throughout the year, let me first mention a few highlights of our collective work this past year.

The biggest news is that, after a process that engaged members and solicited input at multiple points, we are pleased to announce that we have selected Anisha Lewis as executive director of the association, starting February 19. Learn more about this announcement here.

Other exciting highlights include:

  • Attendance at our November Evaluation 2017 conference was higher than ever: 4,300 people attended and the energy at the conference was electric!
  • The Evaluation Policy Task Force (EPTF) had multiple successes in efforts to educate and influence federal policy makers about the value of evaluation, including quotations and citations in the Commission on Evidence-Based Policymaking’s 2017 report.
  • The Competencies Task Force is honing in on a final set of Evaluator Competencies thanks to member input and the hard work of the Task Force members.
  • The Diversity, Membership Engagement, and Leadership Task Force will be sharing its recommendations with the Board at the January meeting.
  • The Guiding Principles Task Force is forging ahead with its work to revise the Guiding Principles for Evaluators, having solicited member input through a fall survey.
  • The innovative series of Race and Class Dialogues hosted in 2017 culminated in a dynamic and inspirational plenary session hosted by Drs. Melvin Hall, Jennifer Greene, and Beverly Parsons. These dialogues, and the associated compilation video that was shared at the conference, set the stage for important conversations about the roles evaluators can play in addressing societal inequities and the importance of evaluators keeping these critical issues at the center of our work.

Thank you to all who have given their time and energy to inform and advance the work of our association and our field. A membership association such as ours is nothing without its members, and I am grateful for the passion, brilliance and innovation you bring to all you do as evaluators and AEA members!

A special thanks to Kathy Newcomer for her inspiring leadership as president of AEA in 2017. And to the board members whose terms ended this year – past president John Gargani, and board members Aimee White, Corrie Whitmore, and Tom Grayson – a big thank you for your contributions. We welcome to the board this year Jara Dean-Coffey, Bianca Montrosse-Moorhead and Eric Barela; no doubt they will bring new insights and energy to our work.

If you had the chance to read the back page of the paper program at Evaluation 2017, you saw that the theme for Evaluation 2018 is Speaking Truth to Power. With its roots in an American Friends Service Committee’s 1955 pamphlet on nonviolence and the writings of civil rights activists, I hope this theme will inspire us to explore such questions as:

  • What responsibilities do we have as evaluators for speaking truth to power? When? In what contexts or situations? With what consequences? At what risk or cost? To whom, with what expectations?
  • What is power? Who has it, and how can they best be influenced? What is the power held by evaluators and evaluation?
  • What is truth? Whose truth? How can we best discover these truths?
  • And, what is speaking? Whispering? Public pronouncements? Influence? Activism? And by whom on behalf of whom?

Look for ways to learn and engage the theme throughout the year, including virtual town hall meetings, aea365 blogs, conversations at regional conferences, and more.

On the topic of Evaluation 2018, did I mention that it’s going to be in Cleveland? The Local Arrangements Working Group is already on the job, connecting with local organizations for possible evaluation field trips during the conference; finding local sponsors for the silent auction; scouting restaurants and other venues for exploration during our conference there; and working with Cleveland leaders to welcome AEA to their city of great stories. They will share updates in this newsletter throughout the year to get us ready for a great conference in Cleveland.

Here’s to the year ahead. I look forward to working together – learning from each other, asking important questions and advancing the practice of evaluation . 


Face of AEA: Meet Nicole R. Bowman, PhD

Nicky Bowman_2.jpg
Nicole R. Bowman, PhD

Affiliation: Bowman Performance Consulting

Degrees: Bachelors, Education; Masters, Curriculum and Instruction (Lesley University); PhD, Educational Leadership & Policy Analysis (University of Wisconsin-Madison) 

Joined AEA: March 1998

Why do you belong to AEA?

Initially I joined AEA because of the federal policy changes under President George W. Bush, because they required desegregation of data to be publicly available for racial/ethnic groups, socioeconomic status, and special education. This policy change was very valuable in truly understanding the data and evidence that was out there, or normally non-accessible. And beyond this policy change, I simply had to learn skills, competencies, and knowledge because it was not readily available in my post-secondary training (generally and especially for culturally-responsive evaluation). Presently, I’m still in AEA to learn and understand how evidence-based policies and governance impact Tribal Nations and Indigenous communities. Although we are making progress in terms of culturally responsive and Indigenous evaluation studies, we still have a long way to go for the 566 sovereign tribal governments who need to be invited to the table and be regular participants when other public governments are meeting. As a member of AEA, I can engage through my profession not only the field, but state, federal, and international governments and stakeholders.

What is the most memorable or meaningful evaluation you have been a part of?
It truly is hard to pick just one because we choose projects that are culturally centered and at the community level; most of our projects meet these criteria, which makes it hard to choose. However, one very memorable evaluation was with the Ho-Chunk Nation of Wisconsin. We had evaluated almost 10 years’ worth of language revitalization and preservation that were funded through the U.S. Dept. of Health and Human Services Administration for Native Americans program. Toward the end of this nearly 10-year cycle, we got to see the evaluation capacity and value for data increase in community members. This made them empowered and self-sufficient. We watched a decade of funding help them become stronger from an institutional and systems standpoint as they developed effective policies, curriculum, and community resources that would sustain long after the funding ended. But, the most beautiful part was the language nest that we supported through evaluation in the Ho-Chunk community.

This language nest had nearly 20 Ho-Chunk families that spoke nothing in their homes except Ho-Chunk, including to any in-utero babies. We witnessed four to five generations of language speakers talking to infants and toddlers in this program. Some of the toddlers were able to speak back in their first language, which was Ho-Chunk not English. That not only was symbolic of the effectiveness of the intended goals and outcomes of the project, but it allowed us to bear witness to the rebirth of the original Ho-Chunk language in this contemporary world.

For our work with the Ho-Chunk, we were gifted a blanket during a drumming ceremony from the Culture and Language Commission and the community. To this day I still cherish this blanket because it reminds me of the power of culture, community context, and evaluation when it is done right.

What advice would you give to those new to the field?
Find a mentor(s) that you connect with on more than just an intellectual, theoretical, or methodological level. These relationships should last a lifetime and be supportive “Aunties” and “Uncles,” as we say in a traditional/Indigenous way, to help nurture and facilitate your growth as a human and professional evaluator. Mentors should give you constructive feedback as well as praise. Our elders tell us both celebrations and challenges are there to teach us. It is up to us if we are going to walk that path with academic and cultural humility. This is what makes evaluation a beloved calling and not just a career. 

Policy Watch

Evaluation Guidelines for Federal Foreign Assistance Agencies

From Cheryl Oros, Consultant to the Evaluation Policy Task Force (EPTF)
Cheryl Oros.jpg

The Foreign Aid Transparency and Accountability Act of 2016 (FATAA) required (in sections 3(b) and 3(d)) the President to establish guidelines for the establishment of measurable goals, performance metrics, and evaluation plans for U.S. foreign development and economic assistance. The bill had bipartisan support in both houses, garnering unanimous approval in the Senate. 

This bill provides a good example of EPTF efforts to support adoption of policies on federal evaluation consistent with AEA principles and practices. The EPTF monitored Congressional efforts to draft and move this bill since December 2012, providing staff the AEA Roadmap, discussing options with congressional staff, reviewing and commenting on drafts, meeting with staff and drafting support letters for the AEA president to send to committee chairs who were primary authors and sponsors. The resulting guidelines reflect much of AEA recommended policies. 

This legislation is unusual and heartening in its scope across many foreign aid programs, rather than only addressing evaluation program by program. All foreign assistance programs already covered by the Guidance on Collection of U.S. Foreign Assistance Data (OMB Bulletin 12-01) are included in these guidelines. Foreign assistance programs covered by FATAA include: 

  1. Assistance authorized for many programs under Part I of the Foreign Assistance Act of 1961;
  2. Economic Support Fund;
  3. Millennium Challenge Act of 2003; and
  4. Food for Peace Act. 

On January 11, 2018, OMB provided the first cross-agency monitoring and evaluation guidelines for Federal agencies that administer foreign assistance (OMB M-18-04). On January 22, 2018, AEA sent correspondence to OMB praising the new guidelines as a valuable resource for federal staff engaged in evaluation activities for the purposes of improving accountability, transparency, and learning.  Key highlights of the guidelines include: 

  1. Establish annual evaluation objectives and timetables to plan and manage the process of evaluating, analyzing progress, and applying learning toward achieving results;
  2. Develop specific project evaluation plans, including measurable goals and metrics, and to identify the resources necessary to conduct such evaluations, which should be covered by program costs;
  3. Apply rigorous evaluation methodologies to such programs, as appropriate, that clearly define program logic, inputs, outputs, intermediate outcomes, and end outcomes;
  4. Disseminate guidelines for the development and implementation of evaluation of programs to all personnel, including roles and responsibilities, the use and dissemination of findings and preservation of lessons learned;
  5. Establish methodologies for the collection of data;
  6. Evaluate, at least once in their lifetime, all programs whose dollar value equals or exceeds the median program size for the relevant office or bureau to ensure the majority of program resources are evaluated;
  7. Conduct impact evaluations on all pilot programs before replicating, or conduct performance evaluations and provide a justification for not conducting an impact evaluation when such an evaluation is deemed inappropriate or impracticable;
  8. Develop a clearinghouse capacity for the collection, dissemination, and preservation of knowledge and lessons learned to guide future programs – Agencies should make information on program plans, monitoring data, and evaluation findings available to the public, other foreign assistance agencies, implementing partners, the donor community and aid recipient governments.;
  9. Internally distribute evaluation reports for learning and analysis;
  10. Publicly report within 90 days each evaluation, including a description of the evaluation methodology, key findings, appropriate context, including quantitative and qualitative data when available, and recommendations made;
  11. Undertake collaborative partnerships and coordinate efforts with the academic community, implementing partners, and national and international institutions, as appropriate, that have expertise in program monitoring, evaluation, and analysis when such partnerships provide needed expertise or significantly improve  the evaluation and analysis;
  12. Ensure verifiable, reliable, and timely data are available to evaluation personnel to permit the objective evaluation of programs, including an assessment of assumptions and limitations in such evaluations. Agency policies should encourage engagement of beneficiaries, partner country governmental or non-governmental stakeholders, and implementing partners in monitoring and evaluation processes where feasible.;
  13. Ensure that standards of professional evaluation organizations for evaluation (AEA is specifically mentioned) efforts are employed, including ensuring the integrity and independence of evaluations, permitting and encouraging the exercise of professional judgment, and providing for quality control and assurance in the monitoring and evaluation process. (Professional standards are intended to improve the quality of evaluation processes and products and to facilitate collaboration.);
  14. Definition of evaluation terms should be made within each agency; and
  15. Agencies should report annually on their findings through the OMB budget submission process.

 If you have any questions or comments, please contact me at

Evaluation Policy Task Force Openings

Reminder: Self-Nominations Due February 5

In case you missed it: The following article ran in the AEA December 2017 newsletter. Self-nominations for EPTF Openings will be accepted through February 5.

The Evaluation Policy Task Force (EPTF), established in 2009, has been the driving force behind AEA’s influence on evaluation-related policy on the federal level. The EPTF now has three openings (three year terms each) on the twelve member body. We invite you to apply.

Under the leadership of George Grob and on behalf of AEA’s membership, the EPTF influenced the understanding and use of evaluation at the federal level in recent years. The EPTF created AEA’s Evaluation Roadmap and informed the findings of the U.S. Commission on Evidence-Based Policymaking. The Commission’s final report issued in September includes a variety of positive references to evaluation. (For other policy related articles on the Commission report, see the September AEA newsletter.)

After eight years leading the EPTF, George Grob is stepping down. AEA owes him a debt of gratitude for his wise counsel and dedicated leadership.

“We have been truly fortunate and have benefited greatly from George's leadership of the Task Force,” AEA President Kathy Newcomer said. “He has deep knowledge of our government and the public policy deliberations space to have ensured that the interests of AEA and the broader evaluation community were brought to the forefront. Thank you, George!" 

Succeeding George as chair of EPTF is Nick Hart, a director at the Bipartisan Policy Center with a decade of experience working on evaluation policy issues in the federal government. Other EPTF members continuing in 2018 include: Mel Mark, Co-Chair, Katrina Bledsoe, Katherine Dawes, George Julnes, Stephanie Shipman, and Executive Director Denise Roosendaal. Mary Hyde and Jonathan Breul are both coming off the Task Force. We greatly appreciate their input and dedication. The group is also assisted by the work of the AEA policy consultant Cheryl Oros.

The Task Force meets on a regular basis for approximately an hour. In 2017, the group typically met on the third Friday of each month.  Per the appointment policy, the Executive Committee, the chair of the EPTF, and the Executive Director will review the applications and appoint the new members of the EPTF. Please submit a self-nomination including the reasons why you would like to serve and a description of your relevant experience or qualifications by February 5, 2018. Please email your submission to

International Evaluation Update

Reflecting Back and Looking Ahead: What’s In Store For International Evaluation in 2018?

 From Cindy Clapp-Wincek and Shawna Hoffman


Happy New Year from the International and Cross-Cultural Evaluation (ICCE) Topical Interest Group (TIG) and the International Working Group (IWG)!

Last year ended on a high note, with the 6th EvalMENA Evaluation Conference in Amman, Jordan, in October, followed by the Joint Global Conference on the Evaluation of the Sustainable Development Goals (SDGs) in Guanajuato, Mexico, in December. Excitingly, the latter was a joint effort by the International Development Evaluation Association (IDEAS) and two Latin America and Caribbean evaluation networks – ReLAC and REDLACME – to bring together evaluation professionals to promote the exchange of international, regional and local knowledge and experience on the progress and challenges of M&E for the SDGs.

Coming up in 2018
After successful events in 2017, neither the African Evaluation Association nor Community of Evaluators South Asia will be hosting their regional association conferences this year. However, you should keep your eyes peeled for events organized by our European (EES), Australasian (AES), and Middle East and North African (MENA) regional counterpart networks. And, if you’re interested in more country-focused evaluation work, the EvalPartners and IOCE websites share more about what’s going on in the global evaluation community, including a directory of the 113 Voluntary Organizations for Professional Evaluation (VOPEs) that have registered with IOCE.

Interested in getting involved?
The easiest way to plug in to the international conversation at AEA is by joining the ICCE TIG. This is a great way to connect with other colleagues from across the world, or those from the U.S. doing evaluation work internationally. The TIG is also always looking for volunteers to help review conference proposals and travel award applications, as well as be a part of the International Buddy Program that connects U.S.-based evaluators with international evaluators to share experiences and ensure that international participants feel welcomed at the conference. For more information, or to sign and get involved, contact ICCE TIG Chair Veronica Olazabal at If you are interested in participating in the International Working Group, which advises the AEA executive director on international perspectives on AEA policies, email Cindy Wincek at; she will let the working group chair know of your interest.

Cheers to a great year of international evaluation ahead!

Potent Presentations Initiative

New Year, New Presentation Organization Strategy?  

From Sheila B. Robinson, Potent Presentations Initiative Coordinator 

Sheila Robinson-RS 2.png

Happy New Year! Have you noticed what's been on sale at your local stores this month? Storage containers, shelving units, and other home goods for organization. After all the holiday entertaining, decorations and presents, retailers know we need new organization strategies for all of that stuff. That got me thinking about organizing our presentation libraries in preparation for creating our potent presentations.

Of course, it’s easy to line up books on a shelf, but most of my presentation resources are online. With our p2i tools organized into three main elements – Message, Design, and Delivery – plus Audience Engagement, it makes sense to organize our online resources using these categories.

What are some tools we can use to organize great articles, websites and notes on presentations? Browser bookmarks? Pinterest? Evernote? Google products? Each comes with a distinct set of advantages and disadvantages, and some with a price. The key is to know what you’re looking for in an organization strategy and committing to it.

What are your online organization needs?
Do you want to be able to sync across devices? Be able to add not only websites, but documents, as well? Want to know which tool can import and organize your handwritten notes? Are you looking for a free tool, or are you willing to pay for certain features?

Once you determine your needs and budget, selecting a tool becomes much easier. My online presentation resources are currently housed in Chrome bookmarks, on Pinterest and in a Dropbox folder. But, I also have notes in the Notes app on my phone and, of course, on about a gazillion pieces of paper in various notebooks! One of my New Year’s resolutions is to organize these in one place for easier access when I need them.

Tools for Organizing Resources
Check out 10 of the Best Bookmarking Tools for the Web, for comparisons of bookmarking tools and 11 Brilliant Tools for Organizing, Developing & Sharing Your Ideas for note-taking tools, visual organization tools, collaboration tools and general organization tools. In these articles, you’ll find brief annotations and links to all the tools – some paid, and many free. There’s just a bit of overlap between the two articles, as both feature Pinterest, Evernote and Trello, arguably some of the most popular tools. To complement these two lists of recommendations, in How to read and organize articles (without driving yourself crazy), content marketer Greg Ciotti describes in detail how he uses a combination of tools to organize online reading and notes.

What tool or combination of tools do you find most helpful for organizing resources?

p2i Needs Your Help!

  • Have you successfully used p2i tools or p2i principles in your presentations?
  • Do you have “before” and “after” slide examples you would be willing to share?
  • Do you have ideas for, or are you interested in writing a blog article on Potent Presentations?
  • Do you have an interest in sharing your tips for Potent Presentations through a brief video or webinar?

Please contact me at and let’s talk! I’m happy to help, offer guidance, or collaborate on any of these.

Part of my presentation library, loosely organized by Message and Delivery, followed by Design.


Upcoming eStudy Sessions

Have you made it a goal to invest in your professional development this year? Here are a few eStudy sessions AEA is offering in February and March. 


February 6, 8, 13 and 15
Using Correlation and Regression: Mediation, Moderation, and More
Multiple regression is a powerful and flexible tool that has wide applications in evaluation and applied research. Join Dale E. Berger, PhD, in for a four-day workshop series that will explore preparing data for analysis, selecting models, running analysis, interpreting results, and presenting your findings to a non-technical audience. The workshop will include demonstrations by the facilitator, as well as detailed handouts to guide future applications. 


March 20 and March 22 
Developing Quality Survey Questions
Surveys are a popular data collection tool for their ease of use and the promise of reaching large populations with a potentially small investment of time and technical resources. However, as survey fatigue grows, evaluators must be even more judicious in using surveys to yield meaningful data. This eStudy will tackle the survey design process and develop an understanding of the cognitive aspects of survey response and design. Participants will increase their ability to craft high quality survey questions and leave with resources to further develop their skills, including a copy of the facilitators’ updated draft checklist for crafting quality questions, soon to be published in a textbook.

More information and registration pages for these sessions can be found on the AEA eStudy page

Now Available: EPTF Webinar

The Evaluation Policy Task Force (EPTF) held a webinar on January 23 to recap their presentation from Evaluation 2017. Led by EPTF Consultant Cheryl Oros and EPTF Chair Nick Hart, the webinar reviewed the advocacy work the EPTF accomplished in 2017. You can view the webinar here using your membership login.

The Evaluation Policy Task Force drafts evaluation policies for the AEA, conducts outreach to advocate for evaluation to government at the federal, state, and international levels; and interfaces with AEA members to obtain input on policies. The EPTF has provided an update and invited discussion at each AEA conference since its inception in 2007.

Recent Stories
AEA Newsletter: February 2018

AEA Newsletter: January 2018

AEA Newsletter: December 2017