American Evaluation Association Guiding Principles For Evaluators

2018 Updated Guiding Principles 
Revisions herein ratified by the AEA membership in August 2018. 

Below you will find the Guiding Principles for Evaluators in their entirety. Brochures of the Guiding Principles are available, free of charge, in both hardcopy and PDF. To download the Guiding Principles in PDF format, click here. To obtain hardcopies of the Guiding Principles, please contact the AEA office at or 202-367-1166. 

I. Preface

Purpose of the Guiding Principles: The Guiding Principles reflect the core values of the American Evaluation Association (AEA) and are intended as a guide to the professional ethical conduct of evaluators. 

Focus and Interconnection of the Principles: The five Principles address systematic inquiry, competence, integrity, respect for people, and common good and equity. The Principles are interdependent and interconnected. At times, they might even conflict with one another. Therefore, evaluators should carefully examine how they justify professional actions. 

Use of Principles: The Principles govern the behavior of evaluators in all stages of the evaluation from the initial discussion of focus and purpose, through design, implementation, reporting, and ultimately the use of the evaluation. 

Communication of Principles: It is primarily the evaluator's responsibility to initiate discussion and clarification of ethical matters with relevant parties to the evaluation. The Principles can be used to communicate to clients and other stakeholders what they can expect in terms of the professional ethical behavior of an evaluator. 

Professional Development about Principles: Evaluators are responsible for understanding professional development to learn to engage in sound ethical reasoning. Evaluators are also encouraged to consult with colleagues on how best to identify and address ethical issues. 

Structure of the Principles: Each Principle is accompanied by several substatements to amplify the meaning of the overarching principles and to provide guidance for its application. These sub-statements do not include all possible applications of that principle, nor are they rules that provide the basis for sanctioning violators. The Principles are distinct from Evaluation Standards and evaluator competencies. 

Evolution of Principles: The Principles are part of an evolving process of self-examination by the profession in the context of a rapidly changing world. They have been periodically revised since their first adoption in 1994. Once adopted by the membership, they become the official position of AEA on these matters and supersede previous versions. It is the policy of AEA to review the Principles at least every five years, engaging members in the process. These Principles are not intended to replace principles supported by other disciplines or associations in which evaluators participate. 

II. Glossary of Terms

Common Good - the shared benefit for all or most members of society including equitable opportunities and outcomes that are achieved through citizenship and collective action. The common good includes cultural, social, economic, and political resources as well as natural resources involving shared materials such as air, water, and a habitable earth. 

Contextual Factors  - geographic location and conditions; political, technological, environmental, and social climate; cultures; economic and historical conditions; language, customs, local norms, and practices; timing; and other factors that may influence an evaluation process or its findings. 

Culturally Competent Evaluator  - "[an evaluators who] draws upon a wide range of evaluation theories and methods to design and carry out an evaluation that is optimally matched to the context. In constructing a model or theory of how the evaluation operates, the evaluator reflects the diverse values and perspectives of key stakeholder groups,'"11

Environment - the surroundings or conditions in which a being lives or operates; the setting or conditions in which a particular activity occurs. 

Equity  - the condition of fair and just opportunities for all people to participate and thrive in society regardless of individual or group identity or difference. Striving to achieve equity includes mitigating historic disadvantage and existing structural inequalities. 

Guiding Principles vs. Evaluation Standards  - the Guiding Principles pertain to the ethical conduct of the evaluator whereas the Evaluation Standards pertain to the quality of the evaluation

People or Groups  - those who may be affected by an evaluation including, but not limited to, those defined by race, ethnicity, religion, gender, income, status, health, ability, power, underrepresentation, and/or disenfranchisement.

Professional Judgment  - decisions or conclusions based on ethical principles and professional standards for evidence and argumentation in the conduct of an evaluation. 

Stakeholders  - individuals, groups, or organizations served by, or with a legitimate interest in, an evaluation including those who might be affected by an evaluation. 

1 American Evaluation Association (2011). Public Statement on Cultural Competence
in Evaluation. Washington DC: Author. p. 3.

III. The Principles

A. Systematic Inquiry: Evaluators conduct data-based inquiries that are thorough, methodical, and contextually relevant. 

A1. Adhere to the highest technical standards appropriate to the methods being used while attending to the evaluation's scale and available resources. 

A2. Explore with primary stakeholders the limitations and strengths of the core evaluations questions and the approaches that might be used for answering those questions. 

A3. Communicate methods and approaches accurately, and in sufficient detail, to allow others to understand, interpret, and critique the work. 

A4. Make clear the limitations of the evaluation and its results.

A5. Discuss in contextually appropriate ways the values, assumptions, theories, methods, results, and analyses that significantly affect the evaluator's interpretations of the findings. 

A6. Carefully consider the ethical implications of the use of emerging technologies in evaluation practice.

B. Competence: Evaluators provide skilled professional services to stakeholders.

B1. Ensure that the evaluation team possesses the education, abilities, skills, and experiences required to complete the evaluation competently. 

B2. When the most ethical option is to proceed with a commission or request outside the boundaries of the evaluation team's professional preparation and competence, clearly communicate any significant limitations to the evaluation that might result. Make every effort to supplement missing or weak competencies directly or through the assistance of others.

B3. Ensure that the evaluation team collectively possesses or seeks out the competencies necessary to work in the cultural context of the evaluation. 

B4. Continually undertake relevant education, training or supervised practice to learn new concepts, techniques, skills, and services necessary for competent evaluation practice. Ongoing professional development might include: formal coursework and workshops, self-study, self-or externally-commissioned evaluations of one's own practice, and working with other evaluators to learn and refine evaluative skills expertise. 

C. Integrity: Evaluators behave with honesty and transparency in order to ensure the integrity of the evaluation. 

C1. Communicate truthfully and openly with clients and relevant stakeholders concerning all aspects of the evaluation, including its limitations. 

C2. Disclose any conflicts of interest (or appearance of a conflict) prior to accepting an evaluation assignment and manage or mitigate any conflicts during the evaluation. 

C3. Record and promptly communicate any changes to the originally negotiated evaluation plans, that rationale for those changes, and the potential impacts on the evaluation's scope and results. 

C4. Assess and make explicit the stakeholders', clients', and evaluators' values, perspectives, and interests concerning the conduct and outcome of the evaluation. 

C5. Accurately and transparently represent evaluation procedures, data, and findings. 

C6. Clearly communicate, justify, and address concerns related to procedures or activities that are likely to produce misleading evaluative information or conclusions. Consult colleagues for suggestions on proper ways to proceed if concerns cannot be resolved, and decline the evaluation when necessary. 

C7. Disclose all sources of financial support for an evaluation, and the source of the request for the evaluation. 

D. Respect for People: Evaluators honor the dignity, well-being, and self-worth of individuals and acknowledge the influence of culture within and across groups. 

D1. Strive to gain an understanding of, and treat fairly, the range of perspectives and interests that individuals and groups bring to the evaluation, including those that are not usually included or are oppositional. 

D2. Abide by current professional ethics, standards, and regulations (including informed consent, confidentiality, and prevention of harm) pertaining to evaluation participants. 

D3. Strive to maximize the benefits and reduce unnecessary risks or harms for groups and individuals associated with the evaluation. 

D4. Ensure that those who contribute data and incur risks do so willingly, and that they have knowledge of and opportunity to obtain benefits of the evaluation. 

E. Common Good and Equity: Evaluators strive to contribute to the common good and advancement of an equitable and just society. 

E1. Recognize and balance the interests of the client, other stakeholders, and the common good while also protecting the integrity of the evaluation. 

E2. Identify and make efforts to address the evaluation's potential threats to the common good especially when specific stakeholder interests conflict with the goals of a democratic, equitable, and just society. 

E3. Identify and make efforts to address the evaluation's potential risks of exacerbating historic disadvantage or inequity.

E4. Promote transparency and active sharing of data and findings with the goal of equitable access to information in forms that respect people and honor promises of confidentiality. 

E5. Mitigate the bias and potential power imbalances that can occur as a result of the evaluation's context. Self-assess one's own privilege and positioning within that context. 

More information about the work of the 2017-2018 Guiding Principles Task Force (GPTF)

Click here to view the updated Evaluators' Ethical Guiding Principles.

More Information: Click here  for a comparison of the 2004 Guiding Principles and the updated version. 

                                Click here for a summary of the timeline of the work done by the 2017-18 GPTF. 
                                Click here for a list of FAQs about the Revised Guiding Principles.

                                Click here for a recording of the virtual Town Hall that was conducted with members of the GPTF 

Members of the 2017-2018 GPTF: 

  • Beverly Parsons, Insites (Chair)
  • Lisa Aponte-Soto, Mile Square Health Center
  • Lori Bakken, University of Wisconsin
  • Eric Barela,
  • Leslie Goodyear, EDC, Inc.
  • Tom Kelly, Hawai'i Communicty Foundation
  • Michael Morris, University of New Haven
  • Kathy Tibbets, Lili'ukalani Trust
  • Valerie Williams, University Corp. of Atmospheric Science (UCAR)

For questions on the process of the GPTF, or on the 2018 updates to the Guiding Principles, please contact Natalie DeHart, AEA Staff, at


2013 Guiding Principles Review and Report

From 2011-2013 the Guiding Principles Review Task Force worked to determine: 1) To what extent are the ethical standards of the evaluation profession, referring to the Guiding Principles, reflected throughout AEA's governance documents? And 2) What are member perceptions regarding the value and use of the Guiding Principles?

To view the 2013 report in full, click here. For a background on the review process and task force membership, see page 10 of the report.


Guiding Principles Historical Background

In 1986, the Evaluation Network (ENet) and the Evaluation Research Society (ERS) merged to create the American Evaluation Association. ERS had previously adopted a set of standards for program evaluation (published in New Directions for Program Evaluation in 1982); and both organizations had lent support to work of other organizations about evaluation guidelines. However, none of these standards or guidelines were officially adopted by AEA, nor were any other ethics, standards, or guiding principles put into place. Over the ensuing years, the need for such guiding principles was discussed by both the AEA Board and the AEA membership. Under the presidency of David Cordray in 1992, the AEA Board appointed a temporary committee chaired by Peter Rossi to examine whether AEA should address this matter in more detail. That committee issued a report to the AEA Board on November 4, 1992, recommending that AEA should pursue this matter further. The Board followed that recommendation, and on that date created a Task Force to develop a draft of guiding principles for evaluators. The task force members were:

William Shadish, Memphis State University (Chair)
Dianna Newman, University of Albany/SUNY
Mary Ann Scheirer, Private Practice
Chris Wye, National Academy of Public Administration

The AEA Board specifically instructed the Task Force to develop general guiding principles rather than specific standards of practice. Their report, issued in 1994, summarized the Task Force's response to the charge.

Process of Development. Task Force members reviewed relevant documents from other professional societies, and then independently prepared and circulated drafts of material for use in this report. Initial and subsequent drafts (compiled by the Task Force chair) were discussed during conference calls, with revisions occurring after each call. Progress reports were presented at every AEA board meeting during 1993. In addition, a draft of the guidelines was mailed to all AEA members in September 1993 requesting feedback; and three symposia at the 1993 AEA annual conference were used to discuss and obtain further feedback. The Task Force considered all this feedback in a December 1993 conference call, and prepared a final draft in January 1994. This draft was presented and approved for membership vote at the January 1994 AEA board meeting.

Resulting Principles. Given the diversity of interests and employment settings represented on the Task Force, it is noteworthy that Task Force members reached substantial agreement about the following five principles. The order of these principles does not imply priority among them; priority will vary by situation and evaluator role.

A. Systematic Inquiry: Evaluators conduct systematic, data-based inquiries about whatever is being evaluated.

B. Competence: Evaluators provide competent performance to stakeholders.

C. Integrity/Honesty: Evaluators ensure the honesty and integrity of the entire evaluation process.

D. Respect for People: Evaluators respect the security, dignity and self-worth of the respondents, program participants, clients, and other stakeholders with whom they interact.

E. Responsibilities for General and Public Welfare: Evaluators articulate and take into account the diversity of interests and values that may be related to the general and public welfare.

Recommendation for Continued Work. The Task Force also recommended that the AEA Board establish and support a mechanism for the continued development and dissemination of the Guiding Principles, to include formal reviews at least every five years. The Principles were reviewed in 1999 through an EvalTalk survey, a panel review, and a comparison to the ethical principles of the Canadian and Australasian Evaluation Societies. The 2000 Board affirmed this work and expanded dissemination of the Principles; however, the document was left unchanged.

Process of the 2002-2003 Review and Revision. In January 2002 the AEA Board charged its standing Ethics Committee with developing and implementing a process for reviewing the Guiding Principles that would give AEA’s full membership multiple opportunities for comment. At its Spring 2002 meeting, the AEA Board approved the process, carried out during the ensuing months. It consisted of an online survey of the membership that drew 413 responses, a “Town Meeting” attended by approximately 40 members at the Evaluation 2002 Conference, and a compilation of stories about evaluators’ experiences relative to ethical concerns told by AEA members and drawn from the American Journal of Evaluation. Detailed findings of all three sources of input were reported to the AEA Board in A Review of AEA’s Guiding Principles for Evaluators, submitted January 18, 2003.

In 2003 the Ethics Committee continued to welcome input and specifically solicited it from AEA’s Diversity Committee, Building Diversity Initiative, and Multi-Ethnic Issues Topical Interest Group. The first revision reflected the Committee’s consensus response to the sum of member input throughout 2002 and 2003. It was submitted to AEA’s past presidents, current board members, and the original framers of the Guiding Principles for comment. Twelve reviews were received and incorporated into a second revision, presented at the 2003 annual conference. Consensus opinions of approximately 25 members attending a Town Meeting are reflected in this, the third and final revision that was approved by the Board in February 2004 for submission to the membership for ratification. The revisions were ratified by the membership in July of 2004.

The 2002 Ethics Committee members were:

Doris Redfield, Appalachia Educational Laboratory (Chair)
Deborah Bonnet, Lumina Foundation for Education
Katherine Ryan, University of Illinois at Urbana-Champaign
Anna Madison, University of Massachusetts, Boston

In 2003 the membership was expanded for the duration of the revision process:

Deborah Bonnet, Lumina Foundation for Education (Chair)
Doris Redfield, Appalachia Educational Laboratory
Katherine Ryan, University of Illinois at Urbana-Champaign
Gail Barrington, Barrington Research Group, Inc.
Elmima Johnson, National Science Foundation