By Denise Roosendaal
Are you taking advantage of your membership benefits? One important benefit is learning from your colleagues and finding opportunities to share knowledge. AEA members come to the organization from a variety of places. While the groupings are not surprising, the variety of other sources of members may surprise some of our readers. How do you connect with members from different types of organizations? According to the 2016 Membership Survey, the second most commonly cited reason for belonging to AEA was to “network and build relationships.”
Primary Work Setting
Here are some other aspects of membership benefits that you might find interesting:
TIG Membership: Your AEA membership allows you to join up to five of the 56 Topical Interest Groups. The AEA membership survey illustrated the popularity of TIGs with 75 percent of respondents reporting that they found the TIGs to be a useful resource. Sadly, 70 percent of members who do not renew their membership never join a TIG, thereby missing out on the great benefits of membership. Joining a TIG is very easy and can be easily managed here. Simply log in and join the TIGs that best suit your interests. To join a TIG, click 'Join' next to the TIG name. Your request will be queued for approval to ensure you have not exceeded your limit of five. You will be notified when your request is approved. To remove yourself from a TIG to which you are already assigned, click 'Member' next to the TIG name below, and then click the 'Leave Community' button on the following page.
University Programs: AEA has developed a new program to encourage participation by individual universities. The University Membership Program allows a university to offer one free AEA membership for attracting and maintaining 13 student memberships. AEA also wants to promote all university programs in evaluation, so a webpage is dedicated to listing those programs and their associated details. If you are part of a university with an evaluation program, be sure it is listed here. Contact Zachary Grays to update or add your information.
Find-an-Evaluator Feature: Statistically, the AEA website’s Find-An-Evaluator feature is one of the more popular ones. This program allows you and your organization to be found and seen by others outside of AEA who might be searching for an evaluator. This enhanced, members-only benefit allows evaluator users to search you by name, location, or self-selected keywords. Be sure you’re being seen by potential evaluation users by updating your listing here.
Membership Profiles: AEA members can update their personal profiles with contact information and a photo. According to the website statistics, the member profile search capability is a popular feature and allows members to personalize their information.The search capability is a great way to find fellow AEA members. Sadly, many members do not take advantage of all the benefits of the member profile feature. If you have questions about updating your own member profile, contact Zachary Grays or Katherine Vogelsang.
Social Media: AEA is active on multiple social media platforms, including: Twitter, Facebook, and LinkedIn. Our social media platforms provide quick access to association news, insightful articles regarding evaluation practices, tips, and tools, and provides an opportunity for members to connect around common interests.
If you’re not already taking advantage of these membership benefits, I encourage you to explore them!
From Zachary Grays, AEA Staff
Deadline: April 1, 2017
Program Purpose and Goals
The Graduate Education Diversity Internship Program provides paid internship and training opportunities during the academic year. The GEDI program works to engage and support students from groups traditionally under-represented in the field of evaluation. The goals of the GEDI Program are to:
- Expand the pool of graduate students of color and from other under-represented groups who have extended their research capacities to evaluation.
- Stimulate evaluation thinking concerning under-represented communities and culturally responsive evaluation.
- Deepen the evaluation profession’s capacity to work in racially, ethnically and culturally diverse settings.
Interns may come from a variety of disciplines including public health, education, political science, anthropology, psychology, sociology, social work, and the natural sciences. Their commonality is a strong background in research skills, an interest in extending their capacities to the field of evaluation, and a commitment to thinking deeply about culturally responsive evaluation practice.
The interns gather for the first time in August and/or early September for three days to receive an orientation to the program, evaluation, and culturally responsive evaluation practice. The students attend the AEA Annual Conference for a full week in the fall, where they attend pre-conference workshops and multiple education sessions. They also meet with the program director to give updates on their evaluation-focused service learning project.
The students gather again in January or February for three days to receive further training as well as coaching and feedback on their progress related to their internship projects. They gather for a final time for four days in June at the AEA Summer Institute to present and receive feedback on their final projects, attend the Institute, and take part in a commencement ceremony. Concurrently, they participate in a 10-month site-based internship placement that provides them with real-time, hands-on practice in evaluation skills.
Program and Leadership History
Duquesne University (2003-09)
- Rodney Hopson (Director)
- Prisca Collins (Coordinator)
- Tanya Brown (Coordinator)
University of North Carolina, Chapel Hill (2009-11)
- Rita O’Sullivan and Michelle Jay (Co-Directors)
Claremont Graduate University (2011-present)
- Stewart Donaldson (Director)*
- Katrina Bledsoe (Co-Director)
- Ashaki M. Jackson (Co-Director)*
- John LaVelle (Coordinator
- Richard Dowlat (Coordinator)*
*Denotes those currently serving
Current leadership proposes strong academic sites at which evaluation programming is offered and resources can be shared to support that site as a home base.
Program Director Role and Responsibilities
The program director role is fundamental to achieving the present and future goals of AEA’s GEDI internship program. Therefore, it requires a deep investment in developing interns while consistently meeting specified program objectives. To ensure operational excellence, past program directors have elected to designate a program co-director to assist with administrative tasks, cohort oversight, and maintaining relationships with the host sites. This assistance is highly recommended.
The program director is responsible for the following key facets of the internship program:
- Overseeing the curriculum: As program director, thought leadership is indispensable. Therefore, the program director will build on the existing curriculum to refine a curricular framework for the program that spans the two stand-alone trainings and is supplemented by thoughtful workshop and session selection for the conference and Summer Institute.
- Facilitating training:The program director will provide introductory evaluation training and coaching to the interns during the two stand-alone trainings and any conference call follow-ups and will work with the AEA staff to arrange for outside speakers and facilitators to supplement their offerings. For example, the program director might facilitate/coach approximately 50 percent of the opening workshop training and 25 percent of the winter training, with the content provided by the program director supplemented during the training retreats by site visits, discussion groups, and the presentations of other facilitators. Presumably, co-directors would facilitate a larger percentage, drawing on the expertise of two leaders.
- Serving as coach: While program director is not responsible for serving as a mentor per se, their investment in helping students move through the program and connect with other professionals and/or resources is key to the success of the internship program. The program director will thus provide interns guidance through the program through monthly conference calls and email exchanges on a regular basis.
- Serving as host:The program director is the champion for fostering strong relationships with interns. Therefore, the program director serves as host at all four events, and will be responsible for connecting with the interns, welcoming them, and encouraging networking and professional growth.
AEA staff will be responsible for the operational aspects of the program, including overseeing student recruitment, handling logistics for all meetings, and funds management. The AEA staff and program director will work collaboratively on selecting and securing placement sites. The program director/co-director will work collaboratively with the AEA executive director and staff to set policies regarding recruitment, student selection, and general guidance for the program. The extent to which the director participates in the selection and advisory process is at the director’s discretion.
Program Director Criteria
The program director should meet the following criteria:
- Have completed a master’s or doctoral degree and be teaching or practicing in the field of evaluation
- Have experience of successful teaching or training in the field of evaluation or related areas
- Possess knowledge of the needs and experiences of traditionally underrepresented students and/or students of color
- Have knowledge of culturally responsive evaluation practices
- Have the ability to guide or coach students or young professionals
- Be available to attend four training opportunities each year for the next two years as described above
- Be affiliated with an academic institution with a strong evaluation program/component
This is primarily a service-focused position. The program director(s)will receive travel support (airfare, accommodations, and registration) for the Summer Institute and for either of the stand-alone trainings not held within driving distance. The program director(s) will also be reimbursed for a hotel for up to three nights at the Annual Conference, recognizing the need to arrive early or stay late in support of the student interns. In addition, the program director will receive a stipend of $20,000 in recognition of the facilitation work involved in leading this program.
In most recent years, the director has also created opportunities for members of their internal team to participate in the process to learn how the program works and to offer administrative support and guidance.
Program Co-Director Role and Responsibilities
The program co-director plays an intricate role in ensuring that the vision and mission of the American Evaluation Association is realized. This role is designed to not only assist the program director in key administrative functions, but to also be a conduit for the continued growth of the program. The position is funded through a total stipend amount of $5,000. No additional funding is provided by AEA.
The following are the key responsibilities for the program co-director:
Serving as key liaison: The program co-director will maintain direct communication with supervisors throughout the year regarding program updates and upcoming activities. This will include in-person meetings when requested. The individual will also distribute monthly announcements to the GEDI community about pertinent news as well as job and grant opportunities. The program co-director will frequently check-in with scholars to access initial program experience as well as answer questions and requests through the year. As liaison, the program co-director will work diligently to ensure all field concerns are promptly responded to when needed. The program co-director will also be asked to deliver a speech at the yearly winter conference as well as introducing a variety of conference speakers, including theorists, scholars, and practitioners.
Coordinating meetings: A key initiative of program co-director will be to ensure the planning and coordination of GEDI webinars, seminars, and conferences throughout the year. This will include but is not limited to logistical preparations such as identifying, inviting, and hosting GEDI partners to instruct for webinars, as well as creating itineraries, taking notes, as disseminating information regarding meeting previously recorded and future meetings. The program co-director will ensure the fostering of continued relationships with AEA management, instructors, and host sites.
Creating value: The program co-director will have the opportunity to create value for the yearly conference by attending workshops and recommended sessions and by collecting sample handouts and networking with potential conference instructors. Program co-director will also meet with scholar groups socially to discuss pre-determined topics.
Setting up online-systems: Core to administrative functions will be the continuous update of submission and return deadlines, as well as ensuring the distribution of comprehensive bi-weekly program messages with resources. The program co-director will also be essential to maintaining the electronic portfolio for each scholar which will include copies of all group deliverables as well individual deliverables that will be distributed by the program’s end.
Ensuring excellence: The program co-director will distribute key deliverable instructions and explain to the scholars verbally. This will include reading and rigorously editing all drafts for excellence in content logic, quality, and clarity. He or she will also take initiative by coordinating a team of readers if the number of deliverables cannot be reviewed in a timely manner.
Serving in this role(s) comes with a great deal of responsibility and is seen as an invaluable contribution to the evaluation profession, AEA, and expanding the awareness of and opportunity to diverse and underrepresented community. Regarding compensation, the program director will receive $20,000 payable in two (2) installments of $10,000. The program co-director will receive $5,000 payable in two (2) installments of $2,500. Each stipend installment amount will be paid at the beginning of each cohort and upon successful completion of each program year.
To apply on or before April 1, 2017, please submit a letter of interest and a curriculum vitae or résumé via email to email@example.com. The letter should be no more than three pages in length and should detail:
- the ways in which you meet each of the six specified criteria;
- why you are interested in leading this program;
- any unique qualities, experiences, or background that you bring to this opportunity that would further enhance your ability to fulfill the role; and
- any specific support your place of employment/institution plans to offer you in this endeavor.
Questions? Please contact Denise Roosendaal, AEA executive director, at firstname.lastname@example.org or (202) 367-1166.
From Sheila B. Robinson, Potent Presentations Initiative Coordinator
Here are nearly two dozen ways to captivate your audience from the first moment:
- Tell a story
- Ask a thought-provoking question
- State a shocking statistic
- Share a powerful quote
- Show a gripping photo
- Use a prop or creative visual aid
- Play a short video
- Present a “what if” or an “imagine” scenario
- Let the room go silent
- Share a brief overview or outline of your presentation
- Give a dramatic example
- Connect personally:share your personal view or a personal story or anecdote
- Use humor
- Give an expert opinion
- Use a relevant sound effect
- Share a testimony or success story
- Refer to a current event
- Refer to a historical event
- Refer to a well-known person
- Refer to a recent, relevant conversation you have had
- Quote from recent research
- Open with a problem to be solved
- Get the audience talking to each other
For more information on these and a few additional nuggets of wisdom, check out these articles on opening your presentation:
- 7 Brilliant Ways to Start Any Presentation
- 7 Memorable Ways to Open A Speech Or Presentation
- 4 ways to Start a Speech Strongly
- How to Start a Speech – 12 Foolproof Ways to Grab Your Audience
- 15 Ways to Start a Speech + Bonus Tips
Can you think of other ways to start?
We need your help!
- Have you successfully used p2i tools or p2i principles in your presentations?
- Do you have “before” and “after” slide examples you would be willing to share?
- Do you have ideas for, or are you interested in, writing a blog article on Potent Presentations?
- Do you have an interest in sharing your tips for Potent Presentations through a brief video or webinar?
Please contact me at email@example.com and let’s talk! I’m happy to help, offer guidance, or collaborate on any of these.
Valerie Caracelli has been a professional evaluator for over 30 years. Much of her career has been at the U.S. Government Accountability Office (GAO), where she works in the Applied Research and Methods Team, in the Center for Evaluation Methods and Issues. In addition to conducting congressionally requested studies with Stephanie Shipman, she consults with other GAO Teams on evaluation design issues and assists in examining the quality of evaluation studies.
She has been a member of the American Evaluation Association since its founding. With Jennifer Greene, her scholarly interests in mixed methods resulted in several articles that framed mixed methods evaluation designs (see for example, their co-edited volume, Advances in Mixed-Method Evaluation (NDE, no. 74). She has served as chair of the Evaluation Use TIG, now known as the Use and Influence of Evaluation TIG, and maintains active engagement. In 2000, she and Hallie Preskill conducted a survey of evaluators about evaluation use. This led to a New Directions for Evaluation volume (NDE-no. 88), “The Expanding Scope of Evaluation Use.”
Valerie has also served as an informal ambassador for AEA. In 2002, she was an invited keynote speaker at the annual conference of the Associazione Italiana di Valutazione, and in 2005, she gave invited addresses/workshops at the Australasian Evaluation Society conference, Queensland Auditor-General's Office, and in Wellington, New Zealand. As a member of the AEA Board of Directors from 2007-2009, Valerie served as liaison to the Ethics Committee and was deeply involved in the development of additional case study training materials on the Guiding Principles for Evaluators. Valerie has been active in the Washington Evaluators, one of AEA's first local affiliates, for over 20 years, where she has served as president, secretary, board member, program chair and currently as a program committee member. An interest in evaluation quality, metaevaluation, and evaluation synthesis, shared with Leslie Cooksy, has led to several co-authored articles, most recently in NDE 138. Among other roles at conference sessions, Valerie has served AEA as co-chair of the local arrangements committee for D.C.-based conferences and 2010 conference program co-chair. Like many of her peers, she has served on a variety of editorial boards, including Evaluation Practice/American Journal of Evaluation, New Directions for Evaluation, and currently for Evaluation Review. She currently serves on AEA's Oral History Project Team, which publishes articles in AJE on the professional development of scholars who have enlightened evaluation theory and practice. Valerie received her undergraduate degree in psychology in 1979 and her Ph.D. in developmental psychology from Cornell University in 1988.
INTEGRITY. AEA Statement: “We value high quality, ethically defensible, culturally responsive evaluation practices…” Integrity is our bedrock and it underscores all that we do. It provides confidence in our work, particularly when we provide defensible findings that may not be what the client would have preferred to hear. Without integrity we are no more than public relations professionals who package a product to appease a client. I am proud to provide a service that meets many request for proposal requirements of including a “neutral, third-party” evaluator.
CLARITY. AEA Statement: “The American Evaluation Association values…utilization of evaluation findings.” A report that meets a requirement and goes unread has little value. The AEA TIGs concerning utilization and data visualization have reinforced our responsibility to not only write competent reports, but to write reports that people want to read and that provide salient implications for their practice. I take great pride when customers return to write new proposals with me because of the usefulness of the reports that we have written for them on previous projects.
LOCALITY. AEA Statement: “We value inclusiveness and diversity.” Independent evaluation consultants are a diverse group and I treasure our differences. We include international diversity experts, community psychologists, quantitative experts, and qualitative analysts. I have learned that if you carefully identify a market need, you can be highly specialized in a small niche…and this can be a very successful endeavor. For me, I try to serve the evaluation and data needs of the educational community in Southwestern Illinois (across the Mississippi River from St. Louis). Beyond evaluation and educational methodology, I define myself by my professional devotion to this distinct Illinois community.
CONTINUITY. AEA Statement: “We value the continual development of evaluation professionals…” Continual development of oneself, of our projects, of our relationships with our clients and one another is a logical following to the three previous concepts. Through Integrity, Clarity of reporting, and commitment to our niche (defined as Locality for me), our relationships develop, mature and provide new opportunities. It’s almost magical how this happens, because it is difficult to explain. In my experience, the best evaluators don’t heavily market themselves. A successful evaluator is successful as a result of dedication to their personal and professional values.
AEA is a welcoming community of evaluators who are deeply committed to the application and exploration of many forms of evaluation. I fondly remember my first AEA conference and how I was so struck by the affable nature and generosity of members who shared their time, knowledge, and expertise with me. As an AEA member, one never feels alone when wrestling with an evaluation issue (e.g., discerning ways to increase evaluation use in varied contexts), as an entire professional evaluation community is accessible in a variety of ways. In my ongoing effort to help demystify “what is” evaluation, I find the AEA website is one of my most useful tools for introducing individuals to the field—especially, those who are unfamiliar and/or reluctant to engage in evaluation. They are often surprised by the expansiveness and importance of the field when made aware of AEA documents, such as AEA’s Guiding Principles for Evaluators and Evaluation Roadmap for a More Effective Government.
What’s the most memorable or meaningful evaluation you have been a part of?
The evaluation of a faculty development program for online course re/design offered through a center for teaching and learning (CTL) at an urban research university is perhaps the most memorable and meaningful evaluation I have been a part of. It is memorable because an exploratory evaluation approach (i.e., evaluability assessment) was used to assess the extent to which a highly visible program was ready for future evaluation. Consequently, a program analysis was undertaken as a “first step” to help intended users and key stakeholders clarify the program’s design, goals, intended outcomes—and, to agree upon the evaluation’s purpose, its focus, as well as intended uses of evaluation information. The evaluability assessment was especially meaningful and valuable for me (as the evaluator), because the approach undertaken to get ready for program evaluation became an exemplar that cross-pollinated into other higher education contexts (i.e., multidisciplinary settings within academe and externally to the National Collegiate Athletic Association). During the process, key primary stakeholders began to recognize that program evaluation, assessment, research, and performance measurement serve different purposes. Stakeholders also became aware that a program’s life cycle, its design, and questions asked informed the evaluation’s design.
In preparation for the faculty development program for online course re/design evaluation, dynamic and iterative evaluation planning models (i.e., Navigational Map, Front-end Evaluation Planning Framework) were introduced as quick references to help build evaluation capacity and encourage evaluative thinking among primary stakeholders. The ensuing evaluability assessment was highly collaborative and involved a process that collected information and reviewed documents to identify the program’s multiple components. In addition, it facilitated discussion and interviews with program staff who were familiar with the program’s design, which resulted in the development of a conceptual framework, stakeholders’ program theory, logic model, as well as an evaluable model to determine a set of evaluation options that culminated in next steps to develop an “Utilization Focused Evaluation” (UFE) evaluation plan, which identified what data to collect, the data collection instruments that were developed, as well as data analysis—all necessary to have received the IRB approval needed to carry out and disseminate evaluation information.
What advice would you give to those new to the field?
Below are advice, resources, and references I would share with those new to the field of evaluation:
- Regularly consult basic AEA evaluation documents such as: 1) AEA’s Guiding Principles for Evaluators, 2) Evaluation Roadmap for a More Effective Government, 3) AEA Evaluator Competencies, and 4) American Evaluation Association Statement On Cultural Competence In Evaluation.
- Be mindful to not assume that a policy or program is conceptualized or designed well. Policies and/or programs are often complex and/or poorly defined—especially, in government, nonprofit sectors, and education related contexts (e.g., secondary and postsecondary education, professional development, re/training and/or corporate learning programs, etc.).
- Consider conducting an evaluability assessment to determine whether a program is ready for an evaluation. Front-end evaluation planning is key to successful evaluations. Time invested at the front-end yields invaluable returns when analyzing data and averts garbage-in / garbage-out (GIGO).
- Conceptualization generates powerful graphic tools that describe the main things to evaluate. Conceptual frameworks and document models are under utilized tools that can make a world of difference in the quality of an evaluation. Having a complete understanding of what is to be evaluated helps to get key stakeholders on the same page. There are a variety of conceptualization methods and tools available. Learn about them and use them. Equally, there are many theories and approaches to evaluation—learn about them as well. Remember, it is important to choose an evaluation design that can be realistically undertaken.
- Primary intended users of evaluation information should agree upon questions and their purpose—at the beginning. Evaluation questions reflect an evaluation’s purpose.
- Always, always, always stay abreast of evaluator competencies. Professional development is important! As such, continually seek out opportunities to stretch and grow professionally—even when it is inconvenient or feels uncomfortable.