Dear AEA Colleagues,
Wow! In only six weeks, we’ll be gathering in Denver for our 2014 conference!
While the AEA staff under Denise Roosendaal’s leadership have been busy working out the conference schedule and many logistics, Matt Keene, Lovely Dhillon (the conference program chair and co-chair, respectively), and I have been busy developing the presidential strand, drawing on recommendations from the leaders of the Topical Interest Groups. I’d like to expand on what this year’s conference theme is all about, and provide you with more detail on the presidential strand sessions.
As you recall, this year’s conference theme is Visionary Evaluation for a Sustainable, Equitable Future.
What do we mean by a sustainable, equitable future? There is no single definition; we each have our own. I envision a desired future as one in which we are able to creatively balance our economic, social, and natural environments to provide a healthy world for our current population, as well as for generations to come.
Also, visionary evaluation is not a particular method – it is shorthand for encouraging evaluators to pay special attention to creative twists they can put on their work to support movement toward a desired future.
We hope that you will leave the conference with both a renewed sense of what a sustainable, equitable future means to you and how you can use a visionary approach to evaluation to contribute to that future. With that in mind, we have a bevy of educational sessions for you to partake in at the conference.
I’d like to focus on the presidential strand. This strand is composed of two sections: presentation sessions and discussion sessions. The presentation sessions features presentations followed by questions and answers with the presenters. The sessions give examples of initiative/program evaluations that seek to contribute to a sustainable, equitable future. They tie to the plenary sessions and tend to build from examples to theory as the week progresses.
The discussion sessions feature discussion with a variety of participants, from think tanks to World Cafés, in which the larger group is divided into small groups of four or five people for multiple rounds of intimate conversation focused on questions of importance. The sessions encourage participants to share their evaluation examples and create their own visionary evaluation for supporting a sustainable, equitable future relevant to the context in which they work. These sessions provide special opportunities to network and build new relationships with other conference attendees.
Both the presentation and the discussion sessions focus on one or more of the three areas of emphasis in the conference theme: systems thinking, building relationships, and equitable and sustainable living. They also prepare you to be an active participant in the closing plenary about how AEA can support visionary evaluation for a sustainable, equitable future for all.
I encourage you to learn more about the conference and the presidential sessions by visiting our website.
Also, I’d like to ask for your input on items to include in the annual meeting agenda. Please send those items to me at email@example.com.
Finally, I’d like to congratulate our newly elected board members. I am confident that AEA has a bright future under their leadership. Stay tuned for more information on the new AEA directors in upcoming newsletters.
I hope to see you at Evaluation 2014!
AEA 2014 President
Are you familiar with AEA's values statement? What do these values mean to you in your service to AEA and in your own professional work? Each month, we'll be asking a member of the AEA community to contribute her or his own reflections on the association's values.
AEA's Values Statement
The American Evaluation Association values excellence in evaluation practice, utilization of evaluation findings, and inclusion and diversity in the evaluation community.
i. We value high quality, ethically defensible, culturally responsive evaluation practices that lead to effective and humane organizations and ultimately to the enhancement of the public good.
ii. We value high quality, ethically defensible, culturally responsive evaluation practices that contribute to decision-making processes, program improvement, and policy formulation.
iii. We value a global and international evaluation community and understanding of evaluation practices.
iv. We value the continual development of evaluation professionals and the development of evaluators from under-represented groups.
v. We value inclusiveness and diversity, welcoming members at any point in their career, from any context, and representing a range of thought and approaches.
vi. We value efficient, effective, responsive, transparent, and socially responsible association operations.
Being asked to reflect on AEA’s Values Statement provided an opportunity for me to re-examine the personal and professional journeys that have brought me to this moment. Certainly, if nothing else, this experience has reaffirmed why – of all the possible areas of study, professions, and career choices that were available to me – I chose evaluation.
The Association’s values resonate with those that shaped my early development and, now, evaluation practice. I grew up in an environment where five different languages were spoken, where two sets of religious beliefs and cultural practices competed with each other, and where there was great diversity in terms of physical and cognitive abilities. The seeds of sympathy, flexibility, curiosity, skepticism, and resolve were sown in this space. Over time, these values and virtues became principles that I emphasize in all aspects of my professional life, especially how I go about conducting evaluation – to strive for excellence, to establish and maintain transparency, and through these efforts to engender a healthy sense of curiosity and openness to continuous learning. (Not to mention evaluation also provides a structured and dynamic space where learning opportunities are built in. This is incredibly appealing to someone with a relatively low threshold for boredom like myself.)
As an “n” of 1, the experiences that give shape and texture to my journey are not representative. Still, these moments of reflection led me to believe that while there may be “accidental evaluators,” perhaps the “accident” isn’t an accident in the truest sense of the word. Rather, we pursue our personal and professional interests based on what our sense of purpose is; what resonates with our worldviews and experiences; what appeals to us intellectually and emotionally; and what we value and are willing to endure and suffer through – our passion.
When I was about six years old, my dad asked a question that I am certain many of us were asked as children: “What do you want to be when you grow up?” I remember sitting in his lap, winding up a plastic, golf ball-sized yellow chick with black, beady eyes and an orange beak. It was the kind of flea market toy that would chirp and fall over in a failed attempt to hop mechanically as soon as you set it down. With his chin resting on top of my fresh bowl cut, I told him I wanted to be a doctor. “Why?” Because I want to help people.
Over 20 years later, I found myself sitting next to Marv Alkin, my graduate advisor, at an oval, honey-colored conference table in his office. We had been talking about possible classes that I would take over the next year and a half and how I plan to direct several evaluation projects that he had assigned to me. I was half-way through my capacity building plan for a university-run college access program when Marv asked, “What do you want to be when you grow up?” Taken aback, I sank into the 1970s-era light-blue bucket chair that I sat in and took a moment. What do I want to be when I grow up? I want to be a professor and evaluator. “Why?” Because I want to find ways to help people do what they are doing better.
A longstanding challenge for evaluators is alleviating the anxiety that is conjured among some stakeholders at the sound of potential evaluation. My experience and instinct suggest that explicitly incorporating elements of the AEA’s Values Statement into interactions where these feelings must be negotiated is quite impactful. My hope is that, as the field continues to grow, evaluators will be better able to communicate and share what drives our work – our passion – with others. And, as a result, that those outside of this field will genuinely value, rather than fear, evaluation and what it can contribute.
Anne T. Vo is associate director of the University of California Educational Evaluation Center at UCLA. As an evaluation researcher, Dr. Vo’s substantive interests lie at the intersection of comparative evaluation theory, evaluation capacity building, and organizational development. Her work contributes to the field's understanding of how evaluation can be practiced better; where and how social science theory and evaluation science dovetail into each other; and how this knowledge can be leveraged to drive change. She has taught graduate-level courses on research methodology and design, as well as special topics seminars in evaluation. Dr. Vo has published in journals such as the “American Journal of Evaluation,” “Evaluation and Program Planning,” and “New Directions for Evaluation.” She also serves as co-editor of the “American Journal of Evaluation” section on teaching evaluation and is co-director of the Southern California Evaluation Association.
Name: Bernadette Wright
Affiliation: Meaningful Evidence, LLC
Degrees: Ph.D. and master’s in public policy, evaluation, and analytical methods (University of Maryland, 2002 and 1997); B.A. in sociology and public policy (University of Maryland – Baltimore County, 1992)
Years in the Evaluation Field: 17
Joined AEA: 1999
Why do you belong to AEA?
I belong to AEA for several reasons: (1) To connect with others in the evaluation community at the AEA conference, on the EVALTALK discussion list, in the AEA LinkedIn group, and in the Topical Interest Groups; (2) to keep up with new developments and trends in the field through the Coffee Break webinars and evaluation journals; and (3) to be part of advancing best practices in evaluation through the AEA Guiding Principles for Evaluators, Statement on Cultural Competence, and other AEA public statements.
Why do you choose to work in the field of evaluation?
I’ve always wanted to help create more rational, efficacious, and equitable approaches to solving problems in our world. I’m idealistic enough to believe that evaluation can provide information that people can use to create better solutions to social problems.
What's the most memorable or meaningful evaluation that you have been a part of?
I’ve worked on many evaluations that meant a lot to me, but I’d say the most memorable is my doctoral dissertation. It was an evaluation of an interracial dialogue/action group program on racism in the Baltimore region. I was able to provide the organization sponsoring the program with useful information on stakeholders’ goals and recommendations for future changes to the curriculum. In addition, I learned a great deal about the nature of the problem of racism and the need for action, existing efforts to address it, and what can be done in policy and research to overcome it.
What advice would you give to those new to the field?
Never lose your idealism.
It was exactly one year ago that AEA ushered in the 13th Minority Serving Institution (MSI) cohort. Under the direction of Dr. Art Hernandez, MSI fellows Dr. Tamara Bertrand-Jones; Professor Denise Gaither-Hardy; Andrea Guajardo, MPH; Dr. Ana Pinilla; and Dr. Edilberto Raynes will have successfully completed the MSI Fellowship requirements as they gather one last time at Evaluation 2014 to present their final projects. This year’s cohort has been on an incredible journey that they will not soon forget. It has been an absolute pleasure getting to know these talented fellows. I encourage you all to do the same here, on the AEA website.
This year’s cohort has taken a special interest in the advancement of the initiative and its future as a prime resource to faculty members seeking to expand their evaluation knowledge. Outside of the rigor of the program, they have worked together to develop experiential additives to the program that will provide potential fellows the opportunity to see first-hand what it is like to participate. Though many of these additives are still in their peak developmental stages, we are excited to introduce a brand new Frequently Asked Questions list developed by this year’s cohort to guide MSI prospects. In a grand move to ensure the connectivity of cohorts past and cohorts to come, the group developed the FAQs in the first of many steps to promote better preparedness for this program. The group will soon be at work on a new MSI toolkit for incoming fellows in hopes of extending the olive branch to future cohorts and to share how the program has attributed to their successful evaluation work at their respective institutions.
The aim of the MSI Fellowship Initiative is to recruit, train, and develop the skills and competencies related to program evaluation of faculty from minority-serving and underrepresented institutions. These institutions often lack the resources to facilitate such significant knowledge dissemination and professional development support for their faculty and students. The MSI cohort year is rigorous and rewarding, with fellows participating in a multitude of webinars and culminating activities to meet the requirements of the initiative. The cohort year ends during their final presentations at the AEA annual meeting, where they will “graduate” into the ranks of the more than 50 MSI alumni. “According to all associated with it, the AEA MSI program has had a significant impact on the fellows, which has benefited students, programs, and colleagues alike,” says Art Hernandez, program director for over five years and a former MSI fellow. “Fellows have been and continue to be involved in efforts related to the Statement on Cultural Competence, as well as individual TIG efforts, and are at the forefront of engaging in and promoting culturally competent practice. AEA has demonstrated its strong commitment to diversity and cutting edge practice through its ongoing support for the AEA MSI Faculty Fellows Initiative, and I for one am honored and pleased to be an alumnus.”
We are now seeking applications from those interested in participating in the 2014-2015 MSI Faculty Initiative. Those interested should click here to apply. Evaluation 2014, held in Denver, is less than two months away. I encourage you all to attend their session Reflections on Program Evaluation Experiences of a New MSI Cohort, during which they will share and reflect on their experiences during the fellowship year. The group’s work will be their legacy and contribution for the next cohorts. Dr. Joann Yuen, former MSI fellow, once told me that the MSI program gave her wings. This year’s cohort is certainly ensuring that cohorts to come may also earn their wings.
While keeping its primary focus on evaluation policy at the U.S. federal level, the EPTF has recently expanded its scope of work to include evaluation policy issues at the international level. I start with a reminder that 2015 is the International Year of Evaluation, or EvalYear. The upcoming AEA conference will provide opportunities to hear how AEA will join in this effort to advocate for and promote evaluation and evidence-based policymaking at all government levels. Among other things, EvalYear is intended to raise awareness of the importance of embedding evaluation systems in the development and implementation of goals, thereby better positioning evaluation in the policy arenas of countries.
Even before expanding its focus, the EPTF has dealt with US federal evaluation policy with international implications. For example, the AEA “Roadmap for a More Effective Government” was used by the US Agency for International Development (USAID) as it developed a new policy for a multiyear effort to expand evaluation capacity and upgrade evaluation efforts. This effort has been underway the past three years. To learn more about USAID efforts, you might want to visit two sessions at the AEA 2014 conference: Learning by Doing: Experiences with Mixed Evaluation Teams at the United States Agency for International Development (USAID) on Oct. 16 at 4:45 p.m. and Complexity-Aware Monitoring in the United States Agency for International Development (USAID) on Oct. 16 at 8:00 a.m.
More generally, evaluation capacity building is a major priority for many countries and the organizations partnering with them. The production and use of evaluations requires capacities, including individual skills and knowledge, organizational systems and policies, and an enabling environment. Capacity development is a long-term, complex, internal change process for a country. Capacity development efforts may need to take into account the specific national context and related systems of management, governance, accountability, knowledge management, and learning.
The Organization for Economic Cooperation and Development, or OECD, is one notable actor in this field. OECD is focused on developing and improving partnerships to strengthen the relevance, coherence, and impact of evaluation capacity development. To this end, OECD is establishing a strong evidence base of “what works” in capacity development, sharing and identifying opportunities for international collaboration, and developing joint initiatives. Work is also underway to enhance evaluation policy development by exploring the national context, including governance environment, public sector reform activities, cultural norms, state and nonstate institutions, and enabling environment. OECD is working to understand what strategies and activities work best to support partner processes to improve evaluation systems, using the specific experiences of various countries.
Various strategies can contribute to international evaluation capacity building. For example, countries with an extensive evaluation infrastructure, such as the U.S., can share positive examples from their own experiences or facilitate learning opportunities among developing countries. Rather than conducting evaluations with outside staff only, donor countries can help foster capacity development by involving staff from developing countries and by using country systems for monitoring, data collection, and analysis during evaluations. A number of international organizations, including EvalPartners, the International Organization for Cooperation in Evaluation, and the United Nations Evaluation Group, working in partnership with United Nations WOMEN, OECD Development Assistance Committee Network on Development Evaluation, UNICEF, Ministry for Foreign Affairs of Finland, and USAID, have created a toolkit to develop an advocacy strategy for evaluation policy in a country (“Advocating for Evaluation: A toolkit to develop advocacy strategies to strengthen an enabling environment for evaluation”). Here’s hoping for improved evaluation policy around the world!
Please share with us your experience, efforts, and ideas regarding developing evaluation capacity in other countries, particularly in setting national evaluation policies, at EvaluationPolicy@eval.org.
Whether you are in the audience or part of the presentation, your AEA conference session will run more smoothly with a prepared and organized session chair. Many of us have seen these people in session rooms before. They just raise the cards to tell the presenter when time is up, right? Yes – and there’s so much more that session chairs can do to support a rockstar conference presentation.
Ever been annoyed that the last session is still running over when you arrive to grab a seat? Session chairs make sure things end on time!
Ever been irked at the time wasted as presenters plug in their flash drives and struggle to open their slideshows? Session chairs coordinate files in advance!
Ever wondered how seemingly unrelated papers got grouped into one session? Session chairs make it clear!
Find out what else session chairs should do – and when – in our brand new Session Chairs Checklist. Like most things, it involves some preparation and urges chairs to start early. So don’t wait! Download the checklist now and email it to your session chair. If you don’t know your chair, check the listing for your session in the conference program. (Maybe you are chairing a session!) Search the program on your name, start ticking the boxes on the checklist, and facilitate an awesome session.
Rodney A. Wambeam is an AEA members and the author of "The Community Needs Assessment Workbook," a new book published by Lyceum Books.
From the Publisher’s Site:
In planning community and social services, perceptions of need come from many sources – from the local news to political interest groups – but the first step in conducting efficient and effective community interventions is to look beyond perceptions and identify the actual needs based on available evidence. Creating a comprehensive needs assessment is essential for securing funding and designing programs in governmental agencies and nonprofit organizations. This workbook helps community groups, social service organizations, and government agencies collect, analyze, prioritize, and present local data in a way that will ensure that a community's needs are understood and met.
Employing a learn-by-doing approach, the book walks readers through the actual steps of creating a comprehensive needs assessment. The workbook offers thorough background information and provides step-by-step activities to address the entire process beginning with the planning stage, followed by data collection and analysis, and concluding with preparing your report and implementing findings. Whether in a classroom setting or in the workplace, this is the book that practitioners will use throughout their entire careers.
From the Author:
I wrote “The Community Needs Assessment Workbook” to help communities and social service organizations collect, analyze, and prioritize local data in a simple and logical way. The goal is to build community capacity to work with data while completing a rigorous needs assessment that can be used when writing a funding proposal or when using the best possible data to plan social service delivery. The workbook is not simply a text on how to conduct social research; rather, it serves as a step-by-step guide to completing a real needs assessment. It differs from other texts in its focus on actually creating a comprehensive community needs assessment as one reads it. It contains numerous tasks, worksheets, demonstrations, exercises, and tools to aid the reader in completing the research project. In short, it is called a workbook because the reader creates products while he or she learns. In the end, the workbook rewards the user with a useful document to guide community efforts.
The primary audience for the book is professionals who provide social services and participate in community organizing. They can use this book to conduct an actual and useful comprehensive community needs assessment. Secondarily, college students in a variety of social service disciplines, from counseling and social work to health sciences and criminal justice, will find this a useful guide when preparing for a career helping people and communities. Classes can make use of the workbook style to teach the needs assessment process, and individuals can utilize the workbook itself to complete larger projects like a senior thesis.
About the Author:
Rodney Wambeam is a senior research scientist at the Wyoming Survey and Analysis Center and adjunct professor in the Department of Political Science, both at the University of Wyoming. He has served as health and human services advisor to the governor of Nebraska, on numerous councils and coalitions, and as technical assistance provider to states and communities throughout America. He chairs Wyoming’s State Epidemiological Workgroup, and also teaches policy analysis, evaluation, and needs assessment. On a daily basis he uses research to help a variety of organizations and communities better understand their problems and the impact of their work.