| Session Title: Cultural Competency in Evaluation: Discussion of the American Evaluation Association's Public Statement on the Importance of Cultural Competence in Evaluation |
| Think Tank Session 242 to be held in Lone Star A on Thursday, Nov 11, 10:55 AM to 12:25 PM |
| Sponsored by the Presidential Strand |
| Presenter(s): |
| Cindy Crusto, Yale University, cindy.crusto@yale.edu |
| Discussant(s): |
| Katrina Bledsoe, Walter R McDonald and Associates Inc, katrina.bledsoe@gmail.com |
| Karen E Kirkhart, Syracuse University, kirkhart@syr.edu |
| Elizabeth Whitmore, Carleton University, ewhitmore@connect.carleton.ca |
| Jenny Jones, Virginia Commonwealth University, jljones@vcu.edu |
| Katherine A Tibbetts, Kamehameha Schools, katibbet@ksbe.edu |
| Abstract: This highly interactive think tank is the capstone event in the Association’s member review of the AEA Public Statement on the Importance of Cultural Competence in Evaluation. The Statement is the result of five years of work by an AEA task force in consultation with a range of experts and members. The think tank discussion will help shape the final version of this statement and create a vision for moving toward wider attention to culture in evaluation. After a brief introduction to the statement and its history, participants will be asked to provide feedback on the statement and create a vision of how best to support evaluators in translating it into action. A technical known as graphic facilitation will be used along with traditional recording of the discussions to capture the main concepts and ideas generated. (Graphic facilitation is a tool for capturing and organizing a group’s ideas with pictures.) |
| Session Title: Improving the Practice of Theory-driven Evaluation: Understanding the Role of Stakeholders and Context in Evaluation Settings | ||||||||||||||||||||||||
| Multipaper Session 243 to be held in Lone Star B on Thursday, Nov 11, 10:55 AM to 12:25 PM | ||||||||||||||||||||||||
| Sponsored by the Program Theory and Theory-driven Evaluation TIG | ||||||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||||||
| Cindy Gilbert, United States Government Accountability Office, gilbertc@gao.gov | ||||||||||||||||||||||||
| Uda Walker, Gargani + Company, uda@gcoinc.com | ||||||||||||||||||||||||
|
| Session Title: What We Don’t Say Can Hurt Us: Working with Undiscussables |
| Think Tank Session 244 to be held in Lone Star C on Thursday, Nov 11, 10:55 AM to 12:25 PM |
| Sponsored by the Collaborative, Participatory & Empowerment Evaluation TIG |
| Presenter(s): |
| Alexis Kaminsky, Kaminsky Consulting, akaminsky@comcast.net |
| Discussant(s): |
| Hazel Symonette, University of Wisconsin, hsymonette@odos.wisc.edu |
| Jennifer Dewey, James Bell Associates, dewey@jbassoc.com |
| Virginia Dick, University of Georgia, vdick@cuiog.uga.edu |
| Maggie Dannreuther, Stennis Space Center, maggied@ngi.msstate.edu |
| Ranjana Damle, Albuquerque Public Schools, damle@aps.edu |
| Elena Polus, Iowa State University, elenap@iastate.edu |
| Abstract: Silence--what we don’t, won’t or can’t say--is one means of controlling information, reflection, and action in evaluation. Frequently, silences grow out of issues related to gender, race, and class but not always. They can happen any time people from different backgrounds, organizations, aptitudes, and interests come together. However, the greater the differences, the more pronounced the challenge to break the silence. How do we help ourselves and our evaluation participants open up spaces to explore the silences that keep us “safe” but stuck? Participants in this session will explore disrupting silences in small groups based on short practice-based vignettes shared by the session’s presenters. Small group discussion will examine: (1) strengths and limitations of different approaches, (2) identification of additional approaches, and (3) identification of critical contextual factors that foster or inhibit movement. The whole group will reconvene to share observations and elucidate questions that merit further attention. |
| Session Title: Evaluating Social Service Programs for Government and Foundations | ||||||||||||||||||||||||
| Multipaper Session 245 to be held in Lone Star D on Thursday, Nov 11, 10:55 AM to 12:25 PM | ||||||||||||||||||||||||
| Sponsored by the Non-profit and Foundations Evaluation TIG | ||||||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||||||
| Beth Stevens, Mathematica Policy Research, bstevens@mathematica-mpr.com | ||||||||||||||||||||||||
|
| Session Title: Quantitative Methods Theory and Design TIG Business Meeting and Presentation: What is New in Multiple Comparison Procedures |
| Business Meeting Session 246 to be held in Lone Star E on Thursday, Nov 11, 10:55 AM to 12:25 PM |
| Sponsored by the Quantitative Methods: Theory and Design TIG |
| TIG Leader(s): |
| Patrick McKnight, George Mason University, pmcknigh@gmu.edu |
| George Julnes, University of Baltimore, gjulnes@ubalt.edu |
| Karen Larwin, University of Akron, Wayne, drklarwin@yahoo.com |
| Raymond Hart, Georgia State University, rhart@gsu.edu |
| Chair(s): |
| Dale Berger, Claremont Graduate University, dale.berger@cgu.edu |
| Presenter(s): |
| Roger Kirk, Baylor University, roger_kirk@baylor.edu |
| Abstract: Prof. Kirk is a renowned author in statistics and experimental design. He will trace the development of multiple comparison procedures from Fisher's work though current research on the false discovery rate. |
| Session Title: Internal Evaluation TIG Business Meeting and Presentation: A Decade of Internal Evaluation in One School District - How Times Change |
| Business Meeting Session 247 to be held in Lone Star F on Thursday, Nov 11, 10:55 AM to 12:25 PM |
| Sponsored by the |
| TIG Leader(s): |
| Boris Volkov, University of North Dakota, bvolkov@medicine.nodak.edu |
| Wendy DuBow, National Center for Women and Information Technology, wendy.dubow@colorado.edu |
| Presenter(s): |
| Jean A King, University of Minnesota, kingx004@umn.edu |
| Roundtable Rotation I: Grad Students on Grad Students: Evaluating Peers in a Professional Context |
| Roundtable Presentation 248 to be held in MISSION A on Thursday, Nov 11, 10:55 AM to 12:25 PM |
| Sponsored by the Graduate Student and New Evaluator TIG |
| Presenter(s): |
| Matthew Linick, University of Illinois at Urbana-Champaign, mlinic1@gmail.com |
| Marjorie Dorime-Williams, University of Illinois at Urbana-Champaign, dorime1@illinois.edu |
| Seung Won Hong, University of Illinois at Urbana-Champaign, hong29@illinois.edu |
| Abstract: The College of Education at a major research I university will be holding its first college-wide graduate student conference this spring. The conference was developed and organized by an interdisciplinary planning committee composed of College of Education graduate students. This committee commissioned a separate team of graduate students to complete a formative and summative evaluation of the conference. Conducting an evaluation of a fellow graduate student initiative in the same college created an interesting context for the evaluation team that offered unique challenges and opportunities. As graduate students and beginning evaluators we struggled with assumptions about our expertise and roles as evaluators, as well as negotiating a professional evaluator-client relationship with our peers, friends, and colleagues. |
| Roundtable Rotation II: A Student-Generated Collaborative Approach to Developing New Evaluator Competencies |
| Roundtable Presentation 248 to be held in MISSION A on Thursday, Nov 11, 10:55 AM to 12:25 PM |
| Sponsored by the Graduate Student and New Evaluator TIG |
| Presenter(s): |
| Jason Black, University of Tennessee, Knoxville, jblack21@utk.edu |
| Pam Bishop, University of Tennessee, Knoxville, pbaird@utk.edu |
| Shayne Harrison, University of Tennessee, Knoxville, sharrison1976@comcast.net |
| Susanne Kaesbauer, University of Tennessee, Knoxville, skaesbau@utk.edu |
| Thelma Woodard, University of Tennessee, Knoxville, twoodar2@utk.edu |
| Abstract: The purpose of this discussion is to provide an effective method for improving new evaluator skills through collaborative efforts between new and advanced graduate students. Traditional academic models for training in evaluation often include coursework, simulations, role-play, and a practicum (Trevisan 2004). In some programs, evaluation students are taught evaluation fundamentals and simultaneously required to conduct evaluation projects independently from start to finish during the first year of graduate school. Although knowledgeable about evaluation competencies, the knowledge, skills and abilities involved in conducting an entire evaluation are often beyond new evaluators’ expertise. Nadler and Cundiff (2009) assert that these skills can’t be sharpened through academic-based training alone. In this roundtable discussion, students will discuss alternative qualitative and quantitative approaches to the same evaluation problem as a collaborative approach to improving their evaluator competencies. |
| Session Title: Issues and Models: Evaluating Universal Design for Learning | |||
| Panel Session 249 to be held in MISSION B on Thursday, Nov 11, 10:55 AM to 12:25 PM | |||
| Sponsored by the Special Needs Populations TIG | |||
| Chair(s): | |||
| Bob Hughes, Seattle University, rhughes@seattleu.edu | |||
| Abstract: Universal Design for Learning (UDL) is a framework that provides principles to guide teaching practices and curriculum development. Both P-12 and higher education UDL projects are being developed through grants and initiatives to improve materials and teaching practices which support a range of learner needs. UDL has been formally evaluated since its development in the mid-1990s. However, this work has been completed by a few organizations and the work is often reported to funding agencies without general distribution. This panel will bring together four evaluators with varied levels of experience with UDL. Panelists will describe their work designing or implementing both higher education and P-12 UDL evaluations; how they have approached the confluence of UDL principles and the measurement of learning outcomes; and how they have evaluated UDL’s impacts on teacher behaviors and perspectives, as well as the impact on organizational issues that impact instruction. | |||
| |||
| |||
| |||
|
| Session Title: Systems Theories in Evaluation Planning: Differentiating Planning Process from Evaluation Plan |
| Think Tank Session 250 to be held in BOWIE A on Thursday, Nov 11, 10:55 AM to 12:25 PM |
| Sponsored by the Systems in Evaluation TIG |
| Presenter(s): |
| Claire Hebbard, Cornell University, cer17@cornell.edu |
| Discussant(s): |
| William M Trochim, Cornell University, wmt1@cornell.edu |
| Thomas Archibald, Cornell University, tga4@cornell.edu |
| Monica Hargraves, Cornell University, mjh51@cornell.edu |
| Abstract: Systems evaluation has many definitions, some themes common among include identifying multiple stakeholder perspectives, viewing a program as a dynamic system nested within larger systems, and defining a program’s boundaries. Many evaluators have identified how a systems approach to evaluation may affect the process of planning an evaluation. But does the process of planning an evaluation get revealed in the evaluation plan? Is there a way to differentiate a systems based evaluation plan from evaluation plans created from other? We will outline some of our thoughts and struggles regarding assessing evaluation plans developed by programs going through our Systems Evaluation Protocol, then facilitate one or more workgroups to identify and discuss developing a rubric for identifying systems based evaluation plans. |
| Session Title: Race, Class, and Power: Bringing The Issues Into Discussion and Evaluation | |||
| Panel Session 251 to be held in BOWIE B on Thursday, Nov 11, 10:55 AM to 12:25 PM | |||
| Sponsored by the Multiethnic Issues in Evaluation TIG | |||
| Chair(s): | |||
| Dawn Smart, Clegg & Associates, dsmart@cleggassociates.com | |||
| Abstract: Part of the jigsaw puzzle in making change in communities is the effect of race, class and power differences among groups. Historical and present-day inequities translate into significant barriers to participation in decision-making affecting people’s lives. Bringing these issues into the conversation can be difficult and uncomfortable, but without direct attention to them, community change efforts often stall. Evaluating race, class and power — what these issues look like in their community context, how people feel about them, and how the dynamics may change over the course of an initiative — is equally challenging. Putting race, class and power on the discussion table is a first step. Learning together what is relevant and how to measure it is a second. Examining the findings and interpreting their meaning is a third. This session will look at the efforts of two national organizations engaged in this process and provide a forum for further exploration. | |||
| |||
| |||
|
| Session Title: Culturally Responsive Evaluation: Three Cases From Aboriginal Peoples and First Nations in Canada | ||||||||||||||||
| Multipaper Session 252 to be held in BOWIE C on Thursday, Nov 11, 10:55 AM to 12:25 PM | ||||||||||||||||
| Sponsored by the Indigenous Peoples in Evaluation TIG | ||||||||||||||||
| Chair(s): | ||||||||||||||||
| Andrea LK Johnston, Johnston Research Inc, andrea@johnstonresearch.ca | ||||||||||||||||
| Discussant(s): | ||||||||||||||||
| Andrea LK Johnston, Johnston Research Inc, andrea@johnstonresearch.ca | ||||||||||||||||
|
| Roundtable Rotation I: Theory of Change Evaluation in the Real World: Lessons Learned from Applying (and Modifying) the TOC Approach in the Evaluation of the Tobacco Policy Change Program |
| Roundtable Presentation 253 to be held in GOLIAD on Thursday, Nov 11, 10:55 AM to 12:25 PM |
| Sponsored by the Advocacy and Policy Change TIG |
| Presenter(s): |
| Andrea Anderson-Hamilton, Anderson Hamilton Consulting, andersonhamilton@gmail.com |
| Abstract: This roundtable will discuss the challenges and lessons learned from applying the Theory of Change approach to the evaluation of the Robert Wood Johnson Foundation’s Tobacco Policy Change Program (TPC), which provided grants to 75 tobacco advocacy coalitions during the 2004 to 2008 grant period. The TPC evaluation was designed to produce lessons for several audiences: the public health field; the philanthropic community; the staff at RWJF; and the grantees themselves. We now understand that this evaluation can offer important lessons to our field as well, particularly around how to modify the commonly understood "theory of change approach" to accommodate the reality of evaluating a program with multiple sites, multiple goals, multiple definitions of success and multiple theories of change operating at different levels. |
| Roundtable Rotation II: Evaluating Foundation Advocacy Strategies: When Theory and Practice Collide |
| Roundtable Presentation 253 to be held in GOLIAD on Thursday, Nov 11, 10:55 AM to 12:25 PM |
| Sponsored by the Advocacy and Policy Change TIG |
| Presenter(s): |
| Catherine Borgman-Arboleda, City University of New York (CUNY), cborgman.arboleda@gmail.com |
| Rachel Kulick, City University of New York (CUNY), rkulick@brandeis.edu |
| Abstract: Grantmakers committed to social change are faced with the need to support work that moves beyond short-term policy change to expanding involvement in the process of making change, and more broadly in democracy. Tension often results in practice as program officers attempt to support an ecosystem of work with often different goals, values and timeframes. Based on evaluations conducted for the Robert Wood Johnson Foundation, the Funding Exchange and the Social Science Research Council, we will explore these challenges and discuss how foundation theories of change can inform evaluations and in turn how findings can shift and refine these theories, potentially leading to more effective grantmaking. We will also examine some important factors to consider in the assessment of foundation support of social movement building work, including grantee selection criteria and internal foundation decision-making processes as they relate to building coalitions, network capacities, power sharing, education, and leadership |
| Roundtable Rotation I: Choices of Research and Development (R&D) Evaluation Approaches in Chinese Academy of Sciences (CAS) Institutes: We Reap What We Sow? |
| Roundtable Presentation 254 to be held in SAN JACINTO on Thursday, Nov 11, 10:55 AM to 12:25 PM |
| Sponsored by the Research on Evaluation TIG and the Pre-K - 12 Educational Evaluation TIG |
| Presenter(s): |
| Xiaoxi Xiao, Chinese Academy of Sciences, xiaoxiaoxi@casipm.ac.cn |
| Changhai Zhou, Chinese Academy of Sciences, chzhou@cashq.ac.cn |
| Tao Dai, Chinese Academy of Sciences, daitao@casipm.ac.cn |
| Abstract: What kind of R&D evaluation approaches to choose is one major decision for research institute’s manager, which reflects the managers’ orientation for the institutes in the future. Then one question arises: will the institute develop according to the orientation? Or as the saying says, we reap what we sow? We chose institute A and B from CAS as two cases. The two institutes were initially both founded on basic researches. But in recent decades, evaluation approaches of the two institutes differentiated gradually, and application orientation and pure basic researches orientation was formed in institute A and B respectively. In this empirical study, firstly we retrospectively identify the management orientations reflected in the evaluation approaches and the major achievements for two institutes by methods of experts’ seminar, text coding, and historical analysis; then, we conduct the correlation analysis between the orientations and achievements to explore the above question. |
| Roundtable Rotation II: Producing Evidence of Effectiveness Data in the Real World of Early Childhood Education |
| Roundtable Presentation 254 to be held in SAN JACINTO on Thursday, Nov 11, 10:55 AM to 12:25 PM |
| Sponsored by the Research on Evaluation TIG and the Pre-K - 12 Educational Evaluation TIG |
| Presenter(s): |
| Cindy Lin, HighScope Educational Research Foundation, clin@highscope.org |
| Marijata Daniel-Echols, HighScope Educational Research Foundation, mdaniel-echols@highscope.org |
| Abstract: In this session, the presenters will lead a discussion on “How can early childhood education program evaluators balance the demand for data about what works with real world issues within the field that present challenges to study design and data analysis?” Current policy focuses on investments in education has increased funding in early childhood education. As part of these new and increased funding streams is a requirement to collect information on the evidence of improvements in children’s school readiness. During the roundtable session, the presenters will use 15 years of experience evaluating Michigan’s state funded preschool program for at-risk four year olds to discuss the evaluation design and data analysis challenges to producing accountability data for early childhood programs. Current work using both Regression Discontinuity design and a quasi-experimental design and two-level hierarchical linear modeling from a statewide random sample of programs will be the basis of the discussion. |
| Session Title: Assessing Student Learning Outcomes II: Three Sides of the Coin | |||||||||||||||
| Multipaper Session 255 to be held in TRAVIS A on Thursday, Nov 11, 10:55 AM to 12:25 PM | |||||||||||||||
| Sponsored by the Assessment in Higher Education TIG | |||||||||||||||
| Chair(s): | |||||||||||||||
| Audrey Rorrer, University of North Carolina at Charlotte, arorrer@uncc.edu | |||||||||||||||
| Discussant(s): | |||||||||||||||
| Jeanne Hubelbank, Independent Consultant, jhubel@evalconsult.com | |||||||||||||||
|
| Session Title: Visualizing Data for Strategic Planning | ||||||||||||||||||||||||||
| Multipaper Session 256 to be held in TRAVIS B on Thursday, Nov 11, 10:55 AM to 12:25 PM | ||||||||||||||||||||||||||
| Sponsored by the Integrating Technology Into Evaluation | ||||||||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||||||||
| Metta Alsobrook, University of Texas, Dallas, metta.alsobrook@utdallas.edu | ||||||||||||||||||||||||||
|
| Session Title: Measuring the Immeasurable: Lessons for Building Grantee Capacity to Evaluate Hard-to-assess Efforts | ||||||
| Panel Session 257 to be held in TRAVIS C on Thursday, Nov 11, 10:55 AM to 12:25 PM | ||||||
| Sponsored by the Non-profit and Foundations Evaluation TIG | ||||||
| Chair(s): | ||||||
| Lande Ajose, BTW Informing Cchange, lajose@informingchange.com | ||||||
| Abstract: Nonprofit organizations, especially those engaged in policy and advocacy, play a critical role in advancing fundamental social reform, but too often they are limited by their capacity to reflect on whether and how they are making progress towards their goal. This session will present the highlights from the newly published monograph Measuring the Immeasurable: Lessons for Building Grantee Capacity to Evaluate Hard-to-Assess Efforts, which details the William and Flora Hewlett Foundation’s innovative efforts to design an evaluation process that would generate a high sense of ownership for grantees and high engagement in the evaluation process, with the goal that the evaluation data would have a better chance of being used for organizational improvement. Hear from the Hewlett Foundation, an evaluator and a grantee about their perspectives on how this approach to evaluation capacity building, and a newly developed tool, enables grantee organizations to reflect on and improve their work. | ||||||
| ||||||
| ||||||
|
| Session Title: Evaluation in Action: A Sampler of Tracking and Timing Methodologies in Museums, Culturals, and Informal Education Settings | |||
| Panel Session 258 to be held in TRAVIS D on Thursday, Nov 11, 10:55 AM to 12:25 PM | |||
| Sponsored by the Evaluating the Arts and Culture TIG | |||
| Chair(s): | |||
| Kathleen Tinworth, Denver Museum of Nature & Science, kathleen.tinworth@dmns.org | |||
| Discussant(s): | |||
| Kathleen Tinworth, Denver Museum of Nature & Science, kathleen.tinworth@dmns.org | |||
| Abstract: Following a successful and well-attended panel session in 2009, members of the Visitor Studies Association (VSA) will return to AEA to showcase a variety of studies used to evaluate the visitor experience in museum and cultural settings. In particular, tracking and timing methodologies will be presented, illustrating the utility of this vehicle across disciplines. While not necessary to attend both sessions, this panel complements a proposed demonstration session where several tracking and timing methodologies will be exhibited. Participants will have the opportunity to both hear case studies where tracking and timing was utilized in a visitor studies setting as well as experiment first-hand with collecting and analyzing the resulting data and applying it to their own work. | |||
| |||
| |||
| |||
|
| Session Title: Implementing Quality Randomized Control Trials in Human Service Evaluations: Applications Addressing Challenges and Barriers | ||||||||||||||||||
| Multipaper Session 259 to be held in INDEPENDENCE on Thursday, Nov 11, 10:55 AM to 12:25 PM | ||||||||||||||||||
| Sponsored by the Human Services Evaluation TIG and the Quantitative Methods: Theory and Design TIG | ||||||||||||||||||
| Chair(s): | ||||||||||||||||||
| Todd Franke, University of California, Los Angeles, tfranke@ucla.edu | ||||||||||||||||||
| Discussant(s): | ||||||||||||||||||
| Tania Rempert, University of Illinois at Urbana-Champaign, trempert@illinois.edu | ||||||||||||||||||
|
| Session Title: Including Everyone: Lesbian, Gay, Bisexual, Transgender, People of Color, and Double Winners | |||||||||||||||||
| Multipaper Session 260 to be held in PRESIDIO A on Thursday, Nov 11, 10:55 AM to 12:25 PM | |||||||||||||||||
| Sponsored by the Lesbian, Gay, Bisexual, Transgender Issues TIG | |||||||||||||||||
| Chair(s): | |||||||||||||||||
| Kathleen McKay, Connecticut Children's Medical Center, kmckay@ccmckids.org | |||||||||||||||||
| Discussant(s): | |||||||||||||||||
| Kathleen McKay, Connecticut Children's Medical Center, kmckay@ccmckids.org | |||||||||||||||||
|
| Session Title: Building, Enhancing, and Sustaining Evaluation Quality for Organizational Learning | |||||||||||||||||||||
| Multipaper Session 261 to be held in PRESIDIO B on Thursday, Nov 11, 10:55 AM to 12:25 PM | |||||||||||||||||||||
| Sponsored by the Organizational Learning and Evaluation Capacity Building TIG | |||||||||||||||||||||
| Chair(s): | |||||||||||||||||||||
| Beverly A Parsons, InSites, bparsons@insites.org | |||||||||||||||||||||
| Discussant(s): | |||||||||||||||||||||
| Beverly A Parsons, InSites, bparsons@insites.org | |||||||||||||||||||||
|
| Session Title: Analytic and Measurement Approaches in Substance Abuse and Mental Health Evaluation | |||||||||||||||
| Multipaper Session 262 to be held in PRESIDIO C on Thursday, Nov 11, 10:55 AM to 12:25 PM | |||||||||||||||
| Sponsored by the Alcohol, Drug Abuse, and Mental Health TIG | |||||||||||||||
| Chair(s): | |||||||||||||||
| Roger Boothroyd, University of South Florida, boothroyd@fmhi.usf.edu | |||||||||||||||
|
| Roundtable Rotation I: Evaluation as a Management and Learning Tool for the Successful Development and Scaling of Innovative Program Models |
| Roundtable Presentation 263 to be held in BONHAM A on Thursday, Nov 11, 10:55 AM to 12:25 PM |
| Sponsored by the Non-profit and Foundations Evaluation TIG |
| Presenter(s): |
| Helen Davis Picher, William Penn Foundation, hdpicher@williampennfoundation.org |
| Sandra Adams, William Penn Foundation, sadams@williampennfoundation.org |
| Abstract: Demonstration of outcomes, while a key ingredient in the development and scaling of promising social programs, is not enough to ensure success. Replication and scalability are critical components of a successful model. The William Penn Foundation will present a framework developed to assess which pilot demonstrations are primed for success and which may fail, if mid-course corrections are not made. A discussion will consider the usefulness and application of the framework in a variety of sectors and how the framework can be operationalized for a program or cluster of programs targeting the same system change. |
| Roundtable Rotation II: Evaluating Enterprising Nonprofits: The Social Return on Investment |
| Roundtable Presentation 263 to be held in BONHAM A on Thursday, Nov 11, 10:55 AM to 12:25 PM |
| Sponsored by the Non-profit and Foundations Evaluation TIG |
| Presenter(s): |
| Goutham Menon, University of Texas, San Antonio, goutham.menon@utsa.edu |
| Maureen Rubin, University of Texas, San Antonio, maureen.rubin@utsa.edu |
| Abstract: Today, running a non-profit has become more complicated than it used to be. With funding cuts, raising demands of performance measures from foundations, corporations asking for better partnerships to meet their responsibilities, some non-profits are taking the issue of becoming social enterprises more seriously. Leaders of these organizations take their roles as social entrepreneurs in their stride and provide a vision and direction for their organization. Social entrepreneurs look out to make sure that their organization is fiscally strong but at the same time needs to make sure that their social mission is left intact. One way to keep track of this and prevent mission drift is to evaluate the social return on investment (SROI) for any venture. This paper will provide an overview of SROI and how it can be used in organizations of any size. It will highlight the role of SROI in evaluating the functioning of non-profits through examples. |
| Session Title: Using Evaluation to Improve the Quality of the Initial Implementation of a Statewide Community and State Level Policy and Systems Change Initiative | |||||
| Panel Session 264 to be held in BONHAM B on Thursday, Nov 11, 10:55 AM to 12:25 PM | |||||
| Sponsored by the Non-profit and Foundations Evaluation TIG | |||||
| Chair(s): | |||||
| Astrid Hendricks, California Endowment, ahendricks@calendow.org | |||||
| Discussant(s): | |||||
| Marie Colombo, Skillman Foundation, mcolombo@skillman.org | |||||
| Abstract: TCE’s BHC has largely successfully integrated evaluation knowledge and skills into the initial implementation of BHC. This has helped improved the performance of TCE, the communities and grantees while also providing valuable information about the factors needed to prepare communities for implementing policy and systems change strategies. We will provide lessons learned that will be valuable to the field.The Aspen Institute’s (2009) recent review of CCIs notes that we lack evidence that community-based work alone can trigger large-scale systems change. The TCE BHC seeks to develop synergies by addressing strategies at the Place level, and also at the State and statewide levels through integration strategies based upon use of goal and outcome-oriented logic models. | |||||
| |||||
| |||||
| |||||
|
| Session Title: Group and Cluster Randomized-Control Experimental Interventions in Educational Evaluation Studies | |||||||||||||||||||||||||
| Multipaper Session 265 to be held in BONHAM C on Thursday, Nov 11, 10:55 AM to 12:25 PM | |||||||||||||||||||||||||
| Sponsored by the Pre-K - 12 Educational Evaluation TIG | |||||||||||||||||||||||||
| Chair(s): | |||||||||||||||||||||||||
| Anane Olatunji, Fairfax County Public Schools, aolatunji@fcps.edu | |||||||||||||||||||||||||
| Discussant(s): | |||||||||||||||||||||||||
| Melissa Chapman, University of Iowa, melissa-chapman@uiowa.edu | |||||||||||||||||||||||||
|
| Session Title: Integrating High Quality Evaluation Into a National Integrated Services in Schools Initiative | |||
| Panel Session 266 to be held in BONHAM D on Thursday, Nov 11, 10:55 AM to 12:25 PM | |||
| Sponsored by the Pre-K - 12 Educational Evaluation TIG | |||
| Chair(s): | |||
| Keith McNeil, University of Texas, El Paso, kamcneil2@utep.edu | |||
| Abstract: Elev8 is a national initiative funded by Atlantic Philanthropies that provides wrap-around services in selected middle schools to students and their families. The one national quantitative evaluator and four local qualitative evaluators will participate in the panel and will address three topics that surfaced. The first topic is the need for relationship building. Relationships need to be built not only with school staff, but also with a large number of project staff from a large number of organizations that are providing services to students and their families. The second topic is the press between maximizing integration and maximizing effects of the program. The third topic is the press for outcomes to support and sustain the program. The evaluation was designed without any kind of comparison group, so the effectiveness of the program has to rely on the quantitative collection of participation rates and the qualitative collection of data. | |||
| |||
| |||
| |||
|
| Session Title: Environmental Education Evaluation: Examining Citizen Collected Data, Mixed Method Designs, and Professional Development | |||||||||||||||||
| Multipaper Session 267 to be held in BONHAM E on Thursday, Nov 11, 10:55 AM to 12:25 PM | |||||||||||||||||
| Sponsored by the Environmental Program Evaluation TIG | |||||||||||||||||
| Chair(s): | |||||||||||||||||
| Annelise Carleton-Hug, Trillium Associates, annelise@trilliumassociates.com | |||||||||||||||||
|
| Session Title: International Approaches to Government Evaluation | |||||||||||||||||||||||||||||||
| Multipaper Session 268 to be held in Texas A on Thursday, Nov 11, 10:55 AM to 12:25 PM | |||||||||||||||||||||||||||||||
| Sponsored by the Government Evaluation TIG | |||||||||||||||||||||||||||||||
| Chair(s): | |||||||||||||||||||||||||||||||
| Susan Berkowitz, Westat, susanberkowitz@westat.com | |||||||||||||||||||||||||||||||
|
| Session Title: Circular Dialogue and Other Dialectical Methods of Inquiry |
| Skill-Building Workshop 269 to be held in Texas B on Thursday, Nov 11, 10:55 AM to 12:25 PM |
| Sponsored by the Qualitative Methods TIG |
| Presenter(s): |
| Richard Hummelbrunner, OEAR Regional Development Consultants, hummelbrunner@oear.at |
| Abstract: Circular Dialogues are systemic forms of communication among people with different perspectives. They use the perspectives and views of participants as a resource e.g. for appraising/validating different experiences or identifying joint solutions. After introducing the principles and rules to be followed in a Circular Dialogue, participants will be invited to break into sub-groups, agree on an issue per group and identify three relevant perspectives. Then two rounds of dialogue sessions will be run in parallel to provide participants with hands-on experience, which will be briefly reflected at the end. Circular Dialogue is an example of a dialectical method, which uses opposing viewpoints to gain meaning. In the final session, other dialectical methods of inquiry will briefly be outlined, namely: Convergent Interviewing, Contradiction analysis and Option One-And-A-Half. The session will end with a final round on the utility of dialectical methods in evaluation and by providing sources of further information. |
| Session Title: Better Evaluation - A Toolbox of Evaluation Methods and Applications That Supports Quality and Methodological Diversity |
| Demonstration Session 270 to be held in Texas C on Thursday, Nov 11, 10:55 AM to 12:25 PM |
| Sponsored by the |
| Presenter(s): |
| Patricia Rogers, Royal Melbourne Institute of Technology, patricia.rogers@rmit.edu.au |
| Suman Sureshbabu, Rockefeller Foundation, ssureshbabu@rockfound.org |
| Abstract: Evaluation quality depends on having both skilled evaluators and savvy evaluation commissioners. For both groups ongoing learning is needed about how to select, adapt and implement evaluation methods to suit a range of situations. This session will demonstrate a new on-line resource “BetterEvaluation” that supports evaluators and evaluation commissioners to make better choices of evaluation approaches and methods, to implement these methods better, and contribute to learning from practice by sharing their experiences. It is built explicitly on Web 2.0 principles of interactive information sharing, user-centered design (including the use of folksonomies, or user-developed classifications), dynamic content, and scaleability. These align with and reinforce the project’s emphasis and commitment to methodological diversity, adaptation and innovation to suit diverse evaluation requirements. The session will present the main features of the site, its main purposes and discuss challenges such as avoiding paradigmatic schisms, and ensuring the quality, accessibility and utility of material. |
| Session Title: Analysis and Evaluation of Research Portfolios Using Quantitative Science Metrics: Theory | ||||||
| Panel Session 271 to be held in Texas D on Thursday, Nov 11, 10:55 AM to 12:25 PM | ||||||
| Sponsored by the Research, Technology, and Development Evaluation TIG | ||||||
| Chair(s): | ||||||
| Israel Lederhendler, National Institutes of Health, lederhei@od.nih.gov | ||||||
| Discussant(s): | ||||||
| Gretchen Jordan, Sandia National Laboratories, gbjorda@sandia.gov | ||||||
| Abstract: This panel addresses the question of the theoretical gaps and opportunities to enhance the understanding scientific research using portfolios as a unit of analysis. Increasing attention is being given to analyzing, managing, and evaluating scientific investment using a quantitative portfolio approach. Many of the analyses seem to lack theoretical reasons for using particular methods to answer particular questions. The presentations are intended to raise the issue of how portfolio analysis, research evaluation, and scientometrics are complementary. | ||||||
| ||||||
| ||||||
| ||||||
|
| Session Title: Mainstreaming Evaluation in Diverse Organizational Contexts | ||||
| Panel Session 272 to be held in Texas E on Thursday, Nov 11, 10:55 AM to 12:25 PM | ||||
| Sponsored by the Evaluation Use TIG , the Organizational Learning and Evaluation Capacity Building TIG, and the Research on Evaluation TIG | ||||
| Chair(s): | ||||
| Gloria Sweida-DeMania, Claremont Graduate University, gloria.sweida-demania@cgu.edu | ||||
| Abstract: Jim Sanders defined mainstreaming evaluation as "…making evaluation a part of the work ethic, the culture, and the job responsibilities of stakeholders at all levels of the organization" (2009, p. 1). Organizational efforts toward mainstreaming evaluation depict the ultimate level of evaluation use. The presenters in this panel will describe methods of embedding evaluation in a variety of contexts. Sam Held, program evaluator for Oak Ridge Institute for Science and Education, will discuss a backward design approach. Ellen Iverson and Randahl Kirkendall, evaluators at Carleton College’s Science Education Resource Center (SERC), will focus on their efforts to mainstream evaluation within their organization, and assist other organizations to do the same. Rachel Muthoni, evaluator Pan African Bean Research Alliance (PABRA), will describe mainstreaming in the agricultural development context. Amy Gullickson will discuss her dissertation research, presenting on four organizations that exemplify evaluation mainstreaming. Reference: Sanders, J. R. (2009, May). Mainstreaming evaluation. Keynote address at the Fourteenth Annual Michigan Association of Evaluators Annual Conference, Lansing, MI. | ||||
| ||||
| ||||
| ||||
|
| Session Title: Truth, Beauty, and Justice: Thirty Years Later | ||||
| Multipaper Session 273 to be held in Texas F on Thursday, Nov 11, 10:55 AM to 12:25 PM | ||||
| Sponsored by the AEA Conference Committee | ||||
| Chair(s): | ||||
| Timothy Cash, University of Illinois at Urbana-Champaign, tjcash2@illinois.edu | ||||
|
| Session Title: The Role of Evaluation During Tough Fiscal Times: Sage Advice From Evaluation Leaders | |||
| Panel Session 274 to be held in CROCKETT A on Thursday, Nov 11, 10:55 AM to 12:25 PM | |||
| Sponsored by the Extension Education Evaluation TIG | |||
| Chair(s): | |||
| Joseph Donaldson, University of Tennessee, jldonaldson@tennessee.edu | |||
| Discussant(s): | |||
| Nancy Franz, Virginia Tech, nfranz@vt.edu | |||
| Abstract: Fiscal challenges affect program evaluation. One evaluator scales-back an evaluation design mid-stream to accommodate reduced budgets. A second evaluator becomes a strategic planner because stakeholders want to capitalize on the evaluator’s expertise to make the best use of public funds. A third evaluator is taxed to build individual capacity for evaluation within an organization. Yet, while the American economy drags through a recession of historic proportions, fiscal challenges are not unique for the country’s economy, nor to evaluation leaders. This panel will offer advice for evaluation practice, training, and utilization in the context of concurrent events including, downsized University budgets, the rise of the Internet, and increased accountability at all levels of government. Panel members will discuss roles, responsibilities and reactions of evaluators during tough fiscal times. They will discuss their varied experiences, with an emphasis on evaluation in Land Grant University settings, especially Cooperative Extension and Outreach. | |||
| |||
|
| Session Title: Evaluating Leadership Development in Organizations | |||||||||||||||||||||
| Multipaper Session 275 to be held in CROCKETT B on Thursday, Nov 11, 10:55 AM to 12:25 PM | |||||||||||||||||||||
| Sponsored by the Business and Industry TIG | |||||||||||||||||||||
| Chair(s): | |||||||||||||||||||||
| Eric Abdullateef, Directed Study Services, eric.abdullateef@mac.com | |||||||||||||||||||||
| Ray Haynes, Indiana University, rkhaynes@indiana.edu | |||||||||||||||||||||
|
| Session Title: National Evaluation Capacity Development | |||
| Panel Session 276 to be held in CROCKETT C on Thursday, Nov 11, 10:55 AM to 12:25 PM | |||
| Sponsored by the International and Cross-cultural Evaluation TIG | |||
| Chair(s): | |||
| Hallie Preskill, FSG Social Impact Advisors, hallie.preskill@fsg-impact.org | |||
| Discussant(s): | |||
| Michael Quinn Patton, Utilization-Focused Evaluation, mqpatton@prodigy.net | |||
| Abstract: Within the social policy reform debate occurring in several countries, much attention has been given to policy advising and formulation, as well as policy (and budget) decision making. However, it appears the real challenge is implementing policy reforms to “translate” policy statements into development results for vulnerable population, including poor children and women. Strengthening national social systems to implement policies is therefore paramount. A strong national evaluation system is crucial to provide essential information and analysis to ensure that such policies are being implemented in the most effective and efficient manner, to review policy implementation and design, and to detect bottlenecks and inform adjustments to enhance systemic capacities to deliver results. While more and more countries are designing and implementing national evaluation systems, often technical capacity to develop evaluation systems that meet international quality standards is weak. Therefore, national strategies to strengthen evaluation capacities are needed. These strategies should be comprehensive and integrated, addressing both the technical and political side, as well as the three different levels of capacity development: individual, institutional and the enabling environment. | |||
| |||
| |||
|
| Session Title: National Aeronautics and Space Administration (NASA) Office of Education’s Portfolio Evaluation Approach: Focus on Questions That Provide High Value Answers | ||||
| Multipaper Session 277 to be held in CROCKETT D on Thursday, Nov 11, 10:55 AM to 12:25 PM | ||||
| Sponsored by the Government Evaluation TIG | ||||
| Chair(s): | ||||
| Joyce Winterton, National Aeronautics and Space Administration, joyce.l.winterton@nasa.gov | ||||
|
| Session Title: Evaluating Education, Health Education and Agriculture Education Around the Globe | |||||||||||||||||||||||
| Multipaper Session 278 to be held in REPUBLIC A on Thursday, Nov 11, 10:55 AM to 12:25 PM | |||||||||||||||||||||||
| Sponsored by the International and Cross-cultural Evaluation TIG | |||||||||||||||||||||||
|
| Session Title: Keep an Eye on the Basics: The Importance of Evaluating Public Health Program Infrastructure | |||||||
| Panel Session 279 to be held in REPUBLIC B on Thursday, Nov 11, 10:55 AM to 12:25 PM | |||||||
| Sponsored by the Health Evaluation TIG | |||||||
| Chair(s): | |||||||
| Leslie Fierro, SciMetrika, let6@cdc.gov | |||||||
| Abstract: When presented with evaluation questions about how a public health program is improving population health, many public health practitioners involved in evaluative activities naturally gravitate towards designing an evaluation focused on one or more public health interventions. Despite the potential important contribution of interventions to improving population health, there are many other activities conducted as part of public health programs. High-quality evaluation portfolios of public health programs will include evaluations of partnerships and surveillance efforts intended to provide a solid infrastructure that practitioners can draw upon to plan, implement, evaluate, and sustain interventions. Presenters will discuss collaborative efforts between the Centers for Disease Control and Prevention and representatives from state asthma programs to develop and disseminate materials helpful for designing and implementing evaluations focused on infrastructure. State asthma program evaluation experiences will be shared with a special emphasis on how information was used to improve public health practice. | |||||||
| |||||||
| |||||||
| |||||||
|
| Session Title: A Checklist for Planning, Implementing and Evaluating Implementation Quality | |||||||||||
| Multipaper Session 280 to be held in REPUBLIC C on Thursday, Nov 11, 10:55 AM to 12:25 PM | |||||||||||
| Sponsored by the Collaborative, Participatory & Empowerment Evaluation TIG | |||||||||||
| Chair(s): | |||||||||||
| Jason Katz, University of South Carolina, katzj@email.sc.edu | |||||||||||
| Discussant(s): | |||||||||||
| Sandra Naoom, National Implementation Research Network, sandra.naoom@unc.edu | |||||||||||
|