| Session Title: Equity and Quality in Evaluation: Ideas and Illustrations From the Field | ||||
| Panel Session 502 to be held in Lone Star A on Friday, Nov 12, 9:15 AM to 10:45 AM | ||||
| Sponsored by the Presidential Strand and the Research on Evaluation TIG | ||||
| Chair(s): | ||||
| Jennifer Greene, University of Illinois at Urbana-Champaign, jcgreene@illinois.edu | ||||
| Discussant(s): | ||||
| Valerie Williams, The Globe Program, vwilliams@globe.gov | ||||
| Abstract: As a judgmental practice, evaluation inherently advances particular values. The values may be those of methodological integrity and defensibility, political independence and credibility, usefulness, cultural responsiveness, democratization, or some combination thereof. These different value stances well reflect the theoretical pluralism of the evaluation field. The values of evaluation show up in evaluation practice through the evaluation’s purpose and audience, the key questions asked, and especially the criteria used to make judgments of program quality. This panel explores the justification for and characteristics of an evaluation practice that intentionally and explicitly advances the value of equity, and its contributions to evaluation quality. Equity refers to the explicit representation of the interests of stakeholders least well served in the context at hand toward greater fairness in opportunity and accomplishment for these stakeholders. The panel features the contexts of STEM education program evaluation. | ||||
| ||||
| ||||
|
| Session Title: Model Forms, Program Theory, And Unexpected Behavior: What Are the Implications For Program Implementation and Evaluation? |
| Think Tank Session 503 to be held in Lone Star B on Friday, Nov 12, 9:15 AM to 10:45 AM |
| Sponsored by the Systems in Evaluation TIG |
| Presenter(s): |
| Jonathan Morell, Vector Research Center, jonny.morell@newvectors.net |
| Discussant(s): |
| Jonathan Morell, Vector Research Center, jonny.morell@newvectors.net |
| Sanjeev Sridharan, University of Toronto, sridharans@smh.toronto.on.ca |
| Abstract: Visual forms of logic models (e.g. flowchart, system, input/output) constrain and influence both beliefs about program theory, and choices about methodologies and measures. All this has a powerful influence on expectations about what a program will do and what evaluation can reveal, and thus, for the surprises and unexpected program behaviors that lie in wait for evaluators. This think tank will explore these relationships. Participants will be presented with different logic model forms for the same program, accompanied by a discussion of the presenters’ beliefs about the implications of each form for theory, methodology, and measurement. Our focus will be on what kinds of representations are good enough to assist program implementation and evaluation, and the relative advantages of those different forms. Literature from the field of knowledge translation will inform the discussion. Breakouts will probe, critique and modify the presenters’ assertions. Two scenarios will be presented and discussed. |
| Session Title: Youth Led Evaluation in Action: Stomping Out the Stigma of Mental Illness |
| Demonstration Session 504 to be held in Lone Star C on Friday, Nov 12, 9:15 AM to 10:45 AM |
| Sponsored by the Collaborative, Participatory & Empowerment Evaluation TIG |
| Presenter(s): |
| Cheri Hoffman, Centerstone Research Institute, cheri.hoffman@centerstoneresearch.org |
| James Martin, Mule Town Family Network, jmartin@tnvoices.org |
| Abstract: Youth from the Mule Town Family Network System of Care project in Maury County, TN are planning a week-long "research camp" for the summer of 2010. Professional evaluators will train 8-10 young people in a participatory program evaluation curriculum known as “Stepping Stones.” Community experts in poetry/spoken word, music and dance will join with youth to create a performance about the stigma associated with mental illness. Youth will present a community performance of their work, and then lead focus groups exploring the topic of stigma and how the artistic representations of youth's experience of mental illness has changed people’s perceptions. In this session, youth will share their experiences, the evaluation results of their project, and how initiating and carrying out an evaluation project has impacted their personal development. Staff will share the process of engaging youth in evaluation and the successes and challenges in completing a youth-led evaluation project. |
| Session Title: Strategic Learning: An Embedded Approach for Evaluating Complex Change | |||
| Panel Session 505 to be held in Lone Star D on Friday, Nov 12, 9:15 AM to 10:45 AM | |||
| Sponsored by the Non-profit and Foundations Evaluation TIG | |||
| Chair(s): | |||
| Gale Berkowitz, David and Lucile Packard Foundation, gberkowitz@packard.org | |||
| Discussant(s): | |||
| Gale Berkowitz, David and Lucile Packard Foundation, gberkowitz@packard.org | |||
| Abstract: This panel will discuss strategic learning, an approach to evaluation that works well with complicated and complex strategies that evolve over time and have multiple causal paths or ways of achieving outcomes. These strategies present unique challenges to conventional program evaluation and require fresh thinking and new approaches to ensure that evaluation is relevant and useful. Strategic learning means using evaluation to help organizations or groups learn in real-time and adapt their strategies to the changing circumstances around them. It means making evaluation a part of the intervention—embedding it so that it influences the process. The panel will describe the concept and principles of strategic learning and how it differs from traditional evaluation approaches. Presenters will describe what strategic learning looks like in practice based on their experiences, and will discuss innovative tools and methods that can be used to promote strategic learning. | |||
| |||
| |||
|
| Session Title: Promoting Quality Impact Studies: Constructive, Context-Appropriate Policies for Strengthening Research Designs for Impact Evaluations | |||
| Panel Session 506 to be held in Lone Star E on Friday, Nov 12, 9:15 AM to 10:45 AM | |||
| Sponsored by the Quantitative Methods: Theory and Design TIG | |||
| Chair(s): | |||
| George Julnes, University of Baltimore, gjulnes@ubalt.edu | |||
| Abstract: The recent controversy over the role and value of random assignment experiments, particularly with regard to the U.S. Department of Education, has raised the issues of what are strong designs for evaluations (more likely to yield valid impact estimates and judgments of quality) and when they should be employed. As the evaluation community has grappled further with these issues, some tentative resolution of the controversy has resulted. This session, following the four presenters, will provide a framework for assessing the context as it relates to value of alternative evaluation designs, report on a government assessment of when different designs are most appropriate, provide an example of mixing methods to strengthen a particular research design, and suggest multiple dimensions to consider in evaluating the value of different designs. | |||
| |||
| |||
| |||
|
| Session Title: The American Evaluation Association and Its Local Affiliates: Shaping Our Future Together |
| Think Tank Session 507 to be held in Lone Star F on Friday, Nov 12, 9:15 AM to 10:45 AM |
| Sponsored by the AEA Conference Committee |
| Presenter(s): |
| Rachel Hickson, Montgomery County Public Schools, rachel_a_hickson@mcpsmd.org |
| Discussant(s): |
| Michael Hendricks, Independent Consultant, mikehendri@aol.com |
| Beverly A Parsons, InSites, bparsons@insites.org |
| Stewart Donaldson, Claremont Graduate University, stewart.donaldson@cgu.edu |
| Abstract: Local affiliates continue their dynamic role in AEA. Local affiliate rosters closely reflect overall AEA membership as well as the evaluation profession. The AEA 2010 work plan addresses policy that will shape the future of the relationship between affiliates and AEA. Members of the Board policy work group on affiliates policy will be invited to this sesssion to discuss their work and its status to date. A World Café format will then be used to discuss local affiliates’ strategies and goals for their work, within the context of AEA’s broad policies. Representatives of AEA affiliates at different stages of development will be invited to comment on affiliates’ needs. |
| Roundtable Rotation I: Making Decisions About Program Continuation: A Step-by-Step Process |
| Roundtable Presentation 508 to be held in MISSION A on Friday, Nov 12, 9:15 AM to 10:45 AM |
| Sponsored by the Human Services Evaluation TIG |
| Presenter(s): |
| Carla Clasen, Wright State University, carla.clasen@wright.edu |
| Betty Yung, Wright State University, betty.yung@wright.edu |
| Carl Brun, Wright State University, carl.brun@wright.edu |
| Katherine Cauley, Wright State University, katherine.cauley@wright.edu |
| Cheryl Meyer, Wright State University, cheryl.meyer@wright.edu |
| Abstract: Frequently decisions have to be made about program continuation when initial funding for the program is ended or reduced. When such decisions must be made about multiple programs competing for limited ongoing funding, considerations of program effectiveness, cost, and popularity must be taken into account by decision makers. Evaluators can assist stakeholders to identify the relevant factors and provide data that will assist in making sometimes difficult choices among programs. This presentation will describe a process that assists stakeholders to identify specific factors that should be taken into account, the relative weight of each factor in contributing to decision making, and a method of measuring each factor. |
| Roundtable Rotation II: New Innovations in Understanding and Measuring Transfer of Learning in Human Services Skills-based Training |
| Roundtable Presentation 508 to be held in MISSION A on Friday, Nov 12, 9:15 AM to 10:45 AM |
| Sponsored by the Human Services Evaluation TIG |
| Presenter(s): |
| Robin Leake, University of Denver, rleake@du.edu |
| Cathryn Potter, University of Denver, cathryn.potter@du.edu |
| Kathryn Schroeder, University of Denver, kathryn.schroeder@du.edu |
| Abstract: This roundtable will address the topic of transfer of learning in child welfare training. Because training is considered one of the key drivers for implementing practice and policy changes in an organization, evaluators must have a good understanding of the individual and organizational climate factors that influence learning and the transfer of learning to the job. Facilitators will discuss how Holton’s (1996) model of transfer and Learning Transfer Systems Inventory is being used for a training evaluation of the National Child Welfare Workforce Institute’s leadership academy for supervisors and managers, and describe the design, methods and preliminary results of this ongoing evaluation. Participants will be invited to share models for conceptually understanding transfer of learning and innovative strategies for measuring transfer of learning outcomes. |
| Session Title: How Evaluation Policies Affect Evaluation Quality in a Texas Public School District | |||
| Panel Session 509 to be held in MISSION B on Friday, Nov 12, 9:15 AM to 10:45 AM | |||
| Sponsored by the | |||
| Chair(s): | |||
| Karen Looby, Austin Independent School District, karen.looby@austinisd.org | |||
| Discussant(s): | |||
| Whitsett Maria, Moak, Casey & Associates, mwhitsett@moakcasey.com | |||
| Abstract: Federal and state departments of education and local school boards regularly institute policies requiring the evaluation of educational programs. Often, these policies support the development and implementation of high quality program evaluations. However, they also may have unintended consequences which undermine the evaluation quality. Program evaluators from Austin Independent School District will illustrate how evaluation policies affect program evaluation work in the district. Panelists will describe policy influences on district program evaluations and highlight theoretical and practical issues that arise in the work. The establishment of district evaluation priorities and the evaluations of the district’s teacher pay for performance program, American Recovery and Reinvestment Act of 2009 (ARRA)funded programs, and externally provided programs operating within the district will be used as illustrations. The panelists’ presentations will set the stage for a collegial discussion about conducting quality evaluation work that is characterized by integrity, accuracy, and usefulness within a policy-driven environment. | |||
| |||
| |||
| |||
|
| Session Title: Enhancing the Quality of Evaluation Design, Data Collection, and Reports Through Peer Review |
| Think Tank Session 510 to be held in BOWIE A on Friday, Nov 12, 9:15 AM to 10:45 AM |
| Sponsored by the Independent Consulting TIG |
| Presenter(s): |
| Sally Bond, The Program Evaluation Group LLC, usbond@mindspring.com |
| Discussant(s): |
| Sally Bond, The Program Evaluation Group LLC, usbond@mindspring.com |
| Courtney Malloy, Vital Research LLC, courtney@vitalresearch.com |
| Abstract: This think tank responds directly to two elements of this year’s conference theme of evaluation quality: (1) How is evaluation quality conceptualized and operationalized? (2) How do we ensure evaluation quality in our practice? Since AEA 2004, the Independent Consulting TIG has operated a Peer Review process for its members. Having established an effective process for reviewing evaluation reports, the current co-chairs of the IC TIG’s Peer Review propose to expand the service to include the review of evaluation designs and data collection tools. After a brief presentation about the existing framework for reviewing evaluation reports, the co-chairs will present two new draft frameworks for reviewing and providing feedback on evaluation designs and data collection tools. The purpose of the think tank is to invite comments on the new frameworks and refine them accordingly. |
| Session Title: Utilizing Evaluation Methods to Provide Quality Health Care Services to Underserved Populations | ||||||||||||||||||||||||
| Multipaper Session 511 to be held in BOWIE B on Friday, Nov 12, 9:15 AM to 10:45 AM | ||||||||||||||||||||||||
| Sponsored by the Multiethnic Issues in Evaluation TIG | ||||||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||||||
| Kevin E Favor, Lincoln University, kfavor@lincoln.edu | ||||||||||||||||||||||||
|
| Session Title: Improving Quality of Programs and Evaluation: Examples From the Field | ||||||||||||||||||
| Multipaper Session 512 to be held in BOWIE C on Friday, Nov 12, 9:15 AM to 10:45 AM | ||||||||||||||||||
| Sponsored by the | ||||||||||||||||||
| Chair(s): | ||||||||||||||||||
| Lennise Baptiste, Kent State University, lbaptist@kent.edu | ||||||||||||||||||
| Discussant(s): | ||||||||||||||||||
| Wendy DuBow, National Center for Women and Information Technology, wendy.dubow@colorado.edu | ||||||||||||||||||
|
| Roundtable Rotation I: Big Money, More Scrutiny: How to Forge Evaluator-Early Childhood Education Program Partnerships in Order to Produce Clear, Relevant, and Useful Data to Inform Policy and Practice |
| Roundtable Presentation 513 to be held in GOLIAD on Friday, Nov 12, 9:15 AM to 10:45 AM |
| Sponsored by the Advocacy and Policy Change TIG and the Research on Evaluation TIG |
| Presenter(s): |
| Marijata Daniel-Echols, HighScope Educational Research Foundation, mdaniel-echols@highscope.org |
| Abstract: As public attention on the importance of early childhood education rises, so has the pressure for preschool programs to show measurable results. This focus on accountability translates into greater demand for research and evaluation projects. This larger context has lead to more opportunities for evaluators and programs to partner in ways they may not have in the past. These partnerships can be both a point of strength and a challenge. Having clear expectations of what each partner has to gain, lose, and must contribute to the evaluation process is essential. This session will use examples from Head Start and state-funded preschool evaluation projects to explore lessons learned on how to forge successful evaluator-program partnerships that produce clear, relevant, and useful data that can be used to inform both policy and practice. |
| Roundtable Rotation II: A Study on the Indicator of High Quality Papers: The Case of Casualty Actuarial Society (CAS) |
| Roundtable Presentation 513 to be held in GOLIAD on Friday, Nov 12, 9:15 AM to 10:45 AM |
| Sponsored by the Advocacy and Policy Change TIG and the Research on Evaluation TIG |
| Presenter(s): |
| Haijun Zheng, Chinese Academy of Sciences, haijzheng@casipm.ac.cn |
| Zhongcheng Guan, Chinese Academy of Sciences, guan@casipm.ac.cn |
| Haiyang Hu, Chinese Academy of Sciences, hyhu@cashq.ac.cn |
| Bing Shi, Chinese Academy of Sciences, bshi@cashq.ac.sn |
| Abstract: The number of SCI papers is one of the most commonly used indicators in R&D evaluation. Theoretically, Papers published on journals with high impact factors (according to JCR statistic) have high quality. In the evaluation practice of CAS, the papers on top 15% SCI journals ranking by JCR are called "high quality papers". In this study, firstly, we deliberate the consistency between high citation papers and "high quality papers" in CAS, and the consistency between work with important social impact, e.g. rewards, and "high quality papers". Secondly, we describe the distribution of papers in CAS among SCI journals by JCR rank, and study how the distribution pattern changes before and after the indicator is adopted. Furthermore, we compare the pattern with that of other national research institutes. Thus, we can inspect the behavior impact on publishing papers for researchers in CAS after this indicator is adopted. |
| Roundtable Rotation I: Practices for Working With and Building Capacity of Local Evaluation Consultants in International Development |
| Roundtable Presentation 514 to be held in SAN JACINTO on Friday, Nov 12, 9:15 AM to 10:45 AM |
| Sponsored by the International and Cross-cultural Evaluation TIG |
| Presenter(s): |
| Elizabeth Hutchinson, Land O'Lakes International Development, erhutchinson@landolakes.com |
| Meredith Blair, Humanity United, mblair@humanityunited.org |
| Abstract: Many international development programs work with host-country external consultants who bring valuable localized knowledge and expertise in evaluation. Key to ensuring the quality of these evaluations and mutual satisfaction of the partnership rests on thoughtful and thorough preparations. Successfully working with local evaluators encompasses two main approaches: 1) strong start up systems and strategies and 2) a commitment to strengthening the capacity of local consultants as needed. Managing this process pays off in robust data collection and analysis, as well as further strengthens local capacity, fosters sustainability and ensures quality evaluations. This roundtable aims at providing an opportunity for participants to share valuable insights on different challenges, limitations, practices and opportunities that emerged in their own work in international contexts. The discussion, facilitated by Land O’Lakes International Development and Humanity United, will include recommendations, practices, and lessons learned to improve the practice of working with local evaluators in international settings. |
| Roundtable Rotation II: Exploring Evaluation Quality in International Development Evaluation: An Examination of How International Development Organizations Issue and Contract Evaluations |
| Roundtable Presentation 514 to be held in SAN JACINTO on Friday, Nov 12, 9:15 AM to 10:45 AM |
| Sponsored by the International and Cross-cultural Evaluation TIG |
| Presenter(s): |
| Anne Cullen, Western Michigan University, anne.cullen@wmich.edu |
| Daniela Schroeter, Western Michigan University, daniela.schroeter@wmich.edu |
| Michele Tarsilla, Western Michigan University, michele.tarsilla@wmich.edu |
| Jim Rugh, Independent Consultant, jimrugh@mindspring.com |
| Abstract: Recent studies have shown that donor dominance of the international development evaluation process can pose serious limitations to the independence of evaluators. Specifically, rigid evaluation terms of reference (TOR) and requests for proposals (RFP) limit evaluators to determine independently (a) how programs should be evaluated, (b) which evaluation methods are most appropriate for use, (c) how to sample stakeholders for interviews or consultations, and (d) how the evaluation is to be conducted. Moreover, in many cases, access to TORs/RFPs is limited to a selected number of vendors/consultants. This session explores the implications of the issuing and contracting processes of international development evaluations on evaluation quality. We present as an example the results of a 2010 study on TORs and RFPs issued by international development organizations. Presenters will pose a number of questions to roundtable participants to highlight strengths, weaknesses, and areas of improvement for international development evaluation contracting. |
| Session Title: Beyond the Classroom: Assessment in Non-Traditional Settings | |||||||||||||||||||||||
| Multipaper Session 515 to be held in TRAVIS A on Friday, Nov 12, 9:15 AM to 10:45 AM | |||||||||||||||||||||||
| Sponsored by the Assessment in Higher Education TIG | |||||||||||||||||||||||
| Chair(s): | |||||||||||||||||||||||
| Howard Mzumara, Indiana University-Purdue University Indianapolis, hmzuymara@iupui.edu | |||||||||||||||||||||||
|
| Session Title: The Integration of Video and In Situ Simulation Practices to Evaluate Organizational Processes |
| Skill-Building Workshop 516 to be held in TRAVIS B on Friday, Nov 12, 9:15 AM to 10:45 AM |
| Sponsored by the Integrating Technology Into Evaluation |
| Presenter(s): |
| William Hamman, William Beaumont Hospital, william.hamman@beaumonthospitals.com |
| Jill Stefaniak, William Beaumont Hospital, j_stefaniak@hotmail.com |
| Abstract: Continuing education is a mandatory requirement for many positions across a variety of industries. A key challenge in training and development is to provide training through non-traditional training means. As technological advances are made, simulation is becoming a recurring teaching method that is used for assessment. Utilizing different assessment tools to link training initiatives with appropriate goals and objectives, we have detailed the process to define various curricula that is targeted to trainee developmental levels. These instructional strategies integrate innovative instructional design processes through low and high fidelity simulations with traditional learning methodologies. These innovative methods will allow the participants to analyze current training and debriefing practices, re-evaluate learning outcomes using in-situ simulation to identify risk and process issues, and develop new plans to integrate activities and validate assessment metrics to enhance learning, transference of learning, and ultimately decrease error to improve upon performance. |
| Session Title: Evaluation of Efforts to Create Safer Environments for Lesbian, Gay, Bisexual, and Transgender (LGBT) Youth | ||||
| Multipaper Session 517 to be held in TRAVIS C on Friday, Nov 12, 9:15 AM to 10:45 AM | ||||
| Sponsored by the Lesbian, Gay, Bisexual, Transgender Issues TIG | ||||
| Chair(s): | ||||
| Joseph Kosciw, Gay, Lesbian & Straight Education Network, jkosciw@glsen.org | ||||
|
| Session Title: Ecologies of Collaboration in the Arts | ||||||||||||||||||
| Multipaper Session 518 to be held in TRAVIS D on Friday, Nov 12, 9:15 AM to 10:45 AM | ||||||||||||||||||
| Sponsored by the Evaluating the Arts and Culture TIG | ||||||||||||||||||
| Chair(s): | ||||||||||||||||||
| Min Zhu, University of South Carolina, helen970114@gmail.com | ||||||||||||||||||
| Discussant(s): | ||||||||||||||||||
| Katie Steedly, Steedly Consulting, k.steedly@rcn.com | ||||||||||||||||||
|
| Session Title: Challenges and Solutions in Implementing and Conducting Quality Evaluations on Children, Youth, and Families | ||||||||||||||||||||
| Multipaper Session 519 to be held in INDEPENDENCE on Friday, Nov 12, 9:15 AM to 10:45 AM | ||||||||||||||||||||
| Sponsored by the Human Services Evaluation TIG | ||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||
| Margaret Polinsky, Parents Anonymous Inc, ppolinsky@parentsanonymous.org | ||||||||||||||||||||
|
| Session Title: Does Sensemaker Make Sense? Evaluating Development Initiatives Through Narrative Capture and Tagging in Kenya and Latin America |
| Demonstration Session 520 to be held in PRESIDIO A on Friday, Nov 12, 9:15 AM to 10:45 AM |
| Sponsored by the Systems in Evaluation TIG and the Qualitative Methods TIG |
| Presenter(s): |
| John Hecklinger, GlobalGiving Foundation, jhecklinger@globalgiving.org |
| Irene Guijt, Learning by Design, iguijt@learningbydesign |
| Abstract: John Hecklinger and Irene Guijt will demonstrate how GlobalGiving in one experiment, and Centro Latinoamericano para el Desarollo Rural (RIMISP) in another, explored the possibility of engaging community members, implementing organizations, researchers, donors, and grantmakers in a cost-effective, real time evaluation effort of small-scale projects and policy influencing research processes in the developing world. We will demonstrate how we used Cognitive Edge’s SenseMaker software on the ground in Africa and Latin America to capture and tag stories gathered from community members, researchers, and social change agents, enabling us to visually depict patterns of impact and change. Subsequent analysis led to strategic questions. Rooted in complexity theory, which looks at systems that are inherently unpredictable and cognitive science, which considers how people make sense, this experiment explores how multiple perspectives illuminate underlying patterns when more traditional means of evaluation are not workable. |
| Session Title: Evaluation Capacity Building (ECB) Models, Measures, And Outcomes: Taking Stock to Forge Ahead | |||||
| Panel Session 521 to be held in PRESIDIO B on Friday, Nov 12, 9:15 AM to 10:45 AM | |||||
| Sponsored by the Organizational Learning and Evaluation Capacity Building TIG | |||||
| Chair(s): | |||||
| Tina Taylor-Ritzler, Dominican University, tina.ritzler@gmail.com | |||||
| Discussant(s): | |||||
| Hallie Preskill, FSG Social Impact Advisors, hallie.preskill@fsg-impact.org | |||||
| Abstract: Evaluation capacity building (ECB) has become an important process for organizations to respond to a myriad of accountability demands. As such, the ECB literature has grown to include models, measures and reported outcomes. This panel provides an analysis of what we know about current ECB efforts and identifies future directions for ECB research. The first presentation by Labin et al. reports the results of a research synthesis of the ECB literature. The second by Taylor-Ritzler et al. reports the results of a mixed-methods ECB model validation study and discusses a validated survey. The third presentation by Suarez-Balcazar et al. reports the results of a qualitative study that was conducted to further specify elements of the model presented in the second presentation and discusses implications of the study findings for future model validation efforts. Finally, discussant Hallie Preskill identifies implications of the presentations for current and future ECB research and practice. | |||||
| |||||
| |||||
|
| Session Title: Evaluating National Substance Abuse Prevention Programs | |||||||||||||||||||
| Multipaper Session 522 to be held in PRESIDIO C on Friday, Nov 12, 9:15 AM to 10:45 AM | |||||||||||||||||||
| Sponsored by the Alcohol, Drug Abuse, and Mental Health TIG | |||||||||||||||||||
| Chair(s): | |||||||||||||||||||
| Trena Anastasia, University of Wyoming, tanastas@uwyo.edu | |||||||||||||||||||
|
| Roundtable Rotation I: Cracking Black Box Health System Performance Evaluations: Potential Practices From Field Applications of Management Oriented Evaluation Models |
| Roundtable Presentation 523 to be held in BONHAM A on Friday, Nov 12, 9:15 AM to 10:45 AM |
| Sponsored by the Theories of Evaluation TIG and the Health Evaluation TIG |
| Presenter(s): |
| Jacob Kawonga, Management Science for Health, jkawonga@gmail.com |
| Abstract: Objective: The objective is to demonstrate field level applications of management oriented evaluation models that have potential to improve health systems performance evaluation. Design: Exploratory Literature based study Results: Cases of field level applications of management oriented evaluation approaches in Malawi, Uganda,Kenya, Tanzania, Rwanda and Namibia points to potential approaches that have capacity to demonstrate evidence of effective health systems strengthening interventions, a challenge which has not be resolved by expenditure based evaluation approaches |
| Roundtable Rotation II: Towards Translational Process Evaluation: Implementation, Fidelity, Integration, and Sustainability – A Roundtable Discussion |
| Roundtable Presentation 523 to be held in BONHAM A on Friday, Nov 12, 9:15 AM to 10:45 AM |
| Sponsored by the Theories of Evaluation TIG and the Health Evaluation TIG |
| Presenter(s): |
| Oliver Massey, university of South Florida, massey@fmhi.usf.edu |
| Abstract: In the last decade behavioral health researchers and practitioners have come to recognize the critical importance of the use of service interventions that have established evidence of their efficacy. Unfortunately, it is now recognized that effective programs are not always readily adopted, and that there are significant gaps in the translation of theoretically sound best practices into workable programs in the field. The translation of research into practice involves recognizing and solving complex problems that deal with the technology of implementation. For evaluators, there are significant leverage points for a renewal of the value of process evaluation interpreted through the lens of implementation science. In this roundtable I will briefly review issues in translational science and its relevance for process evaluation. The roundtable will then provide an opportunity to discuss and explore potential roles for evaluators in the explicit process related roles of implementation, fidelity, integration, and sustainability. |
| Session Title: Aligning Priorities of Diverse Stakeholders Using Collaborative Evaluation Planning |
| Think Tank Session 524 to be held in BONHAM B on Friday, Nov 12, 9:15 AM to 10:45 AM |
| Sponsored by the Non-profit and Foundations Evaluation TIG |
| Presenter(s): |
| Katie Zaback, JVA Consulting LLC, katie@jvaconsulting.com |
| Discussant(s): |
| Randi Nelson, JVA Consulting LLC, randi@jvaconsulting.com |
| Nancy Zuercher, JVA Consulting LLC, nancy@jvaconsulting.com |
| Julia Alvarez, JVA Consulting LLC, julia@jvaconsulting.com |
| Abstract: In this Think Tank session, participants will explore the challenges of planning a multipurpose evaluation that meets the needs of diverse stakeholders, specifically the needs of multiple funders. The chair will present a case study and ask participants to engage in collaboration and consensus building to develop an evaluation plan that is responsive to the needs of all parties and succeeds in measuring the outcomes of the initiative. Session participants will join facilitated breakout groups that represent individual stakeholders—a nonprofit parent organization, statewide affiliates and various funders—and will be asked to design the beginning phases of an evaluation plan. Breakout groups will present their plans to the larger audience, and then collaborate to incorporate all of the small group evaluation plans into one. The session concludes with attendees sharing their own experiences designing high quality multipurpose evaluations that meet the needs of multiple audiences. |
| Session Title: Evaluating Science, Technology, Engineering, and Mathematics (STEM) Initiatives in K-12 Education | |||||||||||||||||||||||||
| Multipaper Session 525 to be held in BONHAM C on Friday, Nov 12, 9:15 AM to 10:45 AM | |||||||||||||||||||||||||
| Sponsored by the Pre-K - 12 Educational Evaluation TIG | |||||||||||||||||||||||||
| Chair(s): | |||||||||||||||||||||||||
| James P Van Haneghan, University of South Alabama, jvanhane@usouthal.edu | |||||||||||||||||||||||||
| Discussant(s): | |||||||||||||||||||||||||
| Tom McKlin, The Findings Group LLC, tom.mcklin@gmail.com | |||||||||||||||||||||||||
|
| Session Title: Evaluating Literacy Curricula for Adolescents: Results From Three Years of Striving Readers | ||||
| Panel Session 526 to be held in BONHAM D on Friday, Nov 12, 9:15 AM to 10:45 AM | ||||
| Sponsored by the Pre-K - 12 Educational Evaluation TIG | ||||
| Chair(s): | ||||
| Stefanie Schmidt, United States Department of Education, stefanie.schmidt@ed.gov | ||||
| Discussant(s): | ||||
| David Francis, University of Houston, david.francis@times.uh.edu | ||||
| Abstract: This panel will discuss the findings from three years of Striving Readers program evaluations from 5 sites across the country. Panel members will report findings based on experimental research designs that provide the most rigorous evaluations to date on a number of adolescent reading interventions. These independent evaluations provide a wealth of detailed information to policymakers and school administrators on the important, but under-researched area of adolescent literacy education. The papers being presented add significantly to our understanding of middle school literacy education and the potential for several intervention strategies to be effective. They also provide insights into the challenges of maintaining a high-quality experimental research design in the field. Despite substantial obstacles to conducting rigorous experiments in school settings, evaluators have been able to negotiate compromises that do not diminish the quality of their evaluation designs. The results advance our understanding of both adolescent literacy and practical research methodology. | ||||
| ||||
| ||||
| ||||
| ||||
|
| Session Title: Challenges and Best Practices in Benefit Cost Studies of Research and Technology Programs | |||||
| Panel Session 527 to be held in BONHAM E on Friday, Nov 12, 9:15 AM to 10:45 AM | |||||
| Sponsored by the Research, Technology, and Development Evaluation TIG | |||||
| Chair(s): | |||||
| Rosalie Ruegg, TIA Consulting Inc, ruegg@ec.rr.com | |||||
| Abstract: Using case study of a renewable energy technology program, this panel of experts and practitioners will discuss challenges in benefit-cost studies and how they can be best met. Specific challenges to be addressed include extending the scope beyond projects to programs and portfolios of projects, inclusion of multiple types of benefits; development of the next best alternative from which to calculate benefits, strategies for tackling attribution, and data collection and assumptions for credible analysis. The U.S. Department of Energy (DOE) has drafted a Guide aimed at incorporating best practices in benefit-cost analysis that was followed in four recent retrospective studies. One of these studies--a benefit-cost analysis of DOE's investment in solar photovoltaic energy systems--will be presented as a case study. Panelists who have a broad view of best practice in RTD evaluation in the U.S. and Europe, including benefit-cost analysis, will provide their opinions in a lively discussion. | |||||
| |||||
| |||||
| |||||
|
| Session Title: The Weakest Link: Does Good Evaluation Lead to Good Decisions? How to Assess Your Organization |
| Skill-Building Workshop 528 to be held in Texas A on Friday, Nov 12, 9:15 AM to 10:45 AM |
| Sponsored by the Government Evaluation TIG |
| Presenter(s): |
| Thea C Bruhn, United States Department of State, bruhntc@state.gov |
| Abstract: GIGO or Garbage In-Garbage Out is a familiar notion: bad data result in ineffective or inappropriate follow-on actions. Does the antithesis apply? If they have good evaluation data, will leaders make good decisions? The use of data for strategic decision making is Business Intelligence (BI). BI also includes providing decision makers with intuitive methods for monitoring and analyzing data on an ongoing basis. Studies of what managers actually do, as opposed to what they are supposed to do, or what they say they do, have shown that even successful managers rarely, if ever, employ rational approaches. How does your organization measure up? In this workshop, participants will apply tools to look at decision-making in their organizations and the role and impact of evaluation data. |
| Session Title: Taking Control of Your Evaluation Career |
| Skill-Building Workshop 529 to be held in Texas B on Friday, Nov 12, 9:15 AM to 10:45 AM |
| Sponsored by the Evaluation Managers and Supervisors TIG |
| Presenter(s): |
| George Grob, Center for Public Program Evaluation, georgefgrog@cs.com |
| Ann Maxwell, United States Department of Health and Human Services, ann.maxwell@oig.hhs.gov |
| Abstract: This session will engage its participants in a series of exercises deigned to help them understand the many possibilities of a rewarding life long career in the field of evaluation; to identify both the broad and specific skills, knowledge, and experience conducive to achieving it; to evaluate where they currently stand; and to set goals for their own personal future career development. The tools aim to open each participating evaluator’s vision to his or her roles and potential as an analyst/methodologist; substantive program expert; and manager/administrator/advisor. It will explain how these skills naturally develop over a lifetime of evaluation practice, and how an evaluator can plan for and enjoy an expanding role of professionalism, influence, and stature over his or her career. |
| Session Title: Ensuring High-Quality Data Processes in Evaluation: Examples From Qualitative, Quantitative and Mixed Methods Work | ||||
| Multipaper Session 530 to be held in Texas C on Friday, Nov 12, 9:15 AM to 10:45 AM | ||||
| Sponsored by the | ||||
| Chair(s): | ||||
| Jacklyn Altuna, Berkeley Policy Associates, jacklyn@bpacal.com | ||||
|
| Session Title: Recent Developments in Research and Development Evaluation: The Academic Side | |||||
| Multipaper Session 531 to be held in Texas D on Friday, Nov 12, 9:15 AM to 10:45 AM | |||||
| Sponsored by the Research, Technology, and Development Evaluation TIG | |||||
| Chair(s): | |||||
| Juan Rogers, School of Public Policy Georgia Institute of Technology, jdrogers@gatech.edu | |||||
| Discussant(s): | |||||
| Juan Rogers, School of Public Policy Georgia Institute of Technology, jdrogers@gatech.edu | |||||
|
| Session Title: Clients Speak Out About Evaluation | ||||||||||||||||||||||||
| Multipaper Session 532 to be held in Texas E on Friday, Nov 12, 9:15 AM to 10:45 AM | ||||||||||||||||||||||||
| Sponsored by the Evaluation Use TIG | ||||||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||||||
| Susan Tucker, Evaluation & Development Associates, sutucker1@mac.com | ||||||||||||||||||||||||
| Discussant(s): | ||||||||||||||||||||||||
| Lyn Shulha, Queen's University at Kingston, lyn.shulha@queensu.ca | ||||||||||||||||||||||||
|
| Session Title: The Gaining Early Awareness and Readiness for Undergraduate Programs (GEAR UP) Experience: Exploring the Promise of Multi-site Evaluation | |||||
| Panel Session 533 to be held in Texas F on Friday, Nov 12, 9:15 AM to 10:45 AM | |||||
| Sponsored by the College Access Programs TIG | |||||
| Chair(s): | |||||
| Yvette Lamb, Academy for Educational Development, ylamb@aed.org | |||||
| Discussant(s): | |||||
| Melissa Panagides-Busch, Academy for Educational Development, mbusch@aed.org | |||||
| Abstract: Gaining Early Awareness and Readiness for Undergraduate Programs (GEAR UP) is a federally funded program to provide services to prepare low income middle and high school students for entering and succeeding in post secondary education. Academy for Educational Development (AED) is working with multiple partnerships and a state agency to conduct external evaluation of GEAR UP programs. Data from across six school districts and 40 schools will be used to conduct multisite analysis to better understand the feasibility of various types of approach for program evaluation. The session begins with our conceptualization of multisite evaluation of GEAR UP program. The second paper presents an overview of GEAR UP programs managed by three agencies in three states. The final paper presents our process for data collection and analysis, including feedback we received from our clients in pursuing the multisite evaluation. The session will end with discussion of key design questions. | |||||
| |||||
| |||||
|
| Session Title: Evaluation and Program Quality | |||||||||||||||||||||||
| Multipaper Session 534 to be held in CROCKETT A on Friday, Nov 12, 9:15 AM to 10:45 AM | |||||||||||||||||||||||
| Sponsored by the Extension Education Evaluation TIG | |||||||||||||||||||||||
| Chair(s): | |||||||||||||||||||||||
| Lisa Townsend, University of New Hampshire, lisa.townsend@unh.edu | |||||||||||||||||||||||
|
| Session Title: Teaching About Specific Aspects of Evaluation | |||||||||||||||
| Multipaper Session 535 to be held in CROCKETT B on Friday, Nov 12, 9:15 AM to 10:45 AM | |||||||||||||||
| Sponsored by the Teaching of Evaluation TIG | |||||||||||||||
| Chair(s): | |||||||||||||||
| John Stevenson, University of Rhode Island, jsteve@uri.edu | |||||||||||||||
|
| Session Title: Evaluation Within Contested Spaces | |||
| Panel Session 536 to be held in CROCKETT C on Friday, Nov 12, 9:15 AM to 10:45 AM | |||
| Sponsored by the International and Cross-cultural Evaluation TIG | |||
| Chair(s): | |||
| Ross VeLure Roholt, University of Minnesota, rossvr@umn.edu | |||
| Abstract: International and humanitarian and other aid agencies require evaluation for accountability and program improvement. Increasingly, evaluation has to be undertaken in communities under conditions of violent division. There is practice wisdom about how to conceptualize and implement this work, but it is not easily available, as is social research under such conditions. This panel will offer a public, professional space for describing, clarifying and understanding this work, suggesting practical strategies, tactics and tools. Also, research on evaluation practice under these conditions will be covered. A relevant bibliography will be distributed. | |||
| |||
| |||
| |||
|
| Session Title: Assessing the Health of and Improving the Evaluation Function Across the Government of Canada Through the Management Accountability Framework (MAF) | ||||
| Panel Session 537 to be held in CROCKETT D on Friday, Nov 12, 9:15 AM to 10:45 AM | ||||
| Sponsored by the Government Evaluation TIG | ||||
| Chair(s): | ||||
| Anne Routhier, Treasury Board of Canada, anne.routhier@tbs-sct.gc.ca | ||||
| Abstract: In 2003, the Management Accountability Framework (MAF) was created by the Treasury Board of Canada Secretariat (TBS) to define/outline the expectations of senior public service managers for good management. The MAF is structured around 10 key elements (comprising 21 Areas of Management (AoMs)). These elements are assessed on a periodic basis (either annually or tri-annually) and the results are reported to Deputy Heads of departments to assist them in identifying management priority issue areas. In this panel presentation, the TBS Centre of Excellence for Evaluation (CEE) will present its methodology and experience in assessing MAF AoM 6 (Quality and Use of Evaluation) in departments and agencies of the Government of Canada and will be joined by representatives from Industry Canada, Canadian Heritage and Indian and Northern Affairs Canada who will share their experience in undergoing and utilizing MAF to improve the evaluation function in their departments. | ||||
| ||||
| ||||
| ||||
|
| Session Title: Assessing the Quality of Research Instruments Using Cognitive Lab Methodology: A Practical Discussion and Lessons Learned |
| Demonstration Session 538 to be held in SEGUIN B on Friday, Nov 12, 9:15 AM to 10:45 AM |
| Sponsored by the Qualitative Methods TIG |
| Presenter(s): |
| Joanna Gilmore, University of South Carolina, jagilmor@mailbox.sc.edu |
| Heather Bennett, University of South Carolina, bennethl@mailbox.sc.edu |
| Karen Price, University of South Carolina, pricekj@mailbox.sc.edu |
| Abstract: When considering “evaluation quality” it is important to critically analyze the appropriateness of research instruments used to garner data on a particular subject. One method employed by researchers from the University of South Carolina to review instruments is the cognitive lab methodology. Cognitive labs involve asking individuals to report their decision-making processes and analyzing the resulting verbal data to garner information about the cognitive processes that an individual uses to complete a task (Van Someren, Barnard, & Sandberg, 1994). This demonstration will describe the purpose of the cognitive labs, how cognitive labs were employed in two projects, and the methods by which data were collected and analyzed. The presenters will also share lessons learned in conducting cognitive labs. Additionally, participants will be provided with an opportunity to observe a “mock” cognitive lab and will be invited to pose questions and comments throughout the presentation. |
| Session Title: Complementary Approaches to Evaluating Social Safety Nets at the World Bank | |||
| Panel Session 539 to be held in REPUBLIC A on Friday, Nov 12, 9:15 AM to 10:45 AM | |||
| Sponsored by the International and Cross-cultural Evaluation TIG | |||
| Chair(s): | |||
| Cheryl Gray, World Bank, cgary@worldbank.org | |||
| Abstract: This session will illustrate the World Bank’s Independent Evaluation Group’s (IEG) approach to evaluating the World Bank’s support to social safety nets world-wide. Specifically, the panel will demonstrate the various building blocks of the evaluation, the main themes of questions posed and the approaches used in addressing the questions. The presentations will explore how different qualitative and quantitative methods complement each other to provide a rich set of evidence. | |||
| |||
| |||
|
| Session Title: Tools for Aligning National-Level and Local-Level Evaluations: Helping Grantees Evaluate Their Public Health Interventions |
| Skill-Building Workshop 540 to be held in REPUBLIC B on Friday, Nov 12, 9:15 AM to 10:45 AM |
| Sponsored by the Health Evaluation TIG |
| Presenter(s): |
| Shyanika Rose, Battelle Memorial Institute, rosesw@battelle.org |
| Joanne Abed, Battelle Memorial Institute, abedj@battelle.org |
| Carlyn Orians, Battelle Memorial Institute, orians@battelle.org |
| Linda Winges, Battelle Memorial Institute, winges@battelle.org |
| Abstract: Public health grant programs often encourage grantees to implement a range of interventions tailored to local needs. This represents a challenge to technical assistance providers charged with aligning local- and national-level evaluations. We present two tools that integrate varied strategies into a comprehensive intervention framework. One uses intervention pathways to categorize interventions into different types, pursuing multiple pathways toward a set of shared health outcomes. The other uses an intervention mapping matrix to categorize interventions by setting and type of change desired. Where setting and change type intersect, a “profile” can be accessed that contains ideas for evaluation questions, indicators, and data sources. Both approaches facilitate clearer understanding of what to evaluate and lead to more appropriate and consistent evaluation across diverse interventions. Session participants will utilize the two methods to characterize their own (or their grantees’) interventions and be asked for input on how tools can be improved. |
| Session Title: Engaging Participants in the Evaluation Process: A Participatory Approach | ||||||||||||||||||||||||||||||||||||||||
| Multipaper Session 541 to be held in REPUBLIC C on Friday, Nov 12, 9:15 AM to 10:45 AM | ||||||||||||||||||||||||||||||||||||||||
| Sponsored by the Collaborative, Participatory & Empowerment Evaluation TIG | ||||||||||||||||||||||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||||||||||||||||||||||
| Seriashia Chatters, University of South Florida, schatter@mail.usf.edu | ||||||||||||||||||||||||||||||||||||||||
|