| Session Title: Fuzzy Set Analysis: A Method for Understanding Causality in Complex Systems |
| Demonstration Session 894 to be held in Centennial Section A on Saturday, Nov 8, 3:05 PM to 3:50 PM |
| Sponsored by the Systems in Evaluation TIG and the Quantitative Methods: Theory and Design TIG |
| Presenter(s): |
| Kurt Moore, Walter R McDonald and Associates Inc, kmoore@wrma.com |
| Abstract: Governments, programs, and organizations are systems consisting of numerous interacting elements. Evaluations often seek to understand the behaviors of these complex systems. A primary task for evaluation is to decipher the causal combinations that create particular outcomes. Many assessment tools deliver continuous data that can be analyzed statistically; however, there are other ways to describe the behaviors of systemic elements. Evaluators gather qualitative data and often use instruments that deliver non-ratio measurement scores. Informed analysts can code disparate types of data according to their degrees of membership in sets - for example, the set called 'this organization is currently in a highly chaotic state.' Combinations of these fuzzy sets are then analyzed with Boolean truth tables, allowing evaluators to discover the complex causal combinations that are necessary and/or sufficient to produce certain outcomes. This demonstration will explain these methodological tools, including software that facilitates analysis. |
| Session Title: The Application of Survival Analysis Methods for Evaluation of Programs With Variable Program Entry and Exit Points |
| Expert Lecture Session 895 to be held in Centennial Section B on Saturday, Nov 8, 3:05 PM to 3:50 PM |
| Sponsored by the Quantitative Methods: Theory and Design TIG |
| Chair(s): |
| George Julnes, Utah State University, george.julnes@usu.edu |
| Presenter(s): |
| Jack Barnette, Colorado School of Public Health, jack.barnette@ucdenver.edu |
| Anne Wallis, University of Iowa, anne-wallis@uiowa.edu |
| Abstract: The application of survival analysis methods to the evaluation of programs where participants enter a program at different times and leave the program after being in it for varying times is presented. Survival analysis is specifically tailored for examining differential entry and exit from a program over a given time period with the ability to predict the odds of being successful in attaining the desired outcome based on logistic regression. While the primary use of survival analysis is in predicting the odds of survival or death, the methodology can be applied to other situations where participants enter and exit a program at different times and for different reasons. Kaplan-Meier curves will be introduced as well as Cox Proportional Hazards Modeling in an example of how this method can be used in program evaluation. |
| Session Title: Discussion of Student Awards Essays |
| Think Tank Session 896 to be held in Centennial Section C on Saturday, Nov 8, 3:05 PM to 3:50 PM |
| Sponsored by the Presidential Strand |
| Chair(s): |
| Leslie J Cooksy, University of Delaware, ljcooksy@udel.edu |
| Presenter(s): |
| Shanelle Boyle, Claremont Graduate University, shanelle.boyle@gmail.com |
| Courtney Coleman, Emory University, courtney_coleman1@hotmail.com |
| Tysza Gandha, University of Illinois Urbana-Champaign, tgandha2@uiuc.edu |
| S Lory Hovsepian, University of Montreal, sarine.lory.hovsepian@umontreal.ca |
| Lacy Mayes, University of Maryland, Baltimore County, lacy@carsonresearch.com |
| Patricia Moore Shaffer, College of William and Mary, pmshaf@wm.edu |
| Jerim Obure, University of Amsterdam, jerotus@yahoo.com |
| Cynthia Roberts, University of Rhode Island, cynthia_roberts@mail.uri.edu |
| Renelda Roberson, Texas Southern University, reneldaroberson@aol.com |
| Jingye Zhou, Syracuse University, jzhouid@gmail.com |
| Session Title: What Have We Learned About Randomized Control Trials (RCTs), Gold Standards, and Credible Evidence: Moving Beyond the Debates to Improve Policy and Practice |
| Expert Lecture Session 897 to be held in Centennial Section D on Saturday, Nov 8, 3:05 PM to 3:50 PM |
| Sponsored by the Research on Evaluation TIG |
| Chair(s): |
| Melvin Mark, Pennsylvania State University, m5m@psu.edu |
| Presenter(s): |
| Stewart I Donaldson, Claremont Graduate University, stewart.donaldson@cgu.edu |
| Discussant(s): |
| Christina Christie, Claremont Graduate University, tina.christie@cgu.edu |
| Abstract: This expert lecture will summarized the key findings from a new book on 'What Counts as Credible Evidence in Applied Research and Evaluation Practice' (Donaldson, Mark, & Christie, 2008). Many thorny debates about what counts as evidence have occurred in recent years, but few have sorted out the issues in a way that directly informs evaluation practice. In this volume, internationally renowned evaluators explore the challenges of designing and executing high quality evaluations in contemporary evaluation practice. This lecture will summarize what can be learned from the chapter authors about the strengths and weaknesses of both experimental and non-experimental approaches for gathering credible and actionable evidence. A proposal to revise the notion of an 'Experimenting Society' to an 'Evidence-based Global Society', which includes replacing the 'RCT Gold Standard' with the gold standard of 'Methodological Appropriateness' will be offered as a avenue toward improving evaluation policy and practice. |
| Session Title: The Arizona Department of Education's Dropout Prevention Toolkit: An Innovative Interactive Tool That Disseminates Best Practice Findings From 39 Dropout Prevention Programs |
| Demonstration Session 898 to be held in Centennial Section E on Saturday, Nov 8, 3:05 PM to 3:50 PM |
| Sponsored by the Integrating Technology Into Evaluation |
| Presenter(s): |
| Claire Brown, LeCroy and Milligan Associates Inc, clairenbrown@aol.com |
| Robert Coccagna, Arizona Department of Education, robert.coccagna@azed.gov |
| Abstract: We will present the Dropout Prevention Toolkit, an innovative CD ROM that presents evaluation findings on best practices in dropout prevention identified among thirty-nine projects funded by the Arizona Department of Education. Results from in-depth surveys and site visits are combined with literature on best practices in an interactive resource designed for continuous learning and improvement of programs. A network of practitioners can peruse practices and innovations described by their counterparts in other schools and institutions based on topical themes. The tool also provides links to active Internet websites addressing dropout prevention and four exemplary site profiles. We will discuss how the Toolkit was developed, how the funder and program staff are making use of it, and the strengths and weaknesses of this type of product and approach. |
| Session Title: Serving Elites and Civil Society: Evaluation Imaginaries in a Globalized World |
| Expert Lecture Session 899 to be held in Centennial Section F on Saturday, Nov 8, 3:05 PM to 3:50 PM |
| Sponsored by the Theories of Evaluation TIG |
| Chair(s): |
| Timothy Cash, University of Illinois Urbana-Champaign, tjcash2@uiuc.edu |
| Presenter(s): |
| Thomas Schwandt, University of Illinois Urbana-Champaign, tschwand@uiuc.edu |
| Abstract: Current globalizing influences profoundly challenge Western, Anglo-American evaluation imaginaries. We adopt Charles Taylor's notion of imagination, which refers to the ways in which people envision their social existence, and extend that understanding of imagination to the practice of evaluation. Our conception of globalization, owing much to world polity theory, comprises a worldview where today's world culture, whose origins lie in the Western European Enlightenment tradition, create and shape two existing and competing Western, Anglo-American evaluation imaginaries, one that espouses an ethic of service to elites and the other an ethic of service to civil society. This article argues that the globalizing influences on evaluation expose both the strengths and weaknesses of these two evaluation imaginaries. As a welcome result, these globalizing influences on evaluation contribute to a shift away from simply debating methods and models and towards a meaningful dialogue of the moral-ethical concerns surrounding the practice of evaluation. |
| Session Title: SIMPACT (Satisfaction-Impact-Action): A Web-based Organizational Culture Assessment System That Links Work and Nonwork Life Needs to Employee Turnover |
| Demonstration Session 900 to be held in Centennial Section G on Saturday, Nov 8, 3:05 PM to 3:50 PM |
| Sponsored by the Business and Industry TIG |
| Presenter(s): |
| Michael Schwerin, RTI International, schwerin@rti.org |
| Abstract: Employee turnover costs organizations economically through the cost to recruit and train an employee, the unfulfilled investment in training, loss of productive employees, additional job demands placed on others when an employee leaves an organization, the loss of intellectual capital, and the potential loss of diversity in the workplace. Understanding factors affecting employee turnover intent is essential for human resources and organizational development decision makers to develop employee retention and human resources strategies for organizations. SIMPACT is a Web-based decision support tool that identifies life needs related to employee turnover and helps leaders take action to improve employee retention. Additionally, results are used in predictive modeling that allows leaders to conduct 'what if' scenarios to explore the potential outcome of interventions that increase the satisfaction and impact of life needs on retention plans. |
| Session Title: Evaluating Impacts of Climate Change | |||||||||||
| Multipaper Session 901 to be held in Centennial Section H on Saturday, Nov 8, 3:05 PM to 3:50 PM | |||||||||||
| Sponsored by the Environmental Program Evaluation TIG | |||||||||||
| Chair(s): | |||||||||||
| Katherine Dawes, United States Environmental Protection Agency, dawes.katherine@epa.gov | |||||||||||
|
| Session Title: IHEs Confront Technology's Impact: Online Surveys and Digital Resources | |||||||||||
| Multipaper Session 902 to be held in Mineral Hall Section A on Saturday, Nov 8, 3:05 PM to 3:50 PM | |||||||||||
| Sponsored by the Integrating Technology Into Evaluation | |||||||||||
| Chair(s): | |||||||||||
| Christopher Migotsky, University of Illinois, migotsky@uiuc.edu | |||||||||||
|
| Session Title: Expanding Advocacy Capacity: Findings From the Evaluation of The California Endowment Clinic Consortia Policy and Advocacy Program |
| Demonstration Session 904 to be held in Mineral Hall Section C on Saturday, Nov 8, 3:05 PM to 3:50 PM |
| Sponsored by the Advocacy and Policy Change TIG |
| Presenter(s): |
| Annette Gardner, University of California San Francisco, annette.gardner@ucsf.edu |
| Claire Brindis, University of California San Francisco, claire.brindis@ucsf.edu |
| Lori Nascimento, The California Endowment, lnascimento@calendow.org |
| Abstract: In 2001, The California Endowment funded 19 California clinic consortia for two three-year funding cycles to undertake policy advocacy activities and improve the financial stability of their member clinics. The Institute for Health Policy Studies at the University of California, San Francisco has evaluated these activities since 2002 using a combination of both quantitative longitudinal measures and qualitative interviews with grantees, partner organizations, the media and targets of grantee advocacy activities. In addition, UCSF developed three policy advocacy case studies and 17 best practice case studies describing exemplary grantee activities. In this demonstration, we describe the evaluation design and results for the years 2001-2006. The findings indicate that individually and collectively, grantees are achieving not only short-term outcomes, such as increased policymaker awareness of clinic policy issues but also longer-term outcomes, such as increased funding to clinics and increased access to care. |
| Session Title: Use of Social Networks for K-12 Evaluation Capacity Building | ||||
| Multipaper Session 905 to be held in Mineral Hall Section D on Saturday, Nov 8, 3:05 PM to 3:50 PM | ||||
| Sponsored by the Organizational Learning and Evaluation Capacity Building TIG and the Pre-K - 12 Educational Evaluation TIG | ||||
| Chair(s): | ||||
| Michelle Osowski, Albuquerque Public Schools, osowski@aps.edu | ||||
| Discussant(s): | ||||
| Rebecca Gajda, University of Massachusetts Amherst, rebecca.gajda@educ.umass.edu | ||||
|
| Session Title: Participation and Policy: Evaluation in Multi-Site Programs | |||
| Panel Session 907 to be held in Mineral Hall Section F on Saturday, Nov 8, 3:05 PM to 3:50 PM | |||
| Sponsored by the Cluster, Multi-site and Multi-level Evaluation TIG | |||
| Chair(s): | |||
| Martha Ann Carey, Maverick Solutions LLC, marthaann123@sbcglobal.net | |||
| Discussant(s): | |||
| Leonard Bickman, Vanderbilt University, leonard.bickman@vanderbilt.edu | |||
| Abstract: Partnerships between communities and researchers/evaluators are one type of challenge in evaluation and research. Adding another level of complexity, interest, and potential power is the partnership between a national level organization and local entities. As the nation's leading nonprofit advocacy organization focusing on mental health for all people, Mental Health America works with over 320 affiliates to ensure that people's voices are heard by policy makers in utilizing evidence in developing resource allocation policies at local, state, and federal levels. With emphasis on project management and federal goals, this panel includes a second paper on the experience of developing and managing multi-site services research programs with input from diverse stakeholders. The discussant is an internationally recognized expert in mental health services, has extensive and relevant publications, and has extensive experience in federally funded multi-site studies. | |||
| |||
|
| Session Title: Using the Getting To Outcomes Framework to Implement Continuous Quality Improvement in Community-Based Prevention |
| Multipaper Session 908 to be held in Mineral Hall Section G on Saturday, Nov 8, 3:05 PM to 3:50 PM |
| Sponsored by the Collaborative, Participatory & Empowerment Evaluation TIG |
| Chair(s): |
| Matthew Chinman, RAND Corporation, chinman@rand.org |
| Abstract: This session explores the use of continuous quality improvement (CQI) in community-based substance abuse prevention settings as an evaluation tool. While efforts to improve quality are widespread, often little attention is paid to engaging staff to embrace these efforts. To address this, the first paper will review multiple approaches to improve quality that focus on staff motivation and how such approaches can be used in CQI. In addition, guidance for conducting CQI in community-based substance abuse prevention is lacking. Getting To Outcomes (GTO) is a 10-step process that helps substance abuse prevention practitioners to better plan, implement, and evaluate their programming consistent with empowerment evaluation; Step 9 provides guidance for conducting CQI. Paper two describes a GTO-CQI intervention and the research conducted to document its implementation. Lessons learned about using CQI methods in community-based prevention settings will be highlighted. |
| Ideas from Near and Far with Implications for Continuous Quality Improvement |
| Gordon Hannah, Finger Lakes Law and Social Policy Center Inc, gordonjhannah@gmail.com |
| While the call for continuous quality improvement (CQI) has been widespread, the literature on how to implement effective CQI processes is relatively new and undeveloped. Literature on improving performance, however, exists in many domains and has many implications for designing CQI processes. A crucial component of improving performance and increasing the quality of programs that is often overlooked in CQI processes is staff motivation. While evidence-based techniques for enhancing motivation exist, they are seldom considered in CQI processes. This presentation will first review various approaches to enhance staff motivation for quality improvement from a broad literature, including existing CQI models (such as the Plan-Do-Study-Act cycle), models of organizational change and accountability (such as the Concerns-Based Adoption Model), the performance improvement literature, and motivational enhancement literature. Then the presentation will discuss how these different approaches to staff motivation can be used in concrete CQI efforts as prescribed by the GTO-CQI intervention. |
| The Getting To Outcomes-Continuous Quality Improvement Demonstration and Evaluation |
| Matthew Chinman, RAND Corporation, chinman@rand.org |
| Sarah B Hunter, RAND Corporation, shunter@rand.org |
| Patricia Ebener, RAND Corporation, pateb@rand.org |
| Incorporating the Institute for Healthcare Improvement (IHI) framework, experience from CQI in substance abuse treatment and input from a Work Group of prevention practitioners, a Getting To Outcomes-CQI intervention (GTO-CQI) was developed and tested by 10 prevention programs. The intervention involved semi-annual trainings and quarterly technical assistance sessions. Participating programs developed action plans and were instructed on plan-do-study-act cycles, IHI's method for making small, rapid improvements and then assessing their impact. Ten participating program directors were interviewed at 3, 6, and 9 months following the initial training about their progress on their plans. During these interviews, research staff also provided technical assistance on CQI. Results highlight the variability across the 10 programs in terms of types and scope of improvements attempted, staff enthusiasm, and implementation level. The resulting lessons learned about improving CQI implementation in these settings will be presented. |
| Session Title: Let's Make it Human: Evaluating Live Interpretation and the Visitor Experience |
| Expert Lecture Session 909 to be held in the Agate Room Section B on Saturday, Nov 8, 3:05 PM to 3:50 PM |
| Sponsored by the Evaluating the Arts and Culture TIG |
| Chair(s): |
| Kathleen Tinworth, Denver Museum of Nature and Science, ktinworth@dmns.org |
| Presenter(s): |
| Kathleen Tinworth, Denver Museum of Nature and Science, ktinworth@dmns.org |
| Abstract: In autumn 2007, the Denver Museum of Nature and Science conducted two in-house evaluations on live interpretation: one examining visitor interactions with first-person enactors in a temporary exhibition (Titanic); the second investigating visitor impressions of live stage shows in a permanent space science exhibition. Different in size, scope, methodology, process and approach, both evaluations looked rigorously and creatively at how live interpretation can be used in a museum setting. Both studies utilized empirical methods and statistical analyses to quantitatively and qualitatively address live museum interpretation as a vehicle to deliver content in unique, non-traditional and compelling ways' opening up discussion about best practice and enhancing informed decision making for future exhibitions and programs. |
| Session Title: Reflecting and Moving Forward on Movement Building Evaluations |
| Think Tank Session 910 to be held in the Agate Room Section C on Saturday, Nov 8, 3:05 PM to 3:50 PM |
| Sponsored by the Non-profit and Foundations Evaluation TIG |
| Presenter(s): |
| Hanh Cao Yu, Social Policy Research Associates, hanh_cao_yu@spra.com |
| Discussant(s): |
| Hanh Cao Yu, Social Policy Research Associates, hanh_cao_yu@spra.com |
| Heather Lewis-Charp, Social Policy Research Associates, heather@spra.com |
| Abstract: With increased interest in social justice philanthropy, a new surge of foundation investments have spawned growth in movement building work. In this session, we will lay a framework for discussion based on our evaluation of the youth organizing, women's funding movement, reproductive justice movement, and human genetics and reproductive rights technology. After the initial theories and frameworks are laid out, we will invite participants to share their ideas and evaluation work to address the following questions: What do we mean by movement building? What are the components/activities of movement building? What are indicators of change we are using? How can groups on the ground track their impact and success? |
| Session Title: Health Promotion Evaluation | |||||||||||||||||
| Multipaper Session 911 to be held in the Granite Room Section A on Saturday, Nov 8, 3:05 PM to 3:50 PM | |||||||||||||||||
| Sponsored by the Health Evaluation TIG | |||||||||||||||||
| Chair(s): | |||||||||||||||||
| Kai Young, Centers for Disease Control and Prevention, deq0@cdc.gov | |||||||||||||||||
|
| Session Title: Evaluating Large Research Initiatives in Health | ||||||||||||
| Multipaper Session 912 to be held in the Granite Room Section B on Saturday, Nov 8, 3:05 PM to 3:50 PM | ||||||||||||
| Sponsored by the Health Evaluation TIG | ||||||||||||
| Chair(s): | ||||||||||||
| Jeannette Oshitoye, Nemours Health and Prevention Services, joshitoy@nemours.org | ||||||||||||
|
| Session Title: An Association to Improve Evaluation of Development Aid |
| Expert Lecture Session 913 to be held in the Granite Room Section C on Saturday, Nov 8, 3:05 PM to 3:50 PM |
| Sponsored by the International and Cross-cultural Evaluation TIG |
| Chair(s): |
| Ronald Visscher, Western Michigan University, ronald.s.visscher@wmich.edu |
| Presenter(s): |
| Paul Clements, Western Michigan University, clements@wmich.edu |
| Abstract: For 60 years the evaluation of development assistance programs has been controlled mainly by donor agencies. These agencies face profoundly mixed incentives, however, when it comes to evaluating their own programs. Although the need for learning and accountability in this field is particularly great, there is considerable evidence that in practice aid evaluations are inconsistent and often weak and/or positively biased. This lecture argues that development aid could be greatly improved if the management of development programs and projects could be governed by an effective orientation to cost-effectiveness. This in turn could be substantially achieved if evaluations routinely made consistent and reliable estimates of each program's total impacts and cost-effectiveness. In order to accomplish this, the lecture proposes the establishment of a professional association of development program evaluators along the lines of associations of accountants and auditors, and it discusses the structure of and steps towards establishing such an association. |
| Roundtable: Is it Possible to Avoid the “Everything but the Kitchen Sink” Online Survey? |
| Roundtable Presentation 914 to be held in the Quartz Room Section A on Saturday, Nov 8, 3:05 PM to 3:50 PM |
| Sponsored by the Extension Education Evaluation TIG |
| Presenter(s): |
| Daniel McDonald, Arizona Cooperative Extension, mcdonald@ag.arizona.edu |
| Donna Peterson, University of Arizona, pdonna@ag.arizona.edu |
| Daniel Ferguson, University of Arizona, dferg@email.arizona.edu |
| Abstract: This round-table will discuss lessons learned as a result of developing two on-line surveys that evaluated the effectiveness of two very different programs. One of the dilemmas often encountered when designing surveys to assess the usefulness of programs is the desire to add items and layers of complexity to the survey instrument. This problem may be amplified by the ease of adding questions or entire sections to on-line surveys when there are no concerns over printing or mailing costs. The purpose of this round table will be to share experiences related to the development of on-line surveys for program evaluation. Issues regarding length of surveys, individual page layout, skip patterns, section instructions, and consenting language will be discussed. Recommendations made as part of the Dillman method will be reviewed. Specific information from each survey process will be provided such as response rate and comments made during the survey piloting process. |
| Roundtable: Teaching Evaluation to Working Professionals: A Hybrid Approach |
| Roundtable Presentation 915 to be held in the Quartz Room Section B on Saturday, Nov 8, 3:05 PM to 3:50 PM |
| Sponsored by the Teaching of Evaluation TIG |
| Presenter(s): |
| Rick Axelson, University of Iowa, rick-axelson@uiowa.edu |
| Abstract: This session will discuss solutions to the challenges involved with designing an evaluation course for busy and geographically dispersed medical education professionals. As a starting point for discussion, the issues faced with the design and delivery of the University of Iowa - Carver College of Medicine's new "Research Design and Evaluation" course will be reviewed. This course will be offered for the first time in the fall of 2008. It will be delivered via a mixture of online and face-to-face sessions. The online sessions will provide opportunities for students to read and reflect on the foundational principles and issues of research design and evaluation. The face-to-face sessions (about 4 per semester) will focus on applying these principles to evaluation projects. After a brief status report on this course, roundtable participants will have the remainder of the session to discuss successful practices in evaluation courses for professionals. |