| Session Title: Identifying, Articulating and Incorporating Values in a Program Theory |
| Demonstration Session 301 to be held in Avalon A on Thursday, Nov 3, 11:40 AM to 12:25 PM |
| Sponsored by the Program Theory and Theory-driven Evaluation TIG |
| Presenter(s): |
| Sue Funnell, Performance Improvement, funn@bigpond.com |
| Abstract: Increasingly program evaluations are being driven by program theories but program theories can be constructive or unhelpful or even destructive for evaluation. The processes used to conceptualize and portray a program theory can influence whether it gives rise to useful evaluation questions and can affect whose voices are heard when identifying evaluation criteria and making judgments about a program's success and its wider effects. This session will show how workshops can be used alongside other techniques to develop a program theory that incorporates a range of value perspectives and poses useful evaluation questions. It will demonstrate questions that can be used, how to arrange the answers into an outcomes chain, the Ideas Writing technique for identifying different perspectives on what constitutes success, how to deal with divergent views and how to incorporate unintended outcomes. |
| Session Title: Extreme Genuine Evaluation Makeovers (XGEMs) |
| Demonstration Session 303 to be held in California A on Thursday, Nov 3, 11:40 AM to 12:25 PM |
| Sponsored by the Evaluation Managers and Supervisors TIG |
| Presenter(s): |
| Jane Davidson, Real Evaluation Ltd, jane@realevaluation.com |
| Abstract: "Value for money" applies not just to programs and policies, but to evaluations themselves. This demonstration will show how evaluations can be designed and implemented in ways that deliver real value for money for their clients. The demonstration uses humor and metaphor to guide the audience through the main species of waste-of-money evaluation, their natural habitats and distinguishing features. These evaluations frequently lack incisive evaluation questions; get lost in the details; skip over the crucial 'values' step; uncritically accept stated objectives as the only evaluative criteria; fail to adequately triangulate and weave sources of evidence; and toss causation into the 'too hard basket'. The session will demonstrate some practical guidelines for doing Extreme Genuine Evaluation Makeovers (XGEMs). The emphasis is on being realistic and humble about what is feasible, but resisting the urge to do non-genuine evaluation when the needs and the constraints are challenging. |
| Session Title: Modern Western Evaluation Imaginary Meets the Pascua Yaqui: An Interview With Fileberto Reynaldo Lopez - By Peter Dahler-Larsen |
| Expert Lecture Session 304 to be held in California B on Thursday, Nov 3, 11:40 AM to 12:25 PM |
| Sponsored by the Collaborative, Participatory & Empowerment Evaluation TIG |
| Chair(s): |
| Peter Dahler-Larsen, University of Southern Denmark, pdl@sam.sdu.dk |
| Presenter(s): |
| Fileberto Reynaldo Lopez, University of Arizona, lopezf1@email.arizona.edu |
| Abstract: In the modern Western world, most evaluation is carried out based on a fundamental world view or "evaluation imaginary" (Schwandt) that is difficult for most evaluators to see, discuss, or thematize, simply because this world view is to increasing extent defining what evaluation is and can be and how it should be carried out. Interesting views or reflections that allow us to thematize the modern evaluation imaginary may be found if we step outside the beaten path defined by the modern evaluation imaginary itself. I have had the privilege to meet one individual from the Pasqua Yaqui of the Sonoran Desert. He is with us today. His name is Dr. Fileberto Lopez. Fileberto has accepted to be interviewed, and we hope that through the interview, we will all learn a bit about the Modern Western Evaluation Imaginary, especially as seen from the outside. |
| Session Title: Exploring the Value of Careers in Evaluation |
| Expert Lecture Session 305 to be held in California C on Thursday, Nov 3, 11:40 AM to 12:25 PM |
| Sponsored by the Graduate Student and New Evaluator TIG |
| Chair(s): |
| John LaVelle, Claremont Graduate University, john.lavelle@cgu.edu |
| Presenter(s): |
| Stewart Donaldson, Claremont Graduate University, stewart.donaldson@cgu.edu |
| Discussant(s): |
| Christina Christie, University of California, Los Angeles, tina.christie@ucla.edu |
| Abstract: The demand for evaluation and evaluation services has increased dramatically over the past decade. As evaluation practice has blossomed world-wide, universities, evaluation professional associations, government agencies, foundations, non-profit and for-profit organizations have become actively involved in providing professional development workshops, certificates and professional designations, and masters and doctoral degrees in evaluation. What is the value of the careers that can result from this advanced training and education in evaluation? Professor Stewart Donaldson will provide answers to this question as well as explore a range of career development issues that are directly relevant for graduate students and working professionals interested in working in the diverse transdisciplinary field of evaluation. |
| Session Title: Evaluation Use and Knowledge Translation: An Exchange for the Future |
| Expert Lecture Session 306 to be held in Pacific A on Thursday, Nov 3, 11:40 AM to 12:25 PM |
| Sponsored by the Presidential Strand |
| Chair(s): |
| Gail Barrington, Barrington Research Group Inc, gbarrington@barringtonresearchgrp.com |
| Presenter(s): |
| Melanie Barwick, The Hospital for Sick Children, melanie.barwick@sickkids.ca |
| Discussant(s): |
| Daniel Stufflebeam, Western Michigan University, dlstfbm@aol.com |
| Abstract: Evaluators have long valued evaluation use, both by decision makers and practitioners in the contexts at hand. But what about the broader community? The concept of knowledge translation addresses the steps between the creation of new knowledge and its application to outcomes including benefits for citizens, effective services and products, and strengthened social systems. It explores the exchange of knowledge between researchers and users to accelerate the knowledge-to-action process. Dr. Melanie Barwick is a Registered Psychologist and Health Systems Research Scientist at The Hospital for Sick Children in Toronto, Ontario. Since 2001 she has implemented an outcome measure in 117 children's mental health service provider organizations and providing training to over 5,000 practitioners. In this practice context she studies innovative health knowledge translation strategies and has developed the Scientist Knowledge Translation Training Program. She currently leads a 5-year innovative project in Knowledge Translation for Child and Youth Mental Health. |
| Session Title: Using PhotoVoice for Participatory Community Evaluation |
| Demonstration Session 307 to be held in Pacific B on Thursday, Nov 3, 11:40 AM to 12:25 PM |
| Sponsored by the Qualitative Methods TIG and the Collaborative, Participatory & Empowerment Evaluation TIG |
| Presenter(s): |
| Amanda Purington, Cornell University, ald17@cornell.edu |
| Jacqueline Davis-Manigaulte, Cornell University, jad23@cornell.edu |
| Jane Powers, Cornell University, jlp5@cornell.edu |
| Abstract: This demonstration will provide an overview of how the New York State ACT for Youth Center of Excellence utilizes PhotoVoice methodology for evaluation with youth and adult staff from programs designed to reduce sexually transmitted infections, HIV infection, and unintended pregnancy among youth while promoting their optimal sexual. Youth and adult staff are using PhotoVoice to evaluate issues in their communities impacting youth sexual health. Youth leaders, in collaboration with adult program staff, are charged with conducting this community evaluation, interpreting findings, and developing action plans to address issues highlighted by this evaluation. Engagement in the evaluation process ensures youth perspective is included and valued in these assessments and helps youth see their potential roles as catalysts for change. Participants in this demonstration will have the opportunity to learn about the PhotoVoice process and explore its potential use for evaluation within their work settings. |
| Session Title: Structural Equation Modeling as a Valuable Tool in Evaluation | |||||||||||
| Multipaper Session 308 to be held in Pacific C on Thursday, Nov 3, 11:40 AM to 12:25 PM | |||||||||||
| Sponsored by the Quantitative Methods: Theory and Design TIG | |||||||||||
| Chair(s): | |||||||||||
| Frederick L Newman, Florida International University, newmanf@fiu.edu | |||||||||||
|
| Session Title: Evaluation Coaches: Designing and Evaluating for the Future | |||
| Panel Session 309 to be held in Pacific D on Thursday, Nov 3, 11:40 AM to 12:25 PM | |||
| Sponsored by the Non-profit and Foundations Evaluation TIG | |||
| Chair(s): | |||
| Charles Gasper, Missouri Foundation for Health, cgasper@mffh.org | |||
| Discussant(s): | |||
| Katrina Bledsoe, Education Development Center Inc, katrina.bledsoe@gmail.com | |||
| Abstract: Evaluation Coaching is an evolved approach to evaluation derived from the sprit and lessons learned from Empowerment, Participatory, and Collaborative Evaluation. The approach directly supports the needs of nonprofits and funders to be educated and engage in evaluation. With pressures on organizations to provide results from rigorous evaluation, funders have resorted to engagement of external evaluation to assess the performance and process of their initiatives and portfolios. Internal evaluators have been tarnished with the view that their work could be tainted by their relationship with their organization. Evaluation Coaching differs from traditional evaluation approaches as it is focused on organizational evaluation capacity building. The critical friendship between the Evaluation Coach and the organization spreads beyond one or two projects and the educational methods are highly participatory. The presenters explain this evolutionary approach focusing on development of internal evaluation capacity, rigor of evaluation design and implementation, and use by organizations; a community-based non-profit evaluation coach will discuss the implications for the work. | |||
| |||
|
| Roundtable: The Use of Coaching, Co-authorship, and Mixed Media to Structure and Support Data Use Within a Teacher Induction Program |
| Roundtable Presentation 310 to be held in Conference Room 1 on Thursday, Nov 3, 11:40 AM to 12:25 PM |
| Sponsored by the Collaborative, Participatory & Empowerment Evaluation TIG |
| Presenter(s): |
| Paul St Roseman, DataUse Consulting Group, paul@mydatause.com |
| Rachelle Rogers-Ard, Teach Tomorrow in Oakland, rachelle.rogers-ard@ousd.k12.ca.us |
| Abstract: This demonstration presents approaches that support 'data use' within the values framework, operation apparatus, and decision making processes of a teacher induction program located in Oakland California. This case example will demonstrate how: (1) administrative coaching supports efforts to develop, interpret, and utilize evaluation products; (2) co-authorship and presentations are utilized as a process for data analysis, and (3) mixed media and web-based resources are utilized to facilitate collaboration. This presentation is most appropriate for evaluation practitioners who collaborate with administrators and their staff to design, implement, sustain and utilize evaluation products. |
| Roundtable: Challenges and Solutions Associated With Participant Recalibration or Reprioritization of Self-Report Data |
| Roundtable Presentation 311 to be held in Conference Room 12 on Thursday, Nov 3, 11:40 AM to 12:25 PM |
| Sponsored by the Health Evaluation TIG |
| Presenter(s): |
| Cady Berkel, Arizona State University, cady.berkel@asu.edu |
| Angelica Tovar-Huffman, Phoenix Children's Hospital, atovarhuffman@phoenixchildrens.com |
| Abstract: Often programs aimed at improving health-related behaviors change not only the targeted program outcomes, but also create response shifts in terms of participants' perspectives of those outcomes. In such cases, self-report data may appear to indicate null or iatrogenic program effects, due to the fact that participants' gains in understanding of the construct equal or outweigh their gains in the construct itself. For example, in the CareConnect AZ program at Phoenix Children's Hospital, participants initially reported high levels of communication with health providers, which declined once they were exposed to new skills for communicating with providers. This is a major problem for evaluators who must untangle 'true' change from apparent change due to recalibration or reconceptualization. In the proposed roundtable session, we will discuss the challenges we faced with recalibration of self-report responses. Attendees will be invited to share similar challenges and discuss different strategies for dealing with this problem. |
| Session Title: Evaluating International Trafficking Programs: The Role of Evaluability Assessments in Determining Program Readiness and Documenting Program Strategy Evolution | |||
| Panel Session 312 to be held in Conference Room 13 on Thursday, Nov 3, 11:40 AM to 12:25 PM | |||
| Sponsored by the International and Cross-cultural Evaluation TIG | |||
| Chair(s): | |||
| Beth Rabinovich, Westat, bethrabinovich@westat.com | |||
| Discussant(s): | |||
| Casey Branchini, United States Department of State, BranchiniCA@State.Gov | |||
| Abstract: Evaluability assessment (EA) has traditionally been used to determine the logical basis of a program; its readiness for implementation, outcome, or impact evaluation; the changes needed to increase its readiness; and evaluation approaches most suitable for measuring program performance and outcomes. In this presentation we discuss how EAs can help track the initiation of new programs (often using models borrowed from other disciplines) in international trafficking as well as document the development and evolution of program strategies. The presentations in this session will focus on recent EAs of international trafficking programs sponsored by the Department of State, Office to Combat Trafficking in Persons (G/TIP) and how the three-pronged program strategy - prosecuting trafficking offenders, protecting victims, and preventing trafficking - is being implemented or the changes required for implementation. Case studies (countries identified by region only) discussing the program challenges and successes will be presented. | |||
| |||
| |||
| |||
|
| Session Title: Evaluators as Partners in Technology Program Design | ||||||||||||||
| Multipaper Session 313 to be held in Conference Room 14 on Thursday, Nov 3, 11:40 AM to 12:25 PM | ||||||||||||||
| Sponsored by the Research, Technology, and Development Evaluation TIG | ||||||||||||||
| Chair(s): | ||||||||||||||
| Mary Beth Hughes, Science and Technology Policy Institute, m.hughes@gmail.com | ||||||||||||||
|
| Session Title: Even Teenagers Value Evaluation!: How Service Recipients Use Outcome and Evaluation Data at the Latin-American Youth Center |
| Expert Lecture Session 314 to be held in Avila A on Thursday, Nov 3, 11:40 AM to 12:25 PM |
| Sponsored by the Evaluation Use TIG and the Human Services Evaluation TIG |
| Chair(s): |
| Isaac Castillo, Latin American Youth Center, isaac@layc-dc.org |
| Presenter(s): |
| Isaac Castillo, Latin American Youth Center, isaac@layc-dc.org |
| Abstract: The Latin American Youth Center (LAYC) is a multi-service nonprofit in Washington, DC that evaluates each of its 71 programs internally. Results from these evaluations have been used to improve programming for years. More recently, LAYC has worked with high-risk and high-need service recipients to empower them in the use of the evaluation data in their daily lives. These youth and young adults share personal level outcomes (and in some instances program level outcomes) with judges, probation officers, and teachers to demonstrate how they have turned their lives around. This session will share LAYC's internal evaluation work, how the outcomes are shared with service recipients, and how youth and young adults use this information to prove to others (and themselves) that they are improving their lives. |
| Session Title: Using Data Visualization Software to Engage Stakeholders in Evaluation |
| Demonstration Session 315 to be held in Avila B on Thursday, Nov 3, 11:40 AM to 12:25 PM |
| Sponsored by the Data Visualization and Reporting TIG |
| Presenter(s): |
| Bryn Sadownik, Vancity Community Foundation, bryn_sadownik@vancity.com |
| Abstract: Demonstrating Value Initiative has been exploring whether data visualization software could promote the use of evaluation results and performance monitoring data in organizations and programs with limited technical capacity. We were interested in particular whether, * this format could improve decision-making and more clearly communicate the learning from evaluation and performance monitoring activities; * there are low-cost software programs available that can be used with minimal technical skill. We found that this software is a powerful tool and is possible within the skill set of most evaluators and small organizations. In this session, I will demonstrate how to apply one software - SAP Crystal Presentation Design - to develop a simple interactive report including how to model relationships and set up 'what-if' scenarios. SAP Crystal design builds directly on Excel and produces Flash files that can be incorporated into websites, PowerPoint presentation and PDF files. |
| Roundtable: Introducing a Practical Guidebook for Values-Engaged, Educative Evaluation |
| Roundtable Presentation 316 to be held in Balboa A on Thursday, Nov 3, 11:40 AM to 12:25 PM |
| Sponsored by the Theories of Evaluation TIG |
| Presenter(s): |
| Jeehae Ahn, University of Illinois at Urbana-Champaign, jahn1@illinois.edu |
| Ayesha Boyce, University of Illinois at Urbana-Champaign, boyce3@illinois.edu |
| Jennifer C Greene, University of Illinois at Urbana-Champaign, jcgreene@illinois.edu |
| Abstract: This presentation showcases a newly developed practical guidebook for a values-engaged, educative approach to evaluating science, technology, engineering, and mathematics (STEM) and other education programs. This evaluation approach is anchored in our dual commitments to (a) active engagement with values of diversity and equity, and (b) being educative in our work, that is, conducting evaluations that advance meaningful learning about the program being evaluated. Our guidebook foregrounds the distinctive role of the values-engaged, educative evaluator, and features step-by-step guidelines for the practice of values-engaged, educative evaluation, along with multiple illustrations of these guidelines from our varied field tests of this approach. In presenting this guidebook, we invite other education program evaluators to our roundtable to share their thoughts in an informal, interactive format that can further inform and enlighten our thinking about values engagement in evaluation practice. |
| Session Title: Evaluation Model for a Capacity Building Website to Assess Use and Information Transfer |
| Demonstration Session 317 to be held in Balboa C on Thursday, Nov 3, 11:40 AM to 12:25 PM |
| Sponsored by the Non-profit and Foundations Evaluation TIG |
| Presenter(s): |
| Andria Zaverl, AIDS Project Los Angeles, azaverl@apla.org |
| Oscar Marquez, AIDS Project Los Angeles, omarquez@apla.org |
| Abstract: Shared Action (SA), a non-profit capacity building assistance (CBA) program, developed a system to evaluate its online (website) capacity building services including information transfer. SA evaluation is a mixed methods model that assesses "use" of acquired knowledge and skills as well as effectiveness of knowledge transfer. Data collected online includes items downloaded (e.g. recorded webinars and educational materials), type of user and frequency, and interactions with blogs and forums. Preliminary findings showed: pages that clients use frequently, where they access the information, and content of feedback provided by clients. The service was setup for organizations in the USA, however, the evaluation showed SA impacted other countries: 5% of users were international, Germany being most frequent. Lesson learned: online evaluations are possible, but need to be creative, use mixed methods, and always be prepared for unexpected results. The demonstration will present the model, the findings, analysis of data, and the application. |
| Session Title: Toward Integration-Organizational Learning From Within and Among the Network of Funded Partners |
| Demonstration Session 318 to be held in Capistrano A on Thursday, Nov 3, 11:40 AM to 12:25 PM |
| Sponsored by the Organizational Learning and Evaluation Capacity Building |
| Presenter(s): |
| Aimee White, Trident United Way, awhite@tuw.org |
| Eileen Rossler, Trident United Way, erossler@tuw.org |
| Abstract: This demonstration will offer attendants an opportunity to learn about an organizational learning and capacity building process and subsequent set of tools/resources for the integration of program areas within service organizations and funders. The system includes a series of trainings, technical assistance processes, and continuous quality improvement processes that will assist agencies performing under multiple programming areas or funders funding under multiple program areas in the daunting task of integrating. The strength of this approach is that it was designed with systems theory and therefore has a flexibility and adaptability that can translate across a variety of program areas and funding directives. Attendees will learn best practices associated with implementing the system of organizational learning and capacity building. The perspective is uniquely United Way in nature, but translates widely. |
| Session Title: Are We There Yet? How Internal & External Evaluation Work Together to Assess Progress | |||
| Panel Session 319 to be held in Capistrano B on Thursday, Nov 3, 11:40 AM to 12:25 PM | |||
| Sponsored by the Internal Evaluation TIG | |||
| Chair(s): | |||
| Kathleen Tinworth, Denver Museum of Nature and Science, kathleen.tinworth@dmns.org | |||
| Discussant(s): | |||
| Kathleen Tinworth, Denver Museum of Nature and Science, kathleen.tinworth@dmns.org | |||
| Abstract: The internal and external evaluator of a geographically dispersed national organization will discuss the ways they ensure equal valuing of internal and external findings. Panelists will discuss tools they use to measure the organization's progress in achieving its goal of gender equity in STEM education and workforce. They will also explain the strategic planning and negotiation necessary to deliver a unified message to the organization's leadership and staff to help stakeholders gauge progress toward achievement of the organization's social-justice mission. This intermingling and merging of perspectives is critical for the organization's program improvement and reporting needs. In addition, sharing data and perspectives enables the organization to better embrace the multiplicity of factors that contribute to the success of a social movement. Panelists will encourage discussion with audience members about internal/external evaluation cooperation as well as the complexity of evaluating a social-justice movement. | |||
| |||
|
| Session Title: National Evaluation of the Stay on Track Program: Examining the Unique Outcomes of Adolescents From Military Families |
| Skill-Building Workshop 320 to be held in Carmel on Thursday, Nov 3, 11:40 AM to 12:25 PM |
| Sponsored by the Alcohol, Drug Abuse, and Mental Health TIG |
| Presenter(s): |
| Melissa Rivera, National Center for Prevention and Research Solutions, mrivera@ncprs.org |
| Abstract: This session will outline the comprehensive approach to drug prevention education utilized in the Stay on Track curriculum, the measurement strategy employed to assess its effectiveness, and the outcomes of the latest large-scale national implementation. The results are further analyzed to differentiate students reporting having family members actively serving in the military as well as those reporting a current family member's deployment. The comprehensive approach encompasses the development and utilization of evaluation quality tools, fidelity measures, and engagement with certified implementers throughout the evaluation cycle. Additionally, a summary of the attitude and intention outcomes associated with illicit substance use, and the complex findings associated with students reporting family member deployment will be provided. Overall results for 36,664 sixth, seventh, and eighth graders who participated in the program will also be presented. |
| Session Title: Understanding Community Capacity and Readiness for Evaluation | |||||||||||||||
| Multipaper Session 322 to be held in El Capitan A on Thursday, Nov 3, 11:40 AM to 12:25 PM | |||||||||||||||
| Sponsored by the Health Evaluation TIG | |||||||||||||||
| Chair(s): | |||||||||||||||
| Jeanette Treiber, University of California, Davis, jtreiber@ucdavis.edu | |||||||||||||||
|
| Session Title: Is Outcome Measurement Possible in the Peacebuilding Field? | |||
| Panel Session 323 to be held in El Capitan B on Thursday, Nov 3, 11:40 AM to 12:25 PM | |||
| Sponsored by the International and Cross-cultural Evaluation TIG | |||
| Chair(s): | |||
| Gretchen Shanks, Mercy Corps, gshanks@mercycorps.org | |||
| Abstract: Monitoring and evaluation of peacebuilding programs presents unique challenges, which often inspire resistance from practitioners. While some challenges are more perceived than real there are numerous barriers, including dynamic conflict contexts, a lack of impact indicators, the challenges associated with measuring prevention, and ethical constraints. Despite these challenges, measuring the results from peacebuilding programs remains more important than ever. This panel session will offer two examples where peacebuilding practitioners are pushing themselves and their partners to move beyond the comfortable, to develop and test indicators, tools and theories of change. These teams conducted research on key causal mechanisms in peacebuilding programming, and developed indicators, survey tools, and practical data collection forms to track some of these outcomes. While the research and the tools are imperfect, things are trending in the right direction. This panel will discuss what worked, what didn't, and where we might go from here. | |||
| |||
|
| Roundtable: A Model of Total Survey Error: Examining the Inferential Value in Survey and Questionnaire Data and its Implications on Evaluation Findings |
| Roundtable Presentation 324 to be held in Exec. Board Room on Thursday, Nov 3, 11:40 AM to 12:25 PM |
| Sponsored by the Quantitative Methods: Theory and Design TIG |
| Presenter(s): |
| Michelle Bakerson, Indiana University South Bend, mmbakerson@yahoo.com |
| Abstract: Charged with making judgments about the quality, merit or worth of a program many evaluators use survey or questionnaires to gather data, which is one of the most important and uniquely informative data collection tools available. The significance placed on this data collection tool is examined as limitations and biases inherently exist. Virtually all surveys will contain some form of error which will harm the inferential value of the data. A model of total survey error is examined including sampling error and non-sampling error with a major focus on non-response error. Practical attention to survey methodology in particular techniques for reducing and correcting for non-response will facilitate evaluators' knowledge and ability to make informed decisions regarding their data and in turn their ability to inform stakeholders. |
| Session Title: Exploring How Values, Identity and Gender Influence Evaluator Approach and Role |
| Think Tank Session 325 to be held in Huntington A on Thursday, Nov 3, 11:40 AM to 12:25 PM |
| Sponsored by the Feminist Issues in Evaluation TIG |
| Presenter(s): |
| Jara Dean-Coffey, jdcPartnerships, jara@jdcpartnerships.com |
| Discussant(s): |
| Jill Casey, jdcPartnerships, jill@jdcpartnerships.com |
| Summer Jackson, Independent Consultant, snjackson22@gmail.com |
| Nicole Farkouh, jdcPartnerships, nicole@jdcpartnerships.com |
| Abstract: This session explores the role of identity and values in our work as female evaluators. Participants will identify their own values and identities, naming the primary ones and looking at the ways in these values and identities interact with their work as evaluators. Session leaders represent a diverse array of backgrounds, and particularly different life choices common among professional women. They will share how their values and identities simultaneously enhance and challenge the process and product of their work, attending to their internal processes as well as how they are perceived by clients. Participants will then break into groups to examine their values and dimensions of their own identities. Participants will have an opportunity for personal reflection and processing and leave with a personal values statement to use as they wish. |
| Session Title: Stakeholder Engagement in Government Evaluations | |||||||||||
| Multipaper Session 326 to be held in Huntington B on Thursday, Nov 3, 11:40 AM to 12:25 PM | |||||||||||
| Sponsored by the Government Evaluation TIG | |||||||||||
| Chair(s): | |||||||||||
| James Newman, Idaho State University, newmjame@isu.edu | |||||||||||
|
| Session Title: Measurement Challenges | |||||||||||
| Multipaper Session 327 to be held in Huntington C on Thursday, Nov 3, 11:40 AM to 12:25 PM | |||||||||||
| Sponsored by the Cluster, Multi-site and Multi-level Evaluation TIG | |||||||||||
| Chair(s): | |||||||||||
| Allan Porowski, ICF International, aporowski@icfi.com | |||||||||||
|
| Session Title: Building Evaluation Capacity in College Access and Success Programs | |||
| Panel Session 328 to be held in La Jolla on Thursday, Nov 3, 11:40 AM to 12:25 PM | |||
| Sponsored by the College Access Programs TIG | |||
| Chair(s): | |||
| Wendy Erisman, Strix Research LLC, werisman@strixresearch.com | |||
| Abstract: This session examines lessons learned from two strategies designed to build evaluation capacity in college access and success programs. The first strategy is an online toolkit that helps program staff design and implement effective evaluations with or without the help of external evaluators. The second strategy involves offering funding to college access and success grantees for evaluation capacity-building through technical assistance provided by external consultants. The session will examine the strengths and weaknesses of each strategy, provide recommendations for using the strategies successfully, and discuss future plans for evaluation capacity-building work in the college access and success field. | |||
| |||
|
| Session Title: The Value of Knowledge Management in Evaluation: A Research Perspective | ||||||||||
| Multipaper Session 329 to be held in Laguna A on Thursday, Nov 3, 11:40 AM to 12:25 PM | ||||||||||
| Sponsored by the Research on Evaluation | ||||||||||
| Chair(s): | ||||||||||
| Karen Widmer, Claremont Graduate University, karen.widmer@cgu.edu | ||||||||||
|
| Session Title: Applying Universal Design for Learning Principles to Evaluation | |||||||||||
| Multipaper Session 330 to be held in Laguna B on Thursday, Nov 3, 11:40 AM to 12:25 PM | |||||||||||
| Sponsored by the Disabilities and Other Vulnerable Populations and the Collaborative, Participatory & Empowerment Evaluation TIG | |||||||||||
| Chair(s): | |||||||||||
| Don Glass, Boston College, donglass@gmail.com | |||||||||||
|
| Roundtable: Evaluation of an Online Versus Classroom Based Undergraduate Social Psychology Course |
| Roundtable Presentation 331 to be held in Lido A on Thursday, Nov 3, 11:40 AM to 12:25 PM |
| Sponsored by the Assessment in Higher Education TIG |
| Presenter(s): |
| Jessica Carlson, Western New England College, jcarlson@wnec.edu |
| Craig Outhouse, University of Massachusetts, Amherst, craigouthouse@gmail.com |
| Abstract: Findings regarding the effectiveness of online versus classroom based courses in higher education have been inconsistent, with some revealing higher exam performance in online courses (e.g., Maki, Maki, Patterson, & Whittaker, 2000; Poirier & Feldman, 2004), others discovering an advantage for students in 'live' classes (e.g., Edmonds, 2006; Wang & Newlin, 2000), and some indicating no performance difference between the two (e.g., Waschull, 2001). However, there is evidence overall of similar instructor evaluations by students in both delivery formats (Knight, Ridley, & Davies, 1998; Ridley, 1995). This roundable will begin with the presentation of results from a study investigating online versus classroom based instruction in a social psychology course at a private northeastern college. Results of this study will be discussed in the context of evaluation for both practitioners and researchers, with thoughts and feedback solicited from the audience. |
| Session Title: Is Not Killing Patients Cost-effective? The Economics of Quality Improvement in Health Care |
| Expert Lecture Session 332 to be held in Lido C on Thursday, Nov 3, 11:40 AM to 12:25 PM |
| Sponsored by the Health Evaluation TIG |
| Chair(s): |
| Mary Gutmann, EnCompass LLC, mgutmann@encompassworld.com |
| Presenter(s): |
| Edward Broughton, University Research Co, ebroughton@urc-chs.com |
| Abstract: Countless thousands of patients die unnecessarily each year from medical errors and other lapses in the quality of their health care. Quality improvement interventions can address health service dysfunction and improve patient outcomes. But many decisionmakers believe such interventions are too expensive and inefficient. Making the business care for improving health care quality with sound economic analyses is becoming more important as budgets are tightened and administrators strive to improve efficiency. Using examples from US and international health care settings, this lecture discussed methods of cost-effectiveness analysis for such programs - why they are done, how they are performed and how to interpret their results. This information is crucial to anyone interested in understanding and performing economic evaluations of programs to make health care work better for everyone. |
| Session Title: Assessing Additionality of Public Support of Industrial Research and Development | ||||||||||
| Multipaper Session 333 to be held in Malibu on Thursday, Nov 3, 11:40 AM to 12:25 PM | ||||||||||
| Sponsored by the Research, Technology, and Development Evaluation TIG | ||||||||||
| Chair(s): | ||||||||||
| Cheryl Oros, Oros Consulting LLC, cheryl.oros@gmail.com | ||||||||||
|
| Session Title: Connections and Hawaiian Culture: Evaluators as Boundary Spanners | |||||
| Panel Session 334 to be held in Manhattan on Thursday, Nov 3, 11:40 AM to 12:25 PM | |||||
| Sponsored by the Indigenous Peoples in Evaluation | |||||
| Chair(s): | |||||
| Martha Ann Carey, Kells Consulting, marthaanncarey@gmail.com | |||||
| Abstract: Evaluators often serve as a bridge between the client and the resource organization, spanning two cultures with different needs, perceptions, expectations, and occasionally somewhat different goals. Boundary spanning is a type of social network connection that can be effective in program development and documentation of a program's success. Factors important in being effective include appreciation of different cultural values and perspectives arising from gender, power distance, individual traits of dominance, and group cohesiveness. In two applied settings in Hawai'i, the researchers/evaluators learned to share the boundary spanner role with community members. The first presentation describes work with a nonprofit organization in planning and evaluation using a logic model, needs assessment, environmental scan, and plans for assessing outputs and outcomes. The second presentation involves boundary spanning across academic and community cultures for a study with Hawaiian elders in their understanding of wellness. | |||||
| |||||
|
| Session Title: Strategies for Tackling Complexity in Environmental Programs and Evaluations | |||||
| Multipaper Session 335 to be held in Monterey on Thursday, Nov 3, 11:40 AM to 12:25 PM | |||||
| Sponsored by the Environmental Program Evaluation TIG | |||||
| Chair(s): | |||||
| Johanna Morariu, Innovation Network, jmorariu@innonet.org | |||||
|
| Session Title: Leading the Horse to Water, Part III: Embedding Evaluation in a Knowledge Management Project |
| Think Tank Session 336 to be held in Oceanside on Thursday, Nov 3, 11:40 AM to 12:25 PM |
| Sponsored by the Business and Industry TIG |
| Presenter(s): |
| Thomas Ward, United States Army, tewardii@aol.com |
| Discussant(s): |
| Rhoda Risner, United States Army, rhoda.risner@us.army.mil |
| Abstract: This is the third in a topical series of Thinks Tanks and Round Tables. The first examined how to guide an organization in its initial consideration of a knowledge management project. The second focused on how to ensure such a project starts off with an effective front end needs assessment that both determines specific needs and identifies outcomes to observe and measure for subsequent project evaluation. This Think Tank will present the lessons learned from experience during a long-term implementation of a knowledge management initiative. The focus is on two issues: using results of the needs assessment to prioritize effort, and the practical aspects of building in evaluation during early implementation phases. It will highlight both quantitative and qualitative measurements and dealing with the messiness of human interaction, especially when institutional inertia is a major factor. The session will be 1/2 presenter time, and 1/2 participant brainstorming in small groups. |
| Session Title: Using Advocacy Evaluation and Learning Processes in Countries With Limited Political Space to Understand Actors, Identify Openings, and Achieve Policy Advances | |||
| Panel Session 342 to be held in San Clemente on Thursday, Nov 3, 11:40 AM to 12:25 PM | |||
| Sponsored by the Advocacy and Policy Change TIG | |||
| Chair(s): | |||
| Laura Roper, Brandeis University, l.roper@rcn.com | |||
| Discussant(s): | |||
| Sono Aibe, Pathfinder International, saibe@pathfinder.org | |||
| Abstract: This session takes several advocacy case examples - a family planning campaign in Tanzania, a nascent disability movement in Vietnam, and a gender-based violence prevention campaign in El Salvador - where civil society actors have to navigate complex political dynamics, not only with an array of formal and informal power structures, but also amongst both domestic and international non-governmental actors. In each case we discuss the role that evaluation and learning has played and, from there, address more broadly the ways in which advocacy planning, monitoring and evaluation tools need to be refined to be more useful in political settings that range from limited democracy to more authoritarian systems | |||
| |||
|
| Session Title: Growing Your Business in the Current Economic Climate | |||
| Panel Session 343 to be held in San Simeon A on Thursday, Nov 3, 11:40 AM to 12:25 PM | |||
| Sponsored by the Independent Consulting TIG | |||
| Chair(s): | |||
| Patricia Mueller, Evergreen Evaluation & Consulting LLC, pat@evergreenevaluation.net | |||
| Discussant(s): | |||
| Patti Bourexis, The Study Group, studygroup@aol.com | |||
| Abstract: This presentation will describe strategies used to grow an independent consultant business into a viable corporation that continues to expand, despite the current economic climate. Critical business principles will be highlighted that have proven successful in the growth and development of an education-focused evaluation small business. Participants will gain an understanding of how the business started 15 years ago and grew in into a corporate business structure with the associated labor, legal, financial needs and requirements. Topics to be addressed include: ensuring ongoing mentorship from a senior mentor; the importance of professional development for all employees; contract, project and time management; diversifying the business portfolio; implications of technological innovations; cash flow and sleepless nights! The presentation's value to the audience will be a combination of the real and practicalGǪthe how tos, with a focus on constraints and pitfalls, and suggestions and solutions for business growth in today's economic climate. | |||
| |||
|
| Session Title: Use Technology to Monitor Programs as They are Implemented: A Moodle and SAS Approach |
| Demonstration Session 344 to be held in San Simeon B on Thursday, Nov 3, 11:40 AM to 12:25 PM |
| Sponsored by the Pre-K - 12 Educational Evaluation TIG |
| Presenter(s): |
| Bo Yan, Blue Valley School District, byan@bluevalleyk12.org |
| Mike Slagle, Blue Valley School District, mslagle@bluevalleyk12.org |
| Abstract: Many districts adopt and implement intervention programs to help at-risk students. However, educators usually do not know whether a program works and issues exist in the program until evaluation at the end of implementation. To address this problem, we developed a data system that collects program implementation data using the database module of Moodle, and automatically and intelligently delivers alerts and reports to stakeholders using the SAS Business Intelligence framework. With the system, stakeholders receive alerts whenever issues occur in program implementation and reports about the effect of the program on a regular basis. In this session, we first introduce the concept of program monitoring and then demonstrate how we use this approach to monitor the gifted program and a math intervention program in our district. |
| Roundtable: Action Learning: An Intervention to Enhance Cultural Competencies of Evaluators |
| Roundtable Presentation 345 to be held in Santa Barbara on Thursday, Nov 3, 11:40 AM to 12:25 PM |
| Sponsored by the Multiethnic Issues in Evaluation TIG |
| Presenter(s): |
| Kyehyeon Cho, University of Illinois at Urbana-Champaign, kcho20@illinois.edu |
| Abstract: Action learning is a learning intervention based on questioning and critical reflection. It aims to provide chances to learn ways to deal with issues arising from actual work situations (O'Neil & Marsick, 2007). Marsick and Maltbia (2009) further argued that action learning can contribute to participants' transformative learning. In spite of lack of agreement on its definition (Cho & Egan, 2010), action learning has been accepted as a method to contemplate one's own perceptions and to generate discussion about discrepancies among the different ways of thinking. In this sense, action learning appears to be a suitable intervention to develop cultural agency of evaluators. This literature review intends to explore the theoretical possibility of utilizing action learning intervention as a tool to develop cross-cultural competencies for evaluators. Furthermore, this study aims to provide implications of cross-cultural learning interventions for evaluators. |
| Session Title: Statistical Methods for Interrupted Time-Series Analysis: Using the Auto-Regressive Integrated Moving Average (ARIMA) Technique in Program and Policy Evaluation |
| Demonstration Session 346 to be held in Santa Monica on Thursday, Nov 3, 11:40 AM to 12:25 PM |
| Sponsored by the Quantitative Methods: Theory and Design TIG and the Crime and Justice TIG |
| Presenter(s): |
| Derek Cohen, University of Cincinnati, cohendk@mail.uc.edu |
| Abstract: This demonstration is designed to present attendees with the "bare bones" essentials to using ARIMA, a sophisticated interrupted time-series statistical model. Attendees will first be briefly educated in the mechanics and assumptions of using of the ARIMA technique. Then, the presenter will demonstrate how to compile or acquire an appropriate dataset as well as how to prepare it for statistical analysis. Once the data is prepared, attendees will be shown the basics of model construction, as well as how to alter a model to minimize autocorrelation. The output will then be interpreted and explained to attendees so that they will be able to garner conclusions from their personal data and models. The demonstration will be conducted using the SPSS/PASW statistical software package. The data used in the demonstration is from a work-in-progress evaluation of firearm policy in the state of Ohio. |
| Session Title: Outcomes for Youth in Residential Treatment and Foster Youth Education Programs | |||||||||||||||||
| Multipaper Session 347 to be held in Sunset on Thursday, Nov 3, 11:40 AM to 12:25 PM | |||||||||||||||||
| Sponsored by the Human Services Evaluation TIG | |||||||||||||||||
| Chair(s): | |||||||||||||||||
| James Sass, Rio Hondo College, jimsass@earthlink.net | |||||||||||||||||
| Discussant(s): | |||||||||||||||||
| Michel Lahti, University of Southern Maine, mlahti@usm.maine.edu | |||||||||||||||||
|