| Session Title: Incorporating Values Into Program Theory and Logic Models |
| Demonstration Session 351 to be held in Avalon A on Thursday, Nov 3, 1:35 PM to 2:20 PM |
| Sponsored by the Program Theory and Theory-driven Evaluation TIG |
| Presenter(s): |
| Patricia Rogers, Royal Melbourne Institute of Technology, patricia.rogers@rmit.edu.au |
| Abstract: Logic models are commonly used to communicate how programs and projects are understood to work - articulating the change mechanisms involved in producing the intended outcomes, and how program activities have been constructed to activate these mechanisms. However it can be difficult to identify diverse values about what are desirable and undesirable standards of performance, outcomes/impacts, processes and distributions of costs and benefits, and even harder to represent these in logic models and use them in evaluations. This demonstration will share examples where diverse values have not only been identified, but incorporated in logic models, program logic matrices and evaluation plans. It includes examples from diverse sectors including labor, aged care, early childhood early intervention and natural resource management. The session will discuss how values might be addressed differently depending on the intended use for program theory and the nature of the intervention - in particular its complicated or complex aspects. |
| Session Title: Introducing Tools to Measure Trainers' Cultural Competence in Training Events with Community Organizations |
| Demonstration Session 352 to be held in Avalon B on Thursday, Nov 3, 1:35 PM to 2:20 PM |
| Sponsored by the Multiethnic Issues in Evaluation TIG |
| Presenter(s): |
| Jeanette Treiber, UC Davis Center for Evaluation and Research, jtreiber@ucdavis.edu |
| Robin Kipke, University of California, Davis, rakipke@ucdavis.edu |
| Veronica Acosta-Deprez, California State University, Long Beach, vacosta@csulb.edu |
| Abstract: In 2010 the Center for Evaluation and Research at UC Davis tested a training observation instrument that measures trainers' cultural competence with a Los Angeles community organization. In May 2011 an enhanced version of this tool and a corresponding trainer self-assessments and participant questionnaire will be tested in three different training events in California and then finalized and released for use. In this AEA demonstration session a trainer will first introduce the tools, which focus on the relationship between trainer and participants (e.g. creating an environment where participants feel comfortable to ask questions; using examples that are relevant to the audience; incorporating learners' knowledge and background into training, etc.), followed by a brief training demonstration with participants on the subject of cultural competence in evaluation. Then the trainer, participants, and an observer will rate the cultural competence of the session using the tools. This will be followed by a discussion. |
| Session Title: Beyond PowerPoint: Keep the Big Picture in Your Presentation Without Losing the Details |
| Demonstration Session 353 to be held in California A on Thursday, Nov 3, 1:35 PM to 2:20 PM |
| Sponsored by the Data Visualization and Reporting TIG |
| Presenter(s): |
| Lyn Paleo, First 5 Contra Costa, lpaleo@firstfivecc.org |
| Abstract: Imagine you are preparing for a presentation. Rather than a pack of Powerpoint slides that dice and slice your presentation into small, equally sized rectangles, you get an enormous white board onto which you place title, text, sets of graphs, photos, pdf files, videos, portions of an Excel sheet. You show the audience the "big picture", and with a click zoom in to examine details such as a paragraph of a document, and zoom out to show how this detail relates to the big picture concept. Prezi is a new alternative to Powepoint-free, simple to use, and engaging. It extends the principles of Tufte and Few from graphs to entire presentations. This demonstration will walk through a Prezi presentation related to evaluation concepts and findings, then will show how simple it is to make a presentation. By the end of the session, participants can create their own presentation. |
| Session Title: Moving Beyond Fragmentation: Building a Set of Participatory Principles for Evaluation |
| Think Tank Session 354 to be held in California B on Thursday, Nov 3, 1:35 PM to 2:20 PM |
| Sponsored by the Collaborative, Participatory & Empowerment Evaluation TIG |
| Presenter(s): |
| Elizabeth Whitmore, Carleton University, elizabeth_whitmore@carleton.ca |
| Lyn Shulha, Queen's University, lyn.shulha@queensu.ca |
| J Bradley Cousins, University of Ottawa, bcousins@uottawa.ca |
| Discussant(s): |
| Lyn Shulha, Queen's University, lyn.shulha@queensu.ca |
| Elizabeth Whitmore, Carleton University, elizabeth_whitmore@carleton.ca |
| Michael Harnar, Claremont Graduate University, michaelharnar@gmail.com |
| Abstract: The last 25 years has seen increasing purposeful engagement of stakeholders in evaluation. Given recent efforts to make definitive distinctions among different approaches to this work, now seems like an appropriate time to consider whether there may be any merit in moving beyond the refinement of unique collaborative models. In this session, we propose to explore with participants the potential of a coherent overarching set of principles that would encompass the wide range of existing participatory and collaborative approaches. Initially working in small, facilitated groups, participants will be asked to draw on their own experiences, and to record their ideas on the wide variety of decisions points embedded in collaborative/participatory work. The session will conclude with a review of responses and a discussion of whether there is value in identifying a set of principles about the nature of collaborative evaluative inquiry to be tested out in future discussions. |
| Session Title: Using R for the Management of Survey Data and Statistics in Evaluation |
| Demonstration Session 355 to be held in California C on Thursday, Nov 3, 1:35 PM to 2:20 PM |
| Sponsored by the Graduate Student and New Evaluator TIG |
| Presenter(s): |
| Lindsey Dunn, University of North Carolina, Greensboro, l_dunn@uncg.edu |
| Lauren Fluegge, University of North Carolina, Greensboro, lbfluegg@uncg.edu |
| Korinne Chiu, University of North Carolina, Greensboro, k_chiu@uncg.edu |
| Abstract: R is a free, open-source software program for statistics and graphics that consists of many statistical packages specific to analyses that are useful to evaluators. This session will include an introduction to R, how to import data files into R and how to export data files into other formats. In addition, we will demonstrate how to analyze descriptive statistics, recode and re-label variables, and produce descriptive tables in R. We will also provide an overview of more recent, qualitative data analysis capabilities in R. Syntax will be reviewed in R and results in R will be compared with familiar packages such as SPSS and Excel. |
| Session Title: Influencing Evaluation Policy and Evaluation Practice: A Progress Report From the American Evaluation Association's (AEA) Evaluation Policy Task Force | |||
| Panel Session 356 to be held in California D on Thursday, Nov 3, 1:35 PM to 2:20 PM | |||
| Sponsored by the AEA Conference Committee | |||
| Chair(s): | |||
| Patrick Grasso, World Bank Group, pgrasso45@comcast.net | |||
| Discussant(s): | |||
| Jennifer C Greene, University of Illinois at Urbana-Champaign, jcgreene@illinois.edu | |||
| Abstract: The Board of Directors of the American Evaluation Association (AEA) established the Evaluation Policy Task Force (EPTF) in September, 2007, to enhance AEA's ability to identify and influence policies that have a broad effect on evaluation practice and to establish a framework and procedures for accomplishing this objective. The EPTF has issued key documents promoting a wider role for evaluation in the Federal Government, influenced federal legislation and executive policy, and informed AEA members and others about the value of evaluation through public presentations and newsletter articles. This session will provide an update on their work and invite member input on their plans and actions. | |||
| |||
|
| Session Title: Examining Intercultural Communications and Its Implications for Effective Evaluation With Stella Ting-Toomey |
| Expert Lecture Session 357 to be held in Pacific A on Thursday, Nov 3, 1:35 PM to 2:20 PM |
| Sponsored by the Presidential Strand |
| Chair(s): |
| Melvin Hall, Northern Arizona University, melvin.hall@nau.edu |
| Presenter(s): |
| Stella Ting-Toomey, California State University, Fullerton, sting@exchange.fullerton.edu |
| Discussant(s): |
| Melvin Hall, Northern Arizona University, melvin.hall@nau.edu |
| Abstract: Stella Ting-Toomey, an internationally recognized expert in intercultural communication, will participate in an interactive dialogue with AEA member and presidential strand co-chair, Melvin Hall. This session will explore issues and insights associated with conducting evaluations using a culture-sensitive theoretical framework. Improving intercultural communication competence will be the focus of the dialogue, with key concepts explained including: cultural value dimensions, identity layers, and communication styles. Each aspect of the dialogue will be grounded in evaluation practice issues. Ting-Toomey is perhaps best known for her work on "mindfulness" and "facework" in cross-cultural communication, in particular her face-negotiation theory which deals with ways people negotiate their communication identities during interactions with each other. The author or editor of 15 books and over 90 journal articles and chapters, Ting-Toomey is an active trainer, consultant, and certified mediator who has conducted intercultural conflict competence workshops for corporations, universities, and non-profit centers/institutes. |
| Session Title: Extending the Focus Group Method |
| Skill-Building Workshop 358 to be held in Pacific B on Thursday, Nov 3, 1:35 PM to 2:20 PM |
| Sponsored by the Qualitative Methods TIG |
| Presenter(s): |
| Katherine Ryan, University of Illinois at Urbana-Champaign, k-ryan6@illinois.edu |
| Peter Muhati, University of Illinois at Urbana-Champaign, mmuhati2@illinois.edu |
| Tysza Gandha, University of Illinois at Urbana-Champaign, tgandha2@illinois.edu |
| Abstract: Focus groups are fundamentally group interviews or collective conversations about a limited set of topics (Bloor, Frankland, Thomas, & Robson, 2001; Kamberelis & Dimitriadis, 2005; Morgan, 2002). While this method encompasses several approaches, this workshop will specifically compare two focus group types: 1) the more common 'theory-building' focus group, and 2) the 'narrative' focus group which evaluators might be less familiar with. By (re)introducing attendees to multiple focus group approaches, we hope to expand the possibilities for evaluators to use focus groups to gather rich and meaningful data. In this interactive session, brief protocol and data analysis examples will be shared to illustrate each focus group type. Attendees will have the opportunity to practice writing questions/probes for the narrative focus group and to participate in analyzing a taped excerpt from a narrative focus group. References to additional information and tools will be provided for further study. |
| Session Title: Maximizing Validity for Evaluation Quality | ||||||||||
| Multipaper Session 359 to be held in Pacific C on Thursday, Nov 3, 1:35 PM to 2:20 PM | ||||||||||
| Sponsored by the Quantitative Methods: Theory and Design TIG | ||||||||||
| Chair(s): | ||||||||||
| Tony Lam, University of Toronto, tonycm.lam@utoronto.ca | ||||||||||
|
| Session Title: One Size Does Not Fit All: Capturing Change in Non-profit Capacity | |||
| Panel Session 360 to be held in Pacific D on Thursday, Nov 3, 1:35 PM to 2:20 PM | |||
| Sponsored by the Non-profit and Foundations Evaluation TIG | |||
| Chair(s): | |||
| Clare Nolan, Harder+Company Community Research, cnolan@harderco.com | |||
| Discussant(s): | |||
| Len Finocchio, The California HealthCare Foundation, lfinocch@chcf.org | |||
| Shane Goldsmith, Liberty Hill Foundation, sgoldsmith@libertyhill.org | |||
| Abstract: As more foundations become interested in capacity-building as a means of helping nonprofits cope with current economic pressures, evaluation has been challenged to find meaningful measures of organizational effectiveness, along with methods that capture the true impact of capacity-building interventions. This session will compare and contrast approaches used to evaluate two very different foundation-sponsored initiatives to increase nonprofit capacity: (1) Liberty Hill Foundation's efforts to strengthen fundraising and advocacy capacity among minority-led and minority-serving organizations in Los Angeles, and (2) California HealthCare Foundation's efforts to strengthen the management and financial capacity of California safety net dental clinics. The presentations will highlight how different objectives of capacity-building and the strategies used to achieve these objectives affect evaluation design and measurement. Merits and limitations of standardized nonprofit capacity metrics will also be discussed. Staff from both foundations will serve as discussants for this session, which will also invite audience discussion and feedback. | |||
| |||
|
| Roundtable: A Logic Model Framework for Programs with Systems Change Intents |
| Roundtable Presentation 361 to be held in Conference Room 1 on Thursday, Nov 3, 1:35 PM to 2:20 PM |
| Sponsored by the Advocacy and Policy Change TIG |
| Presenter(s): |
| Anna Lobosco, New York State Developmental Disabilities Planning Council, anna.lobosco@ddpc.ny.gov |
| Dianna Newman, University at Albany, SUNY, dnewman@uamail.albany.edu |
| Abstract: Recently, strands of several frameworks have been combined into a global logic model framework to guide systemic change efforts. These include: a) The Route to Success framework (PADDC, 2009) for affecting systems change that includes improving the knowledge base, selecting social strategies, engaging stakeholders, supporting policy entrepreneurs, and using unexpected events (or 'tipping points'); b) a model for change (Newman & Lobosco, 2008) that identifies four domains of successful systems change policies and procedures, infrastructure, design and delivery of services, and expectation of outcomes and experiences; and c) Scheirere's (2010) delineation of four dimensions of sustainability (individual, organization, community and population) that emphasizes concepts or programmatic philosophy rather than funding. These strands, when braided together, form a logic model framework useful for education and human services programs with systems change intents. The purpose of this paper is to summarize the strands, define the logic model, and provide examples of use. |
| Roundtable: Measuring Changes in Organizational Capacity: Recent Findings and Ideas for the Future |
| Roundtable Presentation 362 to be held in Conference Room 12 on Thursday, Nov 3, 1:35 PM to 2:20 PM |
| Sponsored by the Non-profit and Foundations Evaluation TIG |
| Presenter(s): |
| Amy Minzner, Abt Associates Inc, amy_minzner@abtassoc.com |
| Kimberly Francis, Abt Associates Inc, kimberly_francis@abtassoc.com |
| Abstract: Organizational capacity building has become an accepted mechanism to increase the quality and quantity of services delivered by nonprofit organizations. While anecdotal information abounds about capacity building's effectiveness, rigorous evaluations have been limited. Capacity building is difficult to evaluate, which has constrained the number of evaluations completed. In this context, three recent evaluations of large-scale capacity building programs are notable. The roundtable presentation will present the evaluation designs and research findings of these evaluations (the Compassion Capital Fund Demonstration outcome and impact evaluations and the Communities Empowering Youth longitudinal outcome evaluation). It will also discuss the authors' measurement challenges, lessons learned, and ideas about moving the field forward in terms of measurement. The discussion portion of the roundtable will focus on participants' questions, experiences measuring capacity, ideas about defining a limited core set of organizational capacities to be used as indicators of broad capacity, and recommendations for future evaluation. |
| Session Title: Evaluating Sustainability of Programs in Developing Countries: What do we Measure and How? The Case of Healthcare Quality Improvement in Niger |
| Expert Lecture Session 363 to be held in Conference Room 13 on Thursday, Nov 3, 1:35 PM to 2:20 PM |
| Sponsored by the International and Cross-cultural Evaluation TIG |
| Presenter(s): |
| Lynne Franco, EnCompass LLC, lfranco@encompassworld.com |
| Abstract: All public health programs seek to have lasting positive impact on the populations they serve, particularly in developing countries where the need is great. Yet little evaluation and research has been done to characterize the extent of and factors for sustainability or institutionalization. Institutionalization can be conceptualized as establishing and maintaining something as an integral, sustainable part of a system or organization, woven into the fabric of daily activities and routine. This expert lecture will explore several key issues with evaluation of institutionalization: 1) what is our desired result - how would institutionalization manifest itself at various levels of the system? 2) what evidence can we gather or how can we measure institutionalization? Based on an evaluation case of institutionalization of quality improvement approaches in the Niger Healthcare system, a framework for evaluating institutionalization will be discussed, as well as illustrative findings presented from the application of this framework. |
| Session Title: Using Bibliometrics for Research Evaluation of Countries, Institutions, and Researchers: A Review of Statistics, Visualizations, and Guidelines |
| Demonstration Session 364 to be held in Conference Room 14 on Thursday, Nov 3, 1:35 PM to 2:20 PM |
| Sponsored by the Research, Technology, and Development Evaluation TIG |
| Presenter(s): |
| Ann Kushmerick, Thomson Reuters, ann.kushmerick@thomsonreuters.com |
| Abstract: This demonstration will review a selection of metrics and visualizations used in bibliometric research evaluation. Thomson Reuters, formerly Institute for Scientific Information, is the originator of the first multidisciplinary citation index (Science Citation Index), and has been working in the field of bibliometrics for over 50 years. Over this period, the use of bibliometric techniques to measure the output and impact of scholarly research of countries, institutions, research groups, and individuals has become an established method of quantitative research assessment. Bibliometrics have been adopted widely around the globe because they reflect key values of sound assessment: they are repeatable, transparent, and easily understood. We will review well-established metrics such as the journal impact factor, as well as newer metrics like h-index, Eigenfactor™, and aggregate performance indicator. Methods of visualizing bibliometric data for the purposes of decision making will also be discussed. |
| Session Title: Translating Science to Practice: An Evaluation Perspective | |||||||||
| Panel Session 365 to be held in Avila A on Thursday, Nov 3, 1:35 PM to 2:20 PM | |||||||||
| Sponsored by the Evaluation Use TIG and the Health Evaluation TIG | |||||||||
| Chair(s): | |||||||||
| Stephanie Gruss, Centers for Disease Control and Prevention, inf6@cdc.gov | |||||||||
| Abstract: The two presentations in this session explore the evaluation approaches and methods used in the translation of science into practice in areas of public health. The projects included are at different points on the continuum of the translation process, from translation of knowledge into actionable products, to implementation and institutionalization . Evaluative criteria based on the REAIM framework have been used to identify, rank, and prioritize the most effective strategies for diabetes prevention and control; the IOM recommendations project is using evaluation methods such as focus groups, key informant interviews, and surveys to assess the key steps in bridging the gaps between science and practice. The audience will learn the different evaluative approaches and methods used in various aspects of translation work. | |||||||||
| |||||||||
|
| Roundtable: Whose Evaluations Count? Lessons Learned From Process Facilitation Applied in Evaluation |
| Roundtable Presentation 367 to be held in Balboa A on Thursday, Nov 3, 1:35 PM to 2:20 PM |
| Sponsored by the Systems in Evaluation TIG |
| Presenter(s): |
| Ritu Shroff, Independent Consultant, ritushroff2003@yahoo.co.uk |
| Bob Williams, Independent Consultant, bob@bobwilliams.co.nz |
| Abstract: Methods drawing on systems thinking and process facilitation for designing evaluation frameworks and areas of inquiry, data collection, and data analysis and generation of recommendations have been successfully applied in conducting evaluations in Africa and Asia. These evaluation methods facilitate exploring inter-relationships (between stakeholders and between intervention and context), eliciting and engaging with perspectives and stakes from multiple stakeholders, and undertaking boundary critique. Experiences with a) multi-stakeholder engagement on program theory of change, b) mixing photography and theatre for development techniques with other methods for data collection, and c) large group facilitation for data analysis and recommendations offer lessons learned on consequences/effects of evaluation, particularly usefulness and uptake. The roundtable will focus on the values that such methods are based on and relationships that they foster between evaluator and client, and effects such as greater internationalization and application of findings and recommendations. |
| Session Title: Government Evaluation Issues in K-12 Education | ||||||||||
| Multipaper Session 368 to be held in Balboa C on Thursday, Nov 3, 1:35 PM to 2:20 PM | ||||||||||
| Sponsored by the Government Evaluation TIG and the Pre-K - 12 Educational Evaluation TIG | ||||||||||
| Chair(s): | ||||||||||
| Samuel Held, Oak Ridge Institute for Science and Education, sam.held@orau.org | ||||||||||
|
| Session Title: How to Infuse Learning in Non-profits: Developing a Framework for Learning Based on Three Non-profit Case Illustrations |
| Think Tank Session 369 to be held in Capistrano A on Thursday, Nov 3, 1:35 PM to 2:20 PM |
| Sponsored by the Organizational Learning and Evaluation Capacity Building |
| Presenter(s): |
| Joelle Cook, Organizational Research Services, jcook@organizationalresearch.com |
| Discussant(s): |
| Astrid Hendricks, Hendricks Consulting, ahendricks2@gmail.com |
| Cameron Clark, Organizational Research Services, cclark@organizationalresearch.com |
| Abstract: Evaluators talk about "learning" in many ways (e.g., learning organizations, organizational learning, strategic learning). To date, most of the relevant literature and work around organizational learning has focused on foundations and for-profit companies. The goal of this think tank is to discuss and develop frameworks for learning through evaluation in a nonprofit setting. Session facilitators will share an overview of relevant literature and frameworks and present three case examples of advocacy organizations who have successfully used evaluation to support learning in different ways: 1) for operations, 2) to measure the effectiveness of programs and tactics, and 3) to define strategy (ORS, 2010). Following this presentation, session facilitators will lead participants in an exercise to further define a useful, unified framework for learning in these different areas and to gather input on other learning models and practices that would benefit nonprofits who wish to develop this capacity. |
| Session Title: Developing Internal Evaluation Capacity | ||||||||||
| Multipaper Session 370 to be held in Capistrano B on Thursday, Nov 3, 1:35 PM to 2:20 PM | ||||||||||
| Sponsored by the Internal Evaluation TIG | ||||||||||
| Chair(s): | ||||||||||
| Amanda Greene, National Institutes of Health, amanda.greene@nih.gov | ||||||||||
| Discussant(s): | ||||||||||
| Boris Volkov, University of North Dakota, boris.volkov@med.und.edu | ||||||||||
|
| Session Title: Assessing Your Agency's Capacity to Serve Injecting Drug Users |
| Demonstration Session 371 to be held in Carmel on Thursday, Nov 3, 1:35 PM to 2:20 PM |
| Sponsored by the Alcohol, Drug Abuse, and Mental Health TIG |
| Presenter(s): |
| Adam Viera, Harm Reduction Coalition, viera@harmreduction.org |
| Abstract: For nearly all of the CDC-identified evidence-based interventions (EBIs) and public health strategies, organizations can avail themselves of a readiness assessment to determine their existing capacity and capacity-building needs. However, these tools often assume a previous relationship with injection drug users (IDU) and neglect to assess the staff and agency capacity around working with injecting drug users and communities impacted by drug use. Harm Reduction Coalition (HRC) will present a conceptual framework for assessing organizational capacity to serve populations of injecting drug users. This conceptual framework establishes four interrelated levels to be assessed (termed assessment domains), which include the individual staff level, the program level, the organizational level and the community engagement level. Within each assessment domain, facilitators will discuss the various instruments and methods available for assessment at that level, as well as the different areas of capacity (termed capacity domains). HRC will present corresponding examples of capacity domains for each assessment domain, along with sample questions. HRC will also close with a discussion regarding resources for assessing and building capacity to serve injecting drug users. |
| Session Title: Technology and Classroom Observation: Bringing the ICOT Up to Date |
| Demonstration Session 372 to be held in Coronado on Thursday, Nov 3, 1:35 PM to 2:20 PM |
| Sponsored by the Distance Ed. & Other Educational Technologies TIG |
| Presenter(s): |
| Talbot Bielefeldt, International Society for Technology in Education, talbot@iste.org |
| Clare Strawn, International Society for Technology in Education, cstrawn@iste.org |
| Brandon Olszewski, International Society for Technology in Education, brandon@iste.org |
| Abstract: The ISTE Classroom Observation tool (ICOT) for several years has provided a convenient platform for conducting evaluation observations in technology-using schools. However, the application has aged quickly. With the loss of funding for programming an upgrade, evaluators moved the protocol into a common spreadsheet, incorporating the latest educational technology standards, eliminating glitches from the original tool, and providing a free and flexible platform than can be easily modified by users without programming. This presentation will introduce users to the ICOT, enable them to download the tool, and discuss how to incorporate the tool into evaluation logic models, achieve reliability across observers, aggregate data for analysis, and present results. |
| Session Title: Map it Out: A Visual and Physical Participatory Method for Data Collection |
| Demonstration Session 373 to be held in El Capitan A on Thursday, Nov 3, 1:35 PM to 2:20 PM |
| Sponsored by the Health Evaluation TIG |
| Presenter(s): |
| Jill Lipski Cain, The Improve Group, jill@theimprovegroup.com |
| Stacy Johnson, The Improve Group, stacyj@theimprovegroup.com |
| Abstract: Learn about a hands-on tool that engages stakeholders in creating a comprehensive inventory of assets and areas of need. Two evaluators from the Improve Group will demonstrate the community mapping process they developed to help local practitioners assess health assets and needs in Faribault, Martin, and Watonwan Counties. This work was done for a broader health needs assessment funded by Minnesota's Statewide Health Improvement Program. Stakeholders used icons to map the depth, connectedness, and gaps in resources in their rural communities. Findings were used to identify activities that were most effective in addressing the problems of obesity and tobacco use. This session will address what materials and resources are needed, which stakeholders should participate, what can be measured using this tool, how its findings can be used to inform a broader needs assessment, and how this process can be adapted and applied to other contexts. |
| Session Title: Stages of Evaluation Development in Kazakhstan: Methodology of State Programs Evaluation |
| Demonstration Session 374 to be held in El Capitan B on Thursday, Nov 3, 1:35 PM to 2:20 PM |
| Sponsored by the International and Cross-cultural Evaluation TIG |
| Presenter(s): |
| Sergey Gulyayev, Decenta Public Foundation, sergey@decenta.org |
| Abstract: In late 1990s the first experiences of program evaluation took place. Reasons for that: international projects were implemented in Kazakhstan; foreign companies brought this tradition; specialists from Kazakhstan got experience of attending trainings and seminars on evaluation; first local evaluators as representatives of a profession appeared. Early 2000s - large projects financed by foreign investors and donors were evaluated, this was an obligatory condition set by the financing party. In 2002-2005 more and more discussions raised about necessity to evaluate the impact of the programs financed by the state. These discussions were initiated by the NGO sector. By 2006 there were examples of evaluation of the state budget programs. During 2007 - 2009 the professional community of evaluators from NGOs lobbied amendments into the national legislation about evaluation of the governmental programs. In 2010 the national legislation was amended and evaluation of the state programs efficiency, effectiveness and impact became obligatory. In 2011 it became legally possibly to evaluate the programs by contacting evaluators from NGOs. |
| Roundtable: A Qualitative Evaluation of Second-Step: A School Violence Prevention Program in Southern Mexico |
| Roundtable Presentation 375 to be held in Exec. Board Room on Thursday, Nov 3, 1:35 PM to 2:20 PM |
| Sponsored by the Qualitative Methods TIG |
| Presenter(s): |
| Enique Polanco-Cabrales, Centro de Estudios de Las Americas, esantacruz@cela.edu.mx |
| Edith Cisneros-Cohernour, University of Yucatan, Merida, cchacon@uady.mx |
| Abstract: This paper presents the findings of a qualitative evaluation study on the implementation of Program Second-Step, a bulling and violence prevention program in a private school in Southern Mexico. Stake Responsive Evaluation approach was used for examining the pertinence to the program to the Mexican context and culture and the quality of its implementation. Findings of the study indicate that the program has increased pro-social behavior of students and helped to reduce discipline and emotional violent behavior. |
| Session Title: 25 Low-cost/No-cost Tools for Evaluation |
| Demonstration Session 376 to be held in Huntington A on Thursday, Nov 3, 1:35 PM to 2:20 PM |
| Sponsored by the Integrating Technology Into Evaluation |
| Presenter(s): |
| LaMarcus Bolton, American Evaluation Association, marcus@eval.org |
| Susan Kistler, American Evaluation Association, susan@eval.org |
| Abstract: Join us for a review of over 25 low-cost/no-cost tools that are useful, used, and user-friendly. Who isn't short on time, short on funds, and short on the patience needed to decide which tools are worth investing the time needed to access and use? Drawing on contributions from over 20 colleagues in different contexts, we'll show examples of tools that are used by practicing evaluators to conduct background research; create and document evaluation plans and logic models; facilitate data cleaning, exploration and analysis; listen to and learn from online exchanges; and promote and enhance collaboration. Each attendee will leave with a handout identifying the tool, its uses, the time needed to learn to use it, and any special considerations. The session will be supplemented by a website with live links to each tool and where and how to learn more. |
| Session Title: Understanding Sponsors and Stakeholders' Interests and Values in High Stakes Evaluations | |||
| Panel Session 377 to be held in Huntington B on Thursday, Nov 3, 1:35 PM to 2:20 PM | |||
| Sponsored by the Government Evaluation TIG | |||
| Chair(s): | |||
| Rakesh Mohan, Idaho State Legislature, rmohan@ope.idaho.gov | |||
| Discussant(s): | |||
| Rakesh Mohan, Idaho State Legislature, rmohan@ope.idaho.gov | |||
| Abstract: The word fact comes from the Latin factum, meaning something that actually took place or is taking place. In the world of performance evaluations, with their focus on evidence-based findings, uncovering the facts is the first step along the path that ultimately leads to the formulation of recommendations - statements about what should be done. However, as the philosopher David Hume argued, there is no logical, certain or obvious way to make value statements about what should be done based on any set of facts. Whether one agrees with this or not, it is useful to recognize that well-reasoned arguments and fact-based evidence may not be enough to gain acceptance of recommendations. This panel explores how knowledge and consideration of key stakeholders' interests and values, and how these values influence their interpretation of the facts, can help to inform the conduct and contribute to the success of utilization-focused evaluations. | |||
| |||
|
| Session Title: A Complex Youth Competency Initiative in Cleveland, OH | |||
| Panel Session 378 to be held in Huntington C on Thursday, Nov 3, 1:35 PM to 2:20 PM | |||
| Sponsored by the Cluster, Multi-site and Multi-level Evaluation TIG | |||
| Chair(s): | |||
| Deborah Volk, Cuyahoga County Family and Children First Council, dvolk@cuyahogacounty.us | |||
| Abstract: My Commitment, My Community (MyCom) is an eight community, youth competency initiative in Cleveland, Ohio, funded by The Cleveland Foundation and Cuyahoga County Family and Children First Council. MyCom is composed of 6 major technical assistance organizations, 8 community "lead" agencies-one in each of the eight communities, and 87 funded and volunteer, youth service delivery agencies and organizations. This presentation discusses challenges of evaluation ranging from process and outcome measures to organizational structure to organizational capacity to internal morale, among other elements affecting a geographically wide-spread and organizationally complex youth program. Members of the evaluation team will review challenges, discuss the development of specifically designed capacity instruments, and the application of social network analysis to enhance organizational structure and organization. | |||
| |||
|
| Session Title: Exploring Praxis in Evaluation | |||||||||||
| Multipaper Session 380 to be held in Laguna A on Thursday, Nov 3, 1:35 PM to 2:20 PM | |||||||||||
| Sponsored by the Theories of Evaluation TIG | |||||||||||
| Chair(s): | |||||||||||
| Bianca Montrosse, Western Carolina University, bemontrosse@wcu.edu | |||||||||||
|
| Session Title: Rigorous Design for Evaluating Vulnerable Populations | |||||||||||||||
| Multipaper Session 381 to be held in Laguna B on Thursday, Nov 3, 1:35 PM to 2:20 PM | |||||||||||||||
| Sponsored by the Disabilities and Other Vulnerable Populations | |||||||||||||||
|
| Roundtable: Evaluation of Study Abroad Outcomes |
| Roundtable Presentation 382 to be held in Lido A on Thursday, Nov 3, 1:35 PM to 2:20 PM |
| Sponsored by the International and Cross-cultural Evaluation TIG |
| Presenter(s): |
| Julia Shaftel, University of Kansas, jshaftel@ku.edu |
| Tim Shaftel, University of Kansas, tshaftel@ku.edu |
| Abstract: The Intercultural Student Attitude Scale (ISAS) is intended for use by college and university study abroad programs to assess student outcomes of international study and evaluate program goals related to student growth in cross-cultural skills and attitudes. The assessment of attitude change as a result of study abroad is in its infancy and no other validated instrument exists for this purpose. The authors plan to make the ISAS available through the public domain for use by university international study offices and individual study abroad programs as well as for research applications. The audience will learn about the development and validation of the ISAS, the role of sex and personality factors in self-selection for international study, and how student attitudes change with study abroad experience. |
| Session Title: Measurement of Interprofessional Education and Health Care Collaboratives | |||||||||||||||||||||
| Multipaper Session 383 to be held in Lido C on Thursday, Nov 3, 1:35 PM to 2:20 PM | |||||||||||||||||||||
| Sponsored by the Health Evaluation TIG | |||||||||||||||||||||
| Chair(s): | |||||||||||||||||||||
| Robert LaChausse, California State University, San Bernardino, rlachaus@csusb.edu | |||||||||||||||||||||
|
| Session Title: Using Logical Framework to Identify Outcomes and Develop Performance Indicators in Science & Technology Program Proposals |
| Skill-Building Workshop 384 to be held in Malibu on Thursday, Nov 3, 1:35 PM to 2:20 PM |
| Sponsored by the Research, Technology, and Development Evaluation TIG |
| Presenter(s): |
| Shan Shan Li, Science & Technology Policy Research and Information Center,, ssli@stpi.narl.org.tw |
| Ling-Chu Lee, Science & Technology Policy Research and Information Center, lclee@stpi.narl.org.tw |
| Abstract: Ex ante evaluation is a fundamental tool for effective management and a formal requirement. The aim of a "good ex-ante evaluation" is to request the well-built framework and present the kind of logical thinking between the goals and indicators in programs' proposals. The session is divided into two parts. The one part is to explain the main concept of LFA and its procedures step by step. Each step will be its "tricks" in details. And the other part is to let audiences experience, participate, and discuss in the whole process by their own. And we will take a case to demonstrate the whole process. After the session is finished, we are glad to share and discuss the method via email, in order to improve the method. If it is necessary, we also have related documents as references. In the future, we will focus on the development in details, such as how to apply SWOT to LFA, how to develop the hypothesis, etc. |
| Session Title: I'm With the Brand: A Behind the Scenes Look at the Three T's of Creating an Effective Internet Presence |
| Demonstration Session 385 to be held in Manhattan on Thursday, Nov 3, 1:35 PM to 2:20 PM |
| Sponsored by the Independent Consulting TIG |
| Presenter(s): |
| Richard Eddy, Cobblestone Applied Research & Evaluation Inc, rich.eddy@cobblestoneeval.com |
| Abstract: The purpose of this presentation is to provide an overview of the tools, techniques and talents useful in developing an effective online presence. An established Internet presence can be used by an independent consultant to build credibility, to establish a unique brand, and for connecting with existing and prospective clients. The presenter will outline a model that implements some of the ideas that an independent practitioner or small consulting practice can use to leverage their online identity, specifically demonstrating what this online presence should look like, including the integration of popular social media platforms such as blogs (Wordpress, Blogger, etc.), Facebook, Twitter and LinkedIn. Some of the many ways that these tools can be used to enhance the success of an independent consulting practice will also be discussed, in addition to the subjects of privacy, security and reputation management. |
| Session Title: A Mixed-Methods Approach to Understanding the Impact of Requiring Citizenship Documentation for Medicaid Enrollment | |||||
| Panel Session 386 to be held in Monterey on Thursday, Nov 3, 1:35 PM to 2:20 PM | |||||
| Sponsored by the Mixed Methods Evaluation TIG | |||||
| Chair(s): | |||||
| Robert Phillips, The California Endowment, rphillips@calendow.org | |||||
| Abstract: The federal Deficit Reduction Act of 2005 (DRA) requires citizens applying for or renewing Medicaid coverage to provide documentation establishing citizenship and identity. Implementing this policy affected state and county Medicaid administrators who had to modify existing enrollment processes, as well as current and potential beneficiaries who faced an additional application step before obtaining coverage. This session presents findings from a comprehensive mixed-methods evaluation of the impact of DRA implementation in California, a state that aimed to implement the DRA with as much flexibility as possible to avoid the negative consequences for enrollment that some states reported. To assess the impact of DRA implementation on counties and clients, two surveys of county-level administrators and site visits to six counties were conducted. In combination with Medicaid enrollment records, data collected through the surveys enabled a rigorous quantitative estimate of the DRA's impact on enrollment and retention trends for Medicaid beneficiaries. | |||||
| |||||
|
| Session Title: Video in Evaluation: Methodological Opportunities and Technical Tips |
| Demonstration Session 387 to be held in Oceanside on Thursday, Nov 3, 1:35 PM to 2:20 PM |
| Sponsored by the Qualitative Methods TIG |
| Presenter(s): |
| Corey Newhouse, Public Profit, corey@publicprofit.net |
| Abstract: This session will explore opportunities to incorporate video into evaluation, both as a means to complement written findings and as a cognitive elicitation technique. Drawing on the presenter's experiences in an evaluation of a teacher coaching initiative, the session will include a brief summary of the literature, exemplars of video use in evaluation, and tips for successful implementation of this method. The session content is geared toward evaluators who are considering incorporating video into their practice or have just begun to do so. |
| Session Title: Assessing Coalition Building and Relationships Through Social Network Analysis |
| Expert Lecture Session 389 to be held in Palos Verdes A on Thursday, Nov 3, 1:35 PM to 2:20 PM |
| Sponsored by the Social Network Analysis TIG |
| Presenter(s): |
| Todd Honeycutt, Mathematica Policy Research, thoneycutt@mathematica-mpr.com |
| Marykate Zukiewicz, Mathematica Policy Research, mzukiewicz@mathematica-mpr.com |
| Debra Strong, Mathematica Policy Research, dstrong@mathematica-mpr.com |
| Discussant(s): |
| Tom Bartholomay, University of Minnesota, barth020@umn.edu |
| Abstract: Among the many objectives of funding a program, funders and participants want to help build relationships among those involved that could potentially last beyond the initial project funding. The Consumer Voices for Coverage program, funded by the Robert Wood Johnson Foundation initially for three years, sought to help 12 state-level consumer advocacy coalitions address health policy in their states as well strengthen the relationships among participating organizations. As part of a larger multi-mode evaluation, we used social network analysis (SNA) methods to assess the extent to which participating organizations of each coalition had worked together before the grant and how organizations communicated and worked collaboratively with each other in the first and third grant years. This presentation will describe how we used SNA for the evaluation and compare our findings on coalition building and relationships with other results from the evaluation. |
| Session Title: The Canadian Federal Evaluation Policy: Drivers, Features for Supporting Quality and Use of Evaluation, and Lessons Learned |
| Expert Lecture Session 391 to be held in Redondo on Thursday, Nov 3, 1:35 PM to 2:20 PM |
| Sponsored by the Evaluation Policy TIG |
| Chair(s): |
| Anne Routhier, Treasury Board of Canada Secretariat, anne.routhier@tbs-sct.gc.ca |
| Presenter(s): |
| Anne Routhier, Treasury Board of Canada Secretariat, anne.routhier@tbs-sct.gc.ca |
| Abstract: On April 1st 2009, a renewed Canadian federal Policy on Evaluation, along with a Directive and Standard, came into effect. The objective of this policy - which applies to most departments and agencies across the Government of Canada - is creating a comprehensive and reliable base of evaluation evidence that is used to support policy and program improvement, expenditure management, Cabinet decision making, and public reporting. In this presentation, the Senior Director of the Treasury Board of Canada Secretariat's Centre of Excellence for Evaluation will provide participants with an overview of the drivers of the renewed policy. Moreover, this presentation will explore the emerging trends in evaluation quality and utilization, since the introduction of the policy in 2009, as assessed through the annual review of evaluation functions in federal departments and agencies. Finally, this presentation will review 'lessons learned' that may inform evaluation policy development in other jurisdictions. |
| Session Title: Methods and Models for Evaluating Training Programs | ||||||||||
| Multipaper Session 392 to be held in Salinas on Thursday, Nov 3, 1:35 PM to 2:20 PM | ||||||||||
| Sponsored by the AEA Conference Committee | ||||||||||
|
| Session Title: Methods in Evaluating Advocacy Efforts: Grantmakers' Perspectives | ||||||||||
| Multipaper Session 393 to be held in San Clemente on Thursday, Nov 3, 1:35 PM to 2:20 PM | ||||||||||
| Sponsored by the Advocacy and Policy Change TIG | ||||||||||
| Chair(s): | ||||||||||
| Aimee White, Trident United Way, awhite@tuw.org | ||||||||||
|
| Session Title: Responding to Insufficient RFPs |
| Skill-Building Workshop 394 to be held in San Simeon A on Thursday, Nov 3, 1:35 PM to 2:20 PM |
| Sponsored by the Independent Consulting TIG and the Graduate Student and New Evaluator TIG |
| Presenter(s): |
| Nakia James, Western Michigan University, nakia.s.james@wmich.edu |
| Abstract: It is often necessary for Nonprofit Organizations (NPOs) to formulate a Request for Proposal to procure essential services in fulfillment to their grant obligations. This is most common when seeking the services of an external evaluator. Since most NPOs do not have an internal evaluation staff, developing an appropriate RFP can be quite challenging. This may lead to the first obstacle encountered by both the NPO and the potential consultant. Many NPOs are unaware of how to appropriately include key components in the RFP. The quality of the bidder's response is highly dependent on the quality of the RFP. Accordingly, it is imperative that we strive to better understand the RFP process, from its conception to fulfillment. The experience of potential consultants will also be quite useful in further understanding how RFPs are developed, and how to appropriately address and respond to any inadequacies in the RFPs. |
| Session Title: Recruiting Participants for Education Studies: Practical Strategies and Advice |
| Demonstration Session 395 to be held in San Simeon B on Thursday, Nov 3, 1:35 PM to 2:20 PM |
| Sponsored by the Pre-K - 12 Educational Evaluation TIG |
| Presenter(s): |
| Elizabeth Autio, Education Northwest, elizabeth.autio@educationnorthwest.org |
| Jason Greenberg Motamedi, Education Northwest, j.g.motamedi@educationnorthwest.org |
| Abstract: Recruitment of participants to take part in a study is a challenge many evaluators face. Failure to accomplish this task can result in the demise of a study before it even begins. In education, recruitment can be particularly daunting, as multiple layers of stakeholders must buy into the study. There are also sometimes ethical concerns about providing or denying the treatment to students. Finally, a plethora of programs and initiatives compete for educators' time and attention. In this session, evaluators who successfully recruited schools for a randomized controlled trial (RCT) of an instructional model share their lessons learned. It will include practical strategies and concrete advice for other evaluators. Topics will include budgeting, developing an approach, creating systems and materials, outreach, interacting with potential participants, delivery of the message, and welcoming, tracking, and retaining participants. These lessons learned can be applied to recruitment for all studies, and RCTs in particular. |
| Roundtable: Evaluation of a Multi-Site Caregiver Training Program in Rural Arkansas: Challenges and Lessons Learned |
| Roundtable Presentation 396 to be held in Santa Barbara on Thursday, Nov 3, 1:35 PM to 2:20 PM |
| Sponsored by the Human Services Evaluation TIG |
| Presenter(s): |
| Jasna Vuk, University of Arkansas, jvuk@uams.edu |
| Amyleigh Overton-McCoy, University of Arkansas, overtonmccoyamyl@uams.edu |
| Sherry White, University of Arkansas, slwhite2@uams.edu |
| Robin McAtee, University of Arkansas, mcateerobine@uams.edu |
| Abstract: The Home Caregiver Training Project was funded in 2009 for a three year period at four Regional Centers on Aging in the state of Arkansas for the purpose of training caregivers for elderly in their homes. The project replicated an already existing training program called Schmieding Home Caregiver Training Program in Northwest Arkansas. The evaluation plan used the Logic Model framework and was designed to facilitate judgments regarding the merit, value, and impact of the program, guide successful replication, and collect evidence to justify the funding of additional training programs at four other locations. A comprehensive evaluation plan was developed; however, evaluation of the program at four sites over multiple years has been challenging. Lessons are learned from implementation of the program and evaluation of the quality of training. The value and impact of the program on communities, caregivers, and elderly who hire trained caregivers present new challenges for evaluation. |
| Session Title: Hierarchical Linear Modeling as a Valuable Tool in Evaluation | |||||||
| Multipaper Session 397 to be held in Santa Monica on Thursday, Nov 3, 1:35 PM to 2:20 PM | |||||||
| Sponsored by the Quantitative Methods: Theory and Design TIG | |||||||
|
| Session Title: Finances and Institutions: Finding Value Through Evaluation | ||||||||||
| Multipaper Session 398 to be held in Sunset on Thursday, Nov 3, 1:35 PM to 2:20 PM | ||||||||||
| Sponsored by the Business and Industry TIG | ||||||||||
| Chair(s): | ||||||||||
| Thomas Ward II, United States Army, thomas.wardii@cgsc.edu | ||||||||||
|
| Session Title: Selecting Measures and Instruments for Studying Fidelity and Outcomes in Education: Lessons Learned From an Evaluation of a Teacher Professional Development Program |
| Multipaper Session 399 to be held in Ventura on Thursday, Nov 3, 1:35 PM to 2:20 PM |
| Sponsored by the Pre-K - 12 Educational Evaluation TIG |
| Chair(s): |
| Castle Sinicrope, Social Policy Research Associates, csinicrope@spra.com |
| Yasuyo Abe, Berkeley Policy Associates, yasuyo@bpacal.com |
| Abstract: With the current emphasis on rigorous research and experimental designs in education, evaluators must often negotiate different values and perspectives on measuring implementation and outcomes. While outcomes continue to be a primary focus in education evaluation, there is a growing recognition of the importance of understanding fidelity of implementation. In this session, presenters share lessons learned from measuring implementation fidelity and outcomes in a random assignment evaluation of a multi-year teacher professional development program. The presentation broadly reflects on balancing different stakeholder values in selecting measures and instruments to answer questions about fidelity and outcomes. Specific challenges to be covered in the session include identifying critical program components, selecting fidelity criteria, specifying outcome measures, and deciding between developing, adapting, or adopting existing instruments. This session will also share lessons learned from timing data collection efforts relative to program delivery and reporting implementation and outcome findings. |
| Lessons Learned From Documenting Implementation Fidelity: Developing Implementation Fidelity Measures for a Teacher Professional Development Program |
| Vanora Thomas, Berkeley Policy Associates, vanora@bpacal.com |
| Yasuyo Abe, Berkeley Policy Associates, yasuyo@bpacal.com |
| There is a growing recognition of the value of measuring fidelity of implementation in education research. An understanding of fidelity of implementation, the degree to which an intervention is delivered as intended, is essential to a comprehensive understanding of intervention outcomes and the identification of contextual factors that support or hinder implementation. However, few published studies provide details on the development and use of program specific fidelity measures for intensive school-based interventions. This presentation will outline the lessons learned from measuring implementation fidelity in a multi-year random assignment evaluation of a teacher professional program. Topics to be covered include the identification of the program's critical components and processes, selection of fidelity criteria, and the development of instruments to measure and monitor implementation. This session will also include the challenges associated with the collection of detailed program records and the potential roadblocks faced when presenting implementation findings. |
| Measuring Program Outcomes: Defining and Focusing Outcomes for a Teacher Professional Development Program |
| Castle Sinicrope, Social Policy Research Associates, csinicrope@spra.com |
| While outcomes are widely studied in education, selecting and measuring outcomes continues to pose challenges for evaluators. This presentation reflects on challenges faced in defining and measuring teacher and student outcomes in a recent multi-year teacher professional development evaluation. Key challenges included identifying outcomes that were aligned with the program theory of change and the values and priorities of the different stakeholders. Topics to be covered during the presentation include selecting short-term versus long-term outcomes, limiting and prioritizing outcomes to minimize multiple comparisons, timing data collection relative to program implementation, and reporting outcome findings. This presentation will also reflect on three additional considerations when selecting teacher and student outcomes for evaluating teacher professional development programs: 1) using established, national student assessments versus local, state-level student assessments; 2) adapting existing instruments versus developing new instruments; and 3) challenges posed by under- and over-alignment of instruments with programs. |