| Session Title: President Obama's Evaluation Policies |
| Expert Lecture Session 542 to be held in Lone Star A on Friday, Nov 12, 10:55 AM to 11:40 AM |
| Sponsored by the Presidential Strand and the Government Evaluation TIG |
| Chair(s): |
| Jennifer Greene, University of Illinois at Urbana-Champaign, jcgreene@illinois.edu |
| Presenter(s): |
| Invited Speaker, , |
| Discussant(s): |
| Patrick Grasso, World Bank, pgrasso45@comcast.net |
| Abstract: This session will provide an explanation of and an update on where things stand with the evaluation policies of President Obama’s administration. |
| Session Title: Examples From the Field: Using Mixed Methodological Frameworks in Theory-Driven Evaluations | |||||||||||||
| Multipaper Session 543 to be held in Lone Star B on Friday, Nov 12, 10:55 AM to 11:40 AM | |||||||||||||
| Sponsored by the Program Theory and Theory-driven Evaluation TIG | |||||||||||||
| Chair(s): | |||||||||||||
| Katrina Bledsoe, Walter R McDonald and Associates Inc, katrina.bledsoe@gmail.com | |||||||||||||
|
| Session Title: Youth Participatory Evaluation: Entering the Age of the Internet |
| Think Tank Session 544 to be held in Lone Star C on Friday, Nov 12, 10:55 AM to 11:40 AM |
| Sponsored by the Collaborative, Participatory & Empowerment Evaluation TIG |
| Presenter(s): |
| Robert Shumer, University of Minnesota, rshumer@umn.edu |
| Discussant(s): |
| Robert Shumer, University of Minnesota, rshumer@umn.edu |
| Kim Sabo Flores, Evaluation Access and ActKnowledge, kimsabo@aol.com |
| Abstract: Youth participatory evaluation has been evolving over the past two decades. While still young, engagement of youth has undertaken some new twists, especially involving the use of electronic media to facilitate its expansion and improve methodology. In this Think Tank we explore two new efforts to use electronic systems to increase the capacity of programs to engage youth in the evaluation process. One system involves the use of an on-line education/training program to help adult mentors/educators work with youth to develop participatory evaluation projects. The second involves the development of e-Portfolios to capture and evaluate learning and social change enacted through service-learning and civic engagement programs. The audience will have the opportunity to react to both systems and then make recommendations for change/improvement so youth participatory evaluation, in the electronic age, can be even more effective. Program Youth participatory evaluation is a field in the making (Sabo, 2003). Ever since the Wingspread conference on youth participatory evaluation (Checkoway, 2003) more and more people are engaging youth in the evaluation process. From Youth in Focus in San Francisco, to engagement of youth in critical praxis through youth/community studies in California (Duncan-Andrade and Morrell, 2008), youth are becoming more involved in evaluating the programs and worlds they inhabit. Preparing them to do a solid job is a challenge, especially since there are such limited resources. In order to address this challenge, Kim Sabo Flores has begun a series of on-line seminars to prepare adults to work with youth in various stages of participatory evaluation. She is piloting a series of lessons that are designed to help adults become true facilitators of evaluation/learning and to implement a reasonable project that demonstrates an understanding of the use of praxis in youth engagement and review. In this Think Tank the audience will have the opportunity to review the sample lessons/program materials and critique the approach and the substance of the program. The goal is to ensure more public input in the development of the lessons, especially from a critical group of evaluators who know and understand youth participatory evaluation. A second approach will be presented for critique and comment by Rob Shumer, evaluator from the University of Minnesota. Shumer is experimenting with the development of e-Portfolios as a mechanism to document and record student experiences with service-learning from middle school through undergraduate education. Each student will have the use of a personal portfolio, developed by the University of Minnesota, with which to record and organize their information about service-learning being a transformative experience. In this part of the Think Tank the audience will have an opportunity to both critique the model for evaluation and comment on the utility of such an instrument and process to record the learning and impact of the programs on the individual and the community. It will also provide a source for discussion of the use of e-Portfolios as a large data source for complex learning initiatives, such as service-learning and civic engagement. Each of these projects should help to promote the kind of discussion that will expand and improve the delivery of youth participatory evaluation for all settings. By obtaining public input on the training materials and the e-Portfolio system, the field of youth participatory evaluation will be greatly improved, making it a more suitable option for many applications of youth engaged in the evaluation process. |
| Session Title: Funder’s Use of Network Analysis to Build Intentional Collaboration | |||
| Panel Session 545 to be held in Lone Star D on Friday, Nov 12, 10:55 AM to 11:40 AM | |||
| Sponsored by the Non-profit and Foundations Evaluation TIG | |||
| Chair(s): | |||
| David J Dobrowski, First 5 Monterey, dobrowski@gmail.com | |||
| Abstract: Once you have network analysis maps, what do you do with them? They are useful tools to create a picture of the nature and depth of relationships. We will share how a funder used network analysis maps with a group of twenty three funded agencies. The agencies were able to use the maps to make strategic organizational decisions as to how and with whom it made sense to coordinate and collaborate. Techniques for helping grantees read and understand the network maps - as well as facilitating discussions with them - will be shared. At a different level, how the maps were used by the funder to understand shifts in the service delivery system for young children will also be shared. | |||
| |||
|
| Session Title: Using Latent Class Analysis to Target and Tailor Programs to Specific Populations |
| Demonstration Session 546 to be held in Lone Star E on Friday, Nov 12, 10:55 AM to 11:40 AM |
| Sponsored by the Quantitative Methods: Theory and Design TIG |
| Presenter(s): |
| Humphrey Costello, University of Wyoming, humphrey.costello@uwyo.edu |
| Reese Jenniges, University of Wyoming, jenniges@uwyo.edu |
| Abstract: In this demonstration, we present an application of Latent Class Analysis (LCA) to adult smokers in Wyoming. LCA is a probabilistic clustering method for identifying unmeasured class membership using both categorical and continuous variables (see Vermunt & Magidson, 2000). We apply LCA to derive a four-class typology of smoking behavior and intention to quit. We then use logistic regression to identify associations between smoking types and demographic variables. Programming and policies may then be tailored to target the needs of each type of smoker. The demonstration introduces the LCA method, discusses appropriate uses, details steps and diagnostics in developing LCA models, and describes how LCA may enhance both program design and program evaluation. |
| Session Title: Multi-method Approaches | |||||||||||
| Multipaper Session 547 to be held in Lone Star F on Friday, Nov 12, 10:55 AM to 11:40 AM | |||||||||||
| Sponsored by the Cluster, Multi-site and Multi-level Evaluation TIG | |||||||||||
| Chair(s): | |||||||||||
| John Hitchcock, Ohio University, hitchcoc@ohio.edu | |||||||||||
|
| Roundtable: Controlling Quality Within Museums: Coordinating Internal Evaluation Departments |
| Roundtable Presentation 548 to be held in MISSION A on Friday, Nov 12, 10:55 AM to 11:40 AM |
| Sponsored by the Evaluation Managers and Supervisors TIG |
| Presenter(s): |
| Sarah Cohn, Science Museum of Minnesota, scohn@smm.org |
| Anna Lindgren-Streicher, Museum of Science, Boston, alstreicher@mos.org |
| Abstract: While the basic demand on evaluators is to assess the merit and worth of the products a client produces, an internal evaluation department has the added demand of being appropriately structured and positioned to most effectively fit the community and culture of practice at work in their organization. Internal evaluation departments at museums have the added responsibility of not only answering to the informal learning environment setting in which their projects reside but also to the more widely accepted educational world of formal education. Inherent in this position is the need for evaluations to be of both high quality and flexibility, as the needs of the project team, the museum, the community, and the nature of informal learning shift with time. This discussion focuses on how two museums manage and communicate their evaluations to be most effective in both the smaller and larger communities of practice at work. |
| Session Title: Issues Impacting the Quality of Program Evaluation and Programs Serving Youth and Young Adults With Significant Disabilities | ||||
| Panel Session 549 to be held in MISSION B on Friday, Nov 12, 10:55 AM to 11:40 AM | ||||
| Sponsored by the Special Needs Populations TIG | ||||
| Chair(s): | ||||
| Michael Du, Human Development Institute, zdu@email.uky.edu | ||||
| Discussant(s): | ||||
| Brent Garrett, Pacific Institute for Research and Evaluation (PIRE), bgarrett@pire.org | ||||
| Abstract: This session will focus on two presentations each of which deals with quality and evaluation in a distinctive manner. One presentation will discuss the issues that arise impacting the quality of evaluation findings in collecting badly needed interview data from youth and young adults with significant cognitive disabilities. It will also provide information on the strategies developed by evaluators to resolve these issues and to obtain interview data that allowed for developing the stories of participants’ experiences in the program in these participants’ own voices. The second presentation will focus on what evaluation findings reveal about a range of challenges which emerged during efforts to administer and implement the program that affected the quality of the program to serve children with significant developmental disabilities in ways consistent with the needs of these children. The purpose of the presentation is to explore what evaluation findings tell us about one program’s efforts to serve children with developmental disabilities using structures and implementation practices used in serving typically developing children and/or children with mild to moderate disabilities. The presentations are related in that each will discuss issues that arise in part due to the need for evaluators to grasp the unique and distinctive lived experiences of groups served by the programs being evaluated. This need is the background against which quality plays out in evaluation practice in one case and in the other case, in insights regarding inconsistencies between program structure and implementation on one hand and participant needs and characteristics on the other. | ||||
| ||||
|
| Session Title: The Adaptive Action Cycle: Bridging the Gap Between Lessons Learned and Lessons Applied | |||
| Panel Session 550 to be held in BOWIE A on Friday, Nov 12, 10:55 AM to 11:40 AM | |||
| Sponsored by the Systems in Evaluation TIG and the Human Services Evaluation TIG | |||
| Chair(s): | |||
| Mallary Tytel, Healthy Workplaces, mtytel@healthyworkplaces.com | |||
| Discussant(s): | |||
| Mallary Tytel, Healthy Workplaces, mtytel@healthyworkplaces.com | |||
| Abstract: Human Systems Dynamics helps individuals and organizations see and influence patterns of interaction and behavior that surround them. Using a collection of concepts, processes and tools, practitioners can better understand what is happening in everyday connections. The Adaptive Action Cycle (AAC) is a process which asks three simple questions: What? So What? and Now What? These questions can assist program managers and evaluators in capturing good information about what currently exists, recognizing and shaping patterns and behavior, and allowing them to think about change and intervention in a different way. Lessons learned are only part of the solution. Using a community-based case study, this session will follow a multi-tiered process of data collection, analysis and decision making to demonstrate an approach to bridging the gap between lessons learned and lessons applied. | |||
| |||
|
| Session Title: Using Evaluation to Enhance Program Participation With Underrepresented Groups in Science, Technology, Engineering and Mathematics (STEM) Fields | ||||||||||
| Multipaper Session 551 to be held in BOWIE B on Friday, Nov 12, 10:55 AM to 11:40 AM | ||||||||||
| Sponsored by the Multiethnic Issues in Evaluation TIG | ||||||||||
| Chair(s): | ||||||||||
| Tamara Bertrand Jones, Florida State University, tbertrand@fsu.edu | ||||||||||
|
| Session Title: How Does Evidence Influence Policy Change? Examining Two Complementary Approaches With Two Complementary Evaluations | ||||
| Panel Session 552 to be held in BOWIE C on Friday, Nov 12, 10:55 AM to 11:40 AM | ||||
| Sponsored by the Advocacy and Policy Change TIG | ||||
| Chair(s): | ||||
| Carlisle Levine, CARE, clevine@care.org | ||||
| Discussant(s): | ||||
| Veena Pankaj, Innovation Network, vpankaj@innonet.org | ||||
| Ehren Reed, Innovation Network, ereed@innonet.org | ||||
| Abstract: If alleviating global poverty depends on successful pro-poor policies, then, CARE, like other international humanitarian organizations, can promote these policies by presenting evidence based on decades of working in more than 60 countries. With Gates Foundation support, CARE is testing this hypothesis via two initiatives. CARE's LIFT UP grant aims to build organizational capacity to more systematically use country-level evidence to influence U.S. policymakers. CARE’s Learning Tours grant provides Members of Congress and influential media and “grasstops” leaders with firsthand experiences aimed at increasing their support for improving maternal health and child nutrition globally. Working with external evaluators Innovation Network (Innonet) and Continuous Progress Strategic Services (CPSS), CARE is assessing the effectiveness of these approaches. Panelists will discuss how to measure the effect of country-based evidence on policy change and highlight how CARE’s overlapping evaluations, inform each other’s work, and increase CARE's ability to influence policy change. | ||||
| ||||
|
| Roundtable: Incorporating Gender into Mainstream Projects |
| Roundtable Presentation 553 to be held in GOLIAD on Friday, Nov 12, 10:55 AM to 11:40 AM |
| Sponsored by the Feminist Issues in Evaluation TIG |
| Presenter(s): |
| Sathi Dasgupta, SONA Consulting Inc, sathi@sonaconsulting.net |
| Brian Heilman, International Center for Research on Women, heilman.brian@gmail.com |
| Ronda Schlagen, Independent Consultant, rschlangen@yahoo.com |
| Jim Rugh, Independent Consultant, jimrugh@mindspring.com |
| Pamela Walsh, Eastern Michigan University, walshmgmt@comcast.net |
| Abstract: As gender issues have been mainstreamed, their inclusion in evaluation has often become an afterthought. In this roundtable, we will share specific examples of ways in which gender can be addressed in evaluations of projects or programs with not particular focus on gender. The presenters come from a variety of backgrounds, both male and female, discussing their experiences in US and international contexts. Some topics will include: engaging men as allies in gender analysis, safety and ethical concerns for researching gender-based violence, gender and labor, and more! The discussion will focus on the pros and cons of concrete methods and instruments which might enable a stronger gender analysis in an evaluation. |
| Roundtable: Utilizing Metaevaluation to Validate Evaluation Quality: Study of a Grant-Funded Graduate Program for Minority Group Students in Science, Technology, Engineering, and Mathematics (STEM) |
| Roundtable Presentation 554 to be held in SAN JACINTO on Friday, Nov 12, 10:55 AM to 11:40 AM |
| Sponsored by the Research on Evaluation TIG |
| Presenter(s): |
| Angela Watson, Oklahoma State University, angela.watson@okstate.edu |
| Katye Perry, Oklahoma State University, katye.perry@okstate.edu |
| Abstract: The purpose of this session is to present the results of a metaevaluation of a project completed by the presenters within a university setting. More specifically, the authors utilized a discrepancy evaluation model to initially evaluate the program as they also adhered to the Program Evaluation Standards (Joint Committee, 1994). In keeping with the theme of this year’s conference, the authors will present a summary of the processes engaged in during the evaluation followed by cross-validations of these processes against the validity standards advanced by House (1980). In doing so, resulting analyses will help the presenters determine how well the evaluation of the project evidenced “truth” (House, 1980, p. 88), “aesthetic principles” (1980, p. 106) and “justice” (1980, p. 135). House, E. R. (1980). Evaluating with validity. Beverly Hills, CA: Sage. Joint Committee on Standards for Educational Evaluation. (1994). The program evaluation standards (2nd ed.). Thousand Oaks, CA: Sage. |
| Session Title: Crossing Barriers: Engaging Faculty, Staff, and Students Through Online Course Evaluations | |||
| Panel Session 555 to be held in TRAVIS A on Friday, Nov 12, 10:55 AM to 11:40 AM | |||
| Sponsored by the Assessment in Higher Education TIG | |||
| Chair(s): | |||
| Karissa Oien, California Lutheran University, koien@callutheran.edu | |||
| Abstract: California Lutheran University has been conducting online course evaluations for the past two years. This session will focus on the strategies used to transition to the online system, mainly through developing an understanding of the process between students, faculty, and staff. Connections between groups were developed through successful student marketing campaigns, cross-committee faculty meetings, a faculty survey, a staff workshop, and the creation of our diverse CoursEval Team. This session will discuss how these strategies created campus-wide awareness of the importance of online course evaluations. | |||
| |||
|
| Session Title: Using Cost --> Procedure --> Process --> Outcome Analysis (CPPOA) Data to Improve Substance Abuse Prevention Programs and Portfolios | |||
| Panel Session 556 to be held in TRAVIS B on Friday, Nov 12, 10:55 AM to 11:40 AM | |||
| Sponsored by the Costs, Effectiveness, Benefits, and Economics TIG | |||
| Chair(s): | |||
| Michael Langer, State of Washington, langeme@dshs.wa.gov | |||
| Discussant(s): | |||
| Beverlie Fallik, United States Department of Health and Human Services, beverlie.fallik@samhsa.hhs.gov | |||
| |||
|
| Session Title: Communication: At What Level Does It Help or Hinder Evaluation Capacity? | |||||||||||
| Multipaper Session 557 to be held in TRAVIS C on Friday, Nov 12, 10:55 AM to 11:40 AM | |||||||||||
| Sponsored by the Organizational Learning and Evaluation Capacity Building TIG | |||||||||||
| Chair(s): | |||||||||||
| Eric Barela, Partners in School Innovation, ebarela@partnersinschools.org | |||||||||||
| Discussant(s): | |||||||||||
| Susan Parker, Clear Thinking Communications, susan@clearthinkingcommunications.com | |||||||||||
|
| Session Title: Is Quality Improvement In Healthcare Cost-Effective? |
| Expert Lecture Session 558 to be held in TRAVIS D on Friday, Nov 12, 10:55 AM to 11:40 AM |
| Sponsored by the Health Evaluation TIG |
| Chair(s): |
| Mary Gutmann, EnCompass LLC, mgutmann@encompassworld.com |
| Presenter(s): |
| Edward Broughton, University Research Company LLC, ebroughton@urc-chs.com |
| Abstract: Many health interventions have been shown to be cost-effective when implemented to evidence-based quality standards. However, several studies show that health care provided for much of the world’s population fails to meet such standards. Quality improvement (QI) interventions can overcome common obstacles to providing high quality care, even in situations where resources are scarce and health systems are weak. Many decisionmakers are skeptical of such interventions. Therefore, it is crucially important that an economic case can be made for QI. Using examples from US and international health care settings, this lecture discussed methods of cost-effectiveness analysis for QI programs – why they are done, how they are performed and how to interpret their results. This information is crucial to anyone interested in understanding and performing full evaluations of QI programs to make a business case for working towards improvements in the quality of health care. |
| Session Title: None of the Above: Expanding Binary Categorizations | ||||||||||
| Multipaper Session 559 to be held in INDEPENDENCE on Friday, Nov 12, 10:55 AM to 11:40 AM | ||||||||||
| Sponsored by the Lesbian, Gay, Bisexual, Transgender Issues TIG | ||||||||||
| Chair(s): | ||||||||||
| John T Daws, University of Arizona, johndaws@email.arizona.edu | ||||||||||
| Discussant(s): | ||||||||||
| John T Daws, University of Arizona, johndaws@email.arizona.edu | ||||||||||
|
| Session Title: Technology and Student Outcomes: Mathematics, Language Arts, and Big-District Diversity | ||||||||||
| Multipaper Session 560 to be held in PRESIDIO A on Friday, Nov 12, 10:55 AM to 11:40 AM | ||||||||||
| Sponsored by the Distance Ed. & Other Educational Technologies TIG | ||||||||||
| Chair(s): | ||||||||||
| Talbot Bielefeldt, International Society for Technology in Education, talbot@iste.org | ||||||||||
|
| Session Title: Herding Cats: Improving the Quality and Quantity of Decentralized Evaluation in a Global Organization Through Capacity Building |
| Demonstration Session 561 to be held in PRESIDIO B on Friday, Nov 12, 10:55 AM to 11:40 AM |
| Sponsored by the Organizational Learning and Evaluation Capacity Building TIG |
| Presenter(s): |
| Thea C Bruhn, United States Department of State, bruhntc@state.gov |
| Abstract: With 38 bureaus and over 240 embassies and missions around the world, the U.S. Department of State (State) does not lend itself to a one-size-fits-all approach to evaluation. At the same time, State faces an increasing demand for credible evidence of the impact of the U.S. Government’s foreign policy activities. In this session, participants will better understand the important of an integrated model of capacity building to ensure that very disparate approaches to program evaluation on a global scale meet standards for quality and are responsive to the agency’s needs. Participants will see how such a model at State better enables evaluation to: • Improve effectiveness in achieving U.S. foreign policy goals; • Document project accomplishments and achieved outcomes; • Integrate senior leadership priorities; • Demonstrate “value for money;” and • Help coordinate and focus strategic planning to ensure accountability and transparency. |
| Session Title: Alcohol, Drug Abuse and Mental Health TIG Business Meeting |
| Business Meeting Session 562 to be held in PRESIDIO C on Friday, Nov 12, 10:55 AM to 11:40 AM |
| Sponsored by the Alcohol, Drug Abuse, and Mental Health TIG |
| TIG Leader(s): |
| Marge Cawley, National Development and Research Institutes (NDRI), cawley@ndri-nc.org |
| Diana Seybolt, University of Maryland, Baltimore, dseybolt@psych.maryland.edu |
| Roundtable: Assessing Board Performance: Challenges and Constraints |
| Roundtable Presentation 563 to be held in BONHAM A on Friday, Nov 12, 10:55 AM to 11:40 AM |
| Sponsored by the Business and Industry TIG |
| Presenter(s): |
| Zita Unger, Evaluation Solutions Pty Ltd, zitau@evaluationsolutions.com |
| Abstract: Evaluation of board performance has increased considerably in recent years. Since the collapse of high-profile corporations, such as Enron, Tyco and WorldCom more rigorous forms of accountability and compliance have been become standard for public company boards and more commonplace for boards in the public, private, non-profit and for-profit sectors. Whilst the various sectors operate within their own regulatory systems and contexts, all effective boards demonstrate a balance of skills, behaviors, relationships, diversity as well as structures and process. Evaluation plays an important role in contributing to their quality and continuous improvement. The intent of this roundtable is to discuss strategic issues for evaluation in the governance space. Such as, what are key questions for evaluation of board effectiveness? What are optimal conformance and performance measures? What are important human capital issues? Who should evaluate the board? Is diversity the melting pot of success? |
| Session Title: Low-cost, High-Quality Assessments for Nonprofit Adolescent Pregnancy Prevention Program Planning | |||||
| Panel Session 564 to be held in BONHAM B on Friday, Nov 12, 10:55 AM to 11:40 AM | |||||
| Sponsored by the Non-profit and Foundations Evaluation TIG | |||||
| Chair(s): | |||||
| Shannon Flynn, South Carolina Campaign to Prevent Teen Pregnancy, sflynn@teenpregnancysc.org | |||||
| Abstract: This panel will focus on the challenge of conducting evaluations in settings with limited resources, such as nonprofits. Solid assessment data is required to design effective interventions but collection of data may seem too costly and cumbersome with shrinking evaluation budgets. Recently, the South Carolina Campaign to Prevent Teen Pregnancy (Campaign) conducted two evaluations to assess environmental structures that may impede or promote contraceptive use among 18 – 19 year old youth that yielded valuable results for program planning, but required limited amounts of organizational resources: staff, time, and money. The first evaluation examines the availability of sexual health services on college campuses and the second illustrates the experience of adolescents who purchase condoms. Success and challenges with methods will be described. The Campaign is a 15 year old nonprofit that prevents teen pregnancy by building the capacity of organizations and communities through education, technical assistance, public awareness, advocacy and research. | |||||
| |||||
|
| Session Title: Is Working Together Worth It? The Process and Findings of a Longitudinal Evaluation of a Districtwide Professional Learning Community Initiative |
| Expert Lecture Session 565 to be held in BONHAM C on Friday, Nov 12, 10:55 AM to 11:40 AM |
| Sponsored by the Pre-K - 12 Educational Evaluation TIG |
| Chair(s): |
| Rebecca Gajda Woodland, University of Massachusetts, Amherst, rebecca.gajda@educ.umass.edu |
| Presenter(s): |
| Rebecca Gajda Woodland, University of Massachusetts, Amherst, rebecca.gajda@educ.umass.edu |
| Discussant(s): |
| Mark Zito, East Hartford Public School District, zito.mf@easthartford.org |
| Abstract: In this session, Dr. Gajda, a former secondary school teacher and administrator, will present the Teacher Collaboration Improvement Framework (Gajda, 2008; Gajda & Koliba, 2007; Koliba & Gajda, 2009), which is a field-tested framework for systematically evaluating the quality and improving the performance of teacher collaboration in K-12 school districts. This framework has been utilized to formatively and summatively assess the attributes and achievements of three year professional learning community initiative in one New England school district. Evaluation methods included on-site observation of teacher teams, interviews with district administrators and teachers, and a comprehensive district-wide annual survey. Findings of the evaluation, including the correlation between quality of teacher collaboration, improvements in instruction, and advances in student learning will be showcased. In addition, participants will learn will how district personnel have used the process and findings of the evaluation to make decisions about how to improve supervision of teacher collaboration and professional development. |
| Session Title: Evaluating Education Programs for English Language Learners | ||||||||||||
| Multipaper Session 566 to be held in BONHAM D on Friday, Nov 12, 10:55 AM to 11:40 AM | ||||||||||||
| Sponsored by the Pre-K - 12 Educational Evaluation TIG | ||||||||||||
| Chair(s): | ||||||||||||
| Courtney Brown, Indiana University, coubrown@indiana.edu | ||||||||||||
| Discussant(s): | ||||||||||||
| Julie Sugarman, Center for Applied Linguistics, julie@cal-org | ||||||||||||
|
| Session Title: From Research to Commercialization: Impact Evaluation of Portfolios of Research | ||||||||||||
| Multipaper Session 567 to be held in BONHAM E on Friday, Nov 12, 10:55 AM to 11:40 AM | ||||||||||||
| Sponsored by the Research, Technology, and Development Evaluation TIG | ||||||||||||
| Chair(s): | ||||||||||||
| Israel Lederhendler, National Institutes of Health, lederhei@od.nih.gov | ||||||||||||
|
| Session Title: Evaluating Government Research and Technology Policies: Traditional and Emerging Methods | |||||||||
| Multipaper Session 569 to be held in Texas D on Friday, Nov 12, 10:55 AM to 11:40 AM | |||||||||
| Sponsored by the Research, Technology, and Development Evaluation TIG | |||||||||
| Chair(s): | |||||||||
| Cheryl Oros, Oros Consulting LLC, cheryl.oros@gmail.com | |||||||||
|
| Session Title: Joint Evaluation of Private Sector Development Projects: Benefits and Challenges | |||
| Panel Session 570 to be held in Texas E on Friday, Nov 12, 10:55 AM to 11:40 AM | |||
| Sponsored by the Evaluation Use TIG and the Organizational Learning and Evaluation Capacity Building TIG | |||
| Chair(s): | |||
| Ade Freeman, World Bank, afreeman@ifc.org | |||
| Discussant(s): | |||
| Cheryl Gray, World Bank, cgray@worldbank.org | |||
| Abstract: Increasingly, in the global environment, development banks bring their collective expertise and financial resources together to support public or private sector components of development projects. These projects are often complex and are structured to meet the development objectives of the supporting institutions. Joint evaluation of these projects can result in many advantages, especially with respect to knowledge sharing. They can also reduce the overall cost of evaluation, leverage resources and reduce the evaluation burden on the client. But, many challenges must be addressed including: different evaluation frameworks; inconsistent institutional missions; timing issues; incompatible disclosure policies; and even operating styles and personalities. This panel will convey the presenters good and bad experiences in conducting joint evaluations and, using cases, will suggest how to successfully engage in joint evaluations. | |||
| |||
|
| Session Title: Using R for Statistical Analysis |
| Demonstration Session 571 to be held in Texas F on Friday, Nov 12, 10:55 AM to 11:40 AM |
| Sponsored by the Graduate Student and New Evaluator TIG |
| Presenter(s): |
| Kristen Cici, University of Minnesota, denz0018@umn.edu |
| Abstract: Statistical software packages such as SPSS and SAS have long dominated the evaluation field as the statistical software packages to use when analyzing quantitative data. In recent years R, a syntax based statistical program, has become increasingly common – yet many evaluators have yet to hear about it. One of the greatest benefits of R is that it is open source and available at no cost. This session will include: introducing attendees to R, comparing R to other statistical software, and providing examples of how evaluators can use R in their evaluation work. |
| Session Title: The American Evaluation Association's Journal Editors Discuss Publishing in AEA's Journals | |||
| Panel Session 572 to be held in CROCKETT A on Friday, Nov 12, 10:55 AM to 11:40 AM | |||
| Sponsored by the AEA Conference Committee | |||
| Chair(s): | |||
| Thomas Schwandt, University of Illinois at Urbana-Champaign, tschwand@illinois.edu | |||
| Abstract: This session is aimed at those interested in submitting manuscripts for publication in either of AEA's sponsored journals, the American Journal of Evaluation or New Directions for Evaluation. The journal editors will discuss the scope of each journal, the submission and review processes, and keys for publishing success. | |||
| |||
|
| Session Title: How to Use Evaluation to Achieve Human Resources (HR) System Alignment |
| Demonstration Session 573 to be held in CROCKETT B on Friday, Nov 12, 10:55 AM to 11:40 AM |
| Sponsored by the Business and Industry TIG |
| Presenter(s): |
| Stephanie Fuentes, Inventivo Design LLC, stephanie@inventivodesign.com |
| Abstract: Evaluation plays a critical role in assuring HR systems are aligned to achieve the maximum benefits for organizations. Too frequently organizations have mismatched practices regarding talent management, employee development, performance management, and rewards and recognition. In many cases for-profit organizations are unfamiliar with the breadth and depth of evaluative capabilities they could use because the only experience they have involves evaluating training courses. By using evaluative inquiry throughout the system, alignment among the four areas can be managed over time to help the organization reach strategic goals. This session presents a model and complementary questions for aligning HR systems using an evaluative inquiry approach. Participants will be introduced step-by-step to each of the four areas and how to ask questions and present evaluation information to decision-makers, what challenges to expect in the process and when they are likely to occur, and what conditions influence successful use of the tool. |
| Session Title: Translating Evaluation to Enhance Its Meaning and Use: Examples From Two Indigenous Communities - Urban United States of America and Rural Uganda | |||||||||
| Multipaper Session 574 to be held in CROCKETT C on Friday, Nov 12, 10:55 AM to 11:40 AM | |||||||||
| Sponsored by the Indigenous Peoples in Evaluation TIG | |||||||||
| Chair(s): | |||||||||
| Joan LaFrance, Mekinak Consulting, lafrancejl@gmail.com | |||||||||
| Discussant(s): | |||||||||
| Joan LaFrance, Mekinak Consulting, lafrancejl@gmail.com | |||||||||
|
| Session Title: The United States Government Accountability Office's (GAO) New Yellow Book: What's in It for Evaluators? |
| Think Tank Session 575 to be held in CROCKETT D on Friday, Nov 12, 10:55 AM to 11:40 AM |
| Sponsored by the AEA Conference Committee |
| Presenter(s): |
| Michael Hendricks, Independent Consultant, mikehendri@aol.com |
| Rakesh Mohan, Idaho State Legislature, rmohan@ope.idaho.gov |
| Abstract: Mention the “GAO’s Yellow Book” or the “Government Auditing Standards of the U.S. Government Accountability Office” to 10 evaluators and you will likely receive 10 blank stares. Most evaluators don’t realize this document even exists, and those who do believe it relates only to auditing and accounting, certainly not to evaluation. But they would be wrong. Two of the document’s eight chapters and 50 of its 166 pages (30%) are devoted to “performance auditing”, which is extremely similar to program evaluation. In addition, this document is being revised this year, and the new Yellow Book may contain standards of special interest to evaluators. In this highly interactive Think Tank, two senior AEA members who serve on the US Comptroller General’s advisory council to the Yellow Book will introduce the new standards and lead a discussion of how they are applicable – and useful – to evaluators working both inside and outside government. |
| Session Title: Using Qualitative Methods in Evaluations With Limited Resources | ||||||||||||
| Multipaper Session 576 to be held in SEGUIN B on Friday, Nov 12, 10:55 AM to 11:40 AM | ||||||||||||
| Sponsored by the Qualitative Methods TIG | ||||||||||||
| Chair(s): | ||||||||||||
| Jennifer Jewiss, University of Vermont, jennifer.jewiss@uvm.edu | ||||||||||||
| Discussant(s): | ||||||||||||
| Jennifer Jewiss, University of Vermont, jennifer.jewiss@uvm.edu | ||||||||||||
|
| Session Title: Peace Corps’ Volunteer Reporting Tool: Increasing the Capacity for Evidence-based Decision-Making at Multiple Levels of the Peace Corps |
| Demonstration Session 577 to be held in REPUBLIC A on Friday, Nov 12, 10:55 AM to 11:40 AM |
| Sponsored by the International and Cross-cultural Evaluation TIG |
| Presenter(s): |
| Eleanor Shirley, Peace Corps, eshirley@peacecorps.gov |
| Abstract: The Volunteer Reporting Tool (VRT) represents a significant step towards Peace Corps’ goal of rigorously demonstrating the results of our Volunteers’ and Partners’ diverse work worldwide. The Peace Corps’ size and unique structure have made standardized monitoring and evaluation a real challenge for the agency over the years. When the VRT was rolled out to over 65 Peace Corps posts worldwide in 2008-2009, it was the first time in the agency that every Post and every Volunteer used a standardized, yet customizable, data collection and data management system. This demonstration will show how the VRT works at each Peace Corps post, and will reveal successes, challenges and lessons learned in designing this system to align with Peace Corps work, and in training field staff and Volunteers to effectively use the system. |
| Session Title: Online Visual System for Strategic Planning and Performance Monitoring: iProgress v Check |
| Demonstration Session 578 to be held in REPUBLIC B on Friday, Nov 12, 10:55 AM to 11:40 AM |
| Sponsored by the Health Evaluation TIG |
| Presenter(s): |
| Jianglan White, Georgia Department of Community Health, jzwhite@dhr.state.ga.us |
| Alex Cowell, Georgia Department of Community Health, ajcowell@dhr.state.ga.us |
| Abstract: This paper introduces an online visual system for strategic planning and performance monitoring - iProgress v Check. The iProgress v Check is an online data system, designed by the Georgia Division of Public Health, to develop program strategic plans, and to monitor and track community level health promotion and disease prevention programs funded by the organization. Based on social-ecological approaches and theory-of-change logic models, the system is designed to underpin a program strategic plan, with identified goals, objectives, strategies, and performance indicators, to track, monitor and collate program activities and progress in meeting program objectives, consistently across funded grantees. It promotes strategic planning and implementation. It actively guides performance-driven decision-making and resources allocation. It helps to identify best practice strategies in the local community, and to identify gaps in resources, policy, and technical assistance. It also strengthens collaboration and communication between state and local staff. |
| Session Title: The Basis for Good Judgment in Evaluation | ||||||||||||||
| Multipaper Session 579 to be held in REPUBLIC C on Friday, Nov 12, 10:55 AM to 11:40 AM | ||||||||||||||
| Sponsored by the Theories of Evaluation TIG and the Research on Evaluation TIG | ||||||||||||||
| Chair(s): | ||||||||||||||
| Bianca Montrosse, Western Carolina University, bianca.montrosse@gmail.com | ||||||||||||||
|