| Session Title: Tools for Improving the Quality of Evaluations: Four Examples From the Field | ||||
| Panel Session 862 to be held in Lone Star A on Saturday, Nov 13, 2:50 PM to 4:20 PM | ||||
| Sponsored by the Presidential Strand and the Environmental Program Evaluation TIG | ||||
| Chair(s): | ||||
| Britta Johnson, United States Environmental Protection Agency, johnson.britta@epa.gov | ||||
| Abstract: Like many federal agencies, evaluators at the U.S. Environmental Protection Agency (EPA) face a number of constraints that can hamper the quality of our evaluation efforts, including poor data quality, no data, and resource and time constraints. The U.S. EPA’s Evaluation Support Division has used several tools to help mitigate the impact of these constraints. Through the examination of four case studies, this panel session will provide practical examples that describe how evaluability assessment, expert/peer review, and integrating evaluation into the design of a program are valuable tools for improving: 1) the quality of measures, 2) data collection strategies and outcome data, 3) evaluation design, and 4) our understanding of the quality and availability of data for evaluation. This panel session will discuss how each tool was applied during the conduct of an evaluation and discuss which aspects of evaluation quality were improved. | ||||
| ||||
| ||||
| ||||
|
| Session Title: Integrating Evaluation Into Everyday Organizational Practice: A Complex Systems Perspective |
| Think Tank Session 863 to be held in Lone Star B on Saturday, Nov 13, 2:50 PM to 4:20 PM |
| Sponsored by the Systems in Evaluation TIG |
| Presenter(s): |
| Srik Gopalakrishnan, New Teacher Center, srik2004@gmail.com |
| Discussant(s): |
| Royce Holladay, Human Systems Dynamics Institute, rholladay@hsdinstitute.org |
| Abstract: Making evaluation an integral part of an organization’s everyday operations has long been held as an ideal in the field. This would mean that evaluation is integrated into organizational norms and culture and becomes a part of the organization’s work ethic. However, organizations are complex systems and making evaluation embedded in organizational culture entails a deep understanding of how complex systems, especially complex human systems, function. This session will explore complexity from a human systems dynamics perspective and engage participants in the question, “what would it take to integrate evaluation into ongoing practice in a complex human system?” Break-out groups will address various facets of complexity such as self-organization, simple rules and dynamical change and implications of those facets for specific approaches to integrating evaluation as part of organizational practice. The session will be structured in a modified “world café” format so that participants have a chance to rotate through several discussions. |
| Session Title: Advances in Stakeholder Consultation for Evaluation Quality | ||||
| Panel Session 864 to be held in Lone Star C on Saturday, Nov 13, 2:50 PM to 4:20 PM | ||||
| Sponsored by the Collaborative, Participatory & Empowerment Evaluation TIG | ||||
| Chair(s): | ||||
| Laura Leviton, Robert Wood Johnson Foundation, llevito@rwjf.org | ||||
| Discussant(s): | ||||
| Laura Leviton, Robert Wood Johnson Foundation, llevito@rwjf.org | ||||
| Abstract: Evaluation quality depends in part on including consultation with the many stakeholder groups that have an interest in a program or its evaluation. Yet more concrete guidance is needed for identifying stakeholders, for engaging them in meaningful ways, for efficient ways to incorporate their suggestions into evaluation planning, and for reengaging them to make sense of findings. The three presentations represent many years of practitioner experience in doing so. Hallie Preskill will present experience to date in using a concrete, step-by-step process to engage stakeholders. Bill Trochim will reflect on years of experience in using concept mapping for this purpose. Amelie Ramirez will describe the latest in a long-established series of Delphi surveys with Latino community leaders and researchers, setting priorities for research and evaluation on childhood obesity prevention in Latino children. Discussion will focus on the cross-cutting principles and practices that affect evaluation quality. | ||||
| ||||
| ||||
|
| Session Title: Striving for Quality During Organizational Change: Three Aspects of Responsible Evaluation | ||||
| Multipaper Session 865 to be held in Lone Star D on Saturday, Nov 13, 2:50 PM to 4:20 PM | ||||
| Sponsored by the Non-profit and Foundations Evaluation TIG | ||||
| Chair(s): | ||||
| Judy Lee, Independent Consultant, judymlee@msn.com | ||||
| Discussant(s): | ||||
| Christian Connell, Yale University, christian.connell@yale.edu | ||||
|
| Session Title: What Am I Supposed to Do With Three-Way Crosstabs? An Introduction to Log Linear Models |
| Demonstration Session 866 to be held in Lone Star E on Saturday, Nov 13, 2:50 PM to 4:20 PM |
| Sponsored by the Quantitative Methods: Theory and Design TIG |
| Presenter(s): |
| Eric Canen, University of Wyoming, ecanen@uwyo.edu |
| Nanette Nelson, University of Wyoming, nnelso13@uwyo.edu |
| Abstract: A common analysis method for cross-tabulated data is a Chi-Square test for independence. When considering only two factors, the choice of Chi-Square is straightforward; however, with three or more factors, the analysis of independence is more complicated. For instance, you can test whether all the factors are independent or whether one factor is independent of two other factors. Chi-Square methods can be used in these analyses however, such analysis is not easy completed using common statistical software. Log-linear models offer an easily understood and easily implemented alternative. This demonstration will provide step-by-step guidance on how to implement this analysis technique. The presenters will teach by example using data from an investigation of differences in smoking-related behaviors and attitudes pre- and post-implementation of smoke free ordinances. Participants will have time to ask questions and will receive a handout to guide them through their own analyses. |
| Session Title: Meeting Needs of Multiple Stakeholders in a High-Scrutiny Multi-site Evaluation: Evaluation of the Communities Putting Prevention to Work (CPPW) Initiative | |||
| Panel Session 867 to be held in Lone Star F on Saturday, Nov 13, 2:50 PM to 4:20 PM | |||
| Sponsored by the Cluster, Multi-site and Multi-level Evaluation TIG | |||
| Abstract: The Centers for Disease Control and Prevention (CDC) has dedicated $650 million in American Recovery and Investment Act (ARRA) funds to a large-scale initiative, Communities Putting Prevention to Work (CPPW). The goal is to implement supportive policies, systems, and environments in states and communities that will drive changes in behavior to reduce risk factors, and prevent/delay chronic disease. Funded recipients—all states and 44 selected communities—have 24 months to implement these strategies and to accomplish the intended policy and environmental change outcomes related to the strategy; in addition, states received funds to improve their tobacco quitlines and enhance media related to tobacco cessation. In this session, presenters will discuss the multi-faceted evaluation approach, the stakeholders, and how different stakeholder needs have been reconciled in the design. Other presentations will discuss the challenges and implementation of two facets of the CPPW effort—the policy/environmental change component and the quitline component. | |||
| |||
| |||
| |||
|
| Session Title: Building Evaluation Capacity in Nonprofit Organizations Serving Lesbian, Gay, Bisexual and Transgender (LGBT) and HIV+ Clients | ||||
| Panel Session 868 to be held in MISSION A on Saturday, Nov 13, 2:50 PM to 4:20 PM | ||||
| Sponsored by the Lesbian, Gay, Bisexual, Transgender Issues TIG | ||||
| Chair(s): | ||||
| Anita Baker, Anita Baker Consulting, abaker8722@aol.com | ||||
| Discussant(s): | ||||
| Anita Baker, Anita Baker Consulting, abaker8722@aol.com | ||||
| Abstract: The Building Evaluation Capacity (BEC) program was initiated in the fall of 2006 by the Hartford Foundation for Public Giving’s Nonprofit Support Program (NSP). BEC is a multi-year program operating in two-year cycles. Each cycle is designed to provide comprehensive, long-term training and coaching to increase both evaluation capacity and organization-wide use of evaluative thinking for participating organizations. Through this session, the Evaluation Trainer and representatives from three trainee organizations will present details about what they learned through the actual evaluations they conducted while in training, how they have used their experiences to enhance evaluative thinking in their organizations, and why this is important for organizations serving LGBT and HIV+ clients. | ||||
| ||||
| Yvette Bello, Latino Community Services, ybello@lcs-ct.org | ||||
| Erica Roggeveen, Latino Community Services, eroggeveen@lcs-ct.org | ||||
| ||||
|
| Session Title: Evaluating Special Education Personnel Development Initiatives in Three Predominately Rural States: Emphasis on Fidelity of Implementation Measures | ||||
| Multipaper Session 869 to be held in MISSION B on Saturday, Nov 13, 2:50 PM to 4:20 PM | ||||
| Sponsored by the Special Needs Populations TIG and the Pre-K - 12 Educational Evaluation TIG | ||||
| Chair(s): | ||||
| David Merves, Evergreen Educational Consulting LLC, david.merves@gmail.com | ||||
| Discussant(s): | ||||
| Patricia Gonzalez, United States Department of Education, patricia.gonzalez@ed.gov | ||||
|
| Session Title: Ensuring Quality in Our Work: Techniques Used by Independent Consultants | ||||
| Multipaper Session 870 to be held in BOWIE A on Saturday, Nov 13, 2:50 PM to 4:20 PM | ||||
| Sponsored by the Independent Consulting TIG | ||||
| Chair(s): | ||||
| Susan M Wolfe, Susan Wolfe and Associates LLC, susan.wolfe@susanwolfeandassociates.net | ||||
| Discussant(s): | ||||
| Michelle Baron, The Evaluation Baron LLC, michelle@evaluationbaron.com | ||||
|
| Session Title: Avoiding Evaluator Role Confusion: The Case of Evaluating a Complex National and Multi-state, Multi-partner Policy-Change Effort to Improve The Productivity of Higher Education |
| Think Tank Session 871 to be held in BOWIE C on Saturday, Nov 13, 2:50 PM to 4:20 PM |
| Sponsored by the Advocacy and Policy Change TIG |
| Presenter(s): |
| Melanie Hwalek, SPEC Associates, mhwalek@specassociates.org |
| Mary Grcich Williams, Lumina Foundation, |
| Discussant(s): |
| Julia Coffman, Center for Evaluation Innovation, jcoffman@evaluationexchange.org |
| Gerri Spilka, OMG Center for Collaborative Learning, gerri@omgcenter.org |
| Patricia Patrizi, Evaluation Roundtable, patti@patriziassociates.com |
| Abstract: Recently, Lumina Foundation for Education began a large, multi-state and national effort to support policy change to improve productivity in higher education at several different levels and in many different ways. The foundation engaged many national partners to provide fiscal and programmatic oversight, technical assistance, a communications campaign, topic relevant research, a Web portal for information sharing, and a national evaluation. Each of seven funded implementation states also has its own evaluator. Conducting a national evaluation of this work, with its many levels, layers, complexities, potential redundancies and tensions can challenge even the experts. This think tank will engage the audience in discussions with three seasoned policy change evaluators about what would be considered “best practice” for how the evaluation of this complex endeavor could be designed and managed so that it truly adds value and distinguishes the role of the evaluator from other data providers. |
| Roundtable Rotation I: The Mentor/Mentee Relationship: Perspectives and Suggestions for Maintaining Successful Relationships |
| Roundtable Presentation 872 to be held in GOLIAD on Saturday, Nov 13, 2:50 PM to 4:20 PM |
| Sponsored by the Graduate Student and New Evaluator TIG |
| Presenter(s): |
| Jennifer Morrow, University of Tennessee, Knoxville, jamorrow@utk.edu |
| Margot Ackermann, Homeward, margot.ackermann@gmail.com |
| Erin Burr, Oak Ridge Institute for Science and Education, erin.burr@orau.org |
| Krystall Dunaway, Eastern Virginia Medical School, dunawake@evms,edu |
| Abstract: In the proposed roundtable an evaluation faculty member (mentor) and three of her former students who earned their Ph.D.’s under her direction (mentees) will lead a discussion on the importance of mentorship in graduate school. Both the mentor and the mentees will offer suggestions to the audience on how to form and maintain a good mentor/mentee relationship as well as discuss the benefits and possible pitfalls of this relationship. We will also engage the audience in an active discussion about what works/doesn’t work for them when it comes to mentoring/being mentored. Lastly, we will work together to come up with strategies to have a successful relationship with our mentor/mentee. |
| Roundtable Rotation II: Evaluating A Rite of Passage Program as a Vehicle for Systemic Change in At-Risk Female Youths Attitudes and Beliefs |
| Roundtable Presentation 872 to be held in GOLIAD on Saturday, Nov 13, 2:50 PM to 4:20 PM |
| Sponsored by the Graduate Student and New Evaluator TIG |
| Presenter(s): |
| Kathryn Wilson, Western Michigan University, kathryn.a.wilson@wmich.edu |
| Mark Kirkpatrick, Western Michigan University, mark.c.kirkpatrick@wmich.edu |
| Abstract: This round table discussion will provide an overview of the Rite of Passage Program. Discussion will center on the advantages and disadvantages of the Rite of Passage program as an effective method for changing attitudes and beliefs in At-Risk female youth. The discussion will present the theoretical framework for the evaluation model, as well as the design and methods of the new data sets used by the evaluation team. The various challenges faced by the evaluation team will be described as well as the strategies used to overcome them, while assisting clients in reaching clarity based on stated program objectives, to determine program effectiveness, merit and worth. During the conclusion, the methodological and operational issues regarding the evaluation process will be examined and discussed. |
| Roundtable Rotation I: The Unfocused Focus Group: Evaluation Benefit or Bane? |
| Roundtable Presentation 873 to be held in SAN JACINTO on Saturday, Nov 13, 2:50 PM to 4:20 PM |
| Sponsored by the Extension Education Evaluation TIG |
| Presenter(s): |
| Nancy Franz, Virginia Tech, nfranz@vt.edu |
| Abstract: Many program evaluators use focus group as a method to collect evidence of process or impact data on programs. However, successful focus groups often rely on the skills of the facilitator. Some facilitators find working with focus groups to be a combination of science and art. This can be especially true when focus group conversation wanders from the interview protocol. This session will explore the benefits and bane of allowing focus groups to wander away from the protocol into uncharted territory. Personal experiences will be shared to spur discussion on the potential best practices for this aspect of working with focus groups. |
| Roundtable Rotation II: eXtension Evaluation Community of Practice (CoP) Grows Up |
| Roundtable Presentation 873 to be held in SAN JACINTO on Saturday, Nov 13, 2:50 PM to 4:20 PM |
| Sponsored by the Extension Education Evaluation TIG |
| Presenter(s): |
| Michael Lambur, Virginia Tech, lamburmt@vt.edu |
| Benjamin Silliman, North Carolina State University, ben_silliman@ncsu.edu |
| Abstract: The eXtension Evaluation Community of Practice, established early in 2009 by EE-TIG members, created a variety of online educational resources and established several streams of dialogue for community-based Extension educators and evaluation specialists. Formative evaluations of the CoP monthly webinar indicate that it fosters evaluation knowledge that is disseminated and applied with diverse populations. Simultaneously, informal feedback from diverse users of CoP resources (webinar, web sites, consultations) and reflection by CoP leaders on non-participation among the broader Extension community is yielding insights on how the CoP can build individual and organizational evaluation capacity. In response, CoP leaders are collaborating with eXtension and university distance education specialists, CoP leaders are expanding use of diverse technologies to educate and engage clients via self-directed and collaborative learning. This Roundtable will update participants on CoP activities and responses and engage discussion on building evaluation capacity through online networks and resources. |
| Session Title: The Importance of Critical Thinking in Assessment in Higher Education | |||||||||||||||||||||||||
| Multipaper Session 874 to be held in TRAVIS A on Saturday, Nov 13, 2:50 PM to 4:20 PM | |||||||||||||||||||||||||
| Sponsored by the Assessment in Higher Education TIG | |||||||||||||||||||||||||
| Chair(s): | |||||||||||||||||||||||||
| Leigh D'Amico, University of South Carolina, kale_leigh@yahoo.com | |||||||||||||||||||||||||
|
| Session Title: Positionality Matters: Understanding Culture and Context From the Perspective of Key Stakeholders |
| Think Tank Session 875 to be held in TRAVIS B on Saturday, Nov 13, 2:50 PM to 4:20 PM |
| Sponsored by the Collaborative, Participatory & Empowerment Evaluation TIG |
| Presenter(s): |
| Alyssa Na'im, Education Development Center, anaim@edc.org |
| Discussant(s): |
| Araceli Martinez Ortiz, Sustainable Future Inc, araceli@sustainablefuturenow.com |
| Shannan McNair, Oakland University, mcnair@oakland.edu |
| Carol Nixon, Edvantia Inc, carol.nixon@edvantia.org |
| David Reider, Education Design LLC, david@educationdesign.biz |
| Angelique Tucker Blackmon, Innovative Learning Concepts LLC, ablackmon@ilearningconcepts.com |
| Pam Van Dyk, Evaluation Resources LLC, evaluationresources@msn.com |
| Karen L Yanowitz, Arkansas State University, kyanowit@astate.edu |
| Abstract: This think tank will explore the challenges and best practices of working with and relating to stakeholders in program evaluation. Presenters will share their experiences with engaging stakeholders from various fields, disciplines, or populations. Central to this discussion will be the notion that various stakeholder groups bring a unique and often integrated culture (or way of doing things) and perspectives that should inform the evaluation process; evaluators must be equipped with certain knowledge and skills to navigate and facilitate understanding within and across stakeholder groups while ensuring and balancing standards of quality. This session poses two questions: 1) How do evaluators begin to understand the stakeholders’ perspectives involved in the program evaluation?; and 2) What role do or should stakeholders play in the design of the evaluation? Participants will be assigned to one of three stakeholder groups – decision makers, implementers, or recipients – to explore these questions in small group discussions. |
| Session Title: Practicing Culturally Responsive Evaluation: Graduate Education Diversity Internship (GEDI) Program Intern Reflections on the Role of Competence, Context, and Cultural Perceptions - Part I | ||||
| Multipaper Session 876 to be held in TRAVIS C on Saturday, Nov 13, 2:50 PM to 4:20 PM | ||||
| Sponsored by the Multiethnic Issues in Evaluation TIG | ||||
| Chair(s): | ||||
| Michelle Jay, University of South Carolina, mjay@sc.edu | ||||
| Discussant(s): | ||||
| Rita O'Sullivan, University of North Carolina at Chapel-Hill, ritao@email.unc.edu | ||||
|
| Session Title: Translating Visitors' Experiences Through Evaluation | ||||||||||||||||
| Multipaper Session 877 to be held in TRAVIS D on Saturday, Nov 13, 2:50 PM to 4:20 PM | ||||||||||||||||
| Sponsored by the Evaluating the Arts and Culture TIG | ||||||||||||||||
| Chair(s): | ||||||||||||||||
| Tara Pearsall, Savannah College of Art and Design, tpearsal@scad.edu | ||||||||||||||||
| Discussant(s): | ||||||||||||||||
| Kirsten Ellenbogen, Science Museum of Minnesota, kellenbogen@smm.org | ||||||||||||||||
|
| Session Title: Community-Derived Research Partnerships: Working Together to Improve Human Services | |||
| Panel Session 878 to be held in INDEPENDENCE on Saturday, Nov 13, 2:50 PM to 4:20 PM | |||
| Sponsored by the Human Services Evaluation TIG | |||
| Chair(s): | |||
| Cynthia Flynn, University of South Carolina, cynthiaf@mailbox.sc.edu | |||
| Discussant(s): | |||
| Diana Tester, South Carolina Department of Social Services, diana.tester@dss.sc.gov | |||
| Abstract: Using a new model, Community-Derived Research and Evaluation, The Center for Child and Family Studies in the College of Social Work at the University of South Carolina has re-envisioned the traditional process where universities identify the evaluation and research questions. With the Community-Derived Research model, the research and evaluation questions come directly from our partner, the SC Department of Social Services. Working side by side, the evaluation study is developed and implemented. Findings are immediately applied to practice and policy. Three papers describing evaluation projects conducted as part of this partnership are discussed by research faculty at the Center. Each presentation highlights how the Center and the state agency work together. The research director at our partner agency will serve as discussant describing the partnership from the agency perspective for each of the projects presented. | |||
| |||
| |||
|
| Session Title: Online Learning in Adult and Postsecondary Education: Theory and Practice | |||||||||||||||||||
| Multipaper Session 879 to be held in PRESIDIO A on Saturday, Nov 13, 2:50 PM to 4:20 PM | |||||||||||||||||||
| Sponsored by the Distance Ed. & Other Educational Technologies TIG | |||||||||||||||||||
| Chair(s): | |||||||||||||||||||
| Talbot Bielefeldt, International Society for Technology in Education, talbot@iste.org | |||||||||||||||||||
|
| Session Title: Increasing Evaluation Capacity Through Different Levels of Training and Support | ||||||||||||||||
| Multipaper Session 880 to be held in PRESIDIO B on Saturday, Nov 13, 2:50 PM to 4:20 PM | ||||||||||||||||
| Sponsored by the Organizational Learning and Evaluation Capacity Building TIG | ||||||||||||||||
| Chair(s): | ||||||||||||||||
| Stephanie Evergreen, Western Michigan University, stephanie.evergreen@wmich.edu | ||||||||||||||||
| Discussant(s): | ||||||||||||||||
| Stanley Capela, HeartShare Human Services of New York, stan.capela@heartshare.org | ||||||||||||||||
|
| Session Title: Evaluating a National Medicaid Children's Mental Health Demonstration Grant Program | ||||||
| Multipaper Session 881 to be held in PRESIDIO C on Saturday, Nov 13, 2:50 PM to 4:20 PM | ||||||
| Sponsored by the Alcohol, Drug Abuse, and Mental Health TIG | ||||||
| Chair(s): | ||||||
| Garrett Moran, Westat, garrettmoran@westat.com | ||||||
|
| Roundtable Rotation I: Adaptations to Evaluation Design: Two Examples of Ensuring Quality in Practice |
| Roundtable Presentation 882 to be held in BONHAM A on Saturday, Nov 13, 2:50 PM to 4:20 PM |
| Sponsored by the Pre-K - 12 Educational Evaluation TIG |
| Presenter(s): |
| Stephanie Feger, Brown University, stephanie_feger@brown.edu |
| Elise Laorenza, Brown University, elise_laorenza@brown.edu |
| Abstract: Often in external evaluation, quality is initially established through development of an appropriate design, selection of instruments, and planning for data collection and analysis. However, evaluation plans often require adaptations based on program modifications. Evaluation adaptations are frequently needed to clarify key program components discovered over the course of implementation and also to determine benchmarks of program impact. Evaluation adaptations can improve evaluation relevance, an indicator of overall evaluation quality, and provide new tools and/or data to identify program activities that effectively contribute to program goals and outcomes. Through discussion of two evaluation studies of statewide science programs this roundtable will explore; (1) the development of a student reflection instrument as an evaluation adaptation in the context of program benchmarks, (2) the process for aligning evaluation adaptations with original methods and the integration of results, and (3) the utilization of evaluation adaptations to support program goals and improve program impact. |
| Roundtable Rotation II: Assessing Program Implementation in Multi-site Educational Evaluations: The Development, Alignment, And Incorporation of Evidence-based Rubrics in Rigorous Evaluation Design |
| Roundtable Presentation 882 to be held in BONHAM A on Saturday, Nov 13, 2:50 PM to 4:20 PM |
| Sponsored by the Pre-K - 12 Educational Evaluation TIG |
| Presenter(s): |
| Amy Burns, Brown University, amy_burns@brown.edu |
| Tara Smith, Brown University, tara_smith@brown.edu |
| Abstract: This roundtable will share a multi-step approach to program implementation assessment developed by the Education Alliance at Brown University. Presenters will provide examples from rigorous evaluations of four districts which received federal Magnet School Assistance Program funding to describe: the development of implementation rubrics that align with districts’ logic models; data sources for evidence-based measures that are used to identify compliance with logic models; rubric scoring processes; and the incorporation of these rubric data into multivariate statistical models. The presenters will promote discussion with the roundtable group on ways to address challenges in “quantifying” implementation data. |
| Session Title: 2010 Report: State of Evaluation Practice and Capacity in the Nonprofit Sector |
| Demonstration Session 883 to be held in BONHAM B on Saturday, Nov 13, 2:50 PM to 4:20 PM |
| Sponsored by the Non-profit and Foundations Evaluation TIG |
| Presenter(s): |
| Johanna Morariu, Innovation Network, jmorariu@innonet.org |
| Myia Welsh, Innovation Network, mwelsh@innonet.org |
| Lily Zandniapour, Innovation Network, lzandniapour@innonet.org |
| Abstract: Nonprofits hear a lot of talk about evaluation, and everyone seems to want evaluation results. But there’s a gap in the conversation: What are nonprofits really doing to evaluate their work? How are they using evaluation results? Innovation Network seeks to answer these questions through the first project that seeks to systematically and repeatedly collect data from nonprofits about their evaluation practices: the State of Evaluation project. In this session Innovation Network will present findings from the 2010 State of Evaluation project. Findings will be drawn from national survey data, and will discuss nonprofit evaluation practices, evaluation capacity, challenges to evaluation, and recommendations for strengthening evaluation practices. Results shared in this session will build understanding for nonprofits (e.g. comparison with peers), for donors and funders (e.g. understand capacity of grantees to collect and use data), and for evaluators (e.g. inform design of evaluation projects re: existing evaluation practices and capacities). |
| Session Title: Evaluating the State of Charter Schools and Public Schools of Choice | |||||||||||||||||||||||||||||||
| Multipaper Session 884 to be held in BONHAM C on Saturday, Nov 13, 2:50 PM to 4:20 PM | |||||||||||||||||||||||||||||||
| Sponsored by the Pre-K - 12 Educational Evaluation TIG | |||||||||||||||||||||||||||||||
| Chair(s): | |||||||||||||||||||||||||||||||
| Juanita Lucas-McLean, Westat, juanitalucas-mclean@westat.com | |||||||||||||||||||||||||||||||
| Discussant(s): | |||||||||||||||||||||||||||||||
| Antionette Stroter, University of Iowa, a-stroter@uiowa.edu | |||||||||||||||||||||||||||||||
|
| Session Title: Cost Studies in Health Care | ||||||||||||||||||||||||
| Multipaper Session 885 to be held in BONHAM D on Saturday, Nov 13, 2:50 PM to 4:20 PM | ||||||||||||||||||||||||
| Sponsored by the Costs, Effectiveness, Benefits, and Economics TIG | ||||||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||||||
| Mustafa Karakus, Westat, mustafakarakus@westat.com | ||||||||||||||||||||||||
| Nadini Persaud, University of the West Indies, npersaud07@yahoo.com | ||||||||||||||||||||||||
|
| Session Title: Improving Proposals and Programs by Improving Peer and Stakeholder Review | ||||||||||||||||||||
| Multipaper Session 886 to be held in BONHAM E on Saturday, Nov 13, 2:50 PM to 4:20 PM | ||||||||||||||||||||
| Sponsored by the Research, Technology, and Development Evaluation TIG | ||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||
| George Teather, George Teather and Associates, gteather@sympatico.ca | ||||||||||||||||||||
|
| Session Title: Straw, Bricks, Construction: Improving Quality of Education Data, Performance Measures, and Evaluation to Enhance Student Achievement, Reduce Gaps and Increase College Access and Retention | |||||
| Multipaper Session 887 to be held in Texas A on Saturday, Nov 13, 2:50 PM to 4:20 PM | |||||
| Sponsored by the Government Evaluation TIG and the Pre-K - 12 Educational Evaluation TIG | |||||
| Chair(s): | |||||
| Stephanie Shipman, United States Government Accountability Office, shipmans@gao.gov | |||||
|
| Session Title: Web Dialogues: A New Tool for Evaluators |
| Demonstration Session 888 to be held in Texas B on Saturday, Nov 13, 2:50 PM to 4:20 PM |
| Sponsored by the Qualitative Methods TIG |
| Presenter(s): |
| Jerome Hipps, WestEd, jhipps@wested.org |
| Laurie Maak, WestEd, lmaak@wested.org |
| Abstract: Focus groups are a staple in the evaluator’s toolkit that we value in gathering responses from stakeholders about key. Imagine a scenario where participating stakeholders are situated in multiple locations and engage in the discussion at a time convenient to them. Enter a new evaluation resource that harnesses the advantages of Web 2.0 technology, Web Dialogues, that facilitate virtual focus groups. The demonstration examines the rationale behind Web Dialogue-based focus groups and details steps necessary to build a dialogue. These steps include framing the agenda, facilitating discussions, maintaining participant engagement, and summarizing key themes. We explore back-end features that help evaluators support communication and facilitate qualitatively coding discussions for later analysis. We review a three-day Web Dialogue used in a needs assessment process involving over 150 contributing participants. This review will show how available tools were used as the dialogue progressed. |
| Session Title: Challenges and Promises for Using Mixed Methods: Lessons From Implementing Mixed Methods Evaluation | ||||||||||||||||||||||||
| Multipaper Session 889 to be held in Texas C on Saturday, Nov 13, 2:50 PM to 4:20 PM | ||||||||||||||||||||||||
| Sponsored by the | ||||||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||||||
| James Riedel, Girl Scouts of the United States of America, jriedel@girlscouts.org | ||||||||||||||||||||||||
| Discussant(s): | ||||||||||||||||||||||||
| Donna Mertens, Gallaudet University, donna.mertens@gallaudet.edu | ||||||||||||||||||||||||
|
| Session Title: Evaluation System of Research, Technology, and Development (RT&D) to Induce Innovation: Strategy, Process, and Reflection | ||||||||||
| Multipaper Session 890 to be held in Texas D on Saturday, Nov 13, 2:50 PM to 4:20 PM | ||||||||||
| Sponsored by the Research, Technology, and Development Evaluation TIG | ||||||||||
| Chair(s): | ||||||||||
| Naoto Kobayashi, Waseda University, naoto.kobayashi@waseda.jp | ||||||||||
|
| Session Title: Thinking Outside the Evaluation Report Box: Transforming Evaluation Results Into a Structural Change Grantmaking Toolkit | |||||
| Panel Session 891 to be held in Texas E on Saturday, Nov 13, 2:50 PM to 4:20 PM | |||||
| Sponsored by the Evaluation Use TIG and the Non-profit and Foundations Evaluation TIG | |||||
| Chair(s): | |||||
| Steven LaFrance, Learning for Action Group, steven@lfagroup.com | |||||
| Abstract: We all hope that our evaluations will inform practice. But many evaluation reports are not used by the client, let alone a wider audience. This panel presentation will share how the project leadership, facilitator and evaluator worked together throughout the span of the initiative being evaluated, and how the resulting integration of evaluation frameworks and processes into the project led to the development of a final product, a toolkit that incorporated the lessons learned and will be distributed and hopefully used by a much larger audience than a conventional evaluation report might have reached. The project of focus is Common Vision, initiated by Funders for LGBTQ Issues, whose goal was to lead a group of grantmakers through a process to discover how they can support structural change in the service of widespread equity, and transfer those lessons learned to the field of philanthropy to advance similar efforts. | |||||
| |||||
| |||||
| |||||
|
| Session Title: How to Write an Evaluation Plan |
| Skill-Building Workshop 892 to be held in Texas F on Saturday, Nov 13, 2:50 PM to 4:20 PM |
| Sponsored by the Graduate Student and New Evaluator TIG |
| Presenter(s): |
| LaTisha Marshall, Centers for Disease Control and Prevention, lmarshall@cdc.gov |
| Abstract: Learning Objectives: At the end of the presentation, the participant will create an evaluation outline and understand what elements facilitate a useful plan. An evaluation plan is a document used to guide the planning of activities, process, and outcomes of a program. It is dynamic tool that can change over time, as needed, to complement program changes. It creates directions for accomplishing program goals and objectives by linking evaluation and program planning. Further, it facilitates getting stakeholders buy-in and commitment to program goals, documenting programmatic changes, and identifying and utilizing key resources to accomplish evaluation activities. This workshop will introduce new evaluators to a guidance document and tools created by the Centers for Disease Control and Prevention, Office on Smoking and Health, that provides practical guidance on writing an evaluation plan. Each participant will write an evaluation plan outline based on sample exercises. |
| Session Title: Say Goodbye to Power Point and Hello to Gallery Walks! |
| Skill-Building Workshop 893 to be held in CROCKETT A on Saturday, Nov 13, 2:50 PM to 4:20 PM |
| Sponsored by the |
| Presenter(s): |
| Cassandra ONeill, Wholonomy Consulting, cassandraoneill@comcast.net |
| Allison Titcomb, ALTA Consulting, atitcomb@cox.net |
| Abstract: In our work we often need to present information to other people. A Gallery Walk is a fun and engaging way to present information in a meeting, a planning session, during a training, or when teaching. The skill we will be teaching is how to use a Gallery Walk to present information to groups. Posters are made with highlights of the information to be presented, either in advance or during a meeting. In this skills building session, participants will have the experience of making a poster to share highlights of an article they read and discuss in small groups. This will be followed by a Gallery Walk of all the posters created by the group. Resources will be given to learn more about using Gallery Walks after the session. Say Hello to high engagement learning with Gallery Walks. |
| Session Title: Expanding Our Knowledgebase: Current Research on Teaching Evaluation | |||||
| Panel Session 894 to be held in CROCKETT B on Saturday, Nov 13, 2:50 PM to 4:20 PM | |||||
| Sponsored by the Teaching of Evaluation TIG | |||||
| Chair(s): | |||||
| Christina Christie, University of California, Los Angeles, tina.christie@ucla.edu | |||||
| Abstract: Recent calls for research on evaluation highlight the importance of exploring professional issues, including evaluation training (Mark, 2008). Although information has been published about teaching evaluation, existing studies tell us little about how individuals who are currently practicing evaluation were trained to do their jobs, the type of evaluation-related training individuals within specific substantive disciplines (e.g., public health and education) receive, or the promise of unique instructional approaches for acquiring competence in evaluation. Such information is valuable for individuals who design academic coursework and professional development trainings. Current research covering each of the aforementioned topics will be presented in an effort to begin filling gaps in the existing knowledgebase and to stimulate ideas for future research. | |||||
| |||||
| |||||
| |||||
|
| Session Title: The Network of Network Funders: Evaluating Networks and Evaluating with a Network Lens |
| Think Tank Session 895 to be held in CROCKETT C on Saturday, Nov 13, 2:50 PM to 4:20 PM |
| Sponsored by the |
| Presenter(s): |
| Astrid Hendricks, California Endowment, ahendricks@calendow.org |
| Discussant(s): |
| Gale Berkowitz, David and Lucile Packard Foundation, gberkowitz@packard.org |
| Mayur Patel, John S and James L Knight Foundation, patel@knightfoundation.org |
| Gigi Barsoum, California Endowment, gbarsoum@calendow.org |
| Abstract: Over the past year, a community of practice of national and local funders has been collaboratively learning to address the challenges of defining, funding, assessing, and documenting networks as a philanthropic investment strategy. This think tank session will begin by introducing the core findings from the 2010 Grantmakers for Effective Organizations report on foundations and networks and engage participants in a discussion of the specific issues of evaluating networks and evaluation within networks. Participants will be asked to discuss the principles and capacity needed to evaluate networks effectively, best methods and frameworks for defining and assessing networks, examples of effective evaluations of networks, and recommendations for additional demonstration and documentation of network evaluations. |
| Session Title: Impact Evaluation of Approaches to Affect Local Resiliency to Disasters, Enhanced Public Health Emergency Peer Networks, and First Responder Psychological Recovery | ||||||||||||||||||
| Multipaper Session 896 to be held in CROCKETT D on Saturday, Nov 13, 2:50 PM to 4:20 PM | ||||||||||||||||||
| Sponsored by the Disaster and Emergency Management Evaluation TIG | ||||||||||||||||||
| Chair(s): | ||||||||||||||||||
| Karen Pendleton, University of South Carolina, ktpendl@mailbox.sc.edu | ||||||||||||||||||
|
| Session Title: Feminist Evaluation and Gender-Specific Programs | |||||||||||||
| Multipaper Session 897 to be held in SEGUIN B on Saturday, Nov 13, 2:50 PM to 4:20 PM | |||||||||||||
| Sponsored by the Feminist Issues in Evaluation TIG | |||||||||||||
| Chair(s): | |||||||||||||
| Linda Thurston, National Science Foundation, lthursto@nsf.gov | |||||||||||||
| Discussant(s): | |||||||||||||
| Jan Middendorf, Kansas State University, jmiddend@ksu.edu | |||||||||||||
|
| Session Title: Case Studies in International M&E | ||||||||||||||||||||||||||
| Multipaper Session 898 to be held in REPUBLIC A on Saturday, Nov 13, 2:50 PM to 4:20 PM | ||||||||||||||||||||||||||
| Sponsored by the International and Cross-cultural Evaluation TIG | ||||||||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||||||||
| Jim Rugh, Independent Consultant, jimrugh@mindspring.com | ||||||||||||||||||||||||||
|
| Session Title: Policy Evaluation and Public Health: Multifaceted Approaches and Examples From the Field | ||||
| Multipaper Session 899 to be held in REPUBLIC B on Saturday, Nov 13, 2:50 PM to 4:20 PM | ||||
| Sponsored by the Health Evaluation TIG | ||||
| Chair(s): | ||||
| Karen Debrot, Centers for Disease Control and Prevention, kdebrot@cdc.gov | ||||
|
| Session Title: Strengthening Schools and Youth Through the Use of Evaluation: Issues and Perspectives | ||||||||||||||||||||||||
| Multipaper Session 900 to be held in REPUBLIC C on Saturday, Nov 13, 2:50 PM to 4:20 PM | ||||||||||||||||||||||||
| Sponsored by the Collaborative, Participatory & Empowerment Evaluation TIG | ||||||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||||||
| Liliana Rodriguez-Campos, University of South Florida, liliana@usf.edu | ||||||||||||||||||||||||
|