| Session Title: Mixed Methods Contributions to Evaluation Quality | ||||
| Panel Session 782 to be held in Lone Star A on Saturday, Nov 13, 10:55 AM to 12:25 PM | ||||
| Sponsored by the Presidential Strand | ||||
| Chair(s): | ||||
| Donna Mertens, Gallaudet University, donna.mertens@gallaudet.edu | ||||
| Discussant(s): | ||||
| Melvin Mark, Pennsylvania State University, m5m@psu.edu | ||||
| Kataraina Pipi, Independent Consultant, kpipi@xtra.co.nz | ||||
| Abstract: Evaluators have been mixing methods for decades, or even longer. As the most applied of all social inquirers, evaluators base methodological decisions, in large part, on contextual practicality and design potency for answering evaluation questions. If both structured surveys and open-ended interviews are needed in a given context, then evaluators – unproblematically and without losing sleep – build these different methods into their design. So, it is not always apparent what the emerging ‘theory’ of mixing methods has to offer to the evaluation community. This session will present an argument that the conceptual ideas and aspirations of mixed methods inquiry can indeed make vital contributions to evaluation practice, and will focus this argument on mixed methods’ contributions to evaluation quality. The session will engage diverse understandings of the philosophical, theoretical, and methodological frameworks in mixed methods and their relevance to the meanings of evaluation quality, with illustrations from practical exemplars. | ||||
| ||||
| ||||
|
| Session Title: Writing Effective Items for Survey Research and Evaluation Studies |
| Skill-Building Workshop 783 to be held in Lone Star B on Saturday, Nov 13, 10:55 AM to 12:25 PM |
| Sponsored by the |
| Presenter(s): |
| Jason Siegel, Claremont Graduate University, jason.siegel@cgu.edu |
| Eusebio Alvaro, Claremont Graduate University, eusebio.alvaro@cgu.edu |
| Abstract: The focus of this hands-on workshop is to instruct attendees how to write effective items for collecting survey research data. Bad items are easy to write. Writing good items is more challenging than most people are aware. Writing effective survey items require a complete understanding of the impact that item wording can have on a research effort. Only through adequate training can a good survey items be discriminated from the bad. This 90-minute workshop focuses specifically on Dillman’s (2007) principles of question writing. After a brief lecture, attendees will then be asked to use their newly gained knowledge to critique items from selected national surveys. |
| Session Title: Collaborative Evaluations |
| Skill-Building Workshop 784 to be held in Lone Star C on Saturday, Nov 13, 10:55 AM to 12:25 PM |
| Sponsored by the Collaborative, Participatory & Empowerment Evaluation TIG |
| Presenter(s): |
| Liliana Rodriguez-Campos, University of South Florida, liliana@usf.edu |
| Rigoberto Rincones-Gomez, Maryland Distribution Council Inc, rrincones@mdc.org |
| Abstract: This highly interactive skill-building workshop is for those evaluators who want to engage and succeed in collaborative evaluations. In clear and simple language, the presenter outlines key concepts and effective tools/methods to help master the mechanics of collaboration in the evaluation environment. Specifically, the presenter is going to blend theoretical grounding with the application of the Model for Collaborative Evaluations (MCE) to real-life evaluations, with a special emphasis on those factors that facilitate and inhibit stakeholders’ participation. The presenter shares her experience and insights regarding this subject in a precise, easy to understand fashion, so that participants can use the information learned from this workshop immediately. |
| Session Title: A Foundation-to-Foundation Partnership: What Went Right, What Took Time- A Look at the Robert Wood Johnson Foundation and the Northwest Health Foundation Partnership- Partners Investing in Nursing’s Future |
| Demonstration Session 785 to be held in Lone Star D on Saturday, Nov 13, 10:55 AM to 12:25 PM |
| Sponsored by the Non-profit and Foundations Evaluation TIG |
| Presenter(s): |
| Ricardo Millett, Ricardo Millett and Associates, ricardo@ricardomilllet.com |
| Chantell Johnson, TCC Group, cjohnson@tccgrp.com |
| Abstract: Collaborations and partnerships among nonprofits are critical strategies in this economy and toward mission achievement. During the past ten years there has been a surge in research around the most effective measures to evaluate them. The Foundation Center and Grantcraft have published interesting and useful articles on the subject and has shared important learnings on evaluation approaches being used; however, what seems to be missing are more candid examples from the philanthropic sector. What role is evaluation playing in funder-to-funder partnerships? What evaluation design approach, measures and indicators are used to assess and learn from partnerships? How well do these measures address funder’s information priories? These are the kinds of questions that either remain unanswered or lack real world examples. Review the ‘findings’ of this evaluation approach, designed and implemented by Ricardo Millett and Chantell Johnson, TCC Group and explore its utility and applicability to other foundation to foundation partnerships. |
| Session Title: Using Rasch Measurement to Strengthen Evaluation Designs and Outcomes |
| Demonstration Session 786 to be held in Lone Star E on Saturday, Nov 13, 10:55 AM to 12:25 PM |
| Sponsored by the Quantitative Methods: Theory and Design TIG |
| Presenter(s): |
| Christine Fox, University of Toledo, chris.fox@utoledo.edu |
| Svetlana Beltyukova, University of Toledo, svetlana.beltyukova@utoledo.edu |
| Abstract: While most evaluation plans use raw ordinal survey data to assess attitudes, beliefs, or behaviors, the Rasch model can be used to determine the extent to which these data yield meaningful linear measurements from these survey responses. This model also allows for obtaining evidence of the functioning the rating scales, suitability of the instrument for specific populations, and facilitates a better understanding of how precisely different samples can be evaluated. Using a survey of high school teachers as an example of one evaluative instrument in a comprehensive evaluation for the Department of Education, we will demonstrate how a variety of Rasch diagnostics can aid in using typical rating scales to construct scientifically defensible measures. Specific focus will be on determining the appropriate number of rating scale categories, designing questions to target the appropriate sample, exploration of dimensionality, and construct interpretation. By the end of the demonstration, participants will be able to identify properties of measures, understand the limitations of working with ordinal data, and be familiar with the way in which the Rasch model overcomes these limitations to provide empirically defensible ways to build and test the quality of measures from survey data. |
| Session Title: Extended Learning: A Conversation Among Evaluators of the National Science Foundation (NSF) Extension Services Projects | |||
| Panel Session 787 to be held in Lone Star F on Saturday, Nov 13, 10:55 AM to 12:25 PM | |||
| Sponsored by the Cluster, Multi-site and Multi-level Evaluation TIG | |||
| Chair(s): | |||
| Beverly Farr, MPR Associates Inc, bfarr@mprinc.com | |||
| Abstract: This panel will include a group of evaluators from the National Science Foundation (NSF)-funded Gender in Science and Engineering (GSE)Extension Services Grants. Extension Service projects present unique challenges to evaluation because they “extend” services across tiers or layers of service, and their most direct strategies are often far removed from the ultimate target outcomes. From this basic dilemma faced in evaluating these projects, the panel will use a Question and Answer discussion format to delve into a range of issues that characterize the evaluation challenge. The panel members will pose questions to each other and discuss ideas and strategies for meeting the challenges and emphasize the value of establishing a community of practice for those undertaking evaluation of multi-site and multi-level projects. | |||
| |||
| |||
| |||
|
| Roundtable Rotation I: Preparing Teacher Candidates for Parent Partnerships: An Evaluation of a Preservice Course in Teacher Education |
| Roundtable Presentation 788 to be held in MISSION A on Saturday, Nov 13, 10:55 AM to 12:25 PM |
| Sponsored by the Special Needs Populations TIG |
| Presenter(s): |
| Michael Wischnowski, St John Fisher College, mwischnowski@sjfc.edu |
| Marie Cianca, S John Fisher College, mcianca@sjfc.edu |
| Susan Hildenbrand, St John Fisher College, shildenbrand@sjfc.edu |
| Daniel Kelly, St John Fisher College, dkelly@sjfc.edu |
| Abstract: This session will describe an evaluation of a partnership between an undergraduate teacher education program and a not-for-profit advocacy organization for people with disabilities. The partnership’s purpose is to prepare future teachers to work collaboratively with parents of children with disabilities. Parents, trained in public speaking and advocacy strategies, become an integral part of an undergraduate course with the theme of educational collaborations—parent collaboration being a centerpiece. Teacher education faculty work with these trained parents to assist teacher candidates with a) parent communication etiquette; b) critical analysis of parent and teacher roles in education; c) research skills to address parent concerns; and d) establishing collaboration between both parties to improve student outcomes. A logic model was developed to describe the resources and intentions of the partnership. Roundtable facilitators will share the results of the partnership evaluation, including perspectives and outcomes of teacher candidates, parent participants, and faculty. |
| Roundtable Rotation II: Quality Evaluations for Educational Programs: Mixed Methods Adds Value Beyond Proficiency Testing Results |
| Roundtable Presentation 788 to be held in MISSION A on Saturday, Nov 13, 10:55 AM to 12:25 PM |
| Sponsored by the Special Needs Populations TIG |
| Presenter(s): |
| Paula Plonski, Praxis Research Inc, pplonski@windstream.net |
| Abstract: The environment of No Child Left Behind and its emphasis on testing have influenced educational evaluations even for those programs not specifically targeted at improving annual yearly progress. This evaluation involved three southern elementary schools where most of the teachers and administrators participated in a program called Schools Attuned. The All Kinds of Minds Institute’s Schools Attuned program seeks to positively impact students, parents, teachers, school practices and school culture by engaging teachers in professional development involving the understanding of neurodevelopmental constructs and practical classroom application. In addition to the collection and analysis of Measures of Academic Progress (MAP) student scores, the evaluation plan included classroom observations, teacher surveys, administrator interviews, and focus groups with teachers, parents, and students. Utilizing this mixed methods design, the qualitative component complemented and informed the quantitative analysis to provide insight on how differing levels of school-wide implementation impacts outcomes. |
| Session Title: Cross-National Evaluation Policies: Where We've Been, Where We're Going, and What We Need for Quality Evaluation | ||||||||||||||||
| Multipaper Session 789 to be held in MISSION B on Saturday, Nov 13, 10:55 AM to 12:25 PM | ||||||||||||||||
| Sponsored by the | ||||||||||||||||
| Chair(s): | ||||||||||||||||
| Marie Gaarder, International Initiative for Impact Evaluation (3ie), mgaarder@3ieimpact.org | ||||||||||||||||
| Discussant(s): | ||||||||||||||||
| Jim Rugh, Independent Consultant, jimrugh@mindspring.com | ||||||||||||||||
|
| Session Title: The Home Energy Audit: An Exercise in Complex Systems Thinking for Practitioners and Evaluators Alike | ||||
| Multipaper Session 790 to be held in BOWIE A on Saturday, Nov 13, 10:55 AM to 12:25 PM | ||||
| Sponsored by the Systems in Evaluation TIG and the Environmental Program Evaluation TIG | ||||
| Chair(s): | ||||
| Daniel Folkman, University of Wisconsin, Milwakee, folkman@uwm.edu | ||||
|
| Session Title: The REWA System of Transformative Evaluation: Founded on Pono (Truth), Ahua (Beauty) and Tika (Justice)- Evaluating Health and Well-being From Te Ao Maori / Indigenous World-view, Protocol, and Practice |
| Demonstration Session 791 to be held in BOWIE C on Saturday, Nov 13, 10:55 AM to 12:25 PM |
| Sponsored by the Indigenous Peoples in Evaluation TIG |
| Presenter(s): |
| Tania Wolfgramm, Pou Kapua Consulting, tania.wolfgramm@gmail.com |
| Wikuki Kingi, Pou Kapua Consulting, wikuki.kingi@gmail.com |
| Abstract: A world-class, innovative, integrated ‘Whanau Ora’ Health Care System in New Zealand aims to achieve positive Whanau Ora outcomes through supporting the delivery of high quality health and social services to Maori whanau / families and high-needs populations. ‘Whanau Ora’ is manifest in whanau who are nurtured, healthy, engaged and knowledgeable, confident and productive who are on a journey to achieving self-determined success. This session demonstrates the REWA System of Transformative Evaluation; founded on Pono (Truth), Ahua (Beauty) and Tika (Justice); and supported by the core values of the Whanau Ora System itself, namely Whanaungatanga (Relationships), Manaakitanga (Support), Rangatiratanga (Sovereignty, Leadership), and Tikanga (Transactional Justice, Ethics, Protocols)as evidenced in the evaluation framework of the Whanau Ora System. Understanding Maori and Indigenous ways of assessing and evaluating merit based on traditional values and cultural expressions, this dynamic evaluation generates continuous learning and ground-breaking transformation for all stakeholders. |
| Roundtable Rotation I: Resources to Guide Non-evaluators in the Design of Educational Program Evaluations |
| Roundtable Presentation 792 to be held in GOLIAD on Saturday, Nov 13, 10:55 AM to 12:25 PM |
| Sponsored by the Teaching of Evaluation TIG |
| Presenter(s): |
| Rick Axelson, University of Iowa, rick-axelson@uiowa.edu |
| Susan Lenoch, University of Iowa, susan-lenoch@uiowa.edu |
| Abstract: This session will discuss approaches and share resources for guiding non-evaluators through the process of designing educational program evaluations. As a starting point for the discussion, we will review a self-study guide recently developed by the Office of Consultation and Research in Medical Education at the University of Iowa. The guide has been distributed in workshops that walk participants through the evaluation design process. The guide and workshops feature a case study of an educational intervention that illustrates the recommended design process. Participants then apply these principles in designing evaluations for selected components or interventions in their own educational programs. After a brief discussion of this approach, roundtable participants will have the remainder of the session to share their successful practices and resources for teaching evaluation to non-evaluators. |
| Roundtable Rotation II: Who Do They Think We Are? Issues and Dilemmas Raised by Others' Perceptions of Evaluators and Evaluation |
| Roundtable Presentation 792 to be held in GOLIAD on Saturday, Nov 13, 10:55 AM to 12:25 PM |
| Sponsored by the Teaching of Evaluation TIG |
| Presenter(s): |
| Loretta Kelley, Kelley, Petterson and Associates, lkelley@kpacm.org |
| Philip Henning, James Madison University, henninph@jmu.edu |
| Abstract: An evaluator’s work is affected by our client’s view of us. A quality evaluation requires an honest exchange between the evaluator and subjects of the evaluation. It requires trust on both sides and a shared belief that the evaluation performs an important formative and/or summative function. This roundtable describes several views clients and stakeholders may have of evaluators and evaluation, the issues that may arise with each, challenges in collecting data in these situations, and strategies for developing a more productive relationship with the subjects of our evaluation, along with discussions of how the AEA Guiding Principles apply in each situation. Participants will share their experiences and discuss strategies for dealing with issues and dilemmas they have faced, and benefit from the experiences of others. Views of evaluators that will be presented include judge, friend of the “boss”, pipeline to the funder, necessary inconvenience, and partner/collaborator. |
| Roundtable Rotation I: Epistemological Distinctions and Values in the Evaluation Process: A Reflective Analysis on the Quality Standards of Truth, Beauty, and Justice Using Findings From an Actual Evaluation Study |
| Roundtable Presentation 793 to be held in SAN JACINTO on Saturday, Nov 13, 10:55 AM to 12:25 PM |
| Sponsored by the Theories of Evaluation TIG and the Teaching of Evaluation TIG |
| Presenter(s): |
| Sarah Wilkey, Oklahoma State University, sarah.wilkey@okstate.edu |
| Zarrina Azizova, Oklahoma State University, zarrina.azizova@okstate.edu |
| Zhanna Shatrova, Oklahoma State University, zhanna.shatrova@okstate.edu |
| Katye Perry, Oklahoma State University, katye.perry@okstate.edu |
| Abstract: The purpose of this session is to emphasize the pedagogical importance of epistemological discussions in evaluation courses in order prepare students to think reflectively regarding issues of quality in evaluation practices. During this roundtable discussion, we will use as an example a completed evaluation of a staffing program in Family and Graduate Student Housing at Oklahoma State University. We will discuss how the process of formulating and completing each step of the evaluation, to include finding a project (or a project ‘finding’ an evaluator), determining the evaluation methodology, interpreting the findings, and presenting the results, can be different depending on the evaluator’s epistemology. Further, we will discuss how epistemology affects the interpretation of the different standards presented by House (1980)—truth, beauty, and justice—throughout the evaluation process. |
| Roundtable Rotation II: Evaluation in Late Victorian Literature |
| Roundtable Presentation 793 to be held in SAN JACINTO on Saturday, Nov 13, 10:55 AM to 12:25 PM |
| Sponsored by the Theories of Evaluation TIG and the Teaching of Evaluation TIG |
| Presenter(s): |
| David D Williams, Brigham Young University, david_williams@byu.edu |
| Abstract: Evalutors are committed to ensuring quality through adherence to various formal evaluation standards, which have evolved from social science disciplines. In contrast, what might humanities and understanding informal evaluations contribute to evaluation theory, practice and quality? This presentation examines evaluations portrayed in late-Victorian literature to identify informal approaches to establishing credibility. Through analyses of books by Dickens, Hardy, Chopin and others, we learned that some literary characters’ criteria and decision methods lead to problematic evaluations that serve as foils for promoting the choices of other characters. These classic stories invite readers to learn from characters’ evaluation experiences and improve their own informal evaluations. In this presentation we share literary examples that lead us to conclude that understanding informal evaluation lessons taught through literature could help formal evaluators extend stakeholders’ positive informal evaluations, while countering their poor informal evaluation choices, thus improving formal evaluation quality through better informal evaluations. |
| Session Title: Making Results Relevant: Designing Evaluations Stakeholders Will Value, Understand, and Use |
| Think Tank Session 794 to be held in TRAVIS A on Saturday, Nov 13, 10:55 AM to 12:25 PM |
| Sponsored by the Evaluation Use TIG |
| Presenter(s): |
| Anita Drever, University of Wyoming, adrever@uwyo.edu |
| Discussant(s): |
| Paul St Roseman, Sakhu and Associates, pstroseman@sakhuandassociates.com |
| Javan Ridge, Colorado Springs School District 11, ridgejb@d11.org |
| Abstract: Evaluators aim to impact organizational and policy decisions. However, the currency of our trade—the technical report—in and of itself rarely achieves that end. This think tank will be a forum for evaluators to discuss strategies they have used to communicate the value and utility of their work to practitioners. The following questions will guide our discussion, “How does one present data so that stakeholders not only interpret one’s findings correctly, but also see their value? What report formats or other products have proven most effective with stakeholders? What kinds of collaborative partnerships with clients have been most successful in helping stakeholders to utilize evaluation products and to use evaluation findings for program improvement or program sustainability?" We anticipate these questions will feed into a larger discussion regarding creative ways to improve the quality of our evaluation and help stakeholders get the most from our services. |
| Session Title: An Integrated Web-based Assessment, Planning, and Evaluation System for Strengthening Families Programs Across the Nation |
| Demonstration Session 795 to be held in TRAVIS B on Saturday, Nov 13, 10:55 AM to 12:25 PM |
| Sponsored by the Integrating Technology Into Evaluation |
| Presenter(s): |
| Lauren Pugh, Mosaic Network Inc, lpugh@mosaic-network.com |
| Michael Bates, Mosaic Network Inc, mbates@mosaic-network.com |
| Abstract: In March 2010, the Center for the Study of Social Policy (CCSP) unveiled a newly revised online assessment and planning tool—GEMS for Strengthening Families— for the Strengthening Families National Network. The tool consists of two main components: a Self Assessment that helps early care and education and other child-serving professionals identify concrete and practical ways of incorporating the strengthening families model in their day-to-day work, and a Protective Factors Survey that helps parents identify protective factors known to reduce child abuse and neglect. In this demonstration, we will guide users through the new online tool, including how programs can complete the Self Assessment and create Action Plans, how family members can complete the Protective Factors survey, and how all users can use the system for reporting and evaluation purposes. |
| Session Title: Construct Validity of Race and Its Impact on the Quality of Research and Evaluation |
| Demonstration Session 796 to be held in TRAVIS C on Saturday, Nov 13, 10:55 AM to 12:25 PM |
| Sponsored by the Multiethnic Issues in Evaluation TIG |
| Presenter(s): |
| Kelly Robertson, Western Michigan University, kelly.robertson@wmich.edu |
| Diane Rogers, Western Michigan University, diane.rogers@wmich.edu |
| Abstract: The demonstration will provide practitioners with awareness, understanding, and the skills to address construct validity of race and how it impacts research and evaluation. First we will discuss race as it relates to evaluation context followed by an overview of the construct of race. Then we will examine how racism has changed over time and the state of racism today. The majority of the demonstration will focus on real world examples of how constructs of race impact research/evaluation and strategies practitioners can use to counter this impact. Both strengths and weaknesses of suggested strategies will be presented, in addition to how they relate to current evaluation concepts and tools. Throughout the demonstration we will highlight the importance of the topic to practitioners and how addressing it can improve the quality of scientific research/evaluation and promote racial justice. Finally, we will conclude with next steps for personal and professional growth. |
| Session Title: Strategies and Tools to Evaluate the Comprehensive Picture of Health Policy | ||||||||||||||||||||||||
| Multipaper Session 797 to be held in TRAVIS D on Saturday, Nov 13, 10:55 AM to 12:25 PM | ||||||||||||||||||||||||
| Sponsored by the Health Evaluation TIG | ||||||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||||||
| Jenica Huddleston, University of California, Berkeley, jenhud@berkeley.edu | ||||||||||||||||||||||||
|
| Session Title: Mixed Methods and Multiple Measures in Quality Human Services Evaluation: Lessons Learned | ||||||||||||||||||||||||||||
| Multipaper Session 798 to be held in INDEPENDENCE on Saturday, Nov 13, 10:55 AM to 12:25 PM | ||||||||||||||||||||||||||||
| Sponsored by the Human Services Evaluation TIG | ||||||||||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||||||||||
| Barry Cohen, Rainbow Research Inc, bcohen@rainbowresearch.org | ||||||||||||||||||||||||||||
|
| Session Title: Application of Propensity Score Analysis in Assessing Outcomes | |||||||||||||||||
| Multipaper Session 799 to be held in PRESIDIO A on Saturday, Nov 13, 10:55 AM to 12:25 PM | |||||||||||||||||
| Sponsored by the Quantitative Methods: Theory and Design TIG | |||||||||||||||||
| Chair(s): | |||||||||||||||||
| MH Clark, Southern Illinois University, mhclark@siu.edu | |||||||||||||||||
|
| Session Title: A Radically Different Approach to Evaluator Competencies | ||||
| Multipaper Session 800 to be held in PRESIDIO B on Saturday, Nov 13, 10:55 AM to 12:25 PM | ||||
| Sponsored by the Organizational Learning and Evaluation Capacity Building TIG | ||||
| Chair(s): | ||||
| Jane Davidson, Real Evaluation Ltd, jane@realevaluation.co.nz | ||||
| Discussant(s): | ||||
| Michael Scriven, Claremont Graduate University, mjscriv1@gmail.com | ||||
| Rodney Hopson, Duquesne University, hopson@duq.edu | ||||
|
| Session Title: Use of Administrative Data for Management and Policy Making | |||||||||||||||||||||
| Multipaper Session 801 to be held in PRESIDIO C on Saturday, Nov 13, 10:55 AM to 12:25 PM | |||||||||||||||||||||
| Sponsored by the Alcohol, Drug Abuse, and Mental Health TIG | |||||||||||||||||||||
| Chair(s): | |||||||||||||||||||||
| Stephen Magura, Western Michigan University, stephen.magura@wmich.edu | |||||||||||||||||||||
|
| Roundtable Rotation I: Standardizing Literacy Data Analyses and Reporting Across Multiple Instruments and Grades |
| Roundtable Presentation 802 to be held in BONHAM A on Saturday, Nov 13, 10:55 AM to 12:25 PM |
| Sponsored by the Pre-K - 12 Educational Evaluation TIG |
| Presenter(s): |
| Ashley Kurth, City Year Inc, akurth@cityyear.org |
| Gretchen Biesecker, City Year Inc, gbiesecker@cityyear.org |
| Abstract: City Year unites more than 1,500 17-24 year-olds for a year of full-time service in over 19 urban school districts. In 2008, City Year established a more standardized model of school service (Whole School Whole Child) to address the academic, social, and emotional needs of children in their school environment. Working with school personnel to differentiate instruction using data, City Year volunteers tutor and mentor students across grades 3-9, and one focus of their work is on literacy. A challenge to collecting and using student-level literacy performance data for formative and summative purposes is the diversity of assessments used and benchmarks across and even within districts. Districts, evaluators, and large-scale organizations, including AmeriCorps, face this challenge in addressing student needs and reporting outcomes. In this roundtable, we will share ways we have standardized data collection and reporting and elicit feedback and discussion from others struggling to standardize data across sources. |
| Roundtable Rotation II: Overcoming Data Quality Challenges to Evaluation of School-based Programs |
| Roundtable Presentation 802 to be held in BONHAM A on Saturday, Nov 13, 10:55 AM to 12:25 PM |
| Sponsored by the Pre-K - 12 Educational Evaluation TIG |
| Presenter(s): |
| Lisa Garbrecht, EVALCORP, lgarbrecht@evalcorp.com |
| Shanelle Boyle, EVALCORP, sboyle@evalcorp.com |
| Tronie Rifkin, EVALCORP, trifkin@evalcorp.com |
| Mona Desai, EVALCORP, mdesai@evalcorp.com |
| Abstract: One of the biggest obstacles to evaluating school-based programs is obtaining quality data from schools and students. This roundtable will discuss challenges to collecting school and student data and how partnerships with programs and school districts can be developed and utilized to overcome those challenges. Presenters will share case examples of effective data collection and evaluation methods used with three diverse programs in California, including the Woodcraft Rangers on-site K-12 after-school program in Los Angeles County, the Murrieta Valley Unified School District Breakthrough Student Assistance Program in Riverside County, and the Project REACH off-site after-school program in San Diego County. In addition, attendees will have the opportunity to share their challenges and experiences evaluating school-based programs and engage in a discussion about cultivating partnerships and other effective strategies for providing quality evaluations for a range of school-based programs. |
| Session Title: Evaluating Family and Community Change | |||||||||||||||||||||
| Multipaper Session 803 to be held in BONHAM B on Saturday, Nov 13, 10:55 AM to 12:25 PM | |||||||||||||||||||||
| Sponsored by the Non-profit and Foundations Evaluation TIG | |||||||||||||||||||||
| Chair(s): | |||||||||||||||||||||
| Joanne Carman, University of North Carolina at Charlotte, jgcarman@uncc.edu | |||||||||||||||||||||
|
| Session Title: Practical Issues in Educational Measurement and Assessment | ||||||||||||||||||||||||
| Multipaper Session 804 to be held in BONHAM C on Saturday, Nov 13, 10:55 AM to 12:25 PM | ||||||||||||||||||||||||
| Sponsored by the Pre-K - 12 Educational Evaluation TIG | ||||||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||||||
| Wendy DuBow, National Center for Women and Information Technology, wendy.dubow@colorado.edu | ||||||||||||||||||||||||
| Discussant(s): | ||||||||||||||||||||||||
| Emily Lai, University of Iowa, emily-lai@uiowa.edu | ||||||||||||||||||||||||
|
| Session Title: Quality Evaluation: Avoiding Hypocrisy by Formative Evaluation of Evaluation's Outcomes, Processes, and Costs |
| Think Tank Session 805 to be held in BONHAM D on Saturday, Nov 13, 10:55 AM to 12:25 PM |
| Sponsored by the Costs, Effectiveness, Benefits, and Economics TIG |
| Presenter(s): |
| Sarah Hornack, American University, sarah.hornack@gmail.com |
| Discussant(s): |
| Brian Yates, American University, brian.yates@me.com |
| Jose Hermida, American University, hermidaj@gmail.com |
| Abstract: This Think Tank addresses the Presidential Theme of Evaluation Quality by asking, “Is evaluation worth it?” More specifically, does evaluation achieve the outcomes desired, at what cost, and how does evaluation compare to other paths to achieving similar outcomes with lower costs? We begin with the common hypocrisy of evaluations not being themselves evaluated. Then, three questions of meta-evaluation are raised in break-out groups: a) how should we judge the outcomes of an evaluation? b) how, and should, we judge the process of being evaluated, and c) how, should, we measure the costs of an evaluation? We reassemble the groups for summaries directed at answering d) how can we measure the cost-effectiveness or cost-benefit of evaluation? and finally e) how can one best conduct an evaluation within the constraints of limited resources while achieving the highest possible quality? |
| Session Title: Evaluating Climate Change: International Perspectives, Methods, and Estimation Techniques | |||||||||||||||
| Multipaper Session 806 to be held in BONHAM E on Saturday, Nov 13, 10:55 AM to 12:25 PM | |||||||||||||||
| Sponsored by the Environmental Program Evaluation TIG | |||||||||||||||
| Chair(s): | |||||||||||||||
| Kara Crohn, Research Into Action, kara.crohn@gmail.com | |||||||||||||||
|
| Session Title: Conceptualizing and Conducting Quality Peer Reviewed Portfolio Evaluations: Approaches and Lessons Learned From the Centers for Disease Control and Prevention (CDC) | |||||||
| Multipaper Session 807 to be held in Texas A on Saturday, Nov 13, 10:55 AM to 12:25 PM | |||||||
| Sponsored by the Government Evaluation TIG | |||||||
| Chair(s): | |||||||
| Sue Lin Yee, Centers for Disease Control and Prevention, sby9@cdc.gov | |||||||
| Discussant(s): | |||||||
| Sue Lin Yee, Centers for Disease Control and Prevention, sby9@cdc.gov | |||||||
|
| Session Title: Successfully Managing Evaluation Projects: Quality Solutions to Common Project Management Challenges |
| Demonstration Session 808 to be held in Texas B on Saturday, Nov 13, 10:55 AM to 12:25 PM |
| Sponsored by the Evaluation Managers and Supervisors TIG |
| Presenter(s): |
| Kathy Brennan, Innovation Network, kbrennan@innonet.org |
| Veena Pankaj, Innovation Network, vpankaj@innonet.org |
| Abstract: It takes more than strong evaluation skills to successfully manage an evaluation project. This session looks at key management and consulting skills that are necessary for evaluation projects to be successful--including client management, budget management, contract management and understanding the context of each evaluation engagement. |
| Session Title: Using Mixed Methods to Evaluate Program Implementation and Inter Agency Collaboration | |||||||||||||||||||||
| Multipaper Session 809 to be held in Texas C on Saturday, Nov 13, 10:55 AM to 12:25 PM | |||||||||||||||||||||
| Sponsored by the | |||||||||||||||||||||
| Chair(s): | |||||||||||||||||||||
| Sandra Bridwell, Cambridge College, sandra.bridwell@go.cambridgecollege.edu | |||||||||||||||||||||
| Discussant(s): | |||||||||||||||||||||
| Michele Tarsilla, Western Michigan University, michele.tarsilla@wmich.edu | |||||||||||||||||||||
|
| Session Title: Evaluation of National Research and Development (R&D) Programs as a Tool for Increasing Efficiency of Public Finance | ||||||
| Multipaper Session 810 to be held in Texas D on Saturday, Nov 13, 10:55 AM to 12:25 PM | ||||||
| Sponsored by the Research, Technology, and Development Evaluation TIG | ||||||
| Chair(s): | ||||||
| Boojong Gill, Korea Institute of Science & Technology Evaluation and Planning (KISTEP), kbjok@kistep.re.kr | ||||||
|
| Session Title: Evaluators Thinking Evaluatively About Use: Tips for the Trade | ||||||||||||||||||||||||||||||||
| Multipaper Session 811 to be held in Texas E on Saturday, Nov 13, 10:55 AM to 12:25 PM | ||||||||||||||||||||||||||||||||
| Sponsored by the Evaluation Use TIG | ||||||||||||||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||||||||||||||
| Helene Jennings, ICF Macro, helene.p.jennings@macrointernational.com | ||||||||||||||||||||||||||||||||
|
| Session Title: Research on Participatory Evaluation | ||||||||||||||||||||
| Multipaper Session 812 to be held in Texas F on Saturday, Nov 13, 10:55 AM to 12:25 PM | ||||||||||||||||||||
| Sponsored by the Research on Evaluation TIG | ||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||
| Michael Szanyi, Claremont Graduate University, michael.szanyi@cgu.edu | ||||||||||||||||||||
|
| Session Title: Evaluating Advocacy Efforts in Cooperative Extension and Other Outreach Organizations |
| Skill-Building Workshop 813 to be held in CROCKETT A on Saturday, Nov 13, 10:55 AM to 12:25 PM |
| Sponsored by the Extension Education Evaluation TIG |
| Presenter(s): |
| Allison Nichols, West Virginia University Extension, ahnichols@mail.wvu.edu |
| Teresa McCoy, University of Maryland, tmccoy1@umd.edu |
| Florita Montgomery, West Virginia University Extension, florita.montgomery@mail.wvu.edu |
| Abstract: This workshop will focus on helping professionals learn skills in designing evaluation protocols for advocacy efforts. It will begin with defining advocacy and determining what kinds of work in Extension or other outreach efforts can be considered advocacy. The presenters will discuss what is different and what is alike in typical program evaluations versus advocacy evaluation. Participants will then be asked to brainstorm, within small groups, the design of advocacy logic models focused on 1) teaching or capacity-building efforts; 2) service, such as network formation, relationship building, or organization; 3) policy-change efforts; and 4) research focused on understanding the impact of the effort. From the logic models, participants will be asked to suggest new and innovative methods for measuring the success of advocacy efforts. |
| Session Title: Experiencing Quality in Evaluation Training in Brazil and Ethiopia | ||||||||
| Multipaper Session 814 to be held in CROCKETT B on Saturday, Nov 13, 10:55 AM to 12:25 PM | ||||||||
| Sponsored by the Teaching of Evaluation TIG | ||||||||
| Chair(s): | ||||||||
| Thereza Penna Firme, Cesgranrio Foundation, therezapf@uol.com.br | ||||||||
|
| Session Title: Quality and International Evaluation |
| Think Tank Session 815 to be held in CROCKETT C on Saturday, Nov 13, 10:55 AM to 12:25 PM |
| Sponsored by the International and Cross-cultural Evaluation TIG |
| Presenter(s): |
| Ross Conner, University of California, Irvine, rfconner@uci.edu |
| Alexey Kuzmin, Process Consulting Company, alexey@processconsulting.ru |
| Discussant(s): |
| Michael Bamberger, Independent Consultant, jmichaelbamberger@gmail.com |
| Tessie Catsambas, EnCompass LLC, tcatsambas@encompassworld.com |
| Thomaz Chianca, COMEA Evaluation Ltd, thomaz.chianca@gmail.com |
| J Bradley Cousins, University of Ottawa, bcousins@uottawa.ca |
| Abstract: In recent years, evaluation has spread around the globe, with evaluation activities and organizations underway in every region and most nations. There are references to ‘international evaluation’ in many documents and there are groups and networks focused on this area. This increase in evaluation beyond and across national borders raises the question of evaluation quality in the international context. The session will begin with a brief historical review of the development of international evaluation and an overview of some salient aspects of it. The session will then shift to small-group discussions among attendees to consider answers to the following questions: 1.What makes an evaluator ‘international’? 2.What makes an evaluation ‘international’? 3.What determines quality of ‘international evaluation’ as opposed to evaluation in general? At the end, four discussants experienced in international evaluation will synthesize the comments from the small groups and share their thoughts. |
| Session Title: Development and Selection of Frameworks and Constructs for Disaster Preparedness and Recovery Evaluation | |||||||||||||||
| Multipaper Session 816 to be held in CROCKETT D on Saturday, Nov 13, 10:55 AM to 12:25 PM | |||||||||||||||
| Sponsored by the Disaster and Emergency Management Evaluation TIG | |||||||||||||||
| Chair(s): | |||||||||||||||
| Scott Aminov, Food For The Hungry, saminov@fh.org | |||||||||||||||
| Discussant(s): | |||||||||||||||
| Patricia Bolton, Battelle Memorial Institute, bolton@battelle.org | |||||||||||||||
|
| Session Title: Reflections of Emerging Professionals: The Culturally Responsive Path Ahead | |||
| Panel Session 817 to be held in SEGUIN B on Saturday, Nov 13, 10:55 AM to 12:25 PM | |||
| Sponsored by the Graduate Student and New Evaluator TIG | |||
| Chair(s): | |||
| Jill Jim, Independent Consultant, jilljim2003@hotmail.com | |||
| Discussant(s): | |||
| Pauline Brooks, Independent Consultant, pbrooks_3@hotmail.com | |||
| Abstract: The guiding principles of the American Evaluation Association call on evaluators to “understand and respect differences, such as differences in culture” and “account for potential implications of these differences” when moving through the cycle of evaluation. Despite the recognized importance of culturally responsive evaluation, much of the field remains unsure about the path an evaluator must take to develop and employ these skills. In February, 2010, four emerging evaluators from historically underrepresented minority groups completed a one-year training and skill building fellowship utilizing culturally responsive evaluation as part of the Robert Wood Johnson Foundation Evaluation Fellows Program. At agencies across the country, these emerging evaluators sought to engage their work and surroundings through culturally sound methodology. This panel presentation offers insights into the experiences of these emerging evaluators, describes their successes and challenges in applying a cultural lens to their work, and offers suggestions for addressing culture in evaluative practice. | |||
| |||
| |||
| |||
|
| Session Title: Evaluation Capacity Building in International Contexts | |||||||||||||||||||||||||||
| Multipaper Session 818 to be held in REPUBLIC A on Saturday, Nov 13, 10:55 AM to 12:25 PM | |||||||||||||||||||||||||||
| Sponsored by the International and Cross-cultural Evaluation TIG | |||||||||||||||||||||||||||
|
| Session Title: Operational Research and Monitoring and Evaluation: Can We Forge a Partnership? | |||
| Panel Session 819 to be held in REPUBLIC B on Saturday, Nov 13, 10:55 AM to 12:25 PM | |||
| Sponsored by the Health Evaluation TIG | |||
| Chair(s): | |||
| Thomas Chapel, Centers for Disease Control and Prevention, tchapel@cdc.gov | |||
| Abstract: The term “operational research” (OR) is used with increasing frequency in public health. Although OR techniques such as mathematical modeling and simulation originated with military operations and despite its traditional use in the private sector, the Centers for Disease Control and Prevention (CDC), among other agencies, supports OR for improving disease prevention, control, and health promotion programs. CDC’s Division of HIV/AIDS Prevention recently established an Operational Research Team (ORT) in the Prevention Research Branch, and the division also includes a Program Evaluation Branch (PEB). ORT’s mission is to improve the efficiency, effectiveness, and sustainability of evidence-based HIV prevention program activities. PEB’s mission is to evaluate processes, outcomes, and impacts of HIV prevention programs, activities, and policies for improvement and accountability. The distinction between OR and program evaluation is not clear. We will discuss the development of OR and program evaluation, OR and evaluation studies in HIV prevention, and ways to differentiate the two disciplines. | |||
| |||
| |||
|
| Session Title: Empowerment Evaluations: Insights, Reflections, and Implications | |||||||||||||||||||||||
| Multipaper Session 820 to be held in REPUBLIC C on Saturday, Nov 13, 10:55 AM to 12:25 PM | |||||||||||||||||||||||
| Sponsored by the Collaborative, Participatory & Empowerment Evaluation TIG | |||||||||||||||||||||||
| Chair(s): | |||||||||||||||||||||||
| Candace Sibley, University of South Florida, csibley@health.usf.edu | |||||||||||||||||||||||
|