| Session Title: Program Theory and Theory-Driven Evaluation TIG Business Meeting and Panel: Improving Evaluation Quality by Improving Program Quality: A Theory-based/Theory-driven Perspective | |||
| Business Meeting with Panel Session 742 to be held in Lone Star A on Saturday, Nov 13, 10:00 AM to 10:45 AM | |||
| Sponsored by the Presidential Strand and the Program Theory and Theory-driven Evaluation TIG | |||
| TIG Leader(s): | |||
| John Gargani, Gargani + Company, john@gcoinc.com | |||
| Katrina Bledsoe, Walter R McDonald and Associates Inc, katrina.bledsoe@gmail.com | |||
| Chair(s): | |||
| Katrina Bledsoe, Walter R McDonald and Associates Inc, katrina.bledsoe@gmail.com | |||
| Discussant(s): | |||
| Michael Scriven, Claremont Graduate University, mjscriv1@gmail.com | |||
| David Fetterman, Fetterman & Associates, fettermanassociates@gmail.com | |||
| Charles Gasper, Missouri Foundation for Health, cgasper@mffh.org | |||
| Abstract: The principle that quality evaluation promotes better programs is well accepted. However, the other half of that equation—that quality programs promote better evaluation—is rarely considered. This panel will examine this missing half and suggest how evaluators can foster a virtuous circle of program quality promoting evaluation quality that in turn promotes program quality. By approaching this dynamic relationship from a distinctly theory-based/theory-driven perspective, the panel will address how the real-world problems of program design, execution, and funding provide concrete opportunities for evaluators to use program theory to improve programs while improving their practice. | |||
| |||
|
| Session Title: Expanding Evaluation’s Utility and Quality Through System-Oriented Data Synthesis |
| Demonstration Session 743 to be held in Lone Star B on Saturday, Nov 13, 10:00 AM to 10:45 AM |
| Sponsored by the Systems in Evaluation TIG |
| Presenter(s): |
| Beverly A Parsons, InSites, bparsons@insites.org |
| Pat Jessup, InSites, pjessup@insites.org |
| Abstract: Many evaluations using multiple data collection instruments report analyses separately by instrument or subscales. Such reports give the reader interesting facts but little or no synthesis across data collection instruments that helps the evaluation user understand how to take actions that have high likelihood of increasing the value or merit of the evaluand. A systems dynamics framework can be a powerful tool for synthesizing the analyses and making meaning of the findings. This session uses three evaluation examples to demonstrate how evaluators used an understanding of two types of system dynamics—organized and self-organizing dynamics—to synthesize data across multiple sources and identify levers for transformative change that support sustainability and scale-up. The examples involve introducing online learning in a multi-year professional development initiative for teachers, a whole school change initiative to improve student achievement in low performing schools, and a partnership for supporting the prevention of child maltreatment. |
| Session Title: Racial and Ethnic Approaches to Community Health Across the United States (REACH US) Programs: Creating and Evaluating Community-based Coalitions | ||||||
| Panel Session 744 to be held in Lone Star C on Saturday, Nov 13, 10:00 AM to 10:45 AM | ||||||
| Sponsored by the Collaborative, Participatory & Empowerment Evaluation TIG | ||||||
| Chair(s): | ||||||
| Ada Wilkinson-Lee, University of Arizona, adaw@email.arizona.edu | ||||||
| Discussant(s): | ||||||
| Mari Wilhelm, University of Arizona, wilhelmm@ag.arizona.edu | ||||||
| Abstract: Racial and Ethnic Approaches to Community Health across the U. S. (REACH US) programs are developed grounded in the socio-ecological framework to address racial health disparities. The tasks of REACH US evaluators are to develop evaluation plans based on the target communities needs and input and yet evaluate and capture the outcomes of the socio-ecological model. This session identifies strategies, initial outcomes and lessons learned in the effort to measure community-based coalitions and describe the process of forming a collaborative subgroup of grantees to identify core indicators of coalitions. Each presenter will offer a unique perspective of their successes and challenges with their target community and specific health disparity. These evaluators will also discuss the process of forming a subgroup of grantees to collaborate across various REACH US programs to identify core indicators of community-based coalitions that can be standardized and psychometrically tested. | ||||||
| ||||||
|
| Session Title: Evaluation in Foundations: The State of the Art | |||
| Panel Session 745 to be held in Lone Star D on Saturday, Nov 13, 10:00 AM to 10:45 AM | |||
| Sponsored by the Non-profit and Foundations Evaluation TIG | |||
| Chair(s): | |||
| Richard McGahey, Ford Foundation, r.mcgahey@fordfound.org | |||
| Discussant(s): | |||
| Lester Baxter, Pew Charitable Trusts, l.baxter@pewtrusts.org | |||
| Mayur Patel, John S and James L Knight Foundation, patel@knightfoundation.org | |||
| Abstract: This session will present information from a benchmark survey and analysis on evaluation practices in leading major U.S. foundations, and an presentation on performance measurement and approaches tools and practices in foundations and nonprofits. The presenters are two of the leading experts in the field. Commentators will be three directors of evaluation from three major national foundations. Having two presentations will allow for maximum dialogue among the panelists and with the audience. | |||
| |||
|
| Session Title: Ten Steps to Making Evaluations Matter: Designing Evaluations to Exert Influence |
| Expert Lecture Session 746 to be held in Lone Star E on Saturday, Nov 13, 10:00 AM to 10:45 AM |
| Sponsored by the Quantitative Methods: Theory and Design TIG |
| Chair(s): |
| Melvin Mark, Pennsylvania State University, m5m@psu.edu |
| Presenter(s): |
| Sanjeev Sridharan, University of Toronto, sridharans@smh.toronto.on.ca |
| Abstract: Much of the discussion on evaluation design has focussed on issues of causality. There has been far more limited focus on how design can be informed by considerations of pathways of influence. This session proposes ten steps to make evaluations matter. State of the art quantitative approaches will be integrated with qualitative approaches. These ten steps are informed by program theory, learning frameworks and pathways of influence, evaluation design and learning, spread and sustainability. |
| Session Title: Partner Roles in a Multi-site Evaluation: The Viewpoints and Experiences of the Cross-site Evaluator and the State Program Coordinator | |||
| Panel Session 747 to be held in Lone Star F on Saturday, Nov 13, 10:00 AM to 10:45 AM | |||
| Sponsored by the Cluster, Multi-site and Multi-level Evaluation TIG | |||
| Chair(s): | |||
| Kristin Everett, Western Michigan University, kristin.everett@wmich.edu | |||
| Abstract: A two-person panel will share viewpoints and experiences about their roles and responsibilities as a cross-site external evaluator and a program coordinator in a multi-site evaluation effort to evaluate a program to improve teacher quality. Session attendees will learn the pros and cons of the evaluation design as experienced in the multi-site evaluation. Additionally, the external evaluation team will describe how it aimed to build evaluation capacity at the local project level and provide evaluation technical assistance to local project directors. The panel will address practical dimensions applicable to other multi-site projects: planning, internal/external communication, evaluation capacity-building, data collection, data analysis and interpretation, technical assistance, reporting procedures, data use, and report preparation. Longitudinal aspects of this six year cross-site evaluation also will be explored. Sample procedures, instruments, and other materials will be shared. This model can be applied across large or small sets of projects and geographic areas. | |||
| |||
|
| Roundtable: Translating Findings Into Client Action |
| Roundtable Presentation 748 to be held in MISSION A on Saturday, Nov 13, 10:00 AM to 10:45 AM |
| Sponsored by the Independent Consulting TIG |
| Presenter(s): |
| Judith Russell, Independent Consultant, jkallickrussell@yahoo.com |
| Abstract: Do you ever feel that your clients don’t know what to do with the findings from your evaluation? Do you feel that your findings don’t fit within the larger organizational context? Do you want to learn more ways in which you can help your client to better translate your findings and recommendations into effective actions? In this roundtable participants will share tools and methods which encourage clear linkages between an evaluation and concrete actions for improved client performance. It will include program and systems evaluation approaches with examples of research design, recommendation frameworks, and client activities which have produced concrete, clear actions by the client based on evaluation findings. Bring your experiences and ideas to the table for a dynamic discussion with a variety of approaches. |
| Session Title: Evaluation of Interventions and Assessments of Individuals With Disabilities | ||||||||||||
| Multipaper Session 749 to be held in MISSION B on Saturday, Nov 13, 10:00 AM to 10:45 AM | ||||||||||||
| Sponsored by the Special Needs Populations TIG and the Pre-K - 12 Educational Evaluation TIG | ||||||||||||
| Chair(s): | ||||||||||||
| Julia Shaftel, University of Kansas, jshaftel@ku.edu | ||||||||||||
|
| Session Title: Evaluations of Community Nonprofits and New Organizations or Developing Programs: Lessons From the Field |
| Skill-Building Workshop 750 to be held in BOWIE A on Saturday, Nov 13, 10:00 AM to 10:45 AM |
| Sponsored by the Independent Consulting TIG and the Non-profit and Foundations Evaluation TIG |
| Presenter(s): |
| Gary Miron, Western Michigan University, gary.miron@wmich.edu |
| Kathryn Wilson, Western Michigan University, kathryn.a.wilson@wmich.edu |
| Michael Kiella, Western Michigan University, mike.kiella@charter.net |
| Abstract: This skill-building session reviews lessons learned from more than 2-dozen program evaluations of community nonprofit organizations and relatively new programs within public sector organizations. Some of the specific topics that will be covered include the following: (1) conducting evaluation when clients and stakeholders have limited understanding of the purpose and process of evaluation; (2) articulating program logic to guide the evaluation; (3) dealing with misleading information from the client; (4) understanding and addressing pressure from organizations to commission evaluations in order to help sell their program and attract grants; (5) creative means to address or compensate for lack of content expertise; (6) measuring impact; (7) designing and conducting evaluation when programs are still evolving or are in the midst of change; and (8) strategies for conducting evaluation on a shoestring budget. |
| Session Title: Slow Down, You Move to Fast: Calibrating Evaluator Engagement to the Pace of Campaigners and Advocates when Developing Theories of Change | |||
| Panel Session 751 to be held in BOWIE C on Saturday, Nov 13, 10:00 AM to 10:45 AM | |||
| Sponsored by the Advocacy and Policy Change TIG | |||
| Chair(s): | |||
| Mary Sue Smiaroski, Oxfam International, marysue.smiaroski@oxfaminternational.org | |||
| Abstract: Many development practitioners, especially those engaged in campaigning and advocacy, are activists, with little time for theoretical discussions or in-depth evaluative processes. Yet articulating a theory of change (TOC) with adequate specificity and committing to testing it can assist practitioners in multiple ways – by helping frame real-time strategic reflection in the face of complexity & uncertainty; by making explicit cause and effect assumptions so they can be tested, allowing for course correction and better allocation of resources; and by better meeting accountability obligations and garnering external support. Gabrielle Watson and Laura Roper, will draw from their extensive experience working with advocates and campaigners with Oxfam and other organizations and share how they’ve approached the challenge of engaging with high-octane activists to develop stronger evaluative practice. This will be a highly interactive session and we will look forward learning from participants’ experiences and insights. | |||
| |||
|
| Roundtable: Increasing Access Through Openness? Evaluating Open Educational Resources (OER) in Himalayan Community Technology Centres |
| Roundtable Presentation 752 to be held in GOLIAD on Saturday, Nov 13, 10:00 AM to 10:45 AM |
| Sponsored by the Distance Ed. & Other Educational Technologies TIG |
| Presenter(s): |
| Tiffany Ivins, Brigham Young University, tiffanyivins@gmail.com |
| David D Williams, Brigham Young University, david_williams@byu.edu |
| Randy Davies, Brigham Young University, randy.davies@byu.edu |
| Shrutee Shrestha, Brigham Young University, shruti722@gmail.com |
| Abstract: Eliminating various forms of poverty is directly linked to improving opportunities for education in the developing world. Despite this, one-fifth of the world’s population is still denied access to quality educational opportunity. (UNESCO, 2006) Effectively disseminating information in developing countries requires a continuous focus at removing obstacles that stand in the way of the right to education (Tomsasevski, 2006). This is best achieved through a holistic approach focused on sustainable and context-sensitive literacy programming conducted by locals for locals with particular regard to tailored content collection and dissemination. (Chambers, 2000) Open Educational Resources (OER) offer an opportunity to reach more learners with localized content that is freely available yet inaccessible to those who need it most. This OER evaluation investigates Open Content for Development (OC4D), a nascent educational portal that bridges this knowledge gap through customized ICT tools aimed at reaching more rural poor learners through a cost-effective sustainable approach. |
| Roundtable: Use, Ethics, and Not Giving Clients What They Ask For |
| Roundtable Presentation 753 to be held in SAN JACINTO on Saturday, Nov 13, 10:00 AM to 10:45 AM |
| Sponsored by the Evaluation Use TIG |
| Presenter(s): |
| Rachael Lawrence, University of Massachusetts, Amherst, rachaellawrence@ymail.com |
| Sharon Rallis, University of Massachusetts, sharallis@gmail.com |
| Abstract: In 2009, we were contracted to evaluate degrees of collaboration between two agencies functioning as state contractors to serve adolescents institutionalized in mental health facilities. One agency provides residential treatment for adolescents; the other provides schooling. The final evaluation product was to be a “metric [that the agencies could then use] for measuring the effectiveness of collaboration.” From observations and interviews, we saw that no shared definitions or practices of collaboration existed across sites. Since we believe the agencies could not measure what they could not define, we chose not to deliver the requested metric. What we reported instead provided clarifications and descriptions that stakeholders found more insightful and thus ultimately more useful than a set of manufactured and irrelevant metrics. This roundtable discusses and analyzes design, execution, decisions, and resulting use through a framework that draws on the AEA Guiding Principles for Evaluators and various theories of evaluation use. |
| Session Title: The Impact of Exogenous Factors in Classroom Evaluation | |||||||||||
| Multipaper Session 754 to be held in TRAVIS A on Saturday, Nov 13, 10:00 AM to 10:45 AM | |||||||||||
| Sponsored by the Assessment in Higher Education TIG | |||||||||||
| Chair(s): | |||||||||||
| Susan Rogers, State University of New York at Albany, susan.rogers.edu@gmail.com | |||||||||||
| Discussant(s): | |||||||||||
| Rhoda Risner, United States Army, rhoda.risner@us.army.mil | |||||||||||
|
| Session Title: Data Dashboard Design for Quality Monitoring and Decision Making |
| Demonstration Session 755 to be held in TRAVIS B on Saturday, Nov 13, 10:00 AM to 10:45 AM |
| Sponsored by the Integrating Technology Into Evaluation |
| Presenter(s): |
| Veronica Smith, data2insight, veronicasmith@data2insight.com |
| Tarek Azzam, Claremont Graduate University, tarek.azzam@cgu.edu |
| Abstract: While the business intelligence industry has excelled in generating robust technologies that are able to handle huge repositories of data, little progress has been made in turning that data into knowledge. Data dashboards represent the most recent attempt at turning data into actionable decisions. However, they often do not live up to their potential. This demonstration will provide: 1) a historical overview of the evolution of the dashboard; 2) strengths and limitations of the dashboard as a communication, monitoring, and self-evaluation tool; 3) key dashboard and metric design principles and practices; and 4) real-world examples of dashboards using different software packages. Participants will leave the demonstration with a clear understanding of appropriate dashboard applications, how this technology tool can be used to tap into the tremendous power of visual perception to communicate, and vetted resources for putting dashboards to work for their stakeholders. |
| Session Title: Strengthening the Learning Culture Within Organizations and Projects |
| Think Tank Session 756 to be held in TRAVIS C on Saturday, Nov 13, 10:00 AM to 10:45 AM |
| Sponsored by the Organizational Learning and Evaluation Capacity Building TIG |
| Presenter(s): |
| David Scheie, Touchstone Center for Collaborative Inquiry, dscheie@touchstone-center.com |
| Nan Kari, Touchstone Center for Collaborative Inquiry, nkari@comcast.net |
| Discussant(s): |
| Jessica Shao, Independent Consultant, 415) 889-7084 |
| Scott Hebert, Sustained Impact, shebert@sustainedimpact.com |
| Ross Velure Roholt, University of Minnesota, rossvr@umn.edu |
| Abstract: In this Think Tank, participants can explore the challenges and promising strategies that help to shift organizational and project cultures to include more room for critical reflection and dialogue, thus strengthening ongoing learning and innovation. Drawing on experiences of evaluation projects in a variety of settings, facilitators will sketch some of the reasons for working to enlarge learning cultures and name pressures that tend to constrain free inquiry. Then small groups will engage in dialogue using the following prompts: • What are the conditions in which learning cultures thrive, and how can those conditions be established? • What specific practices have you seen effective in opening up habits of honest inquiry and reflection? • How can a vibrant learning culture be disruptive? What are potential risks if evaluation embraces a learning culture approach? • What benefits have you seen result from a robust organizational learning culture? |
| Session Title: Building Capacity in Community-Level Organizations | ||||||||||||
| Multipaper Session 757 to be held in TRAVIS D on Saturday, Nov 13, 10:00 AM to 10:45 AM | ||||||||||||
| Sponsored by the Organizational Learning and Evaluation Capacity Building TIG | ||||||||||||
| Chair(s): | ||||||||||||
| Debbie Zorn, University of Cincinnati, debbie.zorn@uc.edu | ||||||||||||
| Discussant(s): | ||||||||||||
| Rebecca Gajda Woodland, University of Massachusetts, Amherst, rebecca.gajda@educ.umass.edu | ||||||||||||
|
| Session Title: Evaluation in Medical Education | ||||||||||||||
| Multipaper Session 758 to be held in INDEPENDENCE on Saturday, Nov 13, 10:00 AM to 10:45 AM | ||||||||||||||
| Sponsored by the Assessment in Higher Education TIG | ||||||||||||||
| Chair(s): | ||||||||||||||
| Chris Lovato, University of British Columbia, chris.lovato@ubc.ca | ||||||||||||||
| Discussant(s): | ||||||||||||||
| Linda Lynch, United States Army, sugarboots2000@yahoo.com | ||||||||||||||
|
| Session Title: Guidelines for Independent Consultants/Evaluators Working With Universities: Complying With Federal Funding Source Requirements, Budgets, Contracts, and Other Unique Issues |
| Demonstration Session 759 to be held in PRESIDIO A on Saturday, Nov 13, 10:00 AM to 10:45 AM |
| Sponsored by the Independent Consulting TIG and the Government Evaluation TIG |
| Presenter(s): |
| Mary Anne Sydlik, Western Michigan University, mary.sydlik@wmich.edu |
| Abstract: The proposed demonstration will discuss questions an independent consultant may face when asked to evaluate federally funded, university-based programs. First, how do you determine evaluation expectations for the National Science Foundation, National Institutes of Health, and the U. S. Department of Education? For example, what are the agency-specific evaluation requirements, and what information will need to be contributed to the PI’s annual and final reports? Second, how do agency-specific budget requirements impact what can be charged for the evaluation? Third, which documents does the submitting institution need for each kind of agency grant submission, and what is the timeframe? Information requirements for NSF FastLane submissions are quite different than those for grants.gov submissions. Fourth, guidelines and expectations for preparing and submitting a university-based evaluation project subcontract will be shared. Finally, ideas about how to build lasting relationships with university customers for future collaborative program evaluations will be discussed. |
| Session Title: Evaluation Capacity Building Through State Affiliates | |||
| Panel Session 760 to be held in PRESIDIO B on Saturday, Nov 13, 10:00 AM to 10:45 AM | |||
| Sponsored by the Organizational Learning and Evaluation Capacity Building TIG | |||
| Chair(s): | |||
| Robert Shumer, University of Minnesota, rshumer@umn.edu | |||
| Abstract: From GPRA to NCLB, and everything in between, evaluation is demanded by more and more organizations and agencies. In order to meet that demand we need to find new ways of teaching, recruiting, and providing professional development for the evaluation field. In this session we discuss the efforts for the Minnesota Evaluation Association, a state affiliate of AEA, to conduct outreach and build capacity for evaluation in all sectors of society. Panelists will discuss how the organization developed a strategic plan and then partnered with the Minnesota Campus Compact to hold regional meetings about evaluation practice. They then will discuss a state-wide study of all colleges and non-profits (partnering with the Minnesota Council of Non Profits) to discover all who teach and conduct evaluations. We conclude by discussing the results of these efforts on the evaluation field. | |||
| |||
|
| Session Title: Evaluation Capacity in School Mental Health: Lessons From School Counseling | ||||||||
| Multipaper Session 761 to be held in PRESIDIO C on Saturday, Nov 13, 10:00 AM to 10:45 AM | ||||||||
| Sponsored by the Alcohol, Drug Abuse, and Mental Health TIG | ||||||||
| Chair(s): | ||||||||
| Melissa Maras, University of Missouri, marasme@missouri.edu | ||||||||
| Discussant(s): | ||||||||
| Paul Flaspohler, Miami University, flaspopd@muohio.edu | ||||||||
|
| Roundtable: Using Site Visits to Improve Programs |
| Roundtable Presentation 762 to be held in BONHAM A on Saturday, Nov 13, 10:00 AM to 10:45 AM |
| Sponsored by the Non-profit and Foundations Evaluation TIG |
| Presenter(s): |
| Emily Hagstrom, Ciurczak & Co Inc, emily@ciurczak.net |
| Caroline Taggert, Ciurczak & Co Inc, caroline@ciurczak.net |
| Abstract: Formal site visit reports are an increasingly required component of evaluations. In addition to providing evidence of fidelity of implementation, site visits can be a source of useful feedback and recommendations to program staff. Providing site visit feedback to staff helps programs continuously improve their services and operations. Unannounced site visits and site visits conducted for the benefit of the client, rather than just to check on fidelity of implementation, have unique characteristics. For example, external evaluators can provide feedback on actual program processes prior to official visits from funding agencies. This roundtable presentation will discuss benefits of conducting site visits and providing feedback to staff, as well as examples of programmatic changes that have resulted from site visits. The presentation will also provide tools and tips for conducting effective site visits. Participants will discuss ethical challenges and concerns that arise from negative observations, and how to address these issues. |
| Session Title: Control Groups and Cost Analysis: Innovative Approaches to College Access Program Evaluation | ||||||||||||
| Multipaper Session 763 to be held in BONHAM B on Saturday, Nov 13, 10:00 AM to 10:45 AM | ||||||||||||
| Sponsored by the College Access Programs TIG | ||||||||||||
| Chair(s): | ||||||||||||
| Kurt Burkum, ACT, kurt.burkum@act.org | ||||||||||||
|
| Session Title: Promoting Truth and Justice: Evaluation’s Role in Teacher Education Programs for Candidates From Underrepresented Populations | ||||
| Multipaper Session 764 to be held in BONHAM C on Saturday, Nov 13, 10:00 AM to 10:45 AM | ||||
| Sponsored by the Pre-K - 12 Educational Evaluation TIG | ||||
| Chair(s): | ||||
| Morris Lai, University of Hawaii, Manoa, lai@hawaii.edu | ||||
|
| Session Title: Evaluating Twenty First Century Skills |
| Think Tank Session 765 to be held in BONHAM D on Saturday, Nov 13, 10:00 AM to 10:45 AM |
| Sponsored by the Pre-K - 12 Educational Evaluation TIG |
| Presenter(s): |
| Sophia Mansori, Education Development Center, smansori@edc.org |
| Discussant(s): |
| Alyssa Na'im, Education Development Center, anaim@edc.org |
| Abstract: Twenty-first century skills—critical thinking, problem-solving, and creativity, as well as “soft skills” like communication and collaboration—have received increasing attention in recent years as more educational programs and initiatives have included them in curricula and goals. The lack of academic research regarding how 21st century skills are developed or assessed leaves evaluators to create ways to understand these skills. Presenters will review existing frameworks for documenting and assessing such skills and share methods used in the evaluation of Adobe Youth Voices, an international youth media program that promotes 21st century skills. Participants are invited to share their experiences in evaluating 21st century skills, specifically: 1) How do we measure and assess 21st century skills; and 2) How do we understand these skills in the larger context of education and learning? Through facilitated small group discussions, participants will explore successful approaches, strategies, and practices for their own evaluation work. |
| Session Title: Systems of Evaluation for Diverse National Portfolios of Research: Lessons From Russia and Finland | |||||||||
| Multipaper Session 766 to be held in BONHAM E on Saturday, Nov 13, 10:00 AM to 10:45 AM | |||||||||
| Sponsored by the Research, Technology, and Development Evaluation TIG | |||||||||
| Chair(s): | |||||||||
| Yelena Thomas, Ministry of Research Science and Technology, yelena.thomas@morst.govt.nz | |||||||||
|
| Session Title: Demonstrating Results for Federally Funded Programs | |||
| Panel Session 767 to be held in Texas A on Saturday, Nov 13, 10:00 AM to 10:45 AM | |||
| Sponsored by the Government Evaluation TIG | |||
| Chair(s): | |||
| Michelle Kobyashi, National Research Center Inc, michelle@n-r-c.com | |||
| Abstract: When Federal grants fund programs implemented by diverse grantees with diverse programs - evaluation holds unique challenges. Here we demonstrate how to find the uniform threads among diverse grantees and present a case study in combining multiple evaluation techniques to assess performance. This evaluation work started with the development of common output tracking forms along with evaluation toolkits and trainings for grantees. The tracking forms collected from diverse grantees are synthesized each year and provide an annual picture of what the group is doing. Interest in broadening the evaluation from tracking outputs (“what we are doing”) to considering outcomes (“what is this changing”), led to some innovative thinking about how to evaluate complex systems with a performance measurement taxonomy that looks at outputs and outcomes within a framework of quality. | |||
| |||
|
| Session Title: Where Do Values Enter Into Evaluations? | |||||||||||||
| Multipaper Session 768 to be held in Texas B on Saturday, Nov 13, 10:00 AM to 10:45 AM | |||||||||||||
| Sponsored by the Qualitative Methods TIG | |||||||||||||
| Chair(s): | |||||||||||||
| Leslie Goodyear, National Science Foundation, lgoodyea@nsf.gov | |||||||||||||
| Discussant(s): | |||||||||||||
| Leslie Goodyear, National Science Foundation, lgoodyea@nsf.gov | |||||||||||||
|
| Session Title: How to Get People to Read Your Research And Take Action on It | |||
| Panel Session 769 to be held in Texas C on Saturday, Nov 13, 10:00 AM to 10:45 AM | |||
| Sponsored by the | |||
| Chair(s): | |||
| Susan Parker, Clear Thinking Communications, susan@clearthinkingcommunications.com | |||
| Abstract: Evaluation and research reports can provide critical information to the field. But they may instead languish on a bookshelf read by just a few people for just one reason--they are not written or presented in a clear and accessible way. Evaluators have already done the hard work of discovering useful findings. By simply shifting how they think about communicating their work, they can reach a much broader audience of policymakers and others who can use their valuable research. In this session, two communications and research experts will provide practical tips for evaluators to greatly expand the reach of their work. The panelists will provide tools and approaches that evaluators can use so that their reports have more impact on the field. Tools include storytelling, a linguist’s approach to writing clearly, and making the most of social media. | |||
| |||
|
| Session Title: Evaluating Contributions to Knowledge Translation for New Technologies or Medical Treatments | ||||||||||||
| Multipaper Session 770 to be held in Texas D on Saturday, Nov 13, 10:00 AM to 10:45 AM | ||||||||||||
| Sponsored by the Research, Technology, and Development Evaluation TIG | ||||||||||||
| Chair(s): | ||||||||||||
| John Reed, Innovologie LLC, jreed@innovologie.com | ||||||||||||
|
| Session Title: Effectively Communicating Evaluation Results: Creative, Innovative, and Technological Ways to Share Evaluation Findings |
| Demonstration Session 771 to be held in Texas E on Saturday, Nov 13, 10:00 AM to 10:45 AM |
| Sponsored by the Evaluation Use TIG and the Integrating Technology Into Evaluation |
| Presenter(s): |
| Tiffany Comer Cook, University of Wyoming, tcomer@uwyo.edu |
| Abstract: Effectively communicating evaluation results is fundamental in ensuring use of evaluation findings. Evaluations can only benefit a program if evaluators communicate their findings clearly and concisely. This demonstration will focus on several web and software applications that have served as communication tools between evaluators and clients. Specifically, the presenters will systematically show attendees highlights of two interactive websites; fact sheets presenting the results of complicated statistical analyses in an easy-to-understand format; and two case management systems that allow users to input, store, and report data. The presenters will explain these specific examples to demonstrate how evaluation findings can be communicated effectively and creatively. |
| Session Title: Quality Evaluation: Drivers and Objectives of the Renewed Canadian Federal Policy on Evaluation |
| Expert Lecture Session 772 to be held in Texas F on Saturday, Nov 13, 10:00 AM to 10:45 AM |
| Sponsored by the and the Government Evaluation TIG |
| Presenter(s): |
| Anne Routhier, Treasury Board of Canada, anne.routhier@tbs-sct.gc.ca |
| Abstract: On April 1st 2009, a renewed Canadian federal Policy on Evaluation (as approved by the Treasury Board of Canada), along with an accompanying Directive and Standard, came into effect. The objective of this policy – which applies to the majority of departments and agencies across the Government of Canada – is to create a comprehensive and reliable base of evaluation evidence that is used to support policy and program improvement, expenditure management, Cabinet decision making, and public reporting. In this presentation, the Senior Director of the Treasury Board of Canada Secretariat’s Centre of Excellence for Evaluation will provide participants with an overview of the drivers and objectives of the renewed policy with an emphasis on how the policy suite has been designed specifically to address issues observed over the course of several audits and evaluations of the evaluation function. |
| Session Title: Supporting Evaluation Capacity Building Within the Cooperative Extension System to Impact the Lives of Children, Youth, and Families at Risk | |||||
| Multipaper Session 773 to be held in CROCKETT A on Saturday, Nov 13, 10:00 AM to 10:45 AM | |||||
| Sponsored by the Extension Education Evaluation TIG | |||||
| Chair(s): | |||||
| Daniel McDonald, University of Arizona, mcdonald@ag.arizona.edu | |||||
| Discussant(s): | |||||
| Roger Rennekamp, Oregon State University, roger.rennekamp@oregonstate.edu | |||||
|
| Session Title: Leading the Horse to Water, Part II: Winning the Front-end Needs Assessment Tug-of-War in a Knowledge Management Program Initiative |
| Think Tank Session 774 to be held in CROCKETT B on Saturday, Nov 13, 10:00 AM to 10:45 AM |
| Sponsored by the Business and Industry TIG |
| Presenter(s): |
| Thomas Ward, United States Army, thomas.wardii@us.army.mil |
| Abstract: Quality evaluation of a program initiative is not only enhanced by an initial needs assessment, the needs assessment is truly fundamental for documenting program success – not to mention ensuring the program is pointed in the right direction in the first place. “Knowledge Management” suffers from being an over-used buzz word. What does the organization and its leadership really hope to accomplish? A “knowledge needs assessment” provides answers to those questions, and suggests priorities for effort. How do you convince the organization and its leadership to invest time and effort in such an assessment? This think tank begins with a narrative from the perspective of a “KM champion” (who is not a CKO) within a military educational institution, and identifies lessons learned in a “3 up / 3 down” format. The think tank will then break into small groups for participants to discuss their own experiences and share their ideas. |
| Session Title: Longitudinal Social Network Analysis: An Understanding of This Dynamic Network Approach |
| Expert Lecture Session 775 to be held in CROCKETT C on Saturday, Nov 13, 10:00 AM to 10:45 AM |
| Sponsored by the |
| Chair(s): |
| Kimberly Fredericks, The Sage Colleges, fredek1@sage.edu |
| Presenter(s): |
| Kimberly Fredericks, The Sage Colleges, fredek1@sage.edu |
| Abstract: In the field of social network analysis, the study of longitudinal networks has been in the forefront as researchers try to capture the dynamic nature of networks. To study such dynamic networks actor-oriented stochastic models of continuous-time Markov chains and exponential random graph models have been used. Dr. Kimberly Fredericks is an expert in social network analysis and its use in evaluation and has published on the topic in such journals as New Directions for Evaluation. In this presentation she will introduce both models as methods to describe and explain the development of interpersonal and inter-organizational networks. The models are applied to longitudinal data to uncover which micro mechanisms (i.e., individual choices) lead to which macro outcomes (i.e., network structures), and how and why these structures change over time. The theories and components of each model and its analysis will be explored as well as its application to program evaluation. |
| Session Title: Evaluation Challenges in Designing and Implementing a Program Evaluation: The Experience of the Centers for Disease Control and Prevention’s (CDC) Colorectal Cancer Control Program and Prevention IS Care (PIC) | ||||||||||||
| Multipaper Session 776 to be held in CROCKETT D on Saturday, Nov 13, 10:00 AM to 10:45 AM | ||||||||||||
| Sponsored by the Government Evaluation TIG | ||||||||||||
| Chair(s): | ||||||||||||
| Amy DeGroff, Centers for Disease Control and Prevention, adegroff@cdc.gov | ||||||||||||
|
| Session Title: Exploring the Role of Software in Qualitative Analysis | |||||||||||
| Multipaper Session 777 to be held in SEGUIN B on Saturday, Nov 13, 10:00 AM to 10:45 AM | |||||||||||
| Sponsored by the Qualitative Methods TIG | |||||||||||
| Chair(s): | |||||||||||
| Janet Usinger, University of Nevada, Reno, usingerj@unr.edu | |||||||||||
| Discussant(s): | |||||||||||
| Janet Usinger, University of Nevada, Reno, usingerj@unr.edu | |||||||||||
|
| Session Title: Simulation Model of Evaluation Biases Under Post Conflict Zones |
| Demonstration Session 778 to be held in REPUBLIC A on Saturday, Nov 13, 10:00 AM to 10:45 AM |
| Sponsored by the International and Cross-cultural Evaluation TIG |
| Presenter(s): |
| Mamadou Sidibe, International Relief & Development, mamadou.sidibe@gmail.com |
| Abstract: This demonstration explores how optimization tools and techniques can be used to help set up normative performance targets under post conflict settings, when uncertainty and high risk are part of program implementation strategies. It stems from the lessons learned when implementing the Community Stabilization Program (CSP) in Iraq by International Relief and Development (IRD) from May 2006 to September 2009. The CSP program targeted more than 15 Iraqi Governorates; it was designed to stabilize the country and set a path conducive to economic growth with justice and equity among Iraqis. The author uses a Mean Variance framework expressed as a Certainty Equivalent model to derive normative performance targets. These normative measures are compared to the actual program targets to determine evaluation biases in post conflict zones. Data from the Baghdad Province is used in the empirical estimates. |
| Session Title: Healthy Aging and Health Screenings: Lessons Learned Through Participatory Evaluations | |||||||||||||||||||
| Multipaper Session 779 to be held in REPUBLIC B on Saturday, Nov 13, 10:00 AM to 10:45 AM | |||||||||||||||||||
| Sponsored by the Health Evaluation TIG | |||||||||||||||||||
| Chair(s): | |||||||||||||||||||
| Susan M Wolfe, Susan Wolfe and Associates LLC, susan.wolfe@susanwolfeandassociates.net | |||||||||||||||||||
| Discussant(s): | |||||||||||||||||||
| Michael Harnar, Claremont Graduate University, michaelharnar@gmail.com | |||||||||||||||||||
|
| Session Title: Contextual Influences on the Evaluator, the Evaluation, and the Evaluation Design | ||||||||||||
| Multipaper Session 780 to be held in REPUBLIC C on Saturday, Nov 13, 10:00 AM to 10:45 AM | ||||||||||||
| Sponsored by the Theories of Evaluation TIG | ||||||||||||
| Chair(s): | ||||||||||||
| James Griffith, Claremont Graduate University, james.griffith@cgu.edu | ||||||||||||
|