| Session Title: Intermediate Consulting Skills: A Self-Help Fair |
| Think Tank Session 102 to be held in Liberty Ballroom Section B on Wednesday, November 7, 4:30 PM to 6:00 PM |
| Sponsored by the Independent Consulting TIG |
| Presenter(s): |
| Robert Hoke, Independent Consultant, robert@roberthoke.com |
| Discussant(s): |
| Mariam Azin, Planning, Research and Evaluation Services Associates Inc, mazin@presassociates.com |
| Victoria Essenmacher, Social Program Evaluators and Consultants Inc, vessenmacher@specassociates.org |
| Maura Harrington, Lodestar Management/Research Inc, mharrington@lmresearch.com |
| James Luther, Luther Consulting LLC, jluther@lutherconsulting.com |
| Emmalou Norland, Institute for Learning Innovation, norland@ilinet.org |
| Geri Peak, Two Gems Consulting Services, geri@twogemsconsulting.com |
| Kathryn Race, Race and Associates Ltd, race_associates@msn.com |
| Dawn Hanson Smart, Clegg & Associates, dsmart@cleggassociates.com |
| Abstract: This skill-building workshop will allow experienced independent evaluation consultants to interact with colleagues with less experience in order to demonstrate and share some of their hard-earned lessons. A series of eight Topic Tables will be set up, each with an experienced Table Leader who is prepared to share information about one consulting topic they enjoy and do well. Every 15 minutes, participants will circulate to a different Topic Table with a different Table Leader. Topics include: -Developing Other Lines of Business- -How Close is TOO Close? Client/Evaluator Relationships- -Managing Sub-Contractors- -Budget Development, Monitoring Cash Flow, and Tracking Expenditures.- -Putting Quality First, Putting Yourself Last- "Web-based Surveys: How Design Influences Response and Costs" -Time and Budget Management- "Tips for Long-Distance Evaluation." Table Leaders have more than three years' consulting experience. |
| Session Title: Evaluation and Learning in a Changing Landscape: How Changes to First 5 LA's Evaluation Framework are Integrated by Evaluation Contractors | |||
| Panel Session 103 to be held in Mencken Room on Wednesday, November 7, 4:30 PM to 6:00 PM | |||
| Sponsored by the Government Evaluation TIG | |||
| Chair(s): | |||
| Marc Davidson, First 5 Los Angeles, mdavidson@first5.org | |||
| Abstract: A panel will present on the new and improved First 5 LA Evaluation Framework and the manner in which the revised evaluation principles contained therein have tackled by two First 5 LA-funded evaluation contractors. A representative from First 5 LA will present on the new framework, and the two evaluation contractors will discuss their learning and the impact on evaluation as they have weathered the process of the ever-evolving First 5 LA evaluation framework over the years. First 5 LA was established through Proposition 10, passed by California voters passed in 1998, establishing a 50 cent-per-pack tax on tobacco products which generates approximately $700 million a year to be invested in the healthy development of children from prenatal to age 5. First 5 LA is the organization that was founded in Los Angeles County to disseminate these funds. As the largest of the 58 California Counties, Los Angeles receives the majority of Proposition 10 funds from the state. First 5 LA has devoted a significant proportion of their resources to promoting and supporting evaluation of funded programs. The panel will be comprised of a representative from First 5 LA and funded evaluation firms charged with conducting comprehensive evaluation of three First 5 LA initiatives. The issues to be explored are a) the process of developing a county-wide vision for evaluation; and b) realizing the core principles in a concrete manner for the purposes of real-life evaluation work. | |||
| |||
| |||
|
| Session Title: Community Focus | |||||||||||||||||
| Multipaper Session 104 to be held in Edgar Allen Poe Room on Wednesday, November 7, 4:30 PM to 6:00 PM | |||||||||||||||||
| Sponsored by the Cluster, Multi-site and Multi-level Evaluation TIG | |||||||||||||||||
| Chair(s): | |||||||||||||||||
| Martha Ann Carey, Azusa Pacific University, mcarey@apu.edu | |||||||||||||||||
|
| Session Title: Awareness and Education: Did You Change Your Behavior This Week? |
| Think Tank Session 105 to be held in Carroll Room on Wednesday, November 7, 4:30 PM to 6:00 PM |
| Sponsored by the Health Evaluation TIG |
| Presenter(s): |
| Anna Kathryn Webb, Catholic Relief Services, awebb@crs.org |
| Discussant(s): |
| Jaime Dominguez, Catholic Relief Services, jdomingu@crs.org |
| Christy Lynch, Partners in Evaluation and Planning, colynch@verizon.net |
| Abstract: The purpose of the session is to discuss the effective measurement of behavior change in awareness and educational programs. In particular, Catholic Relief Services (CRS) seeks to determine the most effective and efficient means (quantitative and qualitative) to measure behavior change in its awareness and educational programs (e.g., Food Fast, a hunger awareness program for youth) in order to assess progress toward its goal of a world "transformed" by global solidarity, specifically, a world in which U.S. Catholics are in relationship with the poor, especially vulnerable groups in the 99 countries where CRS works. The key questions are what, how, when, and why to measure behavior change. The session will include: (a) a 15-minute overview of CRS behavior change indicators; (b) a 45-minute small group exercise on behavior change measurement, including a report out; and (c) a 30-minute plenary discussion on what has been learned. |
| Session Title: International Development Evaluation: Opportunities and Challenges for the Use of the Development Assistance Committee (DAC) Criteria |
| Think Tank Session 106 to be held in Pratt Room, Section A on Wednesday, November 7, 4:30 PM to 6:00 PM |
| Sponsored by the International and Cross-cultural Evaluation TIG |
| Presenter(s): |
| Paul Lamphear, Western Michigan University, paul.a.lamphear@pfizer.com |
| Tererai Trent, Heifer International, tererai.trent@heifer.org |
| Sheri Hudachek, Western Michigan University, sherihudachek@yahoo.com |
| Todd Harcek, Western Michigan University, todd.d.harcek@wmich.edu |
| Ryoh Sasaki, Western Michigan University, ryoh.sasaki@wmich.edu |
| Discussant(s): |
| Thomaz Chianca, Western Michigan University, thomaz.chianca@wmich.edu |
| Ronald Scott Visscher, Western Michigan University, ronald.s.visscher@wmich.edu |
| Krystin Martens, Western Michigan University, krystinm@etr.org |
| Michael Scriven, Western Michigan University, scriven@aol.com |
| Paul Clements, Western Michigan University, paul.clements@wmich.edu |
| Abstract: The Development Assistance Committee (DAC) Evaluation Criteria are probably the most commonly adopted criteria to evaluate development projects. Donor agencies and international non-governmental organizations have adopted these criteria in many parts of the world. The Interdisciplinary PhD in Evaluation at Western Michigan University has created a taskforce to review the DAC criteria and propose improvements to them. Some orienting questions include: Are the DAC criteria the right ones? Are common interpretations and uses of the criteria justified? What criteria, if any, are missing? Should each of the criteria be allocated the same weight and significance? This think-tank session aims to present the findings resulting from the taskforce's effort, get the reaction of two world-class experts in development evaluation, and engage other AEA members in a dialogue to identify ways to improve the quality of evaluation criteria in international development. |
| Session Title: Building Capacity to Strengthen the Evaluation of Safe Start Promising Approaches: An Evidence-based Approach | ||||||||||
| Panel Session 107 to be held in Pratt Room, Section B on Wednesday, November 7, 4:30 PM to 6:00 PM | ||||||||||
| Sponsored by the Alcohol, Drug Abuse, and Mental Health TIG | ||||||||||
| Chair(s): | ||||||||||
| Yvette Lamb, Association for the Study and Development of Community, ylamb@capablecommunity.com | ||||||||||
| Discussant(s): | ||||||||||
| Kristen Kracke, United States Department of Justice, kristen.kracke@usdoj.gov | ||||||||||
| Abstract: While 45 states are implementing evidence-based practices for mental health and substance abuse disorders, little is known about how to effectively implement and evaluate these practices in community settings. The proposed panel will discuss the unique role of the Safe Start Promising Approaches evaluation in building the capacity of sites implementing evidence-based practices to improve the implementation and evaluation of these practices. Safe Start Promising Approaches is the second of a multi-phase initiative funded by the Office of Juvenile Justice to identify the best program strategies to reduce the impact of children's exposure to violence. The processes and strategies used to launch the national evaluation, collect the needed data, and ensure successful implementation of interventions will be discussed. The learnings from this panel are likely to inform other cross-site evaluation efforts and thus improve our ability to monitor and evaluate new approaches to helping at-risk children and their families. | ||||||||||
| ||||||||||
| ||||||||||
| ||||||||||
|
| Roundtable Rotation I: Why do Evaluators use the Technology They do and Why are They not a Stronger Factor for Innovating new Technology for Use in Evaluation |
| Roundtable Presentation 108 to be held in Douglas Boardroom on Wednesday, November 7, 4:30 PM to 6:00 PM |
| Presenter(s): |
| Brian Chantry, Brigham Young University, brian_chantry@byu.edu |
| David Williams, Brigham Young University, david_williams@byu.edu |
| Abstract: In the field of evaluation, technology is implemented in many different ways and to varying degrees. From pencil and paper measures to the use of online data collection software, evaluation is benefiting from technology. While we evaluators are implementing what is being made available to us, it is not apparent that we are a driving force to innovate new technology that will advance the field. Are we content with what we have? Do we feel we cannot be innovators because we are not sure where to begin or what is being done? Or is there a real reason for evaluators to keep their distance from the cutting edge that might impact our abilities to effectively carry out evaluations? This roundtable session will provide an opportunity for participants to engage in discussion on how evaluators are using technology to enhance evaluations and where we might go in the future. |
| Roundtable Rotation II: The Power of Technology: Using Wikis, Blogs, and Online Tools for Evaluation |
| Roundtable Presentation 108 to be held in Douglas Boardroom on Wednesday, November 7, 4:30 PM to 6:00 PM |
| Presenter(s): |
| Cary Johnson, Brigham Young University, cary_johnson@byu.edu |
| Stephen Hulme, Brigham Young University, byusnowboarder@yahoo.com |
| David Williams, Brigham Young University, david_williams@byu.edu |
| Abstract: Online collaboration tools such as blogs and wikis are not only good for collaboration, but also evaluation. Educators typically use these tools with learners for project and paper collaboration, but they are sometimes overlooked as effective evaluation tools. This roundtable will feature a discussion of how these new technologies can be used as tools in evaluation. In addition, participants will discuss how the technologies provide opportunities for students to engage in critical thinking and develop their own evaluation skills. |
| Session Title: Using Evaluative Processes in Foundations: Challenges and Solutions |
| Think Tank Session 109 to be held in Hopkins Room on Wednesday, November 7, 4:30 PM to 6:00 PM |
| Sponsored by the Organizational Learning and Evaluation Capacity Building TIG and the Non-profit and Foundations Evaluation TIG |
| Presenter(s): |
| Astrid Hendricks, The California Endowment, ahendricks@calendow.org |
| Discussant(s): |
| Bill Bickel, University of Pittsburgh, bickel@pitt.edu |
| Catherine Nelson, Independent Consultant, catawsumb@yahoo.com |
| Jennifer Iriti, University of Pittsburgh, iriti+@pitt.edu |
| Abstract: Foundations are uniquely positioned to test new ideas and to distill knowledge from their efforts to inform their own work, and that of the field. Arguably, foundations have the potential to be "centers of learning" if they are systematic, reflective, and transparent in their grant making and strategic operations. Evaluation has an important role to play in supporting learning and knowledge production. Yet, the record is far from satisfactory (Kramer & Bickel, 2004). Numerous barriers exist to the use of evaluation in foundations (Bickel, Millett, Nelson, 2002). Research on barriers (Leviton & Bickel, 2004) will be used to frame discussion of this session's core questions: What barriers to effective use of evaluation to support learning are most prevalent in the experiences of the participants? How are these being overcome? |
| Session Title: Evaluation of Multi-Country Teacher Training Programs and Curriculum Policies | |||||||||||||||
| Multipaper Session 110 to be held in Peale Room on Wednesday, November 7, 4:30 PM to 6:00 PM | |||||||||||||||
| Sponsored by the International and Cross-cultural Evaluation TIG | |||||||||||||||
| Chair(s): | |||||||||||||||
| Peter Fisch, European Commission, peter.fisch@ec.europa.eu | |||||||||||||||
|
| Session Title: Assessment in Higher Education TIG Business Meeting and Presentation: Evaluating Alaska Native-Serving/Native Hawaiian-Serving Institutions and Hispanic Serving Institutions of Higher Education |
| Business Meeting Session 111 to be held in Adams Room on Wednesday, November 7, 4:30 PM to 6:00 PM |
| Sponsored by the Assessment in Higher Education TIG |
| TIG Leader(s): |
| William Rickards, Alverno College, william.rickards@alverno.edu |
| Presenter(s): |
| Henry Doan, United States Department of Agriculture, hdoan@csrees.usda.gov |
| Saleia Afele-Faamuli, United States Department of Agriculture, sfaamuliwcsrees.usda.gov |
| Irma Lawrence, United States Department of Agriculture, ilawrencewcsrees.usda.gov |
| Discussant(s): |
| Deborah H Kwon, The Ohio State University, kwon.59@osu.edu |
| Abstract: The Cooperative State Research, Education, and Extension Service (CSREES) provides national leadership in identifying, developing, and managing programs that support university-based research, extension, and teaching activities in order to solve nation-wide agricultural issues. CSREES provides funding to promote and strengthen the ability of Alaska Native-Serving/Native Hawaiian-Serving Institutions to carry out education, applied research, and related community development programs. The Agency also provides funding to promote and strengthen the ability of Hispanic Serving Institutions to carry out educational programs that attract outstanding students and produce graduates capable of enhancing the Nation's food and agricultural scientific and professional work force. This paper presents a plan to monitor and evaluate the two programs. The plan is based on the logic models that delineate linkages of various program components that lead to activities, outputs and expected program outcomes. |
| Roundtable Rotation I: Evolution of a First Year Seminar: Evaluation for Organizational Learning |
| Roundtable Presentation 112 to be held in Jefferson Room on Wednesday, November 7, 4:30 PM to 6:00 PM |
| Presenter(s): |
| Karen M Reid, University of Nevada, Las Vegas, reidk2@unlv.nevada.edu |
| Peggy Perkins, University of Nevada, Las Vegas, peggy.perkins@unlv.edu |
| Amy Morris, University of Nevada, Las Vegas, amy.morris@unlv.nevada.edu |
| Abstract: First-year seminars (FYS) have become a main stay of higher-education with over 90% of American colleges and universities offering them in some form. Current evidence indicates FYS participation can influence students' successful transition to college, academic performance, and a sizable array of college experiences known to relate to bachelor's degree completion. Purpose of this paper is to investigate issues associated with assessing the impact and outcomes of integrating current national FYS research at a large, urban university. How does adopting a research-based two-hour FYS contribute to students' perception of improvement in the skills and strategies necessary to succeed in college? How does conversion to a three-hour FYS contribute to students' perception of improvement in the skills and strategies necessary to succeed in college? The objective was to assess the effectiveness of these changes in order to deduce appropriate program theory changes from the evidence derived. |
| Roundtable Rotation II: Assessing Student Learning Outcomes: An Examination of a Process That Focuses Upon the Improvement of Teaching and Learning |
| Roundtable Presentation 112 to be held in Jefferson Room on Wednesday, November 7, 4:30 PM to 6:00 PM |
| Presenter(s): |
| Tanis Stewart, University of Nevada, Las Vegas, tanis.stewart@unlv.edu |
| Abstract: Processes implemented at a large university in the southwestern United States to assess student learning outcomes were evaluated in order to examine university assessment practice and determine how well academic units responded to established criteria. Data collected from academic units, including student learning outcomes, curriculum alignment, and assessment methodology were analyzed to evaluate process outcomes. Implications for theory-based process evaluation and learning outcome assessment models are discussed. |
| Session Title: New Evaluation Initiatives on Diabetes Prevention and Childhood Obesity: From the National to the School Level | |||||||||||||||||||||||||||||||||
| Multipaper Session 113 to be held in Washington Room on Wednesday, November 7, 4:30 PM to 6:00 PM | |||||||||||||||||||||||||||||||||
| Sponsored by the Health Evaluation TIG | |||||||||||||||||||||||||||||||||
| Chair(s): | |||||||||||||||||||||||||||||||||
| Eunice Rodriguez, Stanford University, er23@stanford.edu | |||||||||||||||||||||||||||||||||
|
| Session Title: When Leadership Moves From I to We: Evaluating Collective Leadership Development Efforts |
| Multipaper Session 114 to be held in D'Alesandro Room on Wednesday, November 7, 4:30 PM to 6:00 PM |
| Sponsored by the Non-profit and Foundations Evaluation TIG |
| Chair(s): |
| Claire Reinelt, Leadership Learning Community, claire@leadershiplearning.org |
| Discussant(s): |
| Claire Reinelt, Leadership Learning Community, claire@leadershiplearning.org |
| Abstract: This session will focus on what the presenters have learned through evaluation about how collective leadership is fostered, and how collective leadership can build capacity to shift the culture and dynamics of teams, organizations, communities, and systems. This session will feature longitudinal case studies from three unique collective leadership initiatives including a large non-profit healthcare system, a municipal government, and a grant-funded regional economic development effort. Each presentation will focus on the evaluation approach, methods, lessons learned, challenges faced, and impact of each initiative. Then the case studies will be summarized to include what is common across these programs and findings, what is unique, any critical contextual factors, and what can be generalized to other collective leadership programs. It is our intention to finish with recommendations about what critical factors should be considered when designing and implementing an evaluation of collective leadership development. |
| Leadership in the City: How Individuals and Teams Impact a Community |
| Jessica Baltes, Center for Creative Leadership, baltesj@leaders.ccl.org |
| Twenty-three people representing a cross section of employees of various levels from multiple departments within a municipal government received individual leadership training by attending the Center for Creative Leadership's Developing the Strategic Leader open enrollment programs. In the spring of 2005, the municipal government and CCL began an initiative to engage the employees who had taken part in this individual training in a more in-depth group experience of strategic leadership concepts, frameworks, and skills, individualized to the organization's context and goals. The team's challenge was twofold, (1) For the individuals involved to continue their own personal growth as strategic leaders while supporting each other in that growth, and (2) To collectively leverage their learnings from the DSL program in a project of strategic significance for the City. This paper documents the efforts, outcomes, and impact of this team over a two year period. |
| Improving the Health of the System: A Case Study of Collective Leadership Within Catholic Healthcare Partners |
| Tracy Enright Patterson, Center for Creative Leadership, pattersont@leaders.ccl.org |
| Jennifer Martineau, Center for Creative Leadership, martineauj@leaders.ccl.org |
| Since 2000, the Center for Creative Leadership has collaborated with Catholic Healthcare Partners (CHP), a large US-based, mission-oriented health system to implement a long-term, multi-cohort leadership development initiative for high-potential CHP executives. Unlike traditional programs that focus on individual leadership development, this initiative aims to create an impact on the organization by increasing understanding and focus on the overall system and the mission, fostering work across organizational boundaries, and enhancing the organization's capacity to tackle strategic, complex, and critical issues. Action learning projects play a key role in the process. This paper describes the design and implementation of the evaluation of this initiative including strategies and challenges in measuring organizational impact over time. The evaluation examines changes in participants' work scope and responsibility (e.g., promotions, transfers), level of commitment, and specific competencies as well as system level connections and changes resulting from this effort. |
| Using Social Network Analysis to Evaluate Collective Leadership and Collaboration |
| Emily Hoole, Center for Creative Leadership, hoolee@leaders.ccl.org |
| Kimberly Fredericks, Indiana State University, kfredericks@indstate.edu |
| The Piedmont Region's WIRED initiative is a grant from the Federal Department of Labor to create collaborative efforts around regional economic building efforts. One of the critical components of the initiative is a regional leadership development effort. The key goals of the leadership initiative are to develop the system's capacity for open dialogue, development of diverse horizontal collaborative networks, a focus on collective learning, and shared values and culture. All of this is meant to foster a collective ability to set direction, create alignment and build commitment to compete as a region in the global economy. Social network analysis (SNA) is one of the evaluative tools utilized to assess changes in a system around collaborative networks over time. This presentation will focus on the use of SNA as an evaluative tool to assess such collaborative efforts and will address data collection, measurement issues and the longitudinal analysis of such data. |
| Session Title: Learning From Alternative Models of Evaluation | ||||||||||||||||||||
| Multipaper Session 115 to be held in Calhoun Room on Wednesday, November 7, 4:30 PM to 6:00 PM | ||||||||||||||||||||
| Sponsored by the Quantitative Methods: Theory and Design TIG | ||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||
| Sanjeev Sridharan, University of Edinburgh, sanjeev.sridharan@ed.ac.uk | ||||||||||||||||||||
|
| Session Title: Providing Meaningful Evaluations for Prevention Projects in Indigenous Communities | |||
| Panel Session 116 to be held in McKeldon Room on Wednesday, November 7, 4:30 PM to 6:00 PM | |||
| Sponsored by the Indigenous Peoples in Evaluation TIG | |||
| Chair(s): | |||
| Richard Nichols, Colyer Nichols Inc Consulting, colyrnickl@cybermesa.com | |||
| Discussant(s): | |||
| Richard Nichols, Colyer Nichols Inc Consulting, colyrnickl@cybermesa.com | |||
| Abstract: Effective evaluations provide new learning both to programs and to their evaluators. This panel, by 3 experienced evaluators in 3 very different indigenous communities, discusses their learnings and challenges in providing meaningful evaluation for violence prevention and substance abuse prevention programs, especially those funded by non-tribal agencies which may have predetermined evaluation requirements. Indigenous evaluators discuss their work with Maori in New Zealand and with American Indian Nations in the northwest and central United States. | |||
| |||
| |||
|
| Session Title: Evaluation of Organizations as Enterprises: Approaches, Appropriate Outcome Expectations, and Potential Indicators | ||||||||||
| Panel Session 117 to be held in Preston Room on Wednesday, November 7, 4:30 PM to 6:00 PM | ||||||||||
| Sponsored by the Organizational Learning and Evaluation Capacity Building TIG | ||||||||||
| Chair(s): | ||||||||||
| Thomas Chapel, Centers for Disease Control and Prevention, tchapel@cdc.gov | ||||||||||
| Abstract: The ultimate goal of CDC, healthy people in healthy communities through prevention, is realized through joint efforts of CDC, component programs, grantees, and networks of state and local partners that turn national vision into local reality. Increasingly, CDC organizational components - centers, divisions, and programs - are asked to demonstrate how their efforts contribute to the whole. This presents two challenges. First, CDC efforts are often infrastructural, hence removed from frontline outcomes they intend to empower. And second, CDC might interact with autonomous participants to achieve distal outcomes, making its role more about rallying these participants in consistent, common efforts than about achievement of final outcomes. Three CDC programs will discuss how they approach evaluation of their efforts. In certain cases, this entails identifying and measuring more proximal outcomes; in other cases, the key has been development of a clear logic model that can provide consistency among approaches of autonomous programs and grantees. | ||||||||||
| ||||||||||
| ||||||||||
|
| Session Title: Putting Context in Context With Examples in Strategic Planning and Measuring Fidelity | |||||||||||
| Multipaper Session 118 to be held in Schaefer Room on Wednesday, November 7, 4:30 PM to 6:00 PM | |||||||||||
| Sponsored by the AEA Conference Committee | |||||||||||
| Chair(s): | |||||||||||
| Cheri Levenson, Cherna Consulting, c.levenson@cox.net | |||||||||||
|
| Session Title: Evaluating the Teaching of Program Evaluation: Student and Teacher Assessments | |||
| Panel Session 119 to be held in Calvert Ballroom Salon B on Wednesday, November 7, 4:30 PM to 6:00 PM | |||
| Sponsored by the Teaching of Evaluation TIG | |||
| Chair(s): | |||
| Katherine McDonald, Portland State University, kmcdona@pdx.edu | |||
| Abstract: Program evaluation has grown significantly since the 1960's (Rossi, Lipsey & Freeman 2004). As such, educating new program evaluators is an important aspect of promoting healthy growth of the field. However, we have conducted relatively few evaluations of the effectiveness of opportunities to learn about program evaluation (Trevisan, 2004). In this presentation, we will share findings from an evaluation of graduate students' learning in a program evaluation seminar and generate a vision for critical next steps. We assessed learning through student and teacher judgments of knowledge and skill demonstration. In some instances, we will draw comparisons between students and their peers not enrolled in the seminar and other evaluation novices in professional training opportunities (Stufflebeam & Wingate, 2005). Audience ideas and insights will be solicited throughout the presentation and implications for the field will be discussed interactively. | |||
| |||
| |||
| |||
| |||
|
| Session Title: Facilitating Fast-paced Learning: Developmental Evaluation for Complex Emergent Innovations |
| Demonstration Session 120 to be held in Calvert Ballroom Salon C on Wednesday, November 7, 4:30 PM to 6:00 PM |
| Sponsored by the Presidential Strand |
| Presenter(s): |
| Michael Quinn Patton, Utilization-Focused Evaluation, mqpatton@prodigy.net |
| Abstract: Innovations being implemented in complex environments under emergent and uncertain conditions present special challenges for evaluation. To be effective in responding to rapidly changing conditions, innovators need to be able to learn quickly. That means evaluators have to be able to gather relevant data rapidly and provide real time feedback if the findings are to be useful. At the same time, the evaluator is inculcating evaluative thinking into the innovative process, which is its own challenge, because creative innovators are often more intuition-driven than data-driven. Using understandings from systems thinking and complexity science, this session will describe and give examples of an approach to evaluation " Developmental Evaluation (DE) " that makes rapid feedback for learning and adaptation the centerpiece of the evaluative process. Learning in DE includes both substantive learning (findings use) and learning to think evaluatively (process use.) |
| Session Title: Building Evaluation Capacity Within Organizations | ||||||||||||||
| Multipaper Session 121 to be held in Calvert Ballroom Salon E on Wednesday, November 7, 4:30 PM to 6:00 PM | ||||||||||||||
| Sponsored by the Extension Education Evaluation TIG | ||||||||||||||
| Chair(s): | ||||||||||||||
| Mary Arnold, Oregon State University, mary.arnold@oregonstate.edu | ||||||||||||||
|
| Session Title: When Does Evaluation Not Feel Like Evaluation? Embedding Evaluation Activities Into Programs | ||||
| Panel Session 122 to be held in Fairmont Suite on Wednesday, November 7, 4:30 PM to 6:00 PM | ||||
| Sponsored by the Pre-K - 12 Educational Evaluation TIG | ||||
| Chair(s): | ||||
| Leslie Goodyear, Education Development Center Inc, lgoodyear@edc.org | ||||
| Discussant(s): | ||||
| Sylvia James, National Science Foundation, sjames@nsf.gov | ||||
| Abstract: Embedding evaluation within program activities is a way to encourage programs to engage in ongoing, continuous evaluation. These four presenters, evaluators of projects funded by the National Science Foundation's ITEST (Information Technology Experiences for Students and Teachers) program, will present the ways in which they have worked with projects to embed evaluation within project activities and the learnings, both programmatic and evaluative, that come from their experiences. The chair and discussant for the session will tie together these presentations with information about the ITEST program and the evaluation research work in which these presenters are involved. | ||||
| ||||
| ||||
| ||||
|
| Session Title: Evaluating the Reading First Program: Best Practices and Lessons Learned | |||
| Panel Session 123 to be held in Federal Hill Suite on Wednesday, November 7, 4:30 PM to 6:00 PM | |||
| Sponsored by the Pre-K - 12 Educational Evaluation TIG | |||
| Chair(s): | |||
| Michael Long, Macro International Inc, michael.c.long@orcmacro.com | |||
| Discussant(s): | |||
| Michael Long, Macro International Inc, michael.c.long@orcmacro.com | |||
| Abstract: The Reading First program is a federal literacy grant that has had a tremendous impact on how reading is taught in elementary schools across the country. This panel will consist of four evaluators of state Reading First programs in Delaware, Indiana, Maryland, and Ohio. Each evaluator will discuss their work in their state, including the types of evaluation questions they are answering, the variables that they are measuring, and the instruments and methods they have found most effective. They will also address the methodological, practical, and political obstacles that they have encountered, and how they have addressed these challenges. | |||
| |||
| |||
| |||
|
| Session Title: Closing the Loop: Mapping Value to Inform Research Management | ||||||||||||||||||||||||
| Multipaper Session 124 to be held in Royale Board Room on Wednesday, November 7, 4:30 PM to 6:00 PM | ||||||||||||||||||||||||
| Sponsored by the Research, Technology, and Development Evaluation TIG | ||||||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||||||
| Neville Reeve, European Commission, neville.reeve@ec.europa.eu | ||||||||||||||||||||||||
|
| Session Title: Practicing Culturally-Based Evaluation: Learnings From the Field | |||
| Panel Session 125 to be held in Royale Conference Foyer on Wednesday, November 7, 4:30 PM to 6:00 PM | |||
| Sponsored by the Multiethnic Issues in Evaluation TIG | |||
| Chair(s): | |||
| Cheryl Blanchette, Harder & Company Community Research, cblanchette@harderco.com | |||
| Abstract: While evaluators agree on the importance of culturally relevant evaluation, there is still a great need to share knowledge and to identify promising practices within the field. In this session, four practicing evaluators will describe specific strategies and lessons learned in this regard based on their experiences conducting evaluations in diverse communities throughout California. Specifically, the panel will: (1) describe their engagement in an intentional and collaborative learning process around culturally-based evaluation; (2) present a checklist approach for systematically thinking through opportunities to increase the cultural relevance of an evaluative undertaking; (3) discuss challenges in using Likert scales with diverse populations and alternative measurement approaches; and (4) present recommendations for improving the quality of translated data collection instruments. The aim of this presentation is to share promising practices and invite discussion with other practitioners in the field. | |||
| |||
| |||
|
| Session Title: Collaborative, Participatory and Empowerment Evaluation TIG Business Meeting |
| Business Meeting Session 126 to be held in Hanover Suite B on Wednesday, November 7, 4:30 PM to 6:00 PM |
| Sponsored by the Collaborative, Participatory & Empowerment Evaluation TIG |
| TIG Leader(s): |
| David Fetterman, Stanford University, profdavidf@yahoo.com |
| Liliana Rodriguez-Campos, University of South Florida, lrodriguez@coedu.usf.edu |
| Session Title: Assessing Randomized Control Trials and Alternatives | ||||||||||||||||||||||||
| Multipaper Session 127 to be held in Baltimore Theater on Wednesday, November 7, 4:30 PM to 6:00 PM | ||||||||||||||||||||||||
| Sponsored by the Quantitative Methods: Theory and Design TIG | ||||||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||||||
| James Derzon, Pacific Institute for Research and Evaluation, jderzon@verizon.net | ||||||||||||||||||||||||
|
| Session Title: Crime, Violence and IRT/Rasch Measurement | |||||||
| Panel Session 128 to be held in International Room on Wednesday, November 7, 4:30 PM to 6:00 PM | |||||||
| Sponsored by the Quantitative Methods: Theory and Design TIG | |||||||
| Chair(s): | |||||||
| Susan Hutchinson, University of Northern Colorado, susan.hutchinson@unco.edu | |||||||
| Discussant(s): | |||||||
| Michael Dennis, Chestnut Health Systems, mdennis@chestnut.org | |||||||
| Abstract: This panel presents a variety of applications of the Rasch measurement model in the assessment of crime and violence (Global Appraisal of Individual Need, Dennis, 2003). Maps of crime and violence items and persons are presented that illustrate the symptom hierarchies on a linear, interval yardstick. Atypical cases are profiled, e.g., those who endorse very little crime and violence except some of the more severe instances such as prostitution, forgery and rape. Differential item functioning for adults vs. adolescents and men vs. women are presented. Each of these examples deals with methods to improve the reliability, validity and fairness of the measures we use to evaluate programs. | |||||||
| |||||||
| |||||||
|
| Session Title: Evaluation Methodology in Educational Technology Contexts | |||||||||||||||
| Multipaper Session 129 to be held in Chesapeake Room on Wednesday, November 7, 4:30 PM to 6:00 PM | |||||||||||||||
| Sponsored by the Distance Ed. & Other Educational Technologies TIG | |||||||||||||||
| Chair(s): | |||||||||||||||
| Michael Coe, Northwest Regional Educational Laboratory, coem@nwrel.org | |||||||||||||||
|
| Session Title: International Perspectives on Evaluation Part 2: Institutionalizing Evaluation in Government |
| Think Tank Session 130 to be held in Versailles Room on Wednesday, November 7, 4:30 PM to 6:00 PM |
| Sponsored by the AEA Conference Committee |
| Presenter(s): |
| Ross Conner, University of California, Irvine, rfconner@uci.edu |
| Abstract: Evaluators from outside the US face some similar and many different issues. Two of the currently-salient 'different issues' will be the focus of two consecutive parts of an IOCE-sponsored think tank session. The second session will focus on new initiatives to institutionalize monitoring and evaluation (or M&E as it is known in many developing countries) in government, in a more integral way and at a higher level than typically occurs in the US. The discussion will focus on recent developments in South Africa and in West Africa in Niger, where following the African Evaluation Association (AfrEA) 4th Conference there, Nigerien President Mamadou Tandja established a new Ministry of Monitoring and Evaluation (in the US context, this would be similar to establishing a new federal level department). Representatives from AfrEA will describe and discuss the background and implications for institutionalizing evaluation throughout Africa. |
| Session Title: International Perspectives on Evaluation Part 1: Evaluating the Impact of Development Projects |
| Think Tank Session 130 to be held in Versailles Room on Wednesday, November 7, 4:30 PM to 6:00 PM |
| Sponsored by the AEA Conference Committee |
| Presenter(s): |
| Ross Conner, University of California, Irvine, rfconner@uci.edu |
| Abstract: Evaluators from outside the US face some similar and many different issues in their work. Two of the currently-salient 'different issues' will be the focus of two consecutive parts of an IOCE-sponsored think tank session. The first session will build on a discussion now underway in developing countries about the best ways to assess the impacts and outcomes of these types of projects. This moderated Q&A session will feature three discussants from different stakeholder groups with a perspective on this issue: indigenous evaluators working in developing countries, donors and others from outside the developing world who support development projects, and evaluators from outside these countries doing international development work and who bridge the other two stakeholder groups. There will be no formal presentations; instead, the three discussants will answer questions about the issue posed by the moderator for about 30 minutes. The final 15 minutes will involve questions from the floor. |