| Session Title: Learning Practical Knowledge Through the Study of Cases | |||
| Panel Session 336 to be held in International Ballroom A on Thursday, November 8, 11:15 AM to 12:45 PM | |||
| Sponsored by the Presidential Strand | |||
| Chair(s): | |||
| Jody Fitzpatrick, University of Colorado, Denver, jody.fitzpatrick@cudenver.edu | |||
| Discussant(s): | |||
| Tysza Gandha, University of Illinois at Urbana-Champaign, tgandha2@uiuc.edu | |||
| Holli Burgon, University of Illinois at Urbana-Champaign, inquireevaluate@gmail.com | |||
| Jody Fitzpatrick, University of Colorado, Denver, jody.fitzpatrick@cudenver.edu | |||
| Abstract: The panel will discuss practical knowledge and its role in learning and enhancing the practice of evaluation. Cases on ethical dilemmas faced by evaluators will be used to illustrate how students and evaluators can gain practical knowledge of how evaluators handle ethical issues. Case studies have long been a tool for learning. Panelists and discussants will debate their role in increasing practical knowledge and the manner in which they may do so. One discussant will contrast her case studies on practice with those on ethics. Two student discussants will comment on the value of the cases to them in illuminating evaluation practice. | |||
| |||
|
| Session Title: Practicing What we Preach: Exploring the Transformative Potential of Evaluation Processes |
| Multipaper Session 337 to be held in International Ballroom B on Thursday, November 8, 11:15 AM to 12:45 PM |
| Sponsored by the Multiethnic Issues in Evaluation TIG |
| Chair(s): |
| Tanya Brown, Duquesne University, jaderunner98@gmail.com |
| Rodney Hopson, Duquesne University, hopson@duq.edu |
| Discussant(s): |
| Karen Kirkhart, Syracuse University, kirkhart@syr.edu |
| Stafford Hood, Arizona State University, stafford.hood@asu.edu |
| Abstract: How do our practices change once we acknowledge that learning within evaluation is dynamic and multi-directional? This question becomes even more prescient when we align it with dominant concerns of social justice and social change in evaluation practice, and the multiple learning capacities within the field. This panel, students of the AEA/DU Graduate Education Diversity Internship, provides accounts of dynamic learning processes that take place over the course of an evaluation. Each paper discusses how the evaluator navigated through the evaluation process, with special focus on one of the following: (1) attending to the interpersonal processes between evaluator and stakeholders; (2) utilizing theories of practice that uniquely address the concerns of the evaluation context; and (3) considering how the learning processes of a particular program inform and map onto the evaluation process and the evaluator's development. Discussants lift the presenters' learning experiences to proffer further lessons on evaluation process and its transformative potential. |
| Planting Collaborative Growth: Coalition Building as a Key Element to the Evaluation Process |
| Nia Davis, University of New Orleans, nkdavis@hlkn.tamu.edu |
| Since 1991, The United State Department of Justice has implemented Operation Weed and Seed (OWS) in sites across the country with the aim to simultaneously reduce crime and bolster community development. OWS New Orleans has adopted a coalition structure, comprised of representatives of organizations or community groups with a vested interest in the residential area designated as the community of focus. Unique to the coalition structure is the integration of the research and evaluation contacts, who collaborate with community representatives on a regular basis. Critical then, was an attunement to interpersonal relationships amongst coalition members, trust building, and the development of the community through the OWS initiative. This presentation highlights the evaluation activities employed to build cohesion and commitment to community development among coalition participants. The presenter also parallels this process to her own development as an evaluator. |
| An Analysis of Organizational Capacity and Research Inquires: Incorporating Cultural Competence in Evaluation Research Agendas |
| Milton Ortega, Portland State University, mao@pdx.edu |
| The literature on cultural competency in evaluation research has grown considerably over the last decade. However, relatively little has been done to implement coherent evaluation practices in accordance with cultural competency. This lack of attention to organizational capacity may be further echoed in the failure of some program evaluations to place cultural competency centrally in research agendas. The pursuit of cultural competence in research evaluation is further constrained by a lack of organizational and methodological approaches. This paper examines the organizational learning capacities of a research evaluation organization in its attempts to incorporate cultural competency in its own evaluation projects. The purpose of this analysis is to provide organizations with some recommendations in preparing for research inquires that promote cultural competency, with the hope that better understandings be attained. |
| Illuminating Community Meanings: Utilization of a Narrative Framework to Document Community Change |
| Josephine Sirineo, University of Michigan, jsirineo@umich.edu |
| The success of an evaluation is largely dependent on gathering information that accurately depicts a context under investigation. How people make sense of their environment (Weick, 1995) and how they choose to communicate their experiences to others are important concepts for evaluators to recognize throughout the evaluation process. Tobin (2005) notes how storytelling can be an integral component in program evaluation when it is used as a primary or secondary data gathering technique. This paper will present a framework that documents a process of applying the Most Significant Change methodology to a national, multi-site, cluster evaluation. The MSC technique is a systematic process for recording, collecting and analyzing stories around specific themes (Davies and Dart, 2005). Preliminary findings show that storytelling can demonstrate the dynamic nature of learning occurring at different levels (individual, group, and organization) and with varying intensities. |
| Evaluation of Non-Traditional Approaches for Preventing High School Dropout |
| Roderick Harris, University of Pittsburgh, rlh1914@yahoo.com |
| Experiential education is the process of actively engaging students in an authentic experience that will have benefits and consequences. Students make discoveries and experiment with knowledge themselves instead of hearing or reading about the experiences of others. Students also reflect on their experiences, thus developing new skills, new attitudes, and new theories or ways of thinking (Kraft & Sakofs, 1988). This presentation discusses the candid field experience of a novice evaluator who used an experiential approach to learn practical program evaluation within the context of a high school dropout prevention organization, Communities in Schools (CIS). Since 1985 CIS aims to help young people successfully transition out of high school and build on their potential. Building on the tenets of experiential education and the aims of CIS, the presenter will outline participatory methods used for evaluating an in-school program, and two different alternative learning academies. |
| Session Title: Incorporating Technological Innovations in Data Collection | ||||||||||||||||||
| Multipaper Session 338 to be held in International Ballroom C on Thursday, November 8, 11:15 AM to 12:45 PM | ||||||||||||||||||
| Sponsored by the Qualitative Methods TIG | ||||||||||||||||||
| Chair(s): | ||||||||||||||||||
| Sandra Mathison, University of British Columbia, sandra.mathison@ubc.ca | ||||||||||||||||||
| Discussant(s): | ||||||||||||||||||
| Sandra Mathison, University of British Columbia, sandra.mathison@ubc.ca | ||||||||||||||||||
|
| Session Title: Models of Evaluation Use and Influence in Social and Educational Services | |||||||||||||||
| Multipaper Session 339 to be held in International Ballroom D on Thursday, November 8, 11:15 AM to 12:45 PM | |||||||||||||||
| Sponsored by the Evaluation Use TIG | |||||||||||||||
| Chair(s): | |||||||||||||||
| Dennis Affholter, Affholter and Associates, thedpa@yahoo.com | |||||||||||||||
|
| Session Title: Evaluation Training: Developing Professionals | |||||||||||||||||||||||||
| Multipaper Session 340 to be held in International Ballroom E on Thursday, November 8, 11:15 AM to 12:45 PM | |||||||||||||||||||||||||
| Sponsored by the Graduate Student and New Evaluator TIG | |||||||||||||||||||||||||
| Chair(s): | |||||||||||||||||||||||||
| Chris Coryn, Western Michigan University, christian.coryn@wmich.edu | |||||||||||||||||||||||||
|
| Session Title: Evaluation Capacity Building Unplugged |
| Think Tank Session 341 to be held in Liberty Ballroom Section A on Thursday, November 8, 11:15 AM to 12:45 PM |
| Sponsored by the Organizational Learning and Evaluation Capacity Building TIG |
| Presenter(s): |
| Hallie Preskill, Claremont Graduate University, hallie.preskill@cgu.edu |
| Shanelle Boyle, Claremont Graduate University, shanelle.boyle@gmail.com |
| Abstract: In spite of the many evaluation capacity building (ECB) efforts that are underway around the world, there is little empirical research that guides evaluators in their design and implementation of such activities. For example, few have written about the linkages between ECB and adult and workplace learning theory, or the theories and practices of organizational learning. In addition, few have offered a typology of strategies and their appropriate uses, or provided evaluation data on the effectiveness of various capacity building strategies. In this session, participants will: 1) be engaged in developing a logic model of evaluation capacity building and, 2) be asked to review and critique a draft of a new evaluation capacity building conceptual framework. Our hope is that participants will leave the session with new insights about ECB and practical ideas for maximizing the ways in which they work to develop others' evaluation capacity. |
| Session Title: Professional Status for Evaluators: Canadian and American Views | ||||
| Panel Session 342 to be held in Liberty Ballroom Section B on Thursday, November 8, 11:15 AM to 12:45 PM | ||||
| Sponsored by the AEA Conference Committee | ||||
| Chair(s): | ||||
| Gerald Halpern, Fair Findings Inc, gerald@fairfindings.com | ||||
| Abstract: The American Evaluation Association does not award professional designations in evaluation; the Canadian Evaluation Society is on the road to seriously considering the development and installation of such a system. This session describes why the Canadian Evaluation Society is moving in this direction and how it would expect to achieve professional status for evaluators in the near future. Professional designations in Canada would have implications for practice in the United States and elsewhere. The process being followed in Canada may have utility for other-country professional associations of evaluators. The Canadian experience will be examined and critiqued by two American evaluators experienced with the issue. Discussion from non-panel participants will be encouraged and significant time will be reserved for this purpose. | ||||
| ||||
| ||||
|
| Session Title: Exploring the Implications of the Administration of Aging's Performance Outcomes Measures Project for Evaluators | ||||
| Panel Session 343 to be held in Mencken Room on Thursday, November 8, 11:15 AM to 12:45 PM | ||||
| Sponsored by the Government Evaluation TIG | ||||
| Chair(s): | ||||
| Patricia Yee, Vital Research, LLC, patyee@vitalresearch.com | ||||
| Discussant(s): | ||||
| Melanie Hwalek, Social Program Evaluators and Consultants Inc, mhwalek@specassociates.org | ||||
| Abstract: This panel will investigate the federal government's framework for measuring outcomes of social services for the aging and what it means for local evaluators. The first presenter will provide an historical overview and summary of the current research on the Administration on Aging's (AoA) core set of performance measures for state and community programs on aging operating under the Older Americans Act. Then, two presenters will discuss their own evaluations of programs in aging: (1) a utilization focused evaluation in senior affordable service-enriched housing and (2) a parenting grandparent caregivers program. In addition to describing their evaluations, the two presenters will examine the extent to which their outcomes relate to the Performance Outcome Measures Project (POMP) of AoA. The discussant will facilitate audience feedback about ways that local evaluators could link up with the performance measures AoA is using to build systems of accountability for social programs in aging. | ||||
| ||||
| ||||
|
| Session Title: Evaluation in Education | ||||||||||||||||||||||||||||||||
| Multipaper Session 344 to be held in Edgar Allen Poe Room on Thursday, November 8, 11:15 AM to 12:45 PM | ||||||||||||||||||||||||||||||||
| Sponsored by the Cluster, Multi-site and Multi-level Evaluation TIG | ||||||||||||||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||||||||||||||
| Rene Lavinghouze, Centers for Disease Control and Prevention, shl3@cdc.gov | ||||||||||||||||||||||||||||||||
|
| Session Title: Foundation Policy Change Efforts: Internal and External Evaluation Strategies | ||||||||||||
| Multipaper Session 345 to be held in Carroll Room on Thursday, November 8, 11:15 AM to 12:45 PM | ||||||||||||
| Sponsored by the Advocacy and Policy Change TIG | ||||||||||||
| Chair(s): | ||||||||||||
| Claire Brindis, University of California, San Francisco, claire.brindis@ucsf.edu | ||||||||||||
|
| Session Title: Conducting Large Scale Evaluations of Federal Cancer Control Programs | |||
| Panel Session 346 to be held in Pratt Room, Section A on Thursday, November 8, 11:15 AM to 12:45 PM | |||
| Sponsored by the Health Evaluation TIG | |||
| Chair(s): | |||
| Lenora Johnson, National Institutes of Health, johnslen@mail.nih.gov | |||
| Abstract: With decreasing funding, there is a greater need for federal programs to demonstrate their effectiveness. To aid in conducting federal evaluations, the government has instituted programs like HHS's 1% Evaluation Set-Aside program, which off-sets the cost of evaluation and provides opportunities for capacity-building. However, challenges remain. GAO documents several barriers in its case study report on the assessment of information dissemination including: variations local-level program implementation; assessing impact of multi-media programs; observation of delayed outcomes; reliance on self-report; and accounting for all factors contributing to behavior change. Despite barriers, agencies are expected to implement rigorous evaluations that deliver reliable, timely, and useful results. In this session, three examples of ongoing, large-scale evaluations of federal cancer control programs will be presented. Authors will share methodologies and findings, including challenges and strategies to address them. A discussion around how to best meet challenges associated with conducting these evaluations will follow. | |||
| |||
| |||
|
| Session Title: Evaluating the Effectiveness of Community Prevention Coalitions: An Interim Report on the Evaluation of the Drug-free Communities Support Program | |||
| Panel Session 347 to be held in Pratt Room, Section B on Thursday, November 8, 11:15 AM to 12:45 PM | |||
| Sponsored by the Alcohol, Drug Abuse, and Mental Health TIG | |||
| Chair(s): | |||
| David Chavis, Association for the Study and Development of Community, dchavis@capablecommunity.com | |||
| Discussant(s): | |||
| Kenneth Shapiro, Office of National Drug Control Policy, kshapiro@ondcp.eop.gov | |||
| Abstract: While the use of coalitions to prevent disease and promote health has been popular for many years, the evaluations of such initiatives are extremely challenging. The proposed panel will discuss how the program evaluation of the Drug-Free Communities Support Program addresses some of the unique challenges of evaluating community prevention coalitions through the use of a typology that captures how coalitions develop or mature over time and an innovative statistical technique to compare communities with and without Drug-Free Community Coalitions. In addition to using innovative methodology, this panel will present preliminary findings about the capacities and characteristics of Coalitions that are making positive public health changes in their community. These preliminary findings significantly improve our field's understanding of how Coalitions can facilitate community change to reduce substance abuse. Finally, the panel will conclude with a discussion of how this evaluation has addressed the challenges associated with using self-reported data. | |||
| |||
| |||
| |||
| |||
|
| Roundtable Rotation I: A Time Sequencing Evaluation Technique for Exercise Evaluation |
| Roundtable Presentation 348 to be held in Douglas Boardroom on Thursday, November 8, 11:15 AM to 12:45 PM |
| Presenter(s): |
| Lisle Hites, Tulane University, lhites@uab.edu |
| Abstract: Evaluation of drills and exercises typically consist of developing exercise objectives into checklists that consist of measurable and observable items or events. However, such evaluation techniques are ill suited for gathering and quantifying less predictable facets of exercise outcomes and effectiveness. By focusing exclusively on pre-identified exercise objectives, many aspects of response effectiveness data may be overlooked. The technique discussed in this presentation will address an exercise evaluation technique which utilized time sequencing to assess the effectiveness of emergency response in a series of multidisciplinary simulated avian influenza outbreaks. Through use of this technique, assessment of this rich data set resulted in the identification of many different and un-expected insights into emergency response effectiveness. |
| Roundtable Rotation II: Linking Monitoring, Evaluation and Internal Audit in International Emergency Response to Increase Effectiveness |
| Roundtable Presentation 348 to be held in Douglas Boardroom on Thursday, November 8, 11:15 AM to 12:45 PM |
| Presenter(s): |
| Jason Ackerman, Catholic Relief Services, jackerma@crs.org |
| Carlisle Levine, Catholic Relief Services, clevine@crs.org |
| Stuart Belle, World Vision International, stuart_belle@wvi.org |
| Alex Causton, Catholic Relief Services, acauston@crspk.org |
| Abstract: The international NGO community's ability to leverage organizational learning generated by monitoring and evaluating emergency responses from the Rwanda genocide to the Pakistan earthquake is mixed. The roundtable presenters suggest that collaboration between NGO internal audit and M&E practitioners will increase the likelihood that M&E recommendations lead to long-term, positive emergency response outcomes. Between M&E and Internal Audit functions, a broad array of skills sets, knowledge and capabilities are available, all of which can be more effectively deployed before, during and after an emergency response in order to increase intervention effectiveness. The roundtable discussion will use the international NGO response to the Pakistan earthquake to highlight M&E successes and challenges associated with improving emergency response outcomes. Those highlight include: internal audit compliance authority reinforces M&E recommendations; collaborate don't duplicate; and to be mutually successful M&E and internal audit need a common vocabulary and understanding in their assessment approaches. |
| Session Title: Exchange Outcome Assessment Linkage System (E-GOALS): A United Sates Department of State Web-Based Approach to Assessing the Performance of International Educational Programs | |||
| Panel Session 349 to be held in Hopkins Room on Thursday, November 8, 11:15 AM to 12:45 PM | |||
| Sponsored by the Integrating Technology Into Evaluation | |||
| Chair(s): | |||
| Cheryl Cook, United States Department of State, cookcl@state.gov | |||
| Abstract: The Bureau of Educational and Cultural Affair's (ECA) Office of Policy and Evaluation within the U.S. Department of State is tasked with assessing the performance of its exchange programs. They have developed a logic model that organizes these diverse programmatic activities and objectives into nine measurable outcomes. These outcomes are then measured using performance indicators through customized surveys using an online system called E-GOALS. The system has the capacity to deliver web-based surveys in multiple languages. The panel will share its learning experiences in combining performance measurement and system delivery. Specifically, the panel will: 1. Provide a summary of the E-GOALS Survey system and its development 2. Outline the nine bureau level performance outcomes 3. Discuss the construction of the nine indicators that are based on the outcomes 4. Demonstrate examples of our pre, post and follow-up templates 5. Highlight critical features e.g. multi-language database | |||
| |||
| |||
|
| Session Title: Building a Framework for Public Diplomacy Evaluations: Lessons Learned and Best Practices in Public Diplomacy Evaluation | |||
| Panel Session 350 to be held in Peale Room on Thursday, November 8, 11:15 AM to 12:45 PM | |||
| Sponsored by the International and Cross-cultural Evaluation TIG | |||
| Chair(s): | |||
| Melinda Crowley, United States Department of State, crowleyml@state.gov | |||
| Discussant(s): | |||
| Norma Fleischman, United States Department of State, fleischmanns@state.gov | |||
| Abstract: This panel discusses three pilot evaluation studies launched in FY'06 by the newly created Public Diplomacy Evaluation Office (PDEO), U.S. Department of State. PDEO combines evaluation staffs of the Bureau of Educational and Cultural Affairs (ECA), Bureau of International Information Programs (IIP) and the Office of Policy, Planning and Resources in the Office of the Under Secretary for Public Diplomacy and Public Affairs (R/PPR). PDEO promotes opportunities for organizational learning, and a nimble structure to transform recommendations into actionable program improvements. The panel focuses on three projects implemented through PDEO. The first presentation involves the American Corners program, a partnership between U.S. Embassies and foreign host institutions, usually public or university libraries. The second presentation involves the Strategic Media Outreach Performance Assessment (SMOPA), one of several projects that measure and assess the effectiveness of U.S. Embassy public diplomacy. The third presentation focuses on the Mission Activity Tracker (MAT), a global tool for tracking public diplomacy outreach at U.S. Embassies. | |||
| |||
| |||
|
| Session Title: Macro-level and Micro-level Methodologies for Evaluating Education System Functioning in Afghanistan |
| Multipaper Session 351 to be held in Adams Room on Thursday, November 8, 11:15 AM to 12:45 PM |
| Sponsored by the International and Cross-cultural Evaluation TIG |
| Chair(s): |
| Edward Kissam, JBS International Inc, ekissam@jbsinternational.com |
| Discussant(s): |
| Roger Rasnake, JBS International Inc, rrasnake@jbsinternational.com |
| Jo Ann Intili, JBS International Inc, jintili@jbsinternational.com |
| Abstract: This session examines the evaluation research toolkit necessary to effectively track initiatives for strengthening education systems in developing countries--using Afghanistan as a case study. The presentations draw on panelists' analyses of two macro-level datasets: the 2005 Afghanistan National School Survey data and the 2003 NRVA, on their experience conducting micro-level community case studies in remote rural areas of the country, and their ongoing ethnographic research in a demonstration cluster school initiative. The presentations will show that the micro-level research provides crucial supplementation to the national assessment using UNESCO's EFA framework in order to guide effective education system reform. Recommendations will be presented regarding the types of capacity-building needed to assure reliable research, data collection, and analysis. |
| Challenges in Interpreting National Survey Data on Education: Moving From Summary Tabulation to Practical Action |
| Craig Naumann, JBS International Inc, cnaumann@jbsinternational.com |
| Shannon Williams, JBS International Inc, swilliams@jbsinternational.com |
| Edward Kissam, JBS International Inc, ekissam@jbsinternational.com |
| The presenters collaborated in detailed analyses of data from Afghanistan's Ministry of Education's 2005 National School Survey. These differed from previous analyses in that efforts were made to clean a dataset generated with limited resources and under difficult data collection conditions and to cross-tabulate key variables rather than simply generating national-level indicators of system status. The presenters will describe key findings from these analyses and their implications for assessing Afghanistan's progress in rebuilding an education system devastated by years of conflict. The discussion will include: strategies to monitor and respond to student dropout and teacher training initiatives to respond to dramatic variations from province to province in teacher qualifications, size of schools, and range of instruction provided. We will also present the team's recommendations for improved school survey design and practical issues to be addressed in strengthening the applied research capacity to reliably monitor national progress in education system reconstruction. |
| From Ritual Flowchart to Complexities of Real-world Action: Understanding the Local Community Context of School Functioning as an Element of Formative Evaluation |
| Mohammad Javad Ahmadi, Creative Associates International Inc, mohammadj@af.caii.com |
| Bianca Murray, JBS International Inc, bmurray@jbsinternational.com |
| Afghanistan's centralized command and control education system faces challenges in its efforts to effectively impact local instruction and student outcomes in a rural country with little infrastructure. Decentralization has been an important strand in both international donors' and Ministry of Education strategic planning. However, implementation of this macro-level strategy is problematic due to lack of information on variations in local conditions. This results in reliance on 'cookie-cutter models' for school administrator and teacher training and overall systemic change. The presenters describe their formative evaluation research in support of implementation of the first phase of a national initiative to strengthen local schools and quality of instruction via parallel training for school management teams and teachers. Evidence is presented that attention to variations in local conditions, to local problem-solving strategies, and to local perspectives on educational objectives, and different sorts of local resources contribute significantly to the design of promising decentralization initiatives. |
| Capacity-Building Challenges, Requirements, and Strategies for Strengthening National Education Systems' Evaluation Research Capacity |
| Trish Hernandez, JBS International Inc, thernandez@jbsinternational.com |
| Shannon Williams, JBS International Inc, swilliams@jbsinternational.com |
| Craig Naumann, JBS International Inc, cnaumann@jbsinternational.com |
| The authors describe and discuss the practical challenges inherent in efforts to develop the technical capacity of Afghanistan's Ministry of Education to conduct the applied research needed to effectively monitor progress in education system reconstruction and evaluate ongoing strategic initiatives. The discussion includes attention to the specific challenges and types of solutions needed to address problems at each stage in the evaluation research process: developing a baseline profile of system status, systematically identifying research priorities, formulating efficient and viable research strategies, creating workable sampling approaches in the absence of adequate sampling frames, generating and piloting study instruments, collecting, managing, cleaning, and analyzing study data, and reporting findings to decision-makers. |
| Roundtable Rotation I: Authentic Demand and Sustainable Community Change: Testing a Theory and Making the Case |
| Roundtable Presentation 352 to be held in Jefferson Room on Thursday, November 8, 11:15 AM to 12:45 PM |
| Presenter(s): |
| Audrey Jordan, Annie E Casey Foundation, ajordan@aecf.org |
| Mary Achatz, Westat, achatzm1@westat.com |
| Thomas Kelly, Annie E Casey Foundation, tkelly@aecf.org |
| Abstract: Making Connections is a shared effort by the Annie E. Casey Foundation, residents, organizational and systems partners, employers, and others to achieve measurable and sustainable improvements in the life chances of children and families in tough neighborhoods in 10 mid-size cities. The effort began in late1999- early 2000 with a broad theory that articulated the field's best thinking about what it would take to do this. Each community, then, adapted the elements of this theory to develop and sequence strategies in ways that built on local history and context and addressed local needs and priorities. This approach is yielding an increasingly robust and testable theory of community mobilization for action and results. This roundtable will focus on what we are calling authentic demand—an emergent area of learning about how resident leadership, civic engagement, community organizing and social network strategies, individually and in combination, contribute to development of new partnerships with local government, funders, service providers, schools, and businesses that work to improve outcomes for families and children. |
| Roundtable Rotation II: Maximizing Learning From Evaluation Findings for Diverse Stakeholders in a Community Capacity-building Initiative |
| Roundtable Presentation 352 to be held in Jefferson Room on Thursday, November 8, 11:15 AM to 12:45 PM |
| Presenter(s): |
| Liz Maker, Alameda County Public Health Department, liz.maker@acgov.org |
| Mia Luluquisen, Alameda County Public Health Department, mia.luluquisen@acgov.org |
| Tammy Lee, Alameda County Public Health Department, tammy.lee@acgov.org |
| Kim Gilhuly, University of California, inertiate@yahoo.com |
| Abstract: Evaluators working in community capacity-building (CCB) initiatives face the challenge of meeting multiple interests of stakeholders involved in implementing these complex projects. CCB interventions strive to improve a community's health and wellbeing by strengthening residents' leadership skills and relationships with policy makers. CCB interventions also require multi-level strategies aimed at changing individual behaviors, group relationships, social environments and power structures. When conducting evaluations in CCB initiatives, evaluators must balance the competing interests of a wide range of stakeholders, including community residents, organizers, funders and decision-makers. For example, residents and organizers may be particularly interested in “telling people's stories” about neighborhood change. Decision-makers may want to focus on measurable changes in health and social outcomes. This Roundtable will allow for sharing experiences and lessons learned in Oakland, California, on balancing the interests of stakeholders; followed by a dialogue about maximizing learning from evaluation findings with diverse stakeholders in CCB evaluations. |
| Session Title: Evaluating a State Comprehensive Cancer Control Program: Planning, Implementation and Initial Results | |||
| Panel Session 353 to be held in Washington Room on Thursday, November 8, 11:15 AM to 12:45 PM | |||
| Sponsored by the Health Evaluation TIG | |||
| Chair(s): | |||
| Lisa Stephens, National Cancer Institute, stephens.lisa@mayo.edu | |||
| Abstract: The increase in collaboratives to address large scale chronic health issues warrants improved evaluation methods. Effective collaborations have been shown to include key components regarding leadership, capacity building, synergy, partnership, and member satisfaction. Because measurable outcomes are often long-term, formative evaluations that measure collaborative functioning can inform stakeholders on areas requiring improvement in order to reach long-term goals. This presentation will discuss the Minnesota Cancer Alliance's (MCA) internal evaluation of its first two years of partnership. The presentation will include background on comprehensive cancer control in Minnesota, the use of an internal, volunteer committee structure for developing and guiding evaluation activities, utility for using theoretical underpinnings, formative evaluation results and sharing recommendations with stakeholders. | |||
| |||
| |||
|
| Session Title: The Contribution of Evaluation to Building the Capacity of Indigenous, Not for Profit Organizations in New Zealand: Implementation of the Child, Youth and Family Provider Development Fund |
| Multipaper Session 354 to be held in D'Alesandro Room on Thursday, November 8, 11:15 AM to 12:45 PM |
| Sponsored by the Non-profit and Foundations Evaluation TIG |
| Discussant(s): |
| Kate McKegg, The Knowledge Institute Ltd, kate.mckegg@xtra.co.nz |
| Abstract: In 2000, the New Zealand government allocated funding to the Department of Child, Youth and Family Services to work with iwi (tribal) and Maori organizations to strengthen their capacity to (1) deliver government programmes and services, and (2) develop their own programmes and services to meet local needs. These papers background the evidence-based development processes and shared 'learnings' that have informed and built a community of practice between the funder and the evaluators to support the ongoing implementation of this capacity building fund. The papers focus on evaluation utility, how evaluation and evaluations methods have contributed to the policy process, fund administration, training development and delivery and discusses the critical attributes and factors that have supported this relationship, since 2001 to the present day. |
| Taking the Time and Building the Relationship: The Approach Taken to the Design and Implementation of the Iwi and Maori Provider Workforce and Development Fund Evaluation |
| Nan Wehipeihana, Research Evaluation Consultancy Ltd, nanw@clear.net.nz |
| The paper explores the importance of strong, trusting and respectful relationships within evaluation. How, working with the evaluation sponsor over a period of six months to gain an in-depth understanding of the aims, intent and operation of the fund, prior to the development of an evaluation design, built trust and confidence in the evaluators. How, on the strength of the relationship, funding was made available for the collective development of the evaluation approach/design involving all eight members of the evaluation team in a series of workshops and planning meetings. How the evaluators reciprocated by providing feedback and data, in advance of final reports, to support the decision-making processes in relation to the ongoing management and implementation of the fund. How, the relationship (and the quality of the evaluation outputs) has supported the ongoing involvement of the evaluators from 2001 to 2007, and an invitation to continue that involvement until 2009. |
| Utilizing Evaluation in the Ongoing Implementation of the Iwi Maori Provider Development Fund |
| Sonya Cameron, Department of Child, Youth and Family Services, sonya.cameron006@cyf.govt.nz |
| This paper discusses how evaluation has contributed to the implementation and strategic direction of the IMPDF. How the literature review and evaluation framework have been utilized post the evaluation. How the evaluation findings contributed to changes in the funding application process, to the nature of support provided to provider organizations and the range of development activities that could be funded. One of the most significant contributions has been the development of an organizational capacity self-assessment tool by the evaluators. The application of that tool has greatly enhanced the ability of providers to identify their own needs and plan their own development and its value is both as an assessment tool and as a capacity building activity in its own right. The paper concludes by discussing the potential contribution of the evaluation to building the capacity of the wider social services and voluntary sector in New Zealand. |
| The Contribution of Evaluation to Building the Capacity of Iwi and Maori Social Service, Not-For-Profit Provider Organizations |
| Miri Rawiri, Department of Child, Youth and Family Services, miri.rawiri004@cyf.govt.nz |
| Providers being self-determining and sustainable development were key messages that arose out of the Iwi and Maori Provider Workforce Development Fund (IMPDF) evaluation. This paper explores how the development of an organizational capacity self-assessment tool (and process) has supported providers to be self-determining and how the capacity building activities have contributed to organizational sustainability. It describes how providers have engaged with the process, the benefits and capacity gains they report to date (after only a period of two years of use) and how they have used the self-assessment tool, information and processes - in ways unimagined by the evaluators - to support the ongoing operation of their organizations. |
| Building an Evidence Base to Support the Sustainability of Iwi and Maori Social Service Provider Organizations and the Development of Cultural Practice Models |
| Fiona Cram, Katoa Ltd, fionac@katoa.net.nz |
| The paper explores the contribution of the various evaluation outputs including evaluation reports, literature reviews (2), organizational capacity self-assessment tool and processes and programme logic training to building the organizational capacity of iwi and Maori social service providers. The paper then explores how these capacity building activities have assisted providers to better articulate the cultural basis from which they work; their values, rationale and ways of working. As a result, iwi and Maori providers are better able to document and describe why they do what they do and how it contributes to outcomes which helps to bridge the knowledge gap around cultural practice models of some funders. |
| Session Title: Evaluation Reports: Reframing the Concept for the Real World | |||
| Panel Session 355 to be held in Calhoun Room on Thursday, November 8, 11:15 AM to 12:45 PM | |||
| Sponsored by the Non-profit and Foundations Evaluation TIG | |||
| Chair(s): | |||
| Zoe Clayson, Abundantia Consulting, zoeclay@abundantia.net | |||
| Discussant(s): | |||
| Gale Berkowitz, Packard Foundation, gberkowitz@packard.org | |||
| Abstract: Foundations use evaluation reporting for a variety of purposes, e.g. resource allocation, internal reflection, external communication, and responding to Board of Directors requests. This panel will explore from the foundation and evaluators perspectives the expectations for reporting and the approaches found most useful for decision makers. Each of the panelists will explore this topic from their particular area of expertise. Patricia Patrizi will focus on how foundations have traditionally used reports or internal reflection, the status of the field now, and thoughts regarding future directions. John Nash will discuss the role of diagrammatic reporting in helping foundations move along the continuum from desired change to strategy. Zoe Clayson brings further depth to the conversation by presenting a web-based approach moving from strategy to management, non-profit grantee implementation, and communications. Finally, Gale Berkowitz will discuss the three presentations from the perspective of the needs of today's foundation decision makers. | |||
| |||
| |||
|
| Session Title: Reflections and Recommendations Concerning Culturally Competent Evaluation |
| Think Tank Session 356 to be held in McKeldon Room on Thursday, November 8, 11:15 AM to 12:45 PM |
| Sponsored by the Indigenous Peoples in Evaluation TIG |
| Presenter(s): |
| Arthur Hernandez, University of Texas, San Antonio, art.hernandez@utsa.edu |
| Julie Desjarlais, Turtle Mountain Community College, jdesjarlais@tm.edu |
| Heyda Martinez, SUAGM, heyd_martinez@yahoo.com |
| Ana Marie Pazos-Rego, University of Miami, apazosrego@aol.com |
| Iris Prettypaint, University of Montana, iris.prettypaint@mso.umt.edu |
| Delia J Valles-Rosales, New Mexico State University, dvalles@nmsu.edu |
| Elizabeth Yellowbird, University of North Dakota, elizabeth.demaray@und.nodak.edu |
| JoAnn W L Yuen, University of Hawaii, Manoa, joyuen@hawaii.edu |
| Abstract: The session will provide an opportunity for session leaders and facilitators to present on progress resulting from participation in the National Science Foundation - QEM (Quality Education for Minorities Network) -Broadening Participation Initiative for Minority Serving Institution faculty- and for attendees to learn from and contribute to these ongoing efforts. Each panelist will discuss briefly resultant development and operationalization of concepts and activities related to Culturally Competent Evaluation in teaching, research and service focusing on diverse communities, evaluators in training and particular institutional applications and implications (key question). Groups will be formed focusing on specific academic and professional applications and attendees will discuss, elaborate and suggest related to their own interests and expertise. The Panel will collect and organize the proceedings in an effort to facilitate self examination and further development of evaluation skills and implementation strategies of all participants/attendees which will be emailed to all involved. |
| Session Title: Reflective Inquiry Into Learning Through Evaluation Practice | |||
| Panel Session 357 to be held in Preston Room on Thursday, November 8, 11:15 AM to 12:45 PM | |||
| Sponsored by the Organizational Learning and Evaluation Capacity Building TIG | |||
| Chair(s): | |||
| Daniel Folkman, University of Wisconsin, Milwaukee, folkman@uwm.edu | |||
| Abstract: The theme for this year's AEA conference is evaluation and learning. The proposed panel presentation will provide three examples of evaluation practice that employs a participatory action research (PAR) approach to program development and assessment. The panel members are developing a framework to assess the immediate and long term learnings that evolve from their PAR strategies and will share preliminary findings from longitudinal case studies that are being compiled as part of a larger study. The panel session will encourage discussion and contributions from the audience aimed at eliciting concrete examples of how evaluation practitioners recognize and/or assess the learning that occurs among themselves and program stakeholders that flows from their evaluation practice. | |||
| |||
| |||
|
| Session Title: Sharing, Defining Ethics and Rejections on Training | |||||||||
| Multipaper Session 358 to be held in Schaefer Room on Thursday, November 8, 11:15 AM to 12:45 PM | |||||||||
| Sponsored by the AEA Conference Committee | |||||||||
| Chair(s): | |||||||||
| Cheri Levenson, Cherna Consulting, c.levenson@cox.net | |||||||||
|
| Session Title: State and Local Public Health Emergency Preparedness: Evaluation at the Centers for Disease Control and Prevention Expands Focus on Capacities to Include Outcomes | |||
| Panel Session 359 to be held in Calvert Ballroom Salon B on Thursday, November 8, 11:15 AM to 12:45 PM | |||
| Sponsored by the Disaster and Emergency Management Evaluation TIG | |||
| Chair(s): | |||
| Craig Thomas, Centers for Disease Control and Prevention, cht2@cdc.gov | |||
| Discussant(s): | |||
| Edward Liebow, Battelle Centers for Public Health Research and Evaluation, liebowe@battelle.org | |||
| Abstract: State and local preparedness for public health emergencies is supported by the Centers for Disease Control and Prevention's (CDC) Division of State and Local Readiness. The need for enhanced preparedness was substantially underscored by the 9/11 attacks and an anthrax release through the US postal system the next month. Panelists from the CDC and Battelle will trace the evolution of public health preparedness evaluation since 2001 and discuss emerging research topics. Panelists will review the history of preparedness measurement and evaluation; identify central evaluation questions of interest concerning accountability, preparedness, and program effectiveness; discuss evidence issues; and explore with session attendees how evaluation findings can be fed back into performance improvement by state and local health agencies responsible for preparedness and response. Disclaimer: The findings and conclusions in this panel are those of the authors and do not necessarily represent the views of the Centers for Disease Control and Prevention. Disclaimer: The findings and conclusions in this panel are those of the authors and do not necessarily represent the views of the Centers for Disease Control and Prevention. | |||
| |||
| |||
|
| Session Title: Evaluation Contracts: Considerations, Clauses, and Concerns |
| Demonstration Session 360 to be held in Calvert Ballroom Salon C on Thursday, November 8, 11:15 AM to 12:45 PM |
| Sponsored by the Independent Consulting TIG |
| Presenter(s): |
| Kristin Huff, Independent Consultant, khuff@iyi.org |
| Abstract: This workshop will address evaluation contract issues from the point of consultant, purchaser, and management. Through sample contracts participants will learn about important contractual considerations such as deliverables, timelines, confidentiality clauses, rights to use/ownership, budget, client and evaluator responsibilities, protocol, and more. In addition the workshop will include some discussion about how to negotiate contracts, as well as contract addendums. Participants will receive materials as examples of the items discussed. Samples will include independent consultant contracts; contracts developed by purchasers; and management contracts for those hiring multiple evaluators. Participants are encouraged to bring topics for discussion during the question and answer session at the end. |
| Session Title: Locating Evidence of Research-based Extension Education Programs |
| Think Tank Session 361 to be held in Calvert Ballroom Salon E on Thursday, November 8, 11:15 AM to 12:45 PM |
| Sponsored by the Extension Education Evaluation TIG |
| Presenter(s): |
| Heather Boyd, Virginia Tech, hboyd@vt.edu |
| Discussant(s): |
| Bart Hewitt, United States Department of Agriculture, bhewitt@csrees.usda.gov |
| Dawn Gundermann, University of Wisconsin, dmgundermann@wisc.edu |
| Abstract: Providing research-based educational programs is a major focus, mission and application of the nationwide extension system. Yet, providing evidence of a research base and its contribution to the program has been elusive and difficult to achieve for extension workers and evaluators. Some of this difficulty stems from how different extension contexts define and use research to inform programming. To demonstrate alignment with national extension goals, it is critical that extension evaluators and educators identify ways in which research informs and shapes outreach educational products. Participants in this Think Tank will explore definitions of research, ways in which research is used and incorporated into programming, indicators that research is being used in extension educational programs and the role of evaluators in defining the research and education context in extension programs. Depending on the wishes of the participants, the group may also explore how 1862, 1890 and 1994 institutions approach this issue. |
| Session Title: Evaluating Teacher Professional Development | ||||||||||||||||||||||||||
| Multipaper Session 362 to be held in Fairmont Suite on Thursday, November 8, 11:15 AM to 12:45 PM | ||||||||||||||||||||||||||
| Sponsored by the Pre-K - 12 Educational Evaluation TIG | ||||||||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||||||||
| Rabia Hos, University of Rochester, rabiahos@yahoo.com | ||||||||||||||||||||||||||
|
| Session Title: Evaluating Schools and Processes Within Schools | ||||||||||||||||||||||
| Multipaper Session 363 to be held in Federal Hill Suite on Thursday, November 8, 11:15 AM to 12:45 PM | ||||||||||||||||||||||
| Sponsored by the Pre-K - 12 Educational Evaluation TIG | ||||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||||
| Paul Lorton Jr, University of San Francisco, lorton@usfca.edu | ||||||||||||||||||||||
|
| Session Title: Research Evaluation of the Upcoming Europeans Union’s Framework Programme | |||||||||||||||||
| Multipaper Session 364 to be held in Royale Board Room on Thursday, November 8, 11:15 AM to 12:45 PM | |||||||||||||||||
| Sponsored by the Research, Technology, and Development Evaluation TIG | |||||||||||||||||
| Chair(s): | |||||||||||||||||
| Peter Fisch, European Commission, peter.fisch@ec.europa.eu | |||||||||||||||||
|
| Session Title: Evaluating the Cultural Competence of Substance abuse and Mental Health Services: Policy, Technology, and Practice | |||
| Panel Session 365 to be held in Royale Conference Foyer on Thursday, November 8, 11:15 AM to 12:45 PM | |||
| Sponsored by the Alcohol, Drug Abuse, and Mental Health TIG | |||
| Chair(s): | |||
| James Herrell, United States Department of Health and Human Services, jim.herrell@samhsa.hhs.gov | |||
| Abstract: Although cultural competence is widely considered essential to the delivery of effective substance abuse and mental health services, empirical support for this belief is modest, limited in part by inconsistent definitions of the concept and the lack of tested approaches to measuring cultural competence. This panel will describe advances in operationalizing and evaluating organizational and individual cultural competence, and linking it to client outcomes. Panelists will present key findings from the literature, discuss the emphasis of one federal agency, the Substance Abuse and Mental Health Services Administration (SAHMSA), on culturally competent service delivery, describe advances in technologies for defining and evaluating cultural competence cross-sectionally and longitudinally, and discuss the evaluation of a culturally adapted evidence-based practice employed by a SAMHSA grantee. Panelists will engage the audience in a discussion of using the evaluation of cultural competence to improve services (and maybe even to increase chances of receiving grants). | |||
| |||
| |||
|
| Session Title: Empowerment Evaluations: Insights, Reflections, and Implications | ||||||||||||||||||||
| Multipaper Session 366 to be held in Hanover Suite B on Thursday, November 8, 11:15 AM to 12:45 PM | ||||||||||||||||||||
| Sponsored by the Collaborative, Participatory & Empowerment Evaluation TIG | ||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||
| Brian Marriott, Calgary Health Region, brian.marriott@calgaryhealthregion.ca | ||||||||||||||||||||
|
| Session Title: Quantitative Methods: Theory and Design TIG Business Meeting and Presentation - Theory Soup for the Quantitative Soul |
| Business Meeting Session 367 to be held in Baltimore Theater on Thursday, November 8, 11:15 AM to 12:45 PM |
| Sponsored by the Quantitative Methods: Theory and Design TIG |
| TIG Leader(s): |
| Patrick McKnight, George Mason University, pem@alumni.nd.edu |
| George Julnes, Utah State University, gjulnes@cc.usu.edu |
| Fred Newman, Florida International University, newmanf@fiu.edu |
| Karen Given Larwin, Gannon University, kgiven@kent.edu |
| Dale Berger, Claremont Graduate University, dale.berger@cgu.edu |
| Presenter(s): |
| Melvin Mark, Pennsylvania State University, m5m@psu.edu |
| Discussant(s): |
| William Trochim, Cornell University, wmt1@cornell.edu |
| Abstract: Evaluation theory, in general, has a different -- but complementary – focus than do most evaluators who write about quantitative methods. Methodologists, for example, discuss new and optimal ways of estimating the counterfactual, while evaluation theorists discuss whether, when and why evaluators should try to estimate the counterfactual. Methodologists debate alternative models for data analysis, while evaluation theorists instead debate the alternative ends toward which an evaluation's data and findings might be put. The two lines of thinking might profit from more intersection. Several examples are sketched in support of this assertion. |
| Session Title: Quality Counts: Becoming Bilingual in Quality Improvement and Evaluation in Human Services and Health Care Settings | |||
| Panel Session 368 to be held in International Room on Thursday, November 8, 11:15 AM to 12:45 PM | |||
| Sponsored by the Human Services Evaluation TIG | |||
| Chair(s): | |||
| James Sass, LA's BEST After School Enrichment Program, jim.sass@lausd.net | |||
| Abstract: The emphasis on accreditation in human services and health care settings has emphasized the institutionalization of Quality Improvement departments. Some evaluators might view these as internal evaluation departments. In this session, presenters offer illustrations of the parallel developments of evaluation and quality improvement traditions, their integration, what they can learn from one another, and models of implementation that have shown signs of learning and improvement at both the individual and organizational levels. | |||
| |||
| |||
|
| Session Title: Mainstreaming and Supporting Needs Assessment in a Large Organization | |||
| Panel Session 369 to be held in Chesapeake Room on Thursday, November 8, 11:15 AM to 12:45 PM | |||
| Sponsored by the Needs Assessment TIG | |||
| Chair(s): | |||
| Maurya West Meiers, World Bank, mwestmeiers@worldbank.org | |||
| Abstract: The panelists will discuss their experiences putting into place a system and tools to enable teams in a large organization to build needs assessment into their training and technical assistance projects. The session will focus on how to structure a process of learning about needs assessments, how to provide the resources necessary to implement needs assessment broadly within an organization, and how to communicate the benefits of needs assessment to a variety of stakeholders. Tools, methods, and practical examples will be highlighted. | |||
| |||
| |||
|
| Session Title: Multi-year Evaluation of the Arts Education Reform Efforts in South Carolina |
| Multipaper Session 370 to be held in Versailles Room on Thursday, November 8, 11:15 AM to 12:45 PM |
| Sponsored by the Evaluating the Arts and Culture TIG |
| Chair(s): |
| Ching Ching Yap, University of South Carolina, ccyap@gwm.sc.edu |
| Discussant(s): |
| Ken May, South Carolina Arts Commission, mayken@arts.state.sc.us |
| Abstract: The multi-year Arts Education Research Project seeks to track the progress and evaluate effects of arts education reform efforts in various schools that received assistance from the Arts in Basic Curriculum (ABC) Project. These schools were committed to developing arts programs based on the ABC blueprint that is based on the belief that the arts are an indispensable part of a complete education because quality education in the arts significantly adds to the learning potential of all students. The annual objectives of the Arts Education Research Project varied from (a) documenting arts instructions, (b) determining the effects of increased, modified, or integrated arts instruction, and (c) identifying potential influences that promote, inhibit, or sustain changes in schools that implemented arts reform. This session will include three papers that discuss several key findings of the project and the ongoing effort of developing instruments that measure arts integration. |
| Summary of Five-Year Evaluation in Arts Education Reform Effort |
| Ching Ching Yap, University of South Carolina, ccyap@gwm.sc.edu |
| In the first years, the evaluation of arts education reform efforts for the Arts Education Research Project was based on (a) observations of arts classes (music, visual arts) and general education classes (ELA, Science, and Math), (b) surveys of teachers, parents, and students, and (c) interviews of teachers and administrators. This paper highlights the major findings including the challenges encountered by teachers and schools in implementing arts reform. The evaluators recommended schools and stakeholder consider (a) leadership and advocacy, (b) realistic and endorsed expectations, (c) mutual respect and appreciation across disciplines, (d) resources, (e) communication and feedback when implementing and evaluating arts reform efforts. Finally, the evaluators recommended using student arts achievement results and developing arts integration evaluation tools in addition to observations and interviews to investigate the effects of arts reform efforts. |
| Implications of Arts Programming Characteristics on Student Achievement |
| Leigh D'Amico, University of South Carolina, kale_leigh@yahoo.com |
| Pu Peng, University of South Carolina, lemonpu@yahoo.com |
| The objective of this evaluation project was to compare arts programming and implementation strategies for ABC schools with disparate arts and non-arts achievement levels. Although the majority of the ABC schools demonstrated success in increasing communication among non-arts and arts teacher, enhancing the curriculum using arts-based strategies, and improving student arts and non-arts achievement, a small percentage of schools did not realize their student achievement goals. This presentation will include details regarding the evaluation strategies employed and findings of this evaluation project. In general, the evaluators identified (a) teacher quality, (b) support of arts programming by non-arts teachers and administrators, (c) level of arts integration, and (d) arts-based extracurricular opportunities as arts programming areas that may affect student arts and non-arts achievement. By increasing the awareness of areas that impact arts program implementation, schools can address opportunities and challenges in their arts efforts that allow students to reach maximum potential. |
| Developing Arts Integration Evaluation Tools |
| Christine Fisher, Winthrop University, fisherc@winthrop.edu |
| Varied levels of arts integration effort were observed at ABC schools due to individualized five-year arts strategic plans written based on schools' unique needs with regard to school environment, budgeting, and student population characteristics. In an effort to clarify the definitions for and identify levels of arts integration efforts, the ABC project initiated a task force to develop evaluation instruments to communicate with teachers and administration. The first instrument, Essential Elements for Arts Infusion Programming Survey, was designed to inform schools on missing elements needed for arts infusion after using Opportunity-to-Learn Standards. The Arts Infusion Continuum was developed to inform schools regarding best practice in arts integration effort. This paper will present the development process of these two instruments and the initial validation study conducted. |