Evaluation 2008 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Three Essential C's of Data Collection: Collaboration, Coordination, and Communication in a Multi-Site Evaluation
Panel Session 361 to be held in Mineral Hall Section D on Thursday, Nov 6, 3:35 PM to 4:20 PM
Sponsored by the Cluster, Multi-site and Multi-level Evaluation TIG
Chair(s):
Manolya Tanyu,  Learning Point Associates,  manolya.tanyu@learningpt.org
Discussant(s):
Mary Nistler,  Learning Point Associates,  mary. nistler@learningpt.org
Abstract: In today's public education system, large, multi-site evaluations are expected by clients to provide guidance for urgent management and policy decisions. The quality of data largely depends on comprehensive coordination, communication, and collaboration efforts during data collection. In this two-piece panel, we discuss the process of implementing a multi-site audit of English Language Arts curricula and instruction in 12 districts comprising 314 schools. The project was managed by one educational consulting organization that subcontracted with two other organizations for their content area expertise. Panelists represent team members that were intricately involved in the planning and implementation phases of the audit. Our presentations will focus on: (1) Building and improving evaluation capacities of a cross-disciplinary staff with an array of evaluation experience, and (2) Creating structures and systems to ensure high-quality data. We will discuss strategies that worked and did not work and how the project has evolved based on lessons learned.
Building Evaluation Capacity in a Large Scale Project
Katie Dahlke,  Learning Point Associates,  katie.dahlke@learningpt.org
Brenna O'Brien,  Learning Point Associates,  brenna.o'brien@learningpt.org
The project discussed in this panel involved the partnership of three educational consulting organizations. Thus, the data collection team was comprised of cross-disciplinary staff, contractors, and temporary employees. Although the project grew stronger with the contributions of each partner, training every staff to implement a standardized, high-quality data collection process was necessary. The presentation provides a description of procedures that were followed during the planning phase of the evaluation project. These included an all-staff kick-off meeting; training of staff on multiple data collection instruments and methods; creation of a collaborative team structure, in which each member had an identified set of roles and expectations; creation of an intercommunication web system; and development of policies, procedures, and guidelines for data collection. The presenters will give an overview of these activities, discuss challenges, strategies that worked, and lessons learned.
Implementing an Open And Dynamic Data Collection System
Manolya Tanyu,  Learning Point Associates,  manolya.tanyu@learningpt.org
Christina Bonney,  Learning Point Associates,  christina.bonney@learningpt.org
One of the major challenges posed on the project was the fact that data collection was carried out by a large and diverse group of staff, stationed in different states around the country with different arrangements of workload (full-time, part-time, temporary), who traveled long distances to multiple sites for data collection. Additionally, data collection involved working with 11 rural districts, 1 urban district, and 134 schools. Scheduling various methods of data collection (e.g., observation, interviews, document review) by different groups, coordinating a team structure, and utilizing a centralized data management system required extensive communication, coordination, and collaboration within and between internal organizational staff, partnering organizations, and the client. This presentation provides an overview of the team structure, data collection procedures, and strategies and tools used to monitor the process. The presentation will also include the experiences and lessons learned as implementation went forth, and the improvements that were made based on those lessons.

 Return to Evaluation 2008

Add to Custom Program