2010 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Partner Roles in a Multi-site Evaluation: The Viewpoints and Experiences of the Cross-site Evaluator and the State Program Coordinator
Panel Session 747 to be held in Lone Star F on Saturday, Nov 13, 10:00 AM to 10:45 AM
Sponsored by the Cluster, Multi-site and Multi-level Evaluation TIG
Chair(s):
Kristin Everett, Western Michigan University, kristin.everett@wmich.edu
Abstract: A two-person panel will share viewpoints and experiences about their roles and responsibilities as a cross-site external evaluator and a program coordinator in a multi-site evaluation effort to evaluate a program to improve teacher quality. Session attendees will learn the pros and cons of the evaluation design as experienced in the multi-site evaluation. Additionally, the external evaluation team will describe how it aimed to build evaluation capacity at the local project level and provide evaluation technical assistance to local project directors. The panel will address practical dimensions applicable to other multi-site projects: planning, internal/external communication, evaluation capacity-building, data collection, data analysis and interpretation, technical assistance, reporting procedures, data use, and report preparation. Longitudinal aspects of this six year cross-site evaluation also will be explored. Sample procedures, instruments, and other materials will be shared. This model can be applied across large or small sets of projects and geographic areas.
Evaluating a Statewide, Multi-site Program: The External Evaluator’s Role
Kristin Everett, Western Michigan University, kristin.everett@wmich.edu
In a multi-site evaluation, the cross-site evaluator plays an important role, working closely with both program coordinators and individual grantees. Attendees to this session will hear about different evaluative activities that were used in a cross-site evaluation, as well as the pros and cons of each and “lessons learned” in the evaluation. In this presentation, a member of the external evaluation team will review the evaluation model used for this cross-site initiative and the role of evaluators in it. Topics covered will include the external evaluator’s role in the RFP development and proposal review/selection of grantees, technical assistance to program coordinators and individual projects, evaluation capacity-building, data collection, “mining” data from project individual project reports, analysis of longitudinal trends, site visits (including observations and interviews), and data reporting.
Evaluating a Statewide, Multi-site Program: The Program Coordinator’s Role
Donna Hamilton, Michigan Department of Education, hamiltond3@michigan.gov
The program coordinator will provide her perspective on the roles and responsibilities of the various players in a cross-site evaluation. She will address making evaluation information timely, relevant, and useful; using cross-site evaluations in state-level decision making; and using evaluation results to modify programs. The coordinator has many roles and responsibilities facilitating the statewide initiative and the evaluation. By learning about the roles of the coordinator, attendees will be better equipped to work with coordinators. The coordinator and evaluator organize activities to minimize duplication of effort and increase efficiency. The coordinator solicits assistance from evaluators in RFP development; RFP announcement/technical assistance sessions, and proposal review; periodic grantee review sessions; coordinated site visits; preparation of evaluation findings for targeted audiences; and collection of selected evaluation data. These discussion topics will give attendees ideas of additional types of evaluative activities that may be helpful when working with a multi-site program.

 Return to Evaluation 2010

Add to Custom Program