Evaluation 2008 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Process Use-Methods and Measures
Multipaper Session 542 to be held in Room 105 in the Convention Center on Friday, Nov 7, 9:15 AM to 10:45 AM
Sponsored by the Evaluation Use TIG
Chair(s):
Sandra Ortega,  National Data Evaluation Center,  ortegas@ndec.us
Promoting the Use of Evaluation Information: Early Reading First in Chicago Charter Schools
Presenter(s):
Tania Rempert,  University of Illinois Urbana-Champaign,  trempert@uiuc.edu
Lizanne DeStefano,  University of Illinois Urbana-Champaign,  destefan@ad.uiuc.edu
William Teale,  University of Illinois Chicago,  wteale@uic.edu
Abstract: This presentation is organized via alignment of evaluative questions, measurement tools, and reporting strategies. Comments from program staff highlight the methods, strategies, and tools that have been most useful to them: observations, teacher portfolios, assessment data, and PD questionnaire. This evaluation has been designed to be longitudinal in nature in order to capture changes that occur over time and across settings. Formatively, the evaluation routinely monitors progress toward specific objectives, assesses the quality of implementation, and gauges the short-term impacts of the project. This information is reported to program staff regularly at project management meetings so that it could be used to guide program development and improvement. Summatively, the evaluation employs a quasi-experimental pre-test, post-test, non-equivalent comparison group design to document the impact of the project on teaching, learning and family. Finally, the evaluation is designed to provide information on effective strategies for replication of the program in other settings.
Initial Results from a Planned Approach to Process Use: Using Evaluation to Influence Public School Policy and the Classroom Practice of Formative Assessment
Presenter(s):
Debra Heath,  Albuquerque Public Schools,  heath_d@aps.edu
Nancy Carrillo,  Albuquerque Public Schools,  carrillo_n@aps.edu
River Dunavin,  Albuquerque Public Schools,  dunavin_r@aps.edu
Abstract: This paper describes how evaluators in an urban school district employed strategies to promote evaluation use, foster organizational alignment and influence evaluation policy. Findings from an evaluation of district-wide formative assessment practices suggest that teachers under-utilized formative assessment techniques and that district support systems offered inconsistent formative assessment definitions, strategies and resources. We wanted to do more than provide the standard written report and set of presentations; we wanted to help optimize the district’s support and practice of proven formative assessment strategies. We used a collaborative evaluation approach with the intention of solidifying commitment to formative assessment, increasing communication between stakeholders, creating shared understandings regarding formative assessment, and fostering collaborations between departments. We outline and critique our methods for achieving process use, and describe the techniques and tools we used to measure process use, including group processes, questionnaires, observations and interviews. Finally we share lessons learned.
Evaluation as Change Agent: The Precarious See-Saw Between Helping and Judging and How it Affects Process Use
Presenter(s):
Gale Mentzer,  University of Toledo,  gmentze@utnet.utoledo.edu
Abstract: This paper tracks the precarious path of program improvement that began with a required program evaluation for a federally funded project. It illustrates the complexity of working with “reluctant” stakeholders through the example of the evolution of a summer program designed to introduce secondary school students to teaching. A mixed methods evaluation was used to measure stated outcomes and to discover whether other, unanticipated factors were affecting program goal attainment. Application of the evaluation model to the program revealed not only that students’ misconceptions about teaching were being cleared up but also discovered a major unanticipated factor—the program instructors (university faculty) held misconceptions as to the goals of the program. This paper shows how, over the course of four years, group process techniques were used to engage the stakeholders in the evaluation in order to implement necessary programmatic changes as indicated by program evaluation results.

 Return to Evaluation 2008

Add to Custom Program