Evaluation 2009 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Complexity and Clustering
Multipaper Session 756 to be held in Sebastian Section I2 on Saturday, Nov 14, 10:55 AM to 11:40 AM
Sponsored by the Cluster, Multi-site and Multi-level Evaluation TIG
Chair(s):
Rene Lavinghouze,  Centers for Disease Control and Prevention, rlavinghouze@cdc.gov
Exploratory Cluster Evaluation of Variability and Commonality of the Implementation and Impact of Ohio Mathematics and Science Partnership Projects
Presenter(s):
Lucy Seabrook, Seabrook Evaluation + Consulting LLC, lucy@seabrookevaluation.com
Hsin-Ling Hung, University of Cincinnati, hunghg@ucmail.uc.edu
Sarah Woodruff, Miami University of Ohio, woodrusb@muohio.edu
Debbie Zorn, University of Cincinnati, zorndl@ucmail.uc.edu
Mary Marx, University of Cincinnati, mary.marx@uc.edu
Abstract: USDOE Mathematics and Science Partnership funds support 15 three-year partnerships between Ohio high-need schools/districts and faculty in institutions of higher education. Although projects share common goals of increasing teacher content knowledge, improving teaching practices, and improving student performance, considerable variability in program design and delivery exists across projects. To identify characteristics of effective professional development for teachers of mathematics and science, an exploratory cluster evaluation approach was used to assign these projects membership into different modalities. Analyses were based on data obtained from a self-reported program characteristics survey developed by the evaluation team to assess variability across projects regarding (a) partnerships, (b) strategies to target participants, (c) school/district leadership involvement, (d) curriculum content, (e) delivery of professional development, and (f) local evaluation activities and findings. This presentation will describe the cluster approach and implications of findings. Applications of this approach to evaluations of similar programs also will be discussed.
Complex Adaptive Systems: Evaluation as Dynamic Human and Information Systems in a Formative, Collaborative, Statewide, Multi-site Context
Presenter(s):
A Rae Clementz, University of Illinois at Urbana Champaign, clementz@illinois.edu
Abstract: This presentation uses the lens of complexity theory and complex adaptive systems to analyze and critique a recent evaluation of a statewide, multi-site, induction and mentoring pilot program. Each site presented unique programmatic and contextual challenges and features, while the larger political and collaborative organizational contexts requested data that would lead to convergence and cross-site comparisons. This session will offer our understanding of the ways in which the individual actors and dynamic, often overlapping groups, including the evaluation team, organized and operated in order to achieve both singular and collective goals in a complex, adaptive environment. Similarly, we will look at the ways in which the information systems created by and for these different individuals, groups, and goals affected the human and group dynamics, as well as the progress of the evaluation.

 Return to Evaluation 2009

Add to Custom Program