|
A Mixed Methods Approach to Measurement for Multi-site Evaluation
|
| Presenter(s):
|
| Carlos Bravo, Evaluation, Management & Training Associates Inc, cbravo@emt.org
|
| Fred Springer, Evaluation, Management & Training Associates Inc, fred@emt.org
|
| Abstract:
School climate has become a focal concept for promotion of safer and more productive learning environments that the National School Climate Center describes as “patterns of … experiences … reflect(ing) norms, goals, values, interpersonal relationships, teaching, learning and leadership practices, and organizational structures.” Given this multi-dimensional complexity, measuring school climate is a challenge for evaluators. This presentation summarizes a comprehensive review of all school climate measures used in state school surveys regularly administered in the United States, and the most prominent research and performance monitoring instruments developed specifically to measure school climate. Survey items are mapped onto a comprehensive model of school climate domains and dimensions, relative focus on domains and dimensions are are identified, and similarities and differences in measurement perspective, format, and psychometric approach are profiled. Implications for the definition of school climate, and for measurement development and use by evaluators and policymakers, are discussed.
|
|
Using Mixed Methods to Examine and Interpret the Impact of an National Science Foundation (NSF) Geosciences Multi-Institution, Multi-site Teacher Professional Development Program.
|
| Presenter(s):
|
| Susan Henderson, WestEd, shender@wested.org
|
| Dan Mello, WestEd, dmello@wested.org
|
| Abstract:
This paper examines the impact of a multi-site teacher professional development program, funded by the Geosciences Directorate of the National Science Foundation. Using a mixed methods design this evaluation of the Transforming of Earth Systems Science Education (TESSE) Program, examines the program’s impact on teacher content and pedagogical knowledge, as well as teacher’s self-reported knowledge and perceived comfort level in teaching various aspects of earth system science. Using focused case studies combined with multiple regression and pre-test post-test gains, this paper also presents differences in the extent of program impact based on the implementation of the professional development at 4 distinctly different institutes of higher education. This study offers practical insight to evaluators assisting clients in providing scientifically based evidence for program scale-up in sites with distinctly unique missions, distribution of resources, and connected with K-12 institutions.
|
| |