Return to search form  

Session Title: The Challenges and Opportunities of Evaluating Mathematics and Science Partnership (MSP) Projects: Michigan's Design, Strategies and Instruments
Panel Session 795 to be held in Fairmont Suite on Saturday, November 10, 12:10 PM to 1:40 PM
Sponsored by the Pre-K - 12 Educational Evaluation TIG
Chair(s):
Shannan McNair,  Oakland University,  mcnair@oakland.edu
Abstract: The Michigan Department of Education (MDE) Mathematics and Science Partnership (MSP) programs are to strengthen K-12 mathematics and science education in high-need school districts. Each funded program includes an evaluation model that uses scientifically-based research methods to evaluate whether program objectives were attained. Each participant will present their evaluation design, strategies, instruments and findings. Dialogue about current evaluation expectations and the challenges this presents in school settings will be facilitated. The objective of each MSP project is to achieve a measurable increase in the academic performance among students instructed by teachers who participated in a MSP project. Evaluation findings from the first cohort and preliminary findings from the second cohort will be discussed.
Western Michigan University: Looking at the Michigan Mathematics Rural Initiative and Muskegon Middle School Mathematics Improvement Projects
Sandy Madden,  Western Michigan University,  sandra.madden@wmich.edu
A major focus of three Michigan Mathematics and Science Partnership projects--two serving urban schools and the other rural schools--has been on developing mathematics content knowledge of middle and high school teachers to help them improve the teaching and learning of mathematics in their classrooms. An important element of the evaluation has been on gathering information about the impact of the program on teacher content knowledge through use of project-developed content tests, an existing measure of pedagogical content knowledge, teacher self-report, evaluator observations of lessons in classrooms of participating teachers, interviews, student surveys, and MEAP trend data. A discussion of the professional development interventions, a key component of which has been school-based learning communities, will be followed by presentation of data sharing effects on teacher content knowledge and student content knowledge. Analysis of the strengths and limitations of the various methodologies will be presented.
Oakland Schools Math Science Partnership Project
Wendy Tackett,  iEval,  wendolyn@mac.com
Valerie Mills,  Oakland Schools,  valerie.mills@oakland.k12.mi.us
Project Purpose: The purpose of MERC is to provide a comprehensive professional development experience for teachers and administrators focused on good instruction through mathematics. Teachers participate in curriculum-based workshops, content-focused classes, coaching in the classroom with expert teachers, and lesson-planning meetings. Principals participate in a course designed to improve their leadership skills while focusing on math content. This two-tiered approach helps create a collaborative school environment focused on the pursuit of academic success for students. Methods of Evaluation: The evaluation is also a multi-tiered approach. Teacher pedagogical content knowledge is determined through the testing and retesting of teachers, use of instructional techniques by teachers is determined through classroom observations, perceptions of changes and impact are determined by teacher and administrator interviews, and student academic growth is determined through the state standardized test. Data is disaggregated using various variables including building climate, length of participation in the project, etc. Key Findings: Teachers are creating lessons with more substantive student-student interaction, appropriate pacing for all student levels, and adequate reflection time. It was also determined that building climate may not have an impact on teacher content knowledge but it does have a significant impact on instructional practices and classroom management.
Sustained Professional Development and Achievement: Washtenaw Intermediate School District
Frederica Frost,  Wayne County Research Educational Service Agency,  frostf@resa.net
Naomi Norman,  Washtenaw Intermediate School District,  nnorman@wash.k12.mi.us
Lesson Study was evaluated over a two-year period with the same participants each year. Only six teachers participated consistently, but results appear meaningful. The amount of participation time seemed to make a difference. Not all teachers were involved at the same level of effort; the number of hours of participation varied widely. For the first year, the evaluation indicated some level of reluctance to repeat the experience, particularly by those who actually developed and taught the lesson. They seemed to feel that the amount of time they had to commit was excessive. Further, limited differences could be detected between the Lesson Study and No Lesson Study groups. The additional time spent did not appear to result in greater mathematical content learning or in greater knowledge of math pedagogy. Following the second year of Lesson Study, participating teachers were compared to their non-Lesson Study colleagues on mathematical content knowledge, pedagogical knowledge, and classroom practice. The design replicated that of Elizabeth Fennema's group in Cognitively Guided Instruction, in which teachers were found to continue their growth during the year following their professional training. Although Lesson Study was the only professional development offered to the Washtenaw teachers that year, they continued to improve in both mathematical content and instructional practice. Their non-participating colleagues improved their practice as well, but performed less well overall in content knowledge.
The Role of Statewide Evaluators for Math Science Partnership (MSP) Projects
Dennis W Rudy,  Lakehouse Evaluation Inc,  drudy@lakehouse.org
Shannan McNair,  Oakland University,  mcnair@oakland.edu
Each funded program includes an evaluation model that uses scientifically-based research methods to evaluate whether program objectives were attained. A team of evaluation consultants, contracted by MDE, works to provide technical support and consultation on evaluation to MDE/MSP staff and to individual projects. Their tasks include, but are not limited to, meeting with project staff in large groups, by cohort group and individually, collecting summary information on program implementation and evaluation, reviewing the evaluation section of all grant proposals and making recommendations specific to evaluation, reviewing calls for proposals, assisting with technical assistance on preparing proposals and suggesting criteria. In addition, the Michigan evaluation team conducts site visits of each program, attends the U.S. D.O.E. meetings, participate in the conference calls, and review yearly and final reports before they are submitted. A particular aspect that these consultants have added to their list of tasks is to explore ways in which the question of impact of the partnership on STEM faculty can be identified and measured. This appears to be a question of importance nationally, but was not routinely included in MSP project evaluation plans. Questions raised regarding this aspect of MSP evaluation and others will be raised for discussion among all session participants. What is the role of Statewide Evaluators in other states? How is the relationship between the evaluators and the State Department of Education team negotiated? Does partnership at all levels emerge from such projects?
Search Form