Return to search form  

Session Title: Documenting Math Science Partnership Projects in New York State
Multipaper Session 760 to be held in Fairmont Suite on Saturday, November 10, 10:30 AM to 12:00 PM
Sponsored by the Pre-K - 12 Educational Evaluation TIG
Chair(s):
Dianna Newman,  University at Albany,  dnewman@uamail.albany.edu
Abstract: The US Education Math Science Partnership (MSP) program is seeking to bridge the gap between current and expected content, pedagogy, and student outcomes in math and science education. As federal and state priorities shift and funding has increased, the evaluation component of this initiative has become increasingly important. There is need not only for local project evaluation but also of documented success of evaluation methods and findings related to cross-project and statewide efforts. This multi-paper session will present evaluation methodologies proven to be successful in documenting local, cross project and statewide MSP programs and their initial findings. Evaluators of three separate programs will present common cross-site methods and findings related to drill down variables and comparative studies related to the impact of professional development and subsequent implementation. In addition, the state evaluator of these three projects will address how these data provide lessons learned and support the broader MSP initiative.
The Role of Professional Development: How to Document it and What Works
Mary J Pattison,  University at Albany,  mpattison@uamail.albany.edu
Dianna Newman,  University at Albany,  dnewman@uamail.albany.edu
The purpose of this paper is to aggregate data and findings pertaining to the impact of professional development on classroom mathematics instruction within and across two large MSP projects. Utilizing a logic model approach to evaluation, if the goal of MSP projects is to change student outcomes, they must first change teacher knowledge of math and math pedagogy, their affect toward mathematics and math instruction, and their subsequent mathematics instructional practices. Utilizing a series of drill-down, these evaluations have initiated a series of quasi-experimental studies that address the role of type and frequency of professional development on subsequent teacher and classroom activities. The paper will over-view variables under study and designs used to study the role of variations in professional development. In addition, barriers and facilitators to implementation of large-scale longitudinal professional development studies, and lessons learned in the process that can transfer to similar projects will be addressed.
Multiple Avenues to Documenting Student Achievement: Results From Two Large Scale Math Science Partnership Grants
Leigh Mountain,  University at Albany,  lmountain@uamail.albany.edu
The purpose of this presentation will be to present strategies and initial findings related to the impact of two large-scale MSP projects on students' math achievement and affect. Using a drill down strategy, both projects assessed student-related outcomes via a series of data sources that ranged from statewide tests to individual student classroom performance. These data were used to support a series of longitudinal cohort-based analyses that traced students across grades, teachers, and instructional modes. The projects also tracked the types of professional development offered that might have impacted these changes as well as important teacher and building characteristics. The paper will present the methods and initial outcomes associated with the projects and will address barriers and facilitators to implementing the studies. Common findings across the two projects, as well as lessons learned that could be transferred to similar large-scale programs, also will be presented.
High Quality Local Evaluation of Federal Projects: At Long Last, Necessary!
Kathleen Toms,  Research Works Inc,  ktoms@researchworks.org
The purpose of this paper is to present information related to Mathematics and Science Partnership (MSP) project for NYC Region 4. The project includes stakeholders from 106 buildings (K-12) and utilizes an interrupted time-series single group quasi-experimental design to measure the impact of activities carried out each project year using professional development as a mediating factor. Research Works, Inc. serves as the local evaluator measuring effects with high a level of rigor meeting local formative needs within the context of the required state and federal MSP evaluation. The paper poses the question: If a rigorous local evaluation identifies effective practice but the federal funder never sees this evidence, does the evaluation still have a funder-related purpose?
Promoting High Quality Evaluation of Math Science Partnership (MSP) Projects
Amy Germuth,  Compass Consulting Group,  agermuth@mindspring.com
The MSP competitive grant program encourages partnerships between institutions of higher education and local schools to collaboratively engage in professional development activities that increase the subject matter knowledge and teaching skills of mathematics and science teachers. Since 2005 Compass Consulting Group, LLC has provided technical assistance to partnerships in NY and conducted a statewide evaluation of these partnerships. As part of this panel, Compass will report on lessons learned from providing evaluation technical assistance, reviewing local evaluations, and conducting a statewide evaluation. Much of the focus of this presentation will be on implications for promoting rigorous evaluations of future MSP partnerships, both for program officers during the RFP process and then for potential grantees during the proposal and development stages. Although this study is specific to the MSP program, we suggest that the findings are relevant for funders and developers of all projects that are competitively awarded and require evaluations.
Search Form