Evaluation 2008 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Frameworks for Evaluation Teacher Professional Development: Implementation, Outcomes, and Impact
Multipaper Session 337 to be held in Room 111 in the Convention Center on Thursday, Nov 6, 1:40 PM to 3:10 PM
Sponsored by the Pre-K - 12 Educational Evaluation TIG
Chair(s):
Sheryl Gowen,  Georgia State University,  sgowen@gsu.edu
Quantifying the Qualitative: A Test Analysis of Coaching Staff Journals to Boost Understanding of Teacher Professional Development Program Status
Presenter(s):
Keith Murray,  MA Henry Consulting LLC,  keithsmurray@mahenryconsulting.com
Martha Henry,  MA Henry Consulting LLC,  mahenry@mahenryconsulting.com
Abstract: Quantifying the qualitative presents a frequent challenge to evaluators, given the need to convert raw program materials into supportable process and summative evidence. Evaluations of teacher professional development programs usually focus on quantitative evaluator-provided instruments, such as tests, surveys and observation reports, to understand program effectiveness and progress towards objectives. Evaluation of program-derived products offers a more qualitative source of program information. These texts include teacher products such as portfolios and lesson plans and program staff products such as report narratives and journals. To test qualitative methods, evaluators of an NSF-funded grades 6-8 math-science partnership examined the journals of program staff coaches working with teachers, schools and districts for evidence of partnership dynamics, program challenges, problem solving and teacher engagement. The paper reports results of this analysis and its contribution to the program evaluation, and discusses application of such techniques to extant data sources in K-12 teacher professional development programs.
Evaluating the Impact of Professional Development on Teachers' Practice
Presenter(s):
Patricia Moore Shaffer,  College of William and Mary,  pmshaf@wm.edu
Jan Rozzelle,  College of William and Mary,  mjrozz@wm.edu
Abstract: The School-University Research Network (SURN) of the College of William and Mary has provided professional development in research-based content literacy assessment, standards, and strategies to three successive cohorts of middle school core content teachers. The professional development model features a summer workshop for school teams, follow-up workshops during the academic year, school-based classroom observations and coaching, collaborative lesson planning, and peer mentoring. Using a mixed method evaluation design and varied data collection strategies including classroom observation, interviews, surveys, and document analysis, SURN staff has sought to assess the impact of this program on teachers’ practice and student achievement. During this paper presentation, SURN staff will discuss the evaluation design, data collection and analysis, and significant findings in response to its research questions.
Using a Multi-Faceted Approach to Evaluating a Statewide Professional Development Program from Conceptualization Through Implementation
Presenter(s):
Jacqueline Stillisano,  Texas A&M University,  jstillisano@tamu.edu
Hersh Waxman,  Texas A&M University,  hwaxman@tamu.edu
Karin Sparks,  Texas A&M University,  karinsparks@tamu.edu
Brooke Kandel-Cisco,  Texas A&M University,  bkandel@tamu.edu
Sue Wedde,  Texas A&M University,  swedde@tamu.edu
Abstract: This study showcases an evaluation of a statewide professional development project in Texas and provides a model of a broad-based approach to program evaluation. Using the Accountability, Effectiveness, Impact, Organizational Context, and Unanticipated Outcomes (AEIOU) evaluation framework (Simonson, 1997; Sorensen & Sweeney, 1997), the evaluators examined the content, process, and context variables of the curriculum materials developed; the workshops where the curriculum was introduced; and the implementation of the curriculum. Both qualitative and quantitative methods were employed, and multiple sources were used for data collection—including focus groups, questionnaires, surveys, and observations (different observation tools were developed for different contexts). The evaluation process had and continues to have many different facets over an extended time period, from the formative evaluations that contributed to an iterative process in the design and development of the program to the summative evaluations of materials and workshops and the implementation of the curriculum.
Using Systems Approaches to Evaluate Practice Outcomes of Teacher Professional Development
Presenter(s):
Janice Noga,  Pathfinder Evaluation and Consulting,  jan.noga@stanfordalumni.org
Abstract: The purpose of this paper presentation is to discuss the use of systems approaches to evaluate transfer of training to classroom practice among teachers participating in a statewide professional development program in literacy instruction. While it is tempting to view changes in classroom practice as a logical outcome of changes in knowledge and skills, the reality is far more complex. In evaluating any professional development effort that seeks to change teachers’ practice, one must keep in mind that classroom practice, while an important influence on student achievement, is but one element of a much more complex system that is the school or district. This presentation will focus on the methodological challenges faced in documenting practice outcomes and describe the development and use of a systems framework to assess the potential for program learning to impact classroom practices of participating teachers.

 Return to Evaluation 2008

Add to Custom Program