|
Challenges to the Utility of Evaluation of Early Elementary Tutoring and Learning Support Services
|
| Presenter(s):
|
| Magdalena Rood, Third Coast R&D Inc, mrood@thirdcoastresearch.com
|
| Cindy Roberts-Gray, Third Coast R&D Inc, croberts@thirdcoastresearch.com
|
| Abstract:
A key problem for agencies providing tutoring and learning support services for young children in school is to demonstrate improved child performance in the classroom. As experienced in two agencies under contract with the Austin Independent School District in Austin, Texas, in-house measures consistently, year-after-year demonstrate improvements, but often these results were not replicated on district measures. Whereas both program and school measures are diagnostic in nature, the mismatch between measures appears to be due to philosophical differences between test developers about factors such as age-appropriate skills and standards, testing method, and test administrator role. The proposed paper will review findings of early literacy tutoring programs conducted from 2004 through 2009, and will discuss the implications of the misalignment between learning support and classroom measures for demonstrating convincing results for children
|
|
Evaluation of Reading Intervention Effectiveness Using Growth Models
|
| Presenter(s):
|
| Tammiee Dickenson, University of South Carolina, tsdicken@mailbox.sc.edu
|
| Jennifer Young, South Carolina Department of Education, youngjey@gmail.com
|
| Abstract:
This study compared achievement growth of students who received supplemental reading intervention services with students who did not receive services. Students were selected for intervention based on academic need. To study intervention effectiveness, the differential growth rate of students who received services was of interest. The sample consisted of approximately 1500 students in 43 schools that participated in the South Carolina Reading First (SCRF) Initiative during three consecutive years. The Stanford Reading First assessment was administered to SCRF students in grades 1-3 in the fall and spring of each school year. A three-level hierarchical linear model was used to model growth in achievement for students who participated in all three years. A quadratic term was included to account for change in growth rate over time. Comparisons were made by whether students received intervention with three types compared for grade 1. Results indicate significant gains with intervention provided in early grades.
|
|
Impact of READ180 on At-Risk Middle School Students’ Literacy Outcomes
|
| Presenter(s):
|
| Margaret Gheen, Montgomery County Public Schools, mhgheen@hotmail.com
|
| Shahpar Modarresi, Montgomery County Public Schools, shahpar_modarresi@mcpsmd.org
|
| Abstract:
This evaluation examines literacy achievement of middle school students with a history of low performance who were enrolled and not enrolled in READ 180, a program designed to accelerate reading achievement. End-of-year literacy scores on the Measures of Academic Progress-Reading (MAP-R) and the Maryland School Assessment (MSA) in reading were compared among three groups: students enrolled in classes implementing READ 180 with higher versus lower fidelity and students not enrolled in the program. Findings across grade levels, literacy measures, and statistical methods yielded small and sometimes inconsistent patterns of differences among groups. However, overall, READ 180 students had slightly higher end-of-year reading scores than nonparticipants. The greatest differences among groups were observed in Grade 6: students enrolled in READ 180 classes scheduled for 90 minutes daily and with higher program fidelity demonstrated the highest end-of-year performance.
|
|
Evaluating Professional Development Training in Early Literacy: Alternatives for Measuring Participant Use of New Skills Post Training– Use of Action Plans and Levels of Use Indicators
|
| Presenter(s):
|
| Ann Zukoski, Rainbow Research Inc, azukoski@rainbowresearch.org
|
| Joanne Knapp-Philo, National Head Start Family Literacy Center, joanne.knapp-philo@sonoma.edu
|
| Kim Stice, National Head Start Family Literacy Center, kim.stice@sonoma.edu
|
| Abstract:
A key challenge of evaluating professional development training is measuring how participants’ use and apply new knowledge and skills on the job post-training (Gusky, 2000; Phillips & Stone, 2002). Research shows that adoption of new skills is not likely to be universal or complete and participants need time to reflect and adapt new concepts to their own context. Therefore, gathering and analyzing information about whether or not new practices are used and how well they are used are essential evaluation activities for formative and summative use. Action plans and measures of level use represent important mechanisms of measuring participants’ use of new knowledge and skills, the degree of implementation and the quality of implementation. In this presentation, we will share examples of actions plans and applications of level of use indicators to assess a national training program for early childhood educators. Challenges and opportunities will be discussed.
|
| | | |