|
Assessing the Quality of Early Care for Preschool-Aged Children in An Urban Setting
|
| Presenter(s):
|
| Robert Fischer,
Case Western Reserve University,
fischer@case.edu
|
| Donna Bryant,
University of North Carolina Chapel Hill,
bryant@mail.fpg.unc.edu
|
| Ellen Peisner Feinberg,
University of North Carolina Chapel Hill,
peisnerf@mail.fpg.unc.edu
|
| Liane Grayson,
University of South Dakota,
liane.grayson@gmail.com
|
| Abstract:
This paper reports on a study of the quality of care in early care and education programs in the Cleveland, OH area, including center-based and family child care settings. Observational and interview data were collected from a sample of 177 classrooms for 3-to-5 year olds chosen from a stratified random sample of child care centers. Data were collected by trained observers using two standardized instruments – Early Childhood Environment Rating Scale – Revised, and Caregiver Interaction Scale. The quality of care in family child care homes was examined through a review of extant data from a previous study, and administrative data on quality used to guide technical assistance and assess provider performance. Findings speak to the overall quality of care, as well as by setting type, and the factors most related to the level of quality. The paper also addresses how to use quality data to inform practice and policy.
|
|
Achieving the Dual Purpose of Accountability and Improvement: A Case of Evaluating Palm Beach County's Quality Improvement System
|
| Presenter(s):
|
| Xuejin Lu,
Children's Services Council of Palm Beach County,
kim.lu@cscpbc.org
|
| Lance Till,
Children's Services Council of Palm Beach County,
lance.till@cscpbc.org
|
| Karen Brandi,
Children's Services Council of Palm Beach County,
karen.brandi@cscpbc.org
|
| Jeff Goodman,
Children's Services Council of Palm Beach County,
jeff.goodman@cscpbc.org
|
| Abstract:
Evaluation is to assess the value or merit of a program or other entities (Stufflebeam, 2007). Evaluation is not only for accountability but also for improvement. Our proposal uses one stream of work on the evaluation of Palm Beach County’s Quality Improvement System (QIS) to demonstrate that the dual purpose of evaluation can be achieved through asking a series of interrelated evaluation questions. Methodologies used for data collection and data analysis for specific evaluation questions will be illustrated. Evaluation findings will be presented and discussed. This proposal has both methodological and substantive implications.
|
|
Lessons Learned From an Evaluation of the Kansas Early Childhood Comprehension Systems Plan
|
| Presenter(s):
|
| Greg Welch,
University of Kansas,
gww33@ku.edu
|
| Jackie Counts,
University of Kansas,
jcounts@ku.edu
|
| Jessica Oeth,
University of Kansas,
jessoeth@ku.edu
|
| Chris Niileksela,
University of Kansas,
chrisn@ku.edu
|
| Rebecca Gillam,
University of Kansas,
rgillam@ku.edu
|
| Karin Chang-Rios,
University of Kansas,
kcr@ku.edu
|
| Abstract:
This presentation will focus on lessons learned from an ongoing evaluation of the Kansas Early Childhood Comprehensive Systems (KECCS) plan. Specific attention will be placed on the Structural Equation Modeling (SEM) approach used to model school readiness using a variety of data sources. This approach is considered innovative in this realm as many states have ‘models’ of school readiness that have not been validated via the SEM approach being utilized.
|
| | |