| Session Title: Local Evaluator Involvement and Impact in Michigan 21st Century Community Learning Centers After-school Programs |
| Multipaper Session 840 to be held in Panzacola Section H3 on Saturday, Nov 14, 1:40 PM to 3:10 PM |
| Sponsored by the Evaluation Use TIG |
| Chair(s): |
| Laurie A Van Egeren, Michigan State University, vanegere@msu.edu |
| Abstract: Evaluation in government-funded programs may target performance monitoring without addressing continuous improvement planning in local programs. Most programs benefit considerably from the involvement of an evaluator who can facilitate program improvement, but relationships and agreements between programs and evaluators vary substantially. In the Michigan evaluation of the federally funded 21st Century Community Learning Centers after-school initiative, grantees are required to hire local evaluators in addition to participating in the state evaluation. This session examines the uses and benefits of local evaluators in 35 organizations operating 229 after-school sites. The session will address questions such as: What types of administrators and programs are most likely to use evaluators? To what extent does local evaluator involvement result in better data reporting and interpretation? How does program quality differ in sites with high vs. low evaluation use? Finally, a local evaluator will discuss the integration of state and local data in program improvement. |
| Who Uses Local Evaluators? Links to Site and Staff Characteristics |
| Heng-Chieh Wu, Michigan State University, wuhengch@msu.edu |
| Laurie A Van Egeren, Michigan State University, vanegere@msu.edu |
| Nai-Kuan Yan, Michigan State University, yangnaik@msu.edu |
| Chun-Lung Lee, Michigan State University, leechun1@msu.edu |
| Celeste Sturdevant Reed, Michigan State University, csreed@msu.edu |
| In after-school programs, "evaluator use" can take many forms, including assisting in program improvement efforts, collecting data to assess outcomes, and developing reports for funders and stakeholders. After-school programs often do not have evaluators, and when they do, they show extensive variability in their evaluator use. In the Michigan 21st Century Community Learning Centers programs, contracts with local evaluators are a condition of funding. However, not all programs use evaluators equally. This paper examines site and staff predictors of several forms of evaluator use, including program improvement, providing school outcomes data, developing reports, grant writing, data collection to meet state requirements, and local data collection. Results from 229 sites suggest that the type of operating organization (schools rather than community-based organizations), years of operation, and perceptions of the importance of using data for program improvement were linked to greater evaluator use. |
| Improving Reporting Quality Through Local Evaluator Involvement |
| Celeste Sturdevant Reed, Michigan State University, csreed@msu.edu |
| Megan Platte, Michigan State University, plattmeg@msu.edu |
| Beth Prince, Michigan State University, princeem@msu.edu |
| Laura Bates, Michigan State University, bateslau@msu.edu |
| Laurie A Van Egeren, Michigan State University, vanegere@msu.edu |
| Over the last several years the MSU State-Wide Evaluation Team has consistently invested in training and technical assistance for grantees, their staff, and local evaluators in order to improve the understanding and use of data for program improvement. Data from the 2007-2008 Annual Report Forms, Web-based documents that all grantees and sites must complete as part of their contractual requirements to receive funds, was used. We hypothesized that local evaluators' involvement would improve the availability of data (i.e., all relevant data was collected) and users' clarity (i.e., they understood the data presented in the charts). Independent variables included factors such as the role the local evaluator played in the reporting process, how often the local evaluator met with various stakeholders, and grantee cohort. The results were mixed; we will suggest other factors that may have influenced the results. |
| Sites With High vs. Low Evaluation Use: Differences in Program Quality |
| Laurie A Van Egeren, Michigan State University, vanegere@msu.edu |
| Heng-Chieh Wu, Michigan State University, wuhengch@msu.edu |
| Nai-Kuan Yan, Michigan State University, yangnaik@msu.edu |
| Chun-Lung Lee, Michigan State University, leechunl@msu.edu |
| Celeste Sturdevant Reed, Michigan State University, csreed@msu.edu |
| In 2007, the Michigan 21st Century Community Learning Centers program created clear guidelines for the use of local evaluators, emphasizing that their primary role was to assist in improving program quality. However, these guidelines have been implemented inconsistently. This paper examines differences in program quality between sites with high levels of evaluation use and those with low levels. A constellation of variables were used to characterize sites as "high" or "low," including evaluator involvement in program quality assessments and annual reporting to the state, regularity of evaluator meetings with program staff, supervisor ratings of evaluator value across several areas of use, and completeness of data submitted to the state. The top and bottom 10% of programs were compared for program quality, including student, parent, and staff perceptions of the program and student outcomes. Results suggest benefits to more evaluation use for putting processes into place that promote program quality. |
| The Local Evaluator Speaks: Using State and Local Data for Continuous Improvement |
| Wendy Tackett, iEval, wendolyn@mac.com |
| It is generally assumed that a local evaluator has the potential to greatly impact changes leading to program improvement. iEval, serving as local evaluator to six school districts in Michigan who are implementing 21st Century Community Learning Centers programs, has seen positive impact on programmatic improvements because of evaluation findings that led to increases in student academic achievement, improvements in student behaviors, and greater buy-in from the school and community. Dr. Wendy Tackett will discuss the type of local data collected, how data is analyzed, general evaluation findings using state and local data, and how changes that were a result of evaluation findings made an impact on subsequent program years. |