2011

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Data Feedback Loops in Educational Settings: Intervention Process and Student Progress Monitoring Data Visualization Tools and Procedures
Multipaper Session 203 to be held in California A on Thursday, Nov 3, 8:00 AM to 9:30 AM
Sponsored by the Pre-K - 12 Educational Evaluation TIG
Chair(s):
Charles Reichardt, University of Denver, creichar@du.edu
Discussant(s):
Charles Reichardt, University of Denver, creichar@du.edu
Abstract: This session demonstrates management, implementation, and evaluation of a promising high school intervention ("graduation coaching") for students at-risk of academic failure having potential to be implemented widely by the national GEAR UP community and by other schools and school systems to promote retention and graduation. The discussion illustrates how careful project planning strategies and well-designed innovative formative evaluation tools can aid and enhance implementation success and sustainability. We discuss intervention models; data gathering, analysis, and utilization techniques; and new visualization tools for monitoring student educational progress in middle and high school. Among these are site planning and continuous improvement process (CIP) models; intervention tools such as data-based student diagnostic and referral methods, computerized daily activity and intervention logs and case notes; and dashboards utilizing readily available school student information system data for visualizing student performance histories and recent progress in core and academic subject areas and diagnosing strengths and weaknesses.
The Continuous Improvement Process (CIP) Model Applied to Educational Intervention Projects
Shelly Carpenter, Western Michigan University, shelly.carpenter@wmich.edu
Pamela Zeller, Western Michigan University, pamela.zeller@wmich.edu
The Continuous Improvement Process (CIP) model is based on three major components: legal, financial, and compliance aspects; intervention implementation and program activities, and formative and summative evaluation. Applied to large or small educational intervention projects, these components are not discrete elements; rather, they function interdependently at all phases of a project and provide the framework for overall project management. Key to success of the model is feedback from formative evaluation of the process itself and input from all stakeholder groups, allowing the three components to be assessed constantly and feedback used for improvements to insure reliability, efficiency, and project effectiveness. Five assessment tools are particularly useful as aids in managing these components: the Site Plan, Technical Reports, Activity and Intervention Reports, Daily Logs, and Case Notes. This presentation demonstrates these and other assessment tools used in CIP, illustrating how they provide structure and guidance for strategic management and successful evaluation.
Formative Evaluation Tools for Assessing School Intervention Processes
Pamela Zeller, Western Michigan University, pamela.zeller@wmich.edu
Shelly Carpenter, Western Michigan University, shelly.carpenter@wmich.edu
Formative evaluation tools developed for the GEAR UP/RTI project to determine the effectiveness of graduation coaching in a rural high school are presented and generalized for utility when assessing typical school-based interventions. Goals included identifying at-risk students in need of coaching; improving academic performance; lowering the drop-out rate; and promoting graduation. Evaluation instruments included: Student Referral Forms; Student Plans assessing each student's academic history, strengths, and weaknesses; Daily Activity Log databases recording student intervention information; coach Student Case Notes; and Dashboard computer visualizations of student academic progress. These provided valuable information regarding the impact of the services on students and were useful for continuous formative guidance of the project. Particular attention was given to factors related to urban versus rural implementation of the graduation coaching program. Bi-monthly meetings conducted with the project team, including the coach, identified developing problems and barriers and fine-tuned the coaching intervention in this rural setting.
Modeling and Visualizing Student Performance Data: Academics and Behaviors
Warren Lacefield, Western Michigan University, warren.lacefield@wmich.edu
Brooks Applegate, Western Michigan University, brooks.applegate@wmich.edu
This study describes new data visualization tools for data-driven decision-making in educational environments. The methodology involves a well-defined data-based diagnostic identification and selection procedure for choosing students at-risk of academic failure for appropriate academic support services. Dashboards displaying longitudinal performance trajectories covering middle and high school years, disaggregated by subject and combined with behavior, attendance, and other information can serve diagnostic functions by displaying history and progress not only in aggregate core studies but also in math, science, language arts, and history/social studies subject areas. They also provide formative and summative evaluation functions, allowing predictions to be compared directly to outcomes. We present these dashboards in the context of the graduation coaching intervention where students classified at-risk formed an initial caseload and outcome projections were updated and provided to the coach on a quarterly basis, together with weekly electronic gradebook data, allowing individualized intervention adjustment and caseload refocusing activities.
Analytical Basis for Modeling of Student Performance Data: Validity, Automation, Updating, and Interactive Evaluation Processes
Brooks Applegate, Western Michigan University, brooks.applegate@wmich.edu
Warren Lacefield, Western Michigan University, warren.lacefield@wmich.edu
New visualization tools and algorithmic procedures use readily available school information system data and "dashboards" for visualizing student performance histories and recent progress to identify students appearing at substantial academic risk; what some of those risks are; and who appears likely to benefit from specific academic support service interventions. This paper examines the theoretic and analytic basis for employing these procedures. Key research questions include: Can humans make valid predictions about high school performance based on visualizations of middle school data? Can pattern-matching algorithms match human judgments? Can humans discriminate and diagnose well enough to guide individualized intervention planning? Can this process be automated? Most important, do such models have strong predictive validity and growth measurement and evaluation potential when applied to actual outcome data with and without intervention efforts (e.g. using continuous or discontinuous regression models).

 Return to Evaluation 2011

Add to Custom Program