|
Session Title: The Gaining Early Awareness and Readiness for Undergraduate Programs (GEAR UP) Experience: Exploring the Promise of Multi-site Evaluation
|
|
Panel Session 533 to be held in Texas F on Friday, Nov 12, 9:15 AM to 10:45 AM
|
|
Sponsored by the College Access Programs TIG
|
| Chair(s): |
| Yvette Lamb, Academy for Educational Development, ylamb@aed.org
|
| Discussant(s):
|
| Melissa Panagides-Busch, Academy for Educational Development, mbusch@aed.org
|
| Abstract:
Gaining Early Awareness and Readiness for Undergraduate Programs (GEAR UP) is a federally funded program to provide services to prepare low income middle and high school students for entering and succeeding in post secondary education. Academy for Educational Development (AED) is working with multiple partnerships and a state agency to conduct external evaluation of GEAR UP programs.
Data from across six school districts and 40 schools will be used to conduct multisite analysis to better understand the feasibility of various types of approach for program evaluation.
The session begins with our conceptualization of multisite evaluation of GEAR UP program. The second paper presents an overview of GEAR UP programs managed by three agencies in three states. The final paper presents our process for data collection and analysis, including feedback we received from our clients in pursuing the multisite evaluation. The session will end with discussion of key design questions.
|
|
Overview of the GEAR UP Multi-site Evaluation
|
| Yvette Lamb, Academy for Educational Development, ylamb@aed.org
|
|
This paper presents our conceptualization of GEAR UP multisite evaluation. Given the program across sites share similar goals and provide similar types of services, and given that we are working with formative evaluation, we propose taking cluster evaluation as the overall evaluation approach. The paper begins with providing an overview of multisite evaluation by following Straw and Herrell (2002)’s framework of multi site evaluations. Then it quickly move to discuss why cluster evaluation is the most suitable evaluation approach by discussing the nature of intervention GEAR UP program provides. It will also discuss what types of questions multisite evaluations might be able to address, as well as challenges for answering particular types of questions. The presenter brings her expertise of conducting evaluation of college access and community programs.
|
|
|
GEAR UP Programs: Similarities and Differences Across Sites
|
| David Jochelson, Academy for Educational Development, djochels@aed.org
|
| Susanna Kung, Academy for Educational Development, skung@aed.org
|
| Arati Singh, Academy for Educational Development, asingh@aed.org
|
|
Three presenters will present three GEAR UP programs, on which we will be conducting multisite analysis. We present how GEAR UP programs implemented by three agencies differ in the scope of services, specific outcome measures, populations served, and the context in which the services are provided. We also present site specific evaluation questions, data collection and analysis schedule and forms of reporting for each site specific evaluation so that audience will have a concrete image of what types of data we are collecting for the site specific evaluation. Three evaluators who work with the site specific evaluation and grantees (if they are able to participate) will present this section of the session to provide a comprehensive picture of evaluation work, including the nature of collaboration between evaluators and program managers, evaluation use and information needs at each site.
| |
|
Designs for Cluster and Multi-site Evaluation for GEAR UP Program
|
| Mika Yamashita, Academy for Educational Development, myamashita@aed.org
|
|
The final paper discusses our plans for cluster evaluation, which we are currently designing. At this moment, we plan to use data analysis techniques drawn upon theory based evaluation (Davidson, 2000) to acquire descriptions about how different contexts shape implementation of GEAR UP programs. We will also discuss several strategies for analyzing outcome data collected from three sites for site specific evaluation. We will also report involvement of our clients in identifying evaluation questions, and our data analysis scheduling in relation to the site specific evaluation work. Finally, we present how we plan to report findings to the sites that have different evaluation and information needs. The presenter brings her expertise in qualitative analysis.
Davidson, E. J. (2000). Ascertaining Causality in Theory-Based Evaluation. New Directions for Evaluation. 87 17-26
| |