|
Increasing College Access for Underrepresented Youth: Developing a Comprehensive Evaluation of a Summer Bridge Program
|
| Presenter(s):
|
| Brianna Kennedy,
University of Southern California,
blkenned@usc.edu
|
| Abstract:
Out-of-School Time (OST) programs have blossomed in the field of education as ways to supplement students' time in school. While these programs vary in mission and outcomes, the vast majority lack formal evaluation and cannot account for the activities performed. This paper discusses the efforts by one OST program, hosted by a large research university, to create a formal evaluation process that will guide its own development and serve as a template for others. The formal evaluation included the use of a logic model in program planning and implementation, and the use of verifiable measurement methods.
|
|
Evaluating College Access Program Effects: A Dosage Model and Perspective
|
| Presenter(s):
|
| Gary Skolits,
University of Tennessee, Knoxville,
gskolits@utk.edu
|
| Abstract:
Evaluating college access “project effects” offers special challenges for the evaluator. This paper presentation addresses the unique challenges of using a “dosage” model to the determine project effects of six-year GEAR UP partnership project that served a cohort (class of 2006) for six years. The paper and presentation will address:
1. Unique data requirements for dosage analysis
2. Methodological challenges of developing dosage indices
3. Consolidation of a wide array of college access initiatives into a dosage construct
4. Separating college access project interventions/effects from other school-based initiatives
5. Longitudinal (multi-year) challenges of dosage
6. Applying dosage to the analysis of project outcomes
7. Dosage analysis strengths and limitations
This paper/presentation is relevant to evaluators investigating college access programs and other school projects that: a) cover a wide array of multiple interventions. B) are offered along with other school improvement initiatives, and c) are not amenable to the establishment of a meaningful comparison group.
|
|
The Detroit Area Pre-college Engineering Program (DAPCEP) National Science Foundation (NSF) Information Technology Experiences for Students and Teachers (ITEST) Project: Embedding Evaluation in Program Experiences
|
| Presenter(s):
|
| Shannan McNair,
Oakland University,
mcnair@oakland.edu
|
| Margaret Tucker,
Detroit Area Pre-College Engineering Program,
mtucker@dapcep.org
|
| Jason Lee,
Detroit Area Pre-College Engineering Program,
jdlee@dapcep.org
|
| Karla Korpela,
Michigan Technological University,
kokorpel@mtu.edu
|
| Abstract:
The NSF Information Technology Experiences for Students and Teachers (ITEST) invests in informal education programs for middle and high school students that are intended to stimulate interest in high technology fields, as well as professional development for teachers that emphasize IT-intensive STEM subject areas. This presentation will discuss the opportunities and challenges experienced in determining the impact of the DAPCEF (NSF) ITEST program for students and their parents. Students complete pre- and post-program surveys of technology knowledge, skills and attitudes, take pre and post-tests of key concepts each semester, and participate in focus group interviews. Parents complete pre- and post-program surveys of technology knowledge, skills and attitudes, complete a mid-program survey, and participate in focus group interviews. Evaluation findings for the first cohort of 120 students will be discussed along with plans to track students through college.
|
|
Evaluating Scholarship Programs: Models, Methods, and Illustrative Findings
|
| Presenter(s):
|
| Gary Miron,
Western Michigan University,
gary.miron@wmich.edu
|
| Abstract:
This paper is based on two rather innovative evaluations of scholarship programs that have been conducted by the author. Large differences exist in the design and intent of the two scholarship programs that were evaluated. For this reason, differing models and methods were used for each of the evaluations. While the session will not focus on findings, some illustrative findings will be discussed since they exemplify the methods selected for the evaluations.
|
| | | |