|
Lessons Learned in our own Backyard: Evaluation in a University Setting
|
| Presenter(s):
|
| Cidhinnia M Torres Campos,
Crafton Hills College,
cidhinnia@yahoo.com
|
| Beatriz Ornelas,
California State University, Los Angeles,
ornelasbeatriz@yahoo.com
|
| Abstract:
Few reports have distilled the implications of the increasing efforts to evaluate college intervention programs. This paper outlines lessons drawn from the experience of implementers and evaluators of a targeted intervention for Latino freshmen. For two years program staff, faculty, and researchers have collaborated to evaluate and learn from results of an urban commuter campus program, which screened over a thousand students and aimed to identify high-risk students and increase their academic success. The presentation highlights how evaluation efforts contribute to creating learning communities. Eight lessons reflect several themes: timing and function of evaluation, value of feedback and its timing, role of intervention research, importance of open communication of findings, and uses of evaluation results by the program and university staff. These presentation offers practical advice on implementing and effectively evaluating college level interventions. This presentation will also examine the process of learning from program evaluation in a university context.
|
|
Combating the Decline: A Report on Attraction, Retention and Learning Evaluation Data From Higher Education Computing Science Classrooms Using Emerging Technologies
|
| Presenter(s):
|
| Jamie Cromack,
Microsoft Research, External Research and Programs,
jamiecr@microsoft.com
|
| Abstract:
By the year 2014, employment in computer-related industries is expected to grow between 28 and 68 percent, but an alarming decline in the number of incoming freshmen choosing to major in computing science (CS) and computer engineering (CE) bodes ill for American graduates. The questions of attraction, retention and learning in CS courses can be answered by the use of certain cutting-edge technologies and approaches to draw students into CS and CS-related classrooms, keep them there and improve their learning. Evaluation data from a growing body of research in these innovative CS and CS-related courses show strong potential, but more evidence is needed to support this assertion. This report analyzes over 30 research studies from CS and CS-related courses that used advanced technologies as pedagogical tools, describes the mixed-method evaluation methods used, identifies their suitability for the setting and makes recommendations for further research.
|
| |