2011

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Assessment in Community Colleges
Multipaper Session 840 to be held in Salinas on Saturday, Nov 5, 8:00 AM to 9:30 AM
Sponsored by the Assessment in Higher Education TIG
Chair(s):
Rhoda Risner,  United States Army, rhoda.risner@us.army.mil
Critical Indicators of a Successful QuickStart to College Program in Community Colleges- A Theory-Driven Evaluation Approach
Presenter(s):
Xin Liang, University of Akron, liang@uakron.edu
Abstract: This paper presents the development of an evaluation design built on the principles of theory-driven evaluation in a QuickStart to College project. The goal of the program was to offer low-wage, unemployed, and under educated adults a free, academic and student support services course to help them successful in college and become employable. Three stages of evaluation questions were asked to test the causal, intervention, and action hypotheses of the QuickStart to College program theory. The reciprocal process of data collection and cross validation among stages provided timely information on how exactly the planned intervention has been implemented and allowed to check whether an unsuccessful intervention is due to implementation failure or program design failure. In return, the cross validation of the assumptions provided evidence to refine the program theory and help generate 6 critical indicators for a successful QuickStart to college model scalable to community colleges across the state.
Meeting Students' Informational Needs: A Data Analysis Tool to Assess The Quality of Information on Community College Websites
Presenter(s):
Megan Brown, American Institutes for Research, mbrown@air.org
Jonathan Margolin, American Institutes for Research, jmargolin@air.org
Shazia Miller, American Institutes for Research, smiller@air.org
James Rosenbaum, Northwestern University, j-rosenbaum@northwestern.edu
Abstract: This paper presentation describes the processes for creating a rubric and evaluating websites to assess how well community college websites inform prospective students. The rubric was used to evaluate the extent to which websites answer basic questions of prospective students who are selecting a program of study. The rubric items and criteria captured the needs and values of the students, and the approach captures website usability in terms of informational needs. The rubric can serve as a capacity building tool for colleges to judge the usefulness of their websites or other resources to their target constituency, and this approach can be applied in a wide variety of evaluations. In the presentation we will detail the rubric development and evaluation processes, discuss how we represented student values, and share recommendations for similar efforts. Although focused primarily on methodology, we will also present findings and webpage examples of best practices in informing prospective students.
Measuring Fidelity of Implementation in Community College Research
Presenter(s):
Oscar Cerna, MDRC Research Organization, oscar.cerna@mdrc.org
Alissa Gardenhire-Crooks, MDRC Research Organization, alissa.gardenhire@mdrc.org
Phoebe Richman, MDRC Research Associate, phoebe.richman@mdrc.org
Abstract: As a research organization, MDRC has developed standardized procedures for measuring the fidelity of implementation of community college programs. Steps to measure objectives, intended outcomes, program activities, operational feasibility and stakeholder experiences are part of MDRC's framework for conducting implementation fidelity research at community colleges. This paper highlights the application of implementation fidelity procedures for two distinct evaluation studies: a small-scale pilot study and a large-scale, multi-site random assignment demonstration. Qualitative measures for intended short and long term outcomes, program activities, term-to-term progress, and program scale and intensity for both studies are discussed. Procedures for involving community college stakeholders in discussions about measuring program components and evaluation activities are also highlighted. The tenets of the paper will serve to further inform how measuring fidelity of implementation can be used to evaluate community college programs and to provide college practitioners with useful evaluation tools.
A Formative Evaluation of a Dual Admission Program Between a Community College and Elite State University
Presenter(s):
Jason Taylor, University of Illinois at Urbana-Champaign, taylor26@illinois.edu
Abstract: This paper presents findings from a utilization-focused evaluation of a new dual admission and dual enrollment program designed by a community college and elite state university. The paper presents a theoretical framework, evaluation approach, evaluation methods, and evaluation findings. Using a utilization-focused approach, the evaluation sought to determine the program goals and components from multiple stakeholder perspectives. Using semi-structured interviews with program faculty, staff, and administrator at both institutions, findings reveal a variety of program goals and components with mixed perceptions of the potential value of program components for students and for the institutions. Similarly, a number of unexpected consequences have created a new set of challenges and opportunities for the institutions.

 Return to Evaluation 2011

Add to Custom Program