|
Surveys: A Tool for Building a Case Study?
|
| Presenter(s):
|
| Natalya Gnedko, Chicago Public Schools, ngnedko@cps.k12.il.us
|
| Denise Roseland, University of Minnesota, rose0613@umn.edu
|
| Abstract:
This paper presents the findings and experiences of an internal evaluation team that used a survey to develop a collective case study. The choice of using a survey as a tool for building a case study came about when the program planners asked for “stories” of in-school instructional coaches about their successful and unsuccessful experiences of working with teachers, but were unable to dedicate resources required for interviews. In response to the program planners’ request, the evaluation team developed a survey that contained mostly open-ended questions. The questions were designed to guide coaches through telling about their experience in a way that would help them create a “story”. To analyze the responses, the evaluation team used a framework developed by external evaluators, thus beginning efforts to validate the framework. The survey and resulting case study were a part of the larger evaluation of the district’s in-school instructional coaching program.
|
|
Are Surveys Enough? A Case Study in Employing User Tests and Focus Groups to Improve Website Evaluation
|
| Presenter(s):
|
| Michael Porter, College Center for Library Automation, mporter@cclaflorida.org
|
| Dawn Aguero, College Center for Library Automation, daguero@cclaflorida.org
|
| Aimee Reist, College Center for Library Automation, areist@cclaflorida.org
|
| Abstract:
The web-based Library Information Network for Community Colleges (LINCCWeb) is the library-resource search tool used by nearly 1,000,000 students, faculty, and staff at 80 libraries of Florida’s 28 community and state colleges. This resource is provided by the College Center for Library Automation (CCLA) in Tallahassee. Historically, CCLA has had trouble getting in-depth feedback from students and faculty. Annual data from these users has been gathered via surveys. These have yielded useful information, but not the in-depth, qualitative information desired. Targeting these users, CCLA conducted focus groups and user tests at seven campuses across five colleges throughout Florida. These evaluations were extremely valuable and gave CCLA a new, in-depth understanding of user needs. This session will provide lessons learned on marketing, recruiting, providing incentives, logistics of conducting sessions, analysis, and reporting of focus group and user test results. This session will be especially valuable for others in low-budget, non-profit evaluation.
|
| |