Evaluation 2009 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Online Surveys in Evaluation: Issues and Lessons Learned From Four Evaluation Projects
Panel Session 707 to be held in Panzacola Section G2 on Saturday, Nov 14, 9:15 AM to 10:45 AM
Sponsored by the Integrating Technology Into Evaluation
Chair(s):
Hsin-Ling Hung, University of Cincinnati, hunghg@ucmail.uc.edu
Abstract: The use of technology in evaluation, for example, online (web-based or e-mail) surveys in data collection, has become increasingly widespread evaluation activities due to many advantages including time and cost efficiencies, ease of survey administration, and convenience for respondents. However, researchers and evaluators likely encounter methodological issues (e.g., sampling, response rate, instrumentation, etc.) and data collection challenges in conducting online surveys. This panel presentation attempts to address these issues and challenges through discussion of both the theoretical foundations and practical perspectives regarding online surveys. Linking the ideal to the practical world of conducting online surveys in the field of evaluation will be the focus of this presentation. Challenges encountered, possible solutions and lessons learned in real-life evaluation utilizing online surveys will be discussed.
Handling Technological Tools to Have Meaningful Evaluation Data
Imelda Castaneda-Emenaker, University of Cincinnati, castania@ucmail.uc.edu
This presentation focuses on the use of online survey in the evaluation of Client K's teacher professional development program integrating literacy across the curriculum. No names or usual form of student identification was used on the online survey because the client was concerned about non-response. The main challenge lies in matching the students' responses with their teachers'. The college system provided a database of 6000 students of all the teachers in the program. Teachers were handling several courses and students were taking different courses so the list contains duplicate names of teachers, students, and email addresses. The email addresses were used as passwords to enter the survey. Eventually, there were only few students that participated with the online survey. The dilemma about identification issues and the evaluator's naiveté about a database easily solving matching concerns led to data that are not very meaningful for the evaluation processes.
Context-Driven Use of Online Surveys in Educational Program Evaluations
Janet Matulis, University of Cincinnati, janet.matulis@uc.edu
This presentation discusses how the context of an educational program helps determine whether data collection is best accomplished via online survey, pencil-or-paper survey, or some combination of the two. Although the tendency is for technology-based survey administration to be considered the ideal, this methodology may not be the most appropriate choice for an educational program evaluation. When online surveys are deemed appropriate for an evaluation, contexts of the educational program can also dictate whether data integrity would best be attained through survey administration via email and/or a website link. The rationale and process for administering a student survey via both online and paper formats as part of the evaluation of an art museum's K-12 school-based programs will be discussed.
Online Survey on International Participants and Using Technology in International Collaboration: Challenges and Lessons Learned
Hsin-Ling Hung, University of Cincinnati, hunghg@ucmail.uc.edu
This presentation will be based on an electronic, Delphi study of educational program evaluation (EPE) in the Asia-Pacific region conducted by a team of researchers in the U.S. and Taiwan. Unlike the other examples in this panel presentation, this study used both e-mail and commercial service for web-based survey for data collection. Challenges associated with commercial web-based survey as well as issues of online survey specific to the Delphi methods and Needs Assessment will be discussed. General methodological consideration of the survey research will be introduced. In addition, utilization of technology in international research collaboration will also be discussed. The presentation would be of interest to evaluators and researchers in online survey as well as application of technology in research and international research collaboration.
Challenges and Lessons learned From a Community-Based Initiative Evaluation Project
Mary Marx, University of Cincinnati, mary.marx@uc.edu
This presentation is based on community health project to conduct community-based participatory research focusing on nutrition and physical activity of minority adults and youth. An initial challenge was related to instrument development, specifically the difficulty in convincing the community agency of the need to shorten the instrument. As experienced in this project, clients often feel it is important to take advantage of a data collection opportunity by insisting upon an extremely long and cumbersome online survey to gain what they perceive to be high value for the money they spend on an evaluation. A technical issue involving the survey software's management of "missing values" was also encountered. A third issue involved data analysis, particularly alternating back and forth between different software packages. When sharing the data with a researcher consultant working with the community agency, care was required in recoding where needed to assure that the calculations would be accurate.

 Return to Evaluation 2009

Add to Custom Program