2011

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Value and Valuing in New Learning Environments: Web 2.0 and Real-time Simulation
Multipaper Session 639 to be held in Redondo on Friday, Nov 4, 10:45 AM to 11:30 AM
Sponsored by the Distance Ed. & Other Educational Technologies TIG
Chair(s):
Talbot Bielefeldt,  International Society for Technology in Education, talbot@iste.org
Valuing in Developmental Evaluation
Presenter(s):
Chi Yan Lam, Queen's University, chi.lam@queensu.ca
Abstract: Typically, the values underpinning an evaluation become clarified and resolved as decisions are made about goals, methods, processes, priorities, and reporting. However, in evaluations where the goal is to engage in continuous, rapid cycles of learning about a program-in-action and then adapting that program based on what is being learned, the values underpinning decisions are likely to be both less obvious and less stable. These are conditions typical of developmental evaluation. This paper describes a developmental evaluation where the goal was to incorporate Web 2.0 technology into an instructional program as a strategy to overcome contextual constraints. It examines the conditions and processes through which valuing about the program and the evaluation took place. Findings suggest that developmental evaluation may place unique demands on the evaluator to help make explicit the evolving nature of stakeholder values, and to provide a degree of domain-specific expertise in support of the learning cycles.
Addressing Values That Show Up in Evaluation Methodologies and Analyses: External Evaluation in a Live Diagnostic Simulation
Presenter(s):
Debra C Piecka, Wheeling Jesuit University, dpiecka@cet.edu
Manetta Calinger, Wheeling Jesuit University, mcalinger@cet.edu
Laurie Ruberg, Wheeling Jesuit University, lruberg@cet.edu
Abstract: This presentation focuses on discussion about how values show up in evaluation methodologies and analyses. Review includes two different external evaluations. The more qualitative evaluation provided information about the value of videoconferencing programs while the quantitative evaluation provided information about the simulation as an instructional intervention. During the first external evaluation of a five-year grant to design, develop, and evaluate a live diagnostic simulation for secondary science students, the summative evaluation reflected different evaluation methodologies and analyses from those foreseen by the developers. Although data instruments included detailed teacher and student pre- and post-simulation activity surveys, the report provided extensive narrative analyses but little quantitative review. To address this void, the program bid and awarded a more specific second external evaluation contract that called for survey-related descriptive statistical analysis along with qualitative analysis. This work adds to the field by sharing observations about the embedded values in the methodologies featured.

 Return to Evaluation 2011

Add to Custom Program