Evaluation 2009 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Instrumentation Strategies for "Digitally" Interactive Populations
Multipaper Session 319 to be held in Wekiwa 3 on Thursday, Nov 12, 1:40 PM to 3:10 PM
Sponsored by the Distance Ed. & Other Educational Technologies TIG
Chair(s):
Theresa Murphrey,  Texas A&M University, t-murphrey@tamu.edu
Involvement in Virtual Schooling: The Validation of an Instrument to Measure Parental Involvement Mechanisms
Presenter(s):
Feng Liu, University of Florida, martinlf@ufl.edu
Erik Black, University of Florida, erikwblack@gmail.com
James Algina, University of Florida, algina@ufl.edu
Abstract: Parental involvement has been recognized as an important factor for students' achievement in traditional school settings. The lack of research regarding the effect of parental involvement for students' achievement in virtual schooling is, in part, because of the dearth of one valid and reliable instrument to measure this construct. In this study, one Parental Involvement Mechanisms Model including four factors: parent reinforcement, parent modeling, parent encouragement, and parent instruction, 51 indicators/items was analyzed using confirmatory factor analysis and exploratory factor analysis with the data collected from 938 participants (parents) in virtual schooling. The result shows this instrument is overall a valid measurement. The strong relationship among these four factors provides the evidence that the items tending to measure them might be overlap to some degree. Suggestions were given to modify this instrument for the future research.
Incorporating eeLearning Into the Teaching of Evaluation
Presenter(s):
Theresa Murphrey, Texas A&M University, t-murphrey@tamu.edu
Abstract: Teaching courses online has become a common practice. In fact, universities are reporting increasingly higher numbers of courses being delivered online. Teaching evaluation online offers unique challenges and opportunities for engaging students and encouraging constructive dialog that can improve comprehension of the subject matter. Fortunately, technologies are allowing experiential, electronic learning (eeLearning as coined by Trevitte & Eskow, 2007) as never before. This session will present a literature review of the technologies that are currently available to assist in teaching evaluation and a first-hand account of using eelearning technologies to encourage engagement and team-work. The use of both asynchronous (i.e., discussion boards, email, Jing™, etc.) and synchronous (i.e., Centra™, chat, and phone, etc.) technologies will be discussed in the context of both individual and group assignments.
The Unfolding Model: Using Test Validity to Guide Professional Program Evaluation Studies in Distance Education and E-learning
Presenter(s):
Valerie Ruhe, University of British Columbia, valerie.ruhe@ubc.ca
Bruno D Zumbo, University of British Columbia, bruno.zumbo@ubc.ca
Abstract: In this paper, we present Ruhe and Zumbo's (2008) Unfolding Model, a new model of program evaluation based on Messick's (1989) model of test validity. Based on scientific evidence, relevance, cost-benefit, underlying values and unintended consequences, our model unfolds to reveal practical tools and strategies for diverse technology-based contexts. Our approach brings rich, theoretical insights from test validity into program evaluation and distance education. The model is also a practical tool to guide scientific evaluation studies of innovative instructional programs. With this model, we are responding to recurring calls for a more professional approach to the evaluation of innovative instructional programs. Our model has been tested on four authentic post-secondary courses in diverse subject areas, and is especially helpful for novice evaluators. Finally, its adaptive and dynamic quality ensures its relevance for Web 2.0 and beyond.

 Return to Evaluation 2009

Add to Custom Program