|
An Evaluation of Wiki as a Tool for Building Communal Constructivism in a Graduate-Level Course
|
| Presenter(s):
|
| Kathleen D Kelsey, Oklahoma State University, kathleen.kelsey@okstate.edu
|
| Hong Lin, Oklahoma State University, hong.lin@okstate.edu
|
| Tanya Franke, Oklahoma State University, tanya.franke@okstate.edu
|
| Abstract:
Wikis have been praised as tools that enhance learning and collaborative writing within educational environments and move learners toward a state of communal constructivism (Holmes et al., 2001). Many pedagogical claims exist regarding the benefits of using wikis. These claims, however, have rarely been tested empirically. This study used a three-year longitudinal cohort survey design (Creswell, 2008) to test the pedagogical claims of wiki, including the theory of communal constructivism, when implemented as a writing tool to create an online textbook in a graduate-level course. Holmes, et al. (2001) assertions were not substantiated by our findings. The overall survey mean was 2.33 on a four-point scale, indicating learners were not sure if the wiki writing experience impacted their knowledge construction or critical thinking skills. Instructors must encourage and reward students for collaborating and stress learner responsibility when using wikis as a collaborative writing tool.
|
|
Evaluating Learners and Building Evaluation Capacity in an Online Community Learning Model
|
| Presenter(s):
|
| Cindy Beckett, Independent Consultant, cbevaluate@aol.com
|
| Abstract:
Evaluating learners in the context of an online environment where a community learning model is implemented through peer and professional mentor interaction provides unique evaluation opportunities. In this format, evaluating the process and progress of learners can be challenging. In some cases, the degree to which these elements can be evaluated may be dependent on information gleaned from activity and the evaluation capacity built, for example the ability to collect data on the community website. Challenges may arise when evaluators are faced with evaluating web based programs that may have not considered an evaluation approach or tools and lack capacity built into their online programs from the outset. The following example of an evaluation of a non-profit organization strives to illustrate the benefits and challenges of evaluating web based learning in this context and provide some solutions and insight in evaluating distance learning in a community learning model.
|
|
Evaluating Supplemental Educational Services: A Randomized Control Trial
|
| Presenter(s):
|
| S Marshall Perry, Dowling College, smperry@gmail.com
|
| Abstract:
The paper concerns an evaluation of an online individualized supplemental educational services program aimed at improving middle school reading performance for students who are below grade level. The diverse study sample consisted of nearly 400 students from 15 schools in three states. Study measures included two standardized assessments and a student survey conducted three times over a school year. The paper examines a randomized control trial to determine the relationship between student involvement in the program and changes in academic achievement and academic attitudes and behaviors. By the mid-tests, the treatment group significantly outperformed control group students by nearly 3/4s of a grade. Students with lower pre-program achievement tended to experience greater growth. The treatment and control group did not always differ significantly in attitudinal and behavioral measures, but changes were correlated with academic growth. The paper also highlights methodological and logistical challenges and implications for future evaluation research.
|
|
Evaluation of the Multi-Phase Release of Florida's Community College Library-Resource Website: A Mixed Methods Approach
|
| Presenter(s):
|
| Michael Porter, College Center for Library Automation, mporter@cclaflorida.org
|
| Dawn Aguero, College Center for Library Automation, daguero@cclaflorida.org
|
| Barry Harvey, College Center for Library Automation, bharvey@cclaflorida.org
|
| Aimee Reist, College Center for Library Automation, areist@cclaflorida.org
|
| Abstract:
The web-based Library Information Network for Community Colleges (LINCCWeb) is the library-resource search tool used by nearly 1,000,000 students, faculty, and staff at 80 libraries of Florida's 28 community and state colleges. In 2008, LINCCWeb version 2.0 was released to library staff while still in development ('beta') by Florida's College Center for Library Automation (CCLA). A second beta release for staff, faculty, and students occurred in early 2009. During these periods, staff critiqued the website via online forums and the CCLA help desk. Students and faculty participated in focus groups and user test analysis, and also had the opportunity to complete a LINCCWeb version 2.0 user satisfaction survey. A small team of CCLA staff analyzed all of the qualitative and quantitative data, identifying important trends and themes for CCLA's web developers. This paper presents the analysis approach of this team and discusses the lessons learned for an extensive website evaluation.
|
| | | |