|
Course Evaluation Redesign Process, Outcomes, & Consequences: How One School Simplified Its Tool to Find Out It Wasn't That Simple
|
| Presenter(s):
|
| Tanya Ostrogorsky, Oregon Health & Science University, ostrogor@ohsu.edu
|
| Abstract:
This presentation focuses on outcomes experienced during the redesign of the Oregon Health & Science University School of Nursing (OHSU SON) end-of-term course and teaching effectiveness evaluations. The presentation will focus on fostering stakeholder engagement in the redesign process, increasing utility of results, reducing respondent burden, incorporating evaluations into our on-line learning management system (LMS), and determining criteria in which courses are referred to curriculum committee review.
Additionally, the presenter will discuss how the concept of 'simplify' was used to guide the redesign process and how that touchstone word helped maintain focus towards the goals of the redesign, but also how that guiding principle unintentionally made other aspects of the evaluation more complex. Other issues to be discussed include how an increasing reliance on team teaching and/or heavy guest lecturer participation in courses has impacted the evaluation process and the challenges of integrating into an open source LMS.
|
|
The Value of Employing Descriptive Performance Levels in the Learning Assessment of Army Commanders
|
| Presenter(s):
|
| Linda Lynch, Instructional Systems Specialist, Quality Assurance Office, linda.l.lynch@us.army.mil
|
| Abstract:
The purpose of the Army School for Command Preparation (SCP) - Tactical Commander Development Program is to instruct new Brigade and Battalion Commanders in leadership and tactical skills prior to taking over a new command. In the past a Pre - Post Course Survey using nominal 1 - 5 item responses demonstrated that significant learning had taken place for each learning objective. Upon changing the item responses to progressive performance descriptions has presented new opportunities and valuable outcomes for SCP, faculty and students. For SCP significant learning is now specifically tied to performances associated with each learning objective. Faculty can assess learning nuances within each class, and students are clear about progressive performance levels associated with their new role in command.
|
|
Qualitative Feedback in Online Course Evaluations: A standardized analysis of written comments
|
| Presenter(s):
|
| David Nelson, Purdue University, nelson8@purdue.edu
|
| Abstract:
Despite significant research on the role of gender in student course evaluations, analyses of qualitative student feedback occupy a minor place in the literature. This paper explores the influence of gender on the type of qualitative feedback that instructors receive. Using a rubric, written comments on over 15,000 student course evaluations, conducted exclusively online at a large research university, were analyzed to determine if male or female instructors were more likely to receive comments that were personalized or unrelated to course instruction. Comments were also examined to determine if the gender of the student influenced the likelihood of such feedback.
|
|
Problems with Multi-purpose Postsecondary Course Evaluations
|
| Presenter(s):
|
| Stanley Varnhagen, University of Alberta, stanley.varnhagen@ualberta.ca
|
| Jason Daniels, University of Alberta, jason.daniels@ualberta.ca
|
| Brad Arkison, University of Alberta, brad.arkison@ualberta.ca
|
| Abstract:
In order for course evaluations in postsecondary education to be appropriately valued, the process and instruments need to support effectively different evaluation goals. At our post-secondary institution, a single, end-of-course evaluation is conducted, with the results serving at least three distinct functions: assisting with promotion and tenure decisions, providing the instructor formative feedback, and providing information to students to assist in course and section selections. Of these three uses, the most prevalent is information to assist with promotion and tenure decisions. This summative emphasis, including instituting evaluation procedures (i.e., timing, questions, etc.) that are best suited for this purpose, has implications for its broader usefulness. Specifically, this summative focus negatively affects the usefulness of the formative feedback to the course instructor. In this paper we present a case study of how summative and formative uses of course evaluation data are affected by a one-size-fits-all approach, and suggest an alternative approach.
|
| | | |