Evaluation 2008 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Program-Level Evaluation in Medical and Health Science Education: Policies, Practices, and Designs
Multipaper Session 312 to be held in Mineral Hall Section A on Thursday, Nov 6, 1:40 PM to 3:10 PM
Sponsored by the Assessment in Higher Education TIG
Chair(s):
Susan Boser,  Indiana University of Pennsylvania,  sboser@iup.edu
Discussant(s):
France Gagnon,  University of British Columbia,  fgagnon@medd.med.ubc.ca
Developing and Evaluating an Innovative Clinical Education Simulation Curriculum for a Nursing Program
Presenter(s):
Mary Piontek,  University of Michigan Ann Arbor,  mpiontek@umich.edu
Abstract: This paper discusses my collaboration as evaluation consultant with the University of Michigan’s School of Nursing on developing, implementing, and evaluating a curricular revision process for the School’s Initiative for Excellence in Clinical Education, Practice, and Research. The curricular revision includes the development of a series of simulation modules and teaching materials to support deeper integration of clinical skills and knowledge into the BSN degree program. The potential for documenting the curricular development, evaluation, and educational research processes is unique in that an evaluator is involved from day-one of the development process. The paper will describe the process of facilitating the curricular revision and how the use of educational and professional standards shaped the process; it will focus on the evaluation design for assessing the quality of the simulation modules and the impact of the curricular revision on student learning and development of clinical skills and knowledge.
Nursing Curriculum Evaluation Using National League for Nursing Accrediting Commission Standards and Criteria
Presenter(s):
Vicki Schug,  College of St Catherine,  vlschug@stkate.edu
Abstract: Systematic curricular evaluation is essential to assure integrity of baccalaureate nursing programs. This session will provide participants with information about an innovative approach to curricular evaluation using the accreditation standards and criteria required for nursing programs by the National League for Nursing Accrediting Commission (NLNAC). This holistic approach has provided a more comprehensive perspective for nursing faculty and administrators. A conceptual framework will be presented that illustrates the dynamic relationship of curricular content, context, and conduct. It is suggested that curricular content cannot be isolated and must be examined in light of the milieu or context of curricular delivery as well as the conduct or implementation of curriculum. The NLNAC standards of Mission and Governance (I), Faculty (II), and Students (III) relate to the "context" of curricular enactment. Standard IV, Curriculum and Instruction, pertains to the curricular "content." The "conduct" of curricular delivery is addressed through NLNAC standards of Resources (V), Integrity (VI), and Educational Effectiveness (VII). Results of this “best-practice” curricular evaluation approach will be shared along with recommendations and implications for the future.
Studying Curriculum Implementation and Development Through Sustained Evaluation Inquiry: Using Focus Groups to Study Innovations in an Undergraduate Nursing Program
Presenter(s):
William Rickards,  Alverno College,  william.rickards@alverno.edu
Abstract: The undergraduate nursing program at a Midwestern college introduced innovations in its curriculum. After three years, when the first cohort of students was ready to graduate, a program of focus groups was begun that would engage each of the graduating cohorts over the next four semesters. The results of these groups and the subsequent deliberations of the nursing faculty provide a picture of program implementation, described in relation to student learning outcomes, student experience and faculty practices. The processes for conducting and analyzing the focus groups as well the approaches to reporting results to faculty and deliberating on the student experiences are described in this paper. Over the four different groups, students discussed different emphases in their experiences with implications for how the nursing curriculum was developing during this period. The group data in conjunction with curriculum and NCLEX performance data provide insights into how program innovations proceed.
Challenges Evaluating Comparability: Lessons Learned Evaluating a Fully Distributed Medical Undergraduate Program
Presenter(s):
Chris Lovato,  University of British Columbia,  chris.lovato@ubc.ca
Caroline Murphy,  University of British Columbia,  caroline.murphy@ubc.ca
France Gagnon,  University of British Columbia,  fgagnon@medd.med.ubc.ca
Angela Towle,  University of British Columbia,  atowle@medd.med.ubc.ca
Abstract: The University of British Columbia (UBC) implemented a fully-distributed undergraduate medical education program at three separate geographic sites in 2004. Comparability of educational experiences is essential to the success of the program, for purposes of accreditation, educational outcomes, and program policy and decision making. Evaluation of comparability is complex and requires more than a simple analysis of test performance across sites. How should comparability be operationalized to evaluate program success? What methods are most appropriate for evaluating comparability? This paper will describe an approach to evaluating comparability of UBC’s medical undergraduate program, including methodological challenges and reflections regarding the operational definition of comparability. Findings regarding students’ educational experiences and how they relate to performance will be presented. We will discuss our perceptions regarding the utility and accuracy of the methods used and provide recommendations regarding further development of this area.
Evaluating the Graduate Education in Biomedical Sciences Program: A Mixed Method Study
Presenter(s):
Laurie A Clayton,  Higher Education,  lclayton@rochester.rr.com
Abstract: National initiatives, global competition, and public health needs require scientists who have interdisciplinary training to meet these 21st century challenges. In an effort to offer a broad environment from which to enhance the interdisciplinary training of its doctoral students, the University of Rochester School of Medicine and Dentistry developed the Graduate Education in the Biomedical Sciences Program (GEBS). The purpose of this study was to evaluate the GEBS program and the interests and experiences of its graduate students. Implementation of the mixed method design was sequential, interpreting data through document analysis followed by a combination of quantitative (survey) and qualitative (focus group and interview) methods. Evaluation results moderately supported the GEBS program’s theoretical framework, its interdisciplinary orientation and objectives. Doctoral students utilize a combination of experiences, decision-making skills and the GEBS in the selection of their Ph.D. program

 Return to Evaluation 2008

Add to Custom Program