|
Teaching Evaluation Skills in Trinidad and Tobago: Obstacles and Solutions
|
| Presenter(s):
|
| Lindsay Nichols,
Loyola University, Chicago,
lnicho2@luc.edu
|
| Lisa Sandberg,
Loyola University, Chicago,
lsandbe@luc.edu
|
| Aisha Leverett,
Loyola University, Chicago,
jlevere@luc.edu
|
| Linda Heath,
Loyola University, Chicago,
lheath@luc.edu
|
| Abstract:
Evaluation Research is often exported from the U.S. around the globe, but students rarely have the opportunity to conduct international evaluations during their training. This lapse is unfortunate, as working on foreign soil teaches the importance of cultural awareness and sensitivity, as well as requiring the evaluator to work harder to understand the systems and assumptions of the host country. With funding from the Graduate School of Loyola University Chicago, we were able to conduct an evaluation of the Student Advisory Services at the University of the West Indies, St. Augustine Campus, as part of a graduate course in Evaluation Research. Working collaboratively with faculty and staff at UWI and depending heavily on “on-site experts,” the authors completed the evaluation within one academic term. The background, logistics, and lessons learned are discussed.
|
|
Growing New Buds on the Evaluation Tree: Undergraduate Students' Interest in Program Evaluation
|
| Presenter(s):
|
| John LaVelle,
Claremont Graduate University,
john.lavelle@cgu.edu
|
| Abstract:
Currently, few people seek a career in program evaluation even though the demand for evaluators exceeds the supply. In order for the profession to grow there needs to be a consistent flow of trained evaluators who enter the field annually. Undergraduate students are a potential pool of future evaluators, but little is known about their interest in pursuing a career in program evaluation. The purpose of this study was to collect preliminary data on undergraduate students' interest in the field of program evaluation.
The researcher collected data from 89 undergraduate students from a Midwestern university. Participants were asked to describe program evaluation (PE), read a description of PE, indicate their familiarity with and interest in PE as a career, and respond to semantic differentials to assess their global attitude towards PE. Their responses may guide our quest to grow the evaluation profession.
|
|
Program Evaluation to Guide Training for State-wide Federally Funded College Access Initiative: The Experience of First-time Evaluators
|
| Presenter(s):
|
| Karyl Askew,
University of North Carolina, Chapel Hill,
karyls@email.unc.edu
|
| Bridget Weller,
University of North Carolina, Chapel Hill,
bweller@email.unc.edu
|
| Tangie Gray Fleming,
University of North Carolina, Chapel Hill,
tangie_gray@unc.edu
|
| Abstract:
Graduate student courses can be used to both advance the field of evaluation and provide a quality service to organizations. As part of an introductory graduate-level evaluation training course offered at the University of North Carolina at Chapel Hill, a three-member team was commissioned to conduct an evaluation to inform the training of district-level administrators of a multi-million dollar statewide federally funded college access initiative. The sample included 20 school district-level and three state-level directors serving over 6,000 aspiring first-generation college attendees. Presenters will highlight benefits and challenges to learning the fundamentals of evaluation as part of a graduate course. The purpose of this presentation is to share reflections on how course assignments, classroom discussions, instructor mentoring, and real-world pilot study facilitated 1) the delivery of an evaluation product that had both utility and influence for the client, and, 2) the professional development of first-time evaluators.
|
|
Integrating Client Education With the Evaluation Process
|
| Presenter(s):
|
| Christopher L Vowels,
Kansas State University,
cvowels@ksu.edu
|
| Jason Brunner,
Kansas State University,
jbrunner@ksu.edu
|
| Abstract:
Evaluation offices are commonly approached by clients with varying levels of understanding of the evaluation process and related research methodologies. To promote client understanding, we propose the Evaluate-And-Educate (E2) method. This method allows a training of the client during the evaluation process, recognizing that evaluation is often subsumed by limited resources and time. By integrating the client's level of understanding into the evaluation plan, opportunities become available for instruction on correctly utilizing graphical information, accurate and effective use of statistical test results, and better understanding of the effects of different research design implementations. Thus, the client not only receives the requested evaluation services, but is also educated as a functional part of the evaluation process. In this respect, the education is innately linked to evaluation and not seen as an additional request or burden. Likewise, by increasing client understanding, future evaluation opportunities of a more stringent quality become more likely.
|
| | | |