Evaluation 2008 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Innovative Approaches to Teaching Evaluation in a University Setting
Multipaper Session 299 to be held in Capitol Ballroom Section 6 on Thursday, Nov 6, 1:40 PM to 3:10 PM
Sponsored by the Teaching of Evaluation TIG
Chair(s):
SaraJoy Pond,  Brigham Young University,  sarajoypond@gmail.com
Beyond Problem-Based Experiential Learning: An Applied Research Training Practicum
Presenter(s):
Meghan Lowery,  Southern Illinois University at Carbondale,  meghanlowery@gmail.com
Theresa Castilla,  Southern Illinois University at Carbondale,  castilla@siu.edu
Joel Nadler,  Southern Illinois University at Carbondale,  jnadler@siu.edu
Abstract: Evaluation practices in applied settings rely on solid training programs. The need for experiential, hands-on learning in applied settings is important. Applied research training practicum programs are a unique addition to any graduate-level curriculum, yet very few practicum programs exist. The Applied Psychology program at Southern Illinois University Carbondale (SIUC) is one of the few programs in the country to offer a student run practicum program. A literature-based comparison reveals different approaches to and techniques used in evaluation training, such as the case study method, re-enactment of past evaluations, and a problem-based learning approach. These methods will be compared to the training practicum offered by SIUC’s Applied Research Consultants (ARC). ARC provides a full two years of practical experiential based training. Challenges, necessary support, and the unique learning experiences that ARC offers will be discussed as well as the benefits of such training when entering the job market.
Infusing Evaluation Concepts and Skills Across a Professional Curriculum: Evidence-Based Practice as a Theme for a Research Class
Presenter(s):
Kathleen Bolland,  University of Alabama,  kbolland@sw.ua.edu
Abstract: The push for evidence-based practice (EBP) in many fields has an unintended benefit: it provides a way for additional evaluation content to be incorporated into professional curricula. Although the concept is sometimes controversial, and there is some resistance to putting it into practice, there may be more acceptance of EBP than of previous iterations such as “the scientist-practitioner model.” Acceptance of EBP opens the door for infusion of evaluation content into courses where otherwise it has been absent or tangential. In this presentation, I will discuss (a) how I have designed a research course around the theme of evidence-based practice; (b) ways I have highlighted the relationship of EBP, and hence, evaluation, to other curricular areas such as policy and human behavior in the social environment; and (c) further work we evaluators can do to increase appropriate coverage of evaluation concepts and principles across curricula and in textbooks.
Consulting Within the University: Varied Roles, Valuable Training
Presenter(s):
Mark Hansen,  University of California Los Angeles,  hansen.mark@gmail.com
Janet Lee,  University of California Los Angeles,  jslee9@ucla.edu
Abstract: The Social Research Methodology Evaluation Group provides technical assistance and evaluation services to organizations in the community and supports the professional development of novice evaluators. The group is housed within a school of education and is closely linked to a graduate program in research methodology and program evaluation. We are presently working with several university-based outreach programs that seek to improve the college-readiness of youth in the region. Despite a shared purpose, these programs differ in their service approach and the sophistication of their evaluation efforts. Thus, our work has provided an opportunity to assist these programs in a wide variety of capacities: facilitating conversations with stakeholders, developing logic models, focusing evaluation questions, developing measurement tools, enhancing data infrastructure, and building capacity for analysis and communication of findings. This paper will describe these diverse roles and discuss the corresponding value of this setting for training graduate students in evaluation.
Reflections on Evaluation Training by Apprenticeship: Perspectives of Faculty and Graduates
Presenter(s):
Annelise Carleton-Hug,  Trillium Associates,  annelise@trilliumassociates.com
Joan LaFrance,  Mekinak Consulting,  joanlafrance1@msn.com
Abstract: To build evaluation capacity, the Center for Learning and Teaching in the West, a consortium of five universities, numerous community and tribal colleges, and public school districts in Montana, Colorado and Oregon, developed a unique approach for evaluation of this NSF-funded program. Under the guidance of experienced evaluators, doctoral students in science and mathematics education participated in an evaluation apprenticeship involving all aspects of program evaluation, including theory development, design of evaluation plans, data collection, analysis and reporting. The hands-on experience proved to be instructive on many levels, and influenced the opinions of campus leadership toward evaluation. The paper presents the perspectives of university faculty involved as project leaders, as well as reflections of the graduate students who participated in the apprenticeship. The discussion shares the strengths and challenges for implementing such types of apprenticeship training for future education leaders.

 Return to Evaluation 2008

Add to Custom Program