|
Testing an Unusual Model of Professional Development: The Self-Directed Study Group
|
| Presenter(s):
|
| Nancy Carrillo, Apex Education, nancy@apexeducation.org
|
| Abstract:
A local AEA affiliate sponsored a professional development study group in which participants learned about organizational network theory and analysis. Two coordinators neither experts in the field -created a simple syllabus comprised of four pedagogical elements: individual learning (reading, practicing network analysis on freeware), monthly group discussions, on-line dialogue and sharing of resources, and a jointly-created presentation at the end to AEA affiliate members. A non-participating evaluator worked with the coordinators to examine the success of the professional development in terms of participants' perceived linkages between the course and real-life practice, satisfaction with both content and process of the experience, and the acquisition of knowledge and skills. Evaluation methods included a survey following the course, monitoring of attendance and participation each month, and a reiterative reflective exercise in which participants examined their own learning over the course of eight months. Findings suggest do's and don'ts for similar professional development exercises.
|
|
Valuing Evaluation Through New Media Literacy: Using AEA365 Blog to Prepare Next Generation Evaluators
|
| Presenter(s):
|
| Sheila Robinson Kohn, University of Rochester, sbkohn@rochester.rr.com
|
| Kankana Mukhopadhyay, University of Rochester, kankana.m@gmail.com
|
| Abstract:
This paper illustrates and critically analyzes the potential of using a blog - AEA365 Tip-a-day by and for Evaluators- as a teaching and learning tool for educating evaluators in a university's certificate program designed to prepare the next generation of evaluators. The blog is used to encourage students to engage in unique ways of interacting, learning and thinking about evaluation theory and practices. Grounded in empirical evidence obtained from a diverse group of doctoral students for the past three semesters, our paper systematically documents the role of new media literacy, specifically AEA365, in the pedagogical practice of teaching evaluation. In addition, we offer a perspective on how the blog represents a bridge over the theory-practice gap and as such, has enriched our own understanding as evaluators cum instructors of how to best support emerging evaluators in embracing values and valuing in evaluation practice.
|
|
Evaluation Faculty and Doctoral Students' Perspectives on Using a Portfolio as a Comprehensive Exam Option
|
| Presenter(s):
|
| Jennifer Morrow, University of Tennessee, jamorrow@utk.edu
|
| Gary Skolits, University of Tennessee, gskolits@utk.edu
|
| Jason Black, University of Tennessee, jblack21@utk.edu
|
| Susanne Kaesbauer, University of Tennessee, Knoxville, skaesbau@utk.edu
|
| Thelma Woodard, University of Tennessee, Knoxville, twoodar2@utk.edu
|
| Abstract:
In this paper presentation we will discuss using a portfolio as an option for the comprehensive exam in an Evaluation, Statistics, and Measurement (ESM) graduate program. Two years ago while revising our graduate program curriculum we decided to offer a portfolio as one of the three comprehensive exam options (the other two being the traditional four questions or a major area paper) for our doctoral students. As faculty we felt that a portfolio would best represent the students' body of work during their graduate career. We will discuss our reasons for creating the portfolio option, the detailed guidelines we created for students completing their portfolios, and students' perspectives on this comprehensive exam option. Lastly, we will discuss how other faculty can implement this option in their graduate program and how best to evaluate students' completed portfolios.
|
|
Service-Learning Methods in Community-Based Participatory Evaluation: Implications of Service-Learning on Workforce Diversity, Student Capacity Building, and Community Support
|
| Presenter(s):
|
| Tabia Akintobi, Morehouse School of Medicine, takintobi@msm.edu
|
| Donoria Evans, Morehouse School of Medicine, devans@msm.edu
|
| Nastassia Laster, Morehouse School of Medicine, nlaster@msm.edu
|
| Ijeoma Azonobi, Centers for Disease Control and Prevention, iazonobi@msm.edu
|
| Marcus Dumas, Georgia Perimeter College, marcus.dumas@gpc.edu
|
| Debran Jacobs, Morehouse School of Medicine, djacobs@msm.edu
|
| William Moore, ICF Macro, wmoore@icfi.com
|
| Lailaa Ragins, Morehouse School of Medicine, lragins@msm.edu
|
| Abstract:
Broadening the expertise and diversity of the evaluation workforce is a priority for the public health graduate education program at Morehouse School of Medicine. Unique aspects of the program evaluation course include service learning community-based participatory evaluation projects focusing on student capacity/skill building and community-focused participatory principles and practice, evaluation tailoring for special populations and underserved communities, and application of evaluation principles to broaden culturally responsive and appropriate approaches in practice and literature. Associated course participation outcomes include evaluation skill application in course projects and practicum experiences, peer-to-peer technical assistance, facilitation of evaluation in extramural projects, and participation in evaluation organization sponsored trainings/workshops. Students contributed to additional organizational evaluations for a United States Army base Child and Youth Services Division, Morehouse College's Public Health Sciences Institute and 2010 Project Imhotep Internship. This paper discusses program evaluation course methodology, student engagement, evaluation capacity process and outcomes, and implications for workforce development.
|
| | | |