|
Lessons Learned From Planning and Conducting Site Visits
|
| Presenter(s):
|
| Mika Yamashita, Academy for Educational Development, myamashita@aed.org
|
| Abstract:
Site visits are a common feature of many program evaluations, but not much has been written about them. This paper presents our experiences and lessons learned from planning and implementing site visits for several formative evaluation projects. The projects we evaluated were education support programs provided by intermediary organizations and schools. We discuss strategies we used to establish shared norms and expectations within a site visit team, to better communicate with the sites about data collection requests. We also present our post site visit reporting strategies. Building upon Lawrentz and colleague’s work (2002, 2003), the paper aims at expanding our understanding of strategies and steps we can take to improve site visit planning and implementation.
|
|
A Recipe for Success: Lessons Learned for Using Qualitative Methods Across Project Teams
|
| Presenter(s):
|
| Nicole Leacock, Washington University in St Louis, nleacock@wustl.edu
|
| Stephanie Herbers, Washington University in St Louis, sherbers@wustl.edu
|
| Nancy Mueller, Washington University in St Louis, nmueller@wustl.edu
|
| Virginia Houmes, Washington University in St Louis, vhoumes@wustl.edu
|
| Lana Wald, Washington University in St Louis, lwald@wustl.edu
|
| Eric Ndichu, Washington University in St Louis, endichu@gwbmail.wustl.edu
|
| Abstract:
Collecting qualitative data adds important context to an evaluation. It also requires an investment in staff time, skills, and other resources. Evaluation centers must strategically manage their staff and resources in order to efficiently implement qualitative data collection and analyses while still maintaining quality. This is especially true in circumstances where there are several project teams taking on multiple evaluations with different needs, goals, and timelines. In this presentation we will draw on our experiences as an evaluation center that consistently employs qualitative methods as part of a mixed methods approach. This presentation will outline the qualitative data collection and analysis processes utilized by multiple projects within our center. We will also present lessons learned for building project team capacity for data collection, analysis, and dissemination of results.
|
|
Applications of credibility techniques to Promote Trustworthiness of Findings in a Qualitative Program Evaluation: A Demonstration
|
| Presenter(s):
|
| John Hitchcock, Ohio University, hitchcoc@ohio.edu
|
| Jerry Johnson, Ohio University, jerry.johnson.ohiou@gmail.com
|
| Bonnie Prince, Ohio University, bonnielprince@aol.com
|
| Abstract:
The purpose of this paper is to demonstrate how a series of credibility techniques was applied to a qualitatively-oriented program evaluation. The intervention of interest was designed to promote culturally aware and responsive pedagogy among K-12 teacher candidates pursuing degrees in higher education, and the authors serve as external program evaluators. Evaluation objectives were to offer a series of formative input and transition into summative findings. The nature and scope of program delivery suggested a qualitative case-study design would yield the most informed findings. In order to address potential researcher (i.e., evaluator) bias as a validity threat, a series of qualitative techniques were employed to strengthen the design. These included but were not limited to: triangulation, member checks, negative-case analyses and sampling, referential adequacy and so on. As these techniques are established in the literature, the focus will be on demonstrating their application to promote evaluation quality.
|
| | |