2010 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Dealing With Technical Challenges in Mixed Methods Evaluation
Multipaper Session 230 to be held in Texas C on Thursday, Nov 11, 9:15 AM to 10:45 AM
Sponsored by the
Chair(s):
Virginia Dick,  University of Georgia, vdick@cviog.uga.edu
Discussant(s):
Susan Labin,  Independant Consultant, susan@susanlabin.com
Using Mixed Methods to Understand Evaluation Influence: Challenges and Opportunities
Presenter(s):
Sarah Appleton-Dyer, University of Auckland, sk.appleton@auckland.ac.nz
Janet Clinton, University of Auckland, j.clinton@auckland.ac.nz
Rob McNeill, University of Auckland, r.mcneill@auckland.ac.nz
Abstract: Mixed methods is receiving increased attention in the literature, with much discussion surrounding the capacity to mix paradigms and the recognition of pragmatism as an alternative paradigm. The literature also acknowledges our need to understand more about mixed methods in practice. For example, there is not once accepted framework to guide data analysis and integration. While this offers many benefits and opportunities, it also offers some challenges. This paper seeks to contribute to understanding of some of these challenges and opportunities by presenting a mixed methods study that aims to understand evaluation influence within population health partnerships in New Zealand. The study is part of a doctoral thesis and the paper will draw on a range of experiences to highlight the theoretical and practical challenges and opportunities experienced so far. Specifically, the paper will identify challenges and opportunities relating to study development, design, implementation and initial stages of analysis.
A Mixed Methods Toolkit for Evaluating Translational Science Education Programs
Presenter(s):
Julie Rainwater, University of California, Davis, julie.rainwater@ucdmc.ucdavis.edu
Stuart Henderson, University of California, Davis, stuart.henderson@ucdmc.ucdavis.edu
Abstract: Translational research and the training of translational researchers have generated significant attention in the past ten years. This attention has led to an emergence of a number of training programs specifically devoted to recruiting and training translational researchers. Translational training programs emphasis on team science, focus on interdisciplinary research, and acceptance of a range of career trajectories all present unique challenges to program evaluation and the development of evaluation metrics. This presentation describes the evaluation of pre-doctoral and postdoctoral Clinical & Translational Science training programs and the Howard Hughes Medical Institute’s -Integrating Medicine into Basic Science training program at the University of California, Davis. We will share the mixed methods evaluation toolkit that we developed for these programs as well as describe some of the challenges of evaluating translational training programs.
The Tale of Two Mixed Methods Projects
Presenter(s):
Jori Hall, University of Georgia, jorihall@uga.edu
Katherine Ryan, University of Illinois at Urbana-Champaign, k-ryan6@illinois.edu
Abstract: While quality in evaluation is important to determine by considering current and historical definitions of evaluation quality (House, 1980) as a foundation, we believe quality is best understood in terms of the inquiry process itself. Making aspects of the inquiry process transparent enables judgments about how suitable an approach is in relation to the purpose of the inquiry and the context within which it is embedded. With this in mind, this presentation explores the quality of two mixed methods projects by examining different approaches, including a project conducted by a sole researcher in one context as well as a collaborative mixed methods evaluation in another context. Using these projects, we further consider mixed methods decision points during the evaluation design and implementation process. These include the sequence of data collection; the priority given to qualitative and quantitative data collection and analysis; and integration or where the mixing occurred (Creswell, 2003).

 Return to Evaluation 2010

Add to Custom Program