2010 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Examining the Mixing in Mixed Methods Evaluation
Multipaper Session 390 to be held in Texas C on Thursday, Nov 11, 4:30 PM to 6:00 PM
Sponsored by the
Chair(s):
Jori Hall,  University of Georgia, jorihall@uga.edu
Discussant(s):
Mika Yamashita,  Academy for Educational Development, myamashita@aed.org
Improving Public Awareness Campaign Evaluation Using Mixed Methods Design
Presenter(s):
Mary Kay Falconer, Ounce of Prevention Fund of Florida, mfalconer@ounce.org
W Douglas Evans, George Washington University, wdevans@gwu.edu
Abstract: This paper documents a triangulation mixed methods design in an evaluation of a statewide campaign to prevent child abuse and neglect. The methods include an online survey using a web-based panel of parents (quantitative) and five parent focus groups (qualitative). The online survey used an experimental design with study participants randomized into campaign and control groups. In the analysis, convergence and divergence in reactions to campaign stimuli (public service announcements and parent resource material) across methods are of interest. In addition, this design relies on data collected in the qualitative method to expand the explanation of the reactions to the campaign stimuli. This mixed methods application is an excellent illustration of how to improve quality in research when evaluating public awareness campaigns.
What’s the Right Mix? Lessons Learned Using A Mixed Methods Evaluation Approach
Presenter(s):
Nicole Leacock, Washington University in St Louis, nleacock@wustl.edu
Virginia Houmes, Washington University in St Louis, vhoumes@wustl.edu
Nancy Mueller, Washington University in St Louis, nmueller@wustl.edu
Gina Banks, Washington University in St Louis, gbanks@wustl.edu
Amy Stringer-Hessel, Missouri Foundation for Health, astringerhessel@mffh.org
Cheryl Kelly, Saint Louis University, kellycm@slu.edu
Abstract: In 2007, the Missouri Foundation for Health funded a comprehensive evaluation of a multi-site obesity prevention initiative across the state. Currently, there are 35 grantees implementing physical activity and healthy eating programs. As external evaluators, we developed measures and methods to capture not only the breadth of grantees’ activities, but also key details of the implementation process. The evaluation involved a mixed-methods approach including a web-based, quantitative data collection system to collect the breadth of data on program activities, and a series of qualitative interviews to capture details about program context. This approach enabled us to strike a balance between a manageable and informative evaluation. This presentation will describe benefits of a mixed-methods approach, how each method contributed to our evaluation, and lessons learned during our evaluation process. It will also present strategies for triangulating quantitative and qualitative data to disseminate a comprehensive picture of the initiative to key stakeholders.
Triangulation in Evaluation Practice
Presenter(s):
Hongling Sun, University of Illinois at Urbana-Champaign, hsun7@illinois.edu
Nora Gannon, University of Illinois at Urbana-Champaign, ngannon2@illinois.edu
Jennifer Greene, University of Illinois at Urbana-Champaign, jcgreene@illinois.edu
Abstract: With the purpose of drawing stronger inferences through convergence of data from multiple methods, triangulation is the most popular form and rationale for mixed methods (Fidel, 2008). However, evaluators know that the practice of triangulation rarely results in convergence and more often than not we observe inconsistencies and even contradictions (Mathison, 1988). How evaluators effectively deal with those inconsistent or contradictory findings, however, is still not clear. In an empirical review of educational evaluation studies, we found triangulation continues to be the most popular stated purpose of mixed methods, consistent with previous claims. In our empirical review, we focused specifically on how evaluators with a triangulation intent actually attended to contradictory findings. Our results provide a snapshot of current practices of triangulation in mixed methods evaluation. As strong inferences are a critical measure of evaluation quality, understanding how evaluators engage with contradictory findings can improve the practice of triangulation.

 Return to Evaluation 2010

Add to Custom Program