|
Session Title: Perspectives on Credible Evidence in Mixed Methods Evaluation Theory and Practice
|
|
Panel Session 451 to be held in Avalon A on Thursday, Nov 3, 4:30 PM to 6:00 PM
|
|
Sponsored by the Mixed Methods Evaluation TIG
|
| Chair(s): |
| Mika Yamashita, World Bank, myamashita@worldbank.org
|
| Discussant(s):
|
| Donna Mertens, Gallaudet University, donna.mertens@gallaudet.edu
|
| Abstract:
In the current milieu, scarce resources are concomitant with a greater demand for accountability and credible evidence. The evaluation field over 25 years has become increasingly pluralistic. Evaluators embrace different ideologies, value stances, and methods preferences. Mixed methods, as a mode of inquiry, has held sway in developing theory and practice that considers the multiple perspectives of the evaluation community. Panelists who have published on mixed methods provide insights into maximizing the value and integrity of design options. Johnson provides an overarching umbrella for mixed methods work through the meta-paradigm, dialectical pragmatism. The next two papers take on two, of many practical issues. Caracelli addresses qualitative data in evidence review systems. Collins and Onwuegbuzie discuss maximizing interpretive consistency through sampling strategies in mixed method studies. Hesse-Biber provides broad based practical advice for working with mixed-methods. Last, Mertens, as panel discussant, uses a transformative lens to discuss facets of credible evidence.
|
|
How Might Dialectical Pragmatism Examine the Issue of Credible Evidence?
|
| R Burke Johnson, University of South Alabama, bjohnson@usouthal.edu
|
|
"Dialectical pragmatism" is a meta-paradigm that combines ontological pluralism with a dialectical and purposively value-packed pragmatism. It asks users to examine multiple paradigmatic stances carefully and thoughtfully when considering issues of knowledge, methods, theory, policy, and practice. I will apply this "mixed philosophy" to the broad issue examined by the panel. My application will ask questions such as these: What do different stakeholders mean by credible evidence? What is the role of power in determining what is labeled "credible evidence"? How can the aims of federal/national policy/theory be combined with the aims of local communities and practice? How can multiple political and epistemological standpoints be concurrently considered in the debate over credible evidence? How does one warrant claims in a multi-paradigmatic, multi-disciplinary, multi-standpoint, multi-stakeholder environment? The "answers" surround the use of the age old philosophical approach called dialecticalism combined with practical and ethical thinking.
|
|
|
Credible Evidence in Systematic Review Systems Viewed Through a Mixed-Method Lens
|
| Valerie J Caracelli, United States Government Accountability Office, caracelliv@gao.gov
|
| Leslie Cooksy, University of Delaware, ljcooksy@udel.edu
|
|
Over the past decade, several public and private efforts have been launched to summarize available effectiveness research on social interventions to help managers and policymakers identify and adopt effective practices. Patterned after evidence-based practice models in medicine these review system initiatives are intended to provide credible evidence on what works. In synthesizing evidence these review systems complete a meta-evaluation step to judge study quality and primarily include for review experimental designs. The synthesis of qualitative studies is another burgeoning area of interest for systematic reviews. This paper will examine the adequacy of traditional quality review criteria via a mixed methods lens. Using, among other sources, several federally supported evidence review systems discussed by GAO (GAO-10-30), the paper will consider how qualitative data are included, if at all, in such reviews. The potential use of qualitative data to illuminate context, address intervention fidelity, and add value to interpreting findings will be addressed.
| |
|
Establishing Interpretive Consistency When Mixing Approaches: Role of Sampling Designs
|
| Kathleen M T Collins, University of Arkansas, kxc01@uark.edu
|
| Anthony Onwuegbuzie, Sam Houston State University, tonyonwuegbuzie@aol.com
|
|
Decisions pertinent to devising a sampling design (selecting sample schemes and sample sizes) affect various stages of the mixed research process. Further, sampling decisions impact five quality criteria inherent in the process of mixing approaches. Representation refers to the degree that researchers obtain credible data comprising descriptive accounts and numbers. Legitimation refers to the extent that researchers' conclusions and inferences are trustworthy and transferable. Integration reflects the degree that researchers' inferences are combined into credible meta-inferences. Politics refers to the extent that researchers' conclusions and inferences are viewed as trustworthy by stakeholders, and Ethics refers to the degree that they denote an unbiased and socially ethical perspective. In this presentation, we will discuss the concept of Interpretive Consistency as it relates to the degree of consistency between the researchers' conclusions and inferences and the selected sampling designs, and we will offer strategies toward maintaining Interpretive Consistency within a mixed inquiry.
| |
|
What Counts as Credible Evidence in Mixed Methods Evaluation Research?
|
| Sharlene Hesse-Biber, Boston College, sharlene.hesse-biber@bc.edu
|
|
This paper examines the concept of what counts as credible evidence in mixed methods evaluation research. Does the use of two methods enhance the overall credibility of mixed methods evidence? Is mixed methods praxis an inherently synergistic evaluation method?
The paper highlights the impact of a researcher's standpoint-- the values and attitudes they bring to the evaluation process that can determine the questions they ask, the methods, analysis and interpretations they privilege. The paper provides specific methodological and methods case study examples that demonstrate research strategies for enhancing the credibility claims of mixed methods evaluation research. The paper examines how evaluation researchers and practitioners can deploy "strong objectivity" and "holistic reflexivity," as validity tools to enhance awareness of the power and authority relations within the evaluation process; and ways evaluation projects can tend to "difference" with a commitment to social change and social justice evaluation outcomes.
| |