|
Session Title: Advancing Equity in Science, Technology, Engineering and Mathematics (STEM) Education Program Evaluation: A "Tapas" of Field Experiences and Lessons Learned
|
|
Panel Session 227 to be held in Suwannee 16 on Thursday, Nov 12, 9:15 AM to 10:45 AM
|
|
Sponsored by the Research on Evaluation TIG
|
| Chair(s): |
| Jennifer Greene, University of Illinois at Urbana-Champaign, jcgreene@illinois.edu
|
| Abstract:
This panel offers selected a menu of "tapas" - small tasty dishes prepared to delight the palate - from an ongoing research project on evaluation theory and practice. The research project is pursuing an "educative, values-engaged" (EVEN) approach to evaluating science, technology, engineering, and mathematics (STEM) education programs. Distinctively, the EVEN evaluation approach is anchored in a fundamental commitment to equity in STEM opportunity and accomplishment. Our recent work has involved field testing EVEN concepts and practice guidelines in various STEM education contexts, including middle school and high school mathematics and nanotechnology for high school students. The presentations in the panel sample the issues, challenges, and insights experienced in our field testing endeavors, including issues of evaluator authority, the meanings of "being educative" and the contributions of program theory to this purpose, the complementary ideas of school-based evaluation, and the central role of recruitment in diversity-oriented STEM education programs.
|
|
Who Made You God?
|
| Jori Hall, University of Georgia, jorihall@uga.edu
|
| Jeehae Ahn, University of Illinois at Urbana-Champaign, jahn1@illinois.edu
|
|
The premise of "values-engagement" in the EVEN approach to STEM evaluation raises many interesting questions and implications for the role of the evaluator. Prescribing or advocating certain values through evaluation could be viewed as evaluators asserting god-like authority, invoking such questions as, "Who made you God?" Taking on this question, this presentation explores key issues surrounding the EVEN evaluator's role in engaging values to advance equity in STEM educational program evaluation. These include: the EVEN position of prescribing and promoting a particular set of values; the rationale for this particular stance; and the challenges and contradictions inherent in such a stance. Examples are drawn from field experiences to further illustrate the issues, including a specific initiative taken purposefully by the EVEN evaluators to enact values-engagement ideal in one evaluation, as well as critical reflections on "missed opportunities" for just such an engagement in another context.
|
|
|
On Being Educative Through Program Theory
|
| Jeremiah Johnson, University of Illinois at Urbana-Champaign, jjohns62@illinois.edu
|
| Jennifer Greene, University of Illinois at Urbana-Champaign, jcgreene@illinois.edu
|
|
The EVEN evaluation approach to STEM education evaluation is fundamentally positioned within the educative tradition of evaluation (following in the giant footsteps of Lee Cronbach and Carol Weiss). In the EVEN approach, evaluation serves to facilitate practitioners' (primarily teachers') learnings and critical reflections about their own practice. In our field tests of EVEN ideas, we have experimented liberally with program theory as a powerful strategy that can fulfill our educative ambitions. Our work has included the collection of the program theories of administrators, teachers and other program staff, and students; the representation of these program theories in narrative and schematic ways; and the use of these theories in dialogic feedback sessions with key stakeholders. Most of our theories have been descriptive rather than causal or normative. This presentation shares examples from our fieldwork and reflects on the value of descriptive program theory for an evaluation practice intended to be educative.
| |
|
Educative Values-Engaged School-Based Internal Evaluation
|
| Maurice Samuels, University of Illinois at Urbana-Champaign, msamuels@illinois.edu
|
| Jeremiah Johnson, University of Illinois at Urbana-Champaign, jjohns62@illinois.edu
|
|
There are school-based evaluation approaches that attend to equity issues (i.e., Rallis & MacMullen, 2000; Ryan, 2005); however, none have an explicit value commitment to achieving/promoting/developing equitable program policies and practices. The authors discuss how two distinct strands of evaluation can contribute to school improvement. Specifically, the authors illustrate how an "educative values-engaged" evaluation (EVEN) approach can be used with School-Based Evaluation (SBE) to attend to issues of equity and student achievement. The authors begin with a brief description of EVEN's commitment to equity and the aspects of equity that this approach attends to. Next, the authors examine how SBE has been aimed at improving teaching and student learning. Finally, the authors discuss the benefits and challenges in trying to implement an Educative Values Engaged School-Based Evaluation in an education setting that is experiencing tension around student achievement with respect to race, class, and gender.
| |
|
Context in Science, Technology, Engineering and Mathematics (STEM) Education Evaluation: How are Students Recruited?
|
| Amarachuku Enyia, University of Illinois at Urbana-Champaign, aenyia@illinois.edu
|
|
This paper seeks to explore the role of student recruitment to better understand the effectiveness of STEM educational programs. The basis of this paper rests in the notion that the activities of evaluators help to constitute our understandings of the program being evaluated and, in turn, the very nature of the program. It also highlights a commitment to equity in the evaluation context. Recruitment for STEM educational programs serves as a critical, part of determining the beneficiaries of the educational program and is critical to understanding who is served and aligning the program in such a way that it best serves its participants. By approaching recruitment from a culturally and contextually relevant perspective and highlighting the challenges, evaluators are better positioned to make informed analyses on the merits and effectiveness of the program.
| |