2010 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Mainstreaming Evaluation in Diverse Organizational Contexts
Panel Session 272 to be held in Texas E on Thursday, Nov 11, 10:55 AM to 12:25 PM
Sponsored by the Evaluation Use TIG , the Organizational Learning and Evaluation Capacity Building TIG, and the Research on Evaluation TIG
Chair(s):
Gloria Sweida-DeMania, Claremont Graduate University, gloria.sweida-demania@cgu.edu
Abstract: Jim Sanders defined mainstreaming evaluation as "…making evaluation a part of the work ethic, the culture, and the job responsibilities of stakeholders at all levels of the organization" (2009, p. 1). Organizational efforts toward mainstreaming evaluation depict the ultimate level of evaluation use. The presenters in this panel will describe methods of embedding evaluation in a variety of contexts. Sam Held, program evaluator for Oak Ridge Institute for Science and Education, will discuss a backward design approach. Ellen Iverson and Randahl Kirkendall, evaluators at Carleton College’s Science Education Resource Center (SERC), will focus on their efforts to mainstream evaluation within their organization, and assist other organizations to do the same. Rachel Muthoni, evaluator Pan African Bean Research Alliance (PABRA), will describe mainstreaming in the agricultural development context. Amy Gullickson will discuss her dissertation research, presenting on four organizations that exemplify evaluation mainstreaming. Reference: Sanders, J. R. (2009, May). Mainstreaming evaluation. Keynote address at the Fourteenth Annual Michigan Association of Evaluators Annual Conference, Lansing, MI.
Mainstreaming Evaluation: Practices and Innovations
Amy Gullickson, Western Michigan University, amy.m.gullickson@wmich.edu
Amy Gullickson conducted her dissertation research on National Science Foundation Advanced Technological Education Centers that are mainstreaming evaluation. Four centers were chosen for site visits based on survey responses, input from the NSF Program Officers, and initial interviews with the Principal Investigators. Interviews with staff and key stakeholders from each center focused on evaluation practices, innovations and resulting benefits. Ms. Gullickson also explored the organizational characteristics and personnel traits that made mainstreaming initially possible and ultimately sustainable. In this panel presentation, she will discuss the literature basis for mainstreaming evaluation, and present her research findings.
Mainstreaming Evaluation Into Faculty Professional Development Programs
Randahl Kirkendall, Carleton College, rkirkend@carleton.edu
Ellen R Iverson, Carleton College, eiverson@carleton.edu
Drawing from their experiences working at Carleton College’s Science Education Resource Center (SERC), evaluators Ellen Iverson and Randahl Kirkendall will demonstrate the utility of integrating evaluation components into an organization’s operations. This session will outline strategies that evaluators can use to work with website designers and workshop facilitators, specifically for incorporating evaluation activities into the workflow of professional development and education programs. SERC supports educators across a broad range of disciplines and at all educational levels through web resources and workshops, funded primarily through National Science Foundation grants. It has kept SERC current, efficient, and responsive with its comprehensive and dynamic mixed methods approach to combining evaluation and operations.
Mainstreaming Evaluation in Agricultural Research and Development
Rachel Muthoni, International Center for Tropical Agriculture, r.muthoni@cgiar.org
In her work for the Pan African Bean Research Alliance (PABRA), Rachel Muthoni promotes the use of evaluation in agricultural research to improve management of agricultural programs toward the goal of reducing poverty. PABRA implements a dynamic mix of monitoring and evaluation systems across the institutional levels, following the impact pathway from target beneficiary to the network and alliance. This systematic project management system includes Internal and external evaluation processes, multi-stakeholder platforms, and participatory evaluation approaches. The evaluation framework also provides an approach for working with partners and managing the complexity that characterizes the delivery of benefits needed to alleviate poverty. Muthoni’s presentation will showcase the complexity for evaluation utilization in this context, focus on results from the application of evaluation at the level of individual and project, and identify the major factors guiding mainstreaming in this agricultural development context.
Working Backwards: Using the Evaluation Report to Write Evaluation Questions
Sam Held, Oak Ridge Institute for Science and Education, sam.held@orau.org
In his capacity as a program evaluator, Sam Held has been mainstreaming evaluation in the U.S. Department of Energy's Office of Workforce Development for Teachers and Scientists. He led the effort that reversed the usual order of working from goals to methods to an evaluation report. His evaluation team worked backwards from the evaluation report to better facilitate use and influence. Together with managers from multiple STEM Research Internship programs, outreach and recruitment programs, and the National Science Bowl©, an outline of chapters was developed for reports and communication pieces. Each chapter was a common theme to be addressed by each program. These themes were dubbed the Six Critical Indicators of Success (SCIoS). Mr. Held will describe SCIOS and the process of involving users in development of the framework with special attention to facilitating evaluation use though this interaction. Limitations, implications, and suggestions for future research will be presented.

 Return to Evaluation 2010

Add to Custom Program