2011

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Collaborative Evaluations: Successes, Challenges, and Lessons Learned
Multipaper Session 228 to be held in La Jolla on Thursday, Nov 3, 8:00 AM to 9:30 AM
Sponsored by the Collaborative, Participatory & Empowerment Evaluation TIG
Chair(s):
Liliana Rodriguez-Campos,  University of South Florida, liliana@usf.edu
Engaging the Community in Education and Evaluation: Using Collaborative Evaluation to Facilitate Community Member Focus Groups
Presenter(s):
Aarti Bellara, University of South Florida, abellara@usf.edu
Liliana Rodriguez-Campos, University of South Florida, liliana@usf.edu
Michael Berson, University of South Florida, berson@usf.edu
Abstract: In an effort to assess whether the views of the community were aligned to the goals of a large county-wide school program on civic engagement, community member focus groups were conducted. The various views within the community and the role the community plays in school initiatives are important when implementing school-based programs. The chances of programs being sustainable and successful are often increased when the ideals of the community and the program are aligned. To successfully garner focus groups and meet this evaluation goal, a collaborative evaluation approach was utilized. This paper addresses how the focus groups informed the evaluation, the lessons learned on organizing and conducting focus groups, and the perceived impact school programs have on the surrounding communities.
Avoiding the Dusty Shelf: Promoting Use through Stakeholder Collaboration and Multi-Step Dissemination Strategies
Presenter(s):
Katie A Gregory, Michigan State University, katieanngregory@gmail.com
Adrienne E Adams, Michigan State University, adamsadr@gmail.com
Deborah McPeek, Turning Point Inc, dmcpeek@turningpointmacomb.org
Nkiru A Nnawulezi, Michigan State University, nkirunnawulezi@gmail.com
Chris M Sullivan, Michigan State University, sulliv22@msu.edu
Deb Bybee, Michigan State University, deborah.bybee@gmail.com
Echo A Rivera, Michigan State University, echorivera@gmail.com
Katherine A Cloutier, Michigan State University, kcloutier28@gmail.com
Abstract: Many evaluators have demonstrated the utility of a collaborative process when working with stakeholders to develop evaluations that best fit the organization's needs and improve current practices. This paper describes a case study of collaboration between a university-based research team and a non-profit organization interested in examining the extent to which the agency's practices reflected their empowerment philosophy. Adhering to the basic tenets of collaborative evaluations, we conducted a study with a domestic violence shelter that focused on defining and measuring empowering practice. Using a utilization-evaluation approach, we sought to answer the following questions: 'are we doing what we think we're doing and is it making a difference?' Agency stakeholders and evaluators worked together to articulate the program theory and study design, and to interpret and disseminate findings to agency staff. This presentation will describe the collaborative process and methods used to plan for and promote the use of the evaluation findings.
Is the Cost of That Program Worth It? Teachers' Perceived Worth of a New Teaching Program in an Effective Charter School Through the Use of Collaborative Evaluation
Presenter(s):
Claudia Guerere, University of South Florida, cguerere@mail.usf.edu
Janet Mahowski, University of South Florida, mahowskiJ@pcsb.org
Paige James, University of South Florida, paigejames80@yahoo.com
Abstract: Implementing any new program incurs costs. Deciding to spend money on a new teaching program could mean less money for students. This study evaluated teacher's perceived effectiveness of a reading program in a Florida school using collaborative evaluation. Though the program was already available to teachers through the district, this was the first time it was implemented in-house. Emphasis was placed determining if teaching strategies presented were new to the participants, establishing worth. A survey was developed and administered to assess how useful the teachers found the training session. Results from the surveys lead to two focus groups to gather specific details about this programs perceived effectiveness by the teachers, predictions of future use of the strategies in their classrooms, and barriers to implementation. Results demonstrated that teachers were familiar with many of the strategies and if this program were to be kept, a more advanced class would be beneficial.
Working Collaboratively With Multiple Partners to Determine Whether Interventions can Lead to Reduction in Health Disparities
Presenter(s):
Liz Maker, Alameda County Public Health Department, liz.maker@acgov.org
Mia Luluquisen, Alameda County Public Health Department, mia.luluquisen@acgov.org
Abstract: Evaluators at the ACPHD have developed a participatory evaluation model for working with community partners to determine whether their interventions are reducing health disparities. We developed this model through our eight-year involvement with an initiative to reduce violence and improve community well-being, through addressing social determinants of health in two Oakland neighborhoods. Our evaluation role began when we helped program planners determine intermediate outcomes (such as improved relationships among neighbors or community involvement) along the pathway towards reduction in health disparities. To complement secondary data sources, we designed qualitative and quantitative data collection tools to assess neighborhood conditions and priorities for change at three time periods. A key role has been working closely with diverse stakeholders, including neighborhood residents, program staff and government officials, to track how the interventions are changing individuals and neighborhoods, and how these changes can lead to an eventual reduction in health inequities.
Valuing Collaboration from the Bottom-Up: A Formative Evaluation of Science Inquiry in Middle School Classrooms
Presenter(s):
Merlande Petit-Bois, University of South Florida, mpetitbo@usf.edu
Teresa Chavez, University of South Florida, chavez@usf.edu
Robert Dedrick, University of South Florida, dedrick@usf.edu
Robert Potter, University of Souhtern Florida, potter@cas.usf.edu
Abstract: The Leadership in Middle School Science (LIMSS) project's purpose is to increase inquiry and leadership in middle school science. This paper focuses on the collaborative nature of a formative evaluation of the LIMSS project at all levels, from the participants to the leadership team. Taking an eclectic approach (Fitzpatrick, Sanders & Worthen, 2004), we evaluated the extent this program helped science teachers become more effective in their pedagogical knowledge and as leaders. Data were collected from surveys, focus groups, and Blackboard discussion questions. Multiple perspectives from the teachers, teacher leaders, principal investigators, and evaluators were used to gain an understanding of the program's effectiveness and identify areas of improvement. Using feedback from all levels regarding the program's needs has been valuable in improving the LIMSS project. Fitzpatrick J. L., Sanders, J. R., & Worthen, B. R. (2004). Program evaluation: Alternative approaches and practical guidelines (3rd ed.). Boston: Pearson Education, Inc.

 Return to Evaluation 2011

Add to Custom Program