Return to search form  

Session Title: Deliverables as a Tool to Promote and Support Organizational Learning: Client-centered Strategies for Data Collection and Reporting
Panel Session 676 to be held in Schaefer Room on Friday, November 9, 4:30 PM to 6:00 PM
Sponsored by the Organizational Learning and Evaluation Capacity Building TIG
Chair(s):
Debbie Zorn,  University of Cincinnati,  debbie.zorn@uc.edu
Abstract: Every program evaluation is expected to have some kind of deliverable. Yet, why write a technical report that ends up on someone's bookshelf rather than being used to make a meaningful contribution to organizational learning and program improvement? How do we as evaluators meet the accountability and reporting needs of our clients while also ensuring that information provided is usable and appropriate for its intended audience? This panel will discuss participatory, collaborative approaches to the planning and design of project deliverables used by the University of Cincinnati Evaluation Services Center (UCESC) that take into account clients' needs for information, accountability, learning, and dissemination. The panelists will share the processes they used in negotiating a design for deliverables that met the unique program context and constraints of the five different projects represented by this group and describe how these approaches contributed to program and organizational learning.
Old Habits Die Hard: Introducing New Approaches to an Established Client
Imelda Castaņeda-Emenaker,  University of Cincinnati,  castania@ucmail.uc.edu
Imelda Castaeda-Emenaker will discuss the benefits and challenges of having an established relationship with a client. While the established relationship brings continued opportunities for evaluation work, it can also be complicated by old habits and expectations for how evaluation data are collected and reported. Clients get used to a certain way of doing things and bring these habits of mind to successive projects. Attempts to introduce new approaches are often met with resistance. One such example is the use of capacity-building as a focus for data collection and reporting. Imelda Castaeda-Emenaker will explain how the idea of capacity-building as a focus for data collection and reporting was introduced for use in a statewide, multi-site educational intervention project. She will describe how evaluators worked with project staff to embed evaluation activities into their daily operations and move to a reporting process that emphasized continuous improvement rather than summative review.
A Complex Balancing Act: Reporting Across Multiple Years, Sites, and Program Models for Statewide Professional Development in Literacy Instruction
Janice Noga,  Pathfinder Evaluation and Consulting,  jan.noga@stanfordalumni.org
Jan Noga will address the challenges of meeting the diverse information needs of Ohio's State Institutes for Reading Instruction (SIRI). To improve teaching quality in classroom reading instruction, the Ohio Department of Education (ODE) developed SIRI to provide professional development in reading instruction for classroom teachers. Since its inception in 1999, SIRI has served an estimated 45,000 teachers. Ms Noga will describe how evaluators worked with ODE staff to design an integrated approach for data collection and reporting that utilized a cyclical feedback system to continually inform process while also assessing success at attaining expected outcomes. She will describe how a flexible, evolving design that included formative review and frequent reporting to inform mid-course corrections allowed evaluators to provide intermediate assessment of the need for and effectiveness of mid-course corrections as the SIRI design evolved, as well as subsequent summative assessment of the effectiveness and impact of SIRI overall.
Using Professional Development Standards as a Foundation for Program Evaluation and Program Improvement
Stacey Farber,  University of Cincinnati,  stacey.farber@cchmc.org
Stacey Farber will discuss how national standards for quality professional development and theory on the relationship between teacher training and student learning were used as a foundation for evaluation and continuous improvement of the Ohio Writing Institute Network for Success (OhioWINS), a state-supported, multi-site, professional development program for K-12 teachers of writing. She will describe how the National Staff Development Council (NSDC) Standards for Staff Development (2001) were used as a framework to evaluate the quality of OhioWINS and to provide research-based recommendations for program improvement to policy makers and program implementers. She will also illustrate how the Guskey and Sparks model of the relationship between professional development and student learning was adapted into a program and evaluation logic model. This model was then used to better conceptualize the goals of the program and enhance the design of the evaluation itself for future years.
Community Based Weed and Seed Projects: Using Progress Reports to Promote Continuous Improvement and Improve Project Sustainability
Nancy Rogers,  University of Cincinnati,  rogersne@ucmail.uc.edu
Nancy Rogers will discuss the value of evaluation progress reports as a useful tool for improving data collection activities, discussing continuous improvement processes, and guiding strategic planning discussions when working with loosely organized community members and organizations that volunteer their time and resources to the Weed and Seed project. She will explain how working collaboratively to complete the evaluation progress report reveals gaps in program planning and challenges with data collection. These gaps and challenges are examined and improvements are planned. She will explain how regular reference to these reports at quarterly meetings contributes to committee focus on goals and increased interest in data collection activities for demonstrating changes in the community. Finally, she will describe how the Weed and Seed Steering Committee has benefited from discussions that result from regular review of evaluation progress reports and has consequently focused on developing resources for program sustainability as a project goal.
Building the Educational Community into a Multi-Methods Evaluation of the Cincinnati Art Museum's School Program
Jan Matulis,  University of Cincinnati,  matulij@ucmail.uc.edu
Jan Matulis will discuss the importance of building the local educational community into a multi-methods evaluation of the Cincinnati Art Museum's school program through the Success Project. She will also discuss how educational community members were involved in the planning and implementation of this evaluation, leading to a system of deliverables focused on the Art Museum's standards-based school programs. The benefits and challenges of involving a wide range of both educational community members and data collection methods in this evaluation of an informal education provider will also be discussed. As a result of these efforts, an evaluation framework has been developed that monitors and advises the Art Museum's progress relative to providing programs that meet standards-based curriculum needs of schools in the region and increasing awareness of, and participation in, the Art Museum's school programs.
Search Form