Evaluation 2009 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Enhancing Organizational Learning With Technology: Implications of Diversity, Improving Response Rates, and Increasing Evaluation Capacity
Multipaper Session 135 to be held in Wekiwa 7 on Wednesday, Nov 11, 4:30 PM to 6:00 PM
Sponsored by the Integrating Technology Into Evaluation
Chair(s):
Rebecca Culyba,  Emory University, rculyba@emory.edu
Discussant(s):
Nathan Balasubramanian,  Centennial Board of Cooperative Educational Services, nbala@cboces.org
Improving Response Rates for Multiple Populations: Lessons From the Field!
Presenter(s):
Barbara Packer-Muti, Nova Southeastern University, packerb@nova.edu
Jennifer Reeves, Nova Southeastern University, jennreev@nova.edu
Candace Lacey, Nova Southeastern University, lacey@nova.edu
Abstract: Participation rate is typically recognized in the literature as one of the factors that can adversely impact the strength of survey research data. This presentation describes the techniques utilized to improve participation rate in a large, university-wide assessment of institutional engagement. The assessment, conducted annually for the past two years, involves web-based questionnaires with invitations sent to 28,000 students, 4,600 employees (including faculty, staff, and administration) and 50,000 alumni. Discussion focuses on the techniques used to increase response rate. Also included is a discussion of the successes and failures, and three years of data on response rates for each set of constituents.
Using Web-based Technologies to Increase Evaluation Capacity in Organizations Providing Child and Youth Mental Health Services in Ontario
Presenter(s):
Purnima Sundar, Children's Hospital of Eastern Ontario, psundar@cheo.on.ca
Susan Kasprzak, Children's Hospital of Eastern Ontario, skasprzak@cheo.on.ca
Evangeline Danseco, Children's Hospital of Eastern Ontario, edanseco@cheo.on.ca
Tanya Witteveen, Children's Hospital of Eastern Ontario, twitteveen@cheo.on.ca
Heather Woltman, Children's Hospital of Eastern Ontario, hwoltman@cheo.on.ca
Abstract: Given today's climate of economic uncertainty and fiscal restraint, organizations providing child and youth mental health services are required to do so with limited resources. Within this context, service providers face added pressure to deliver evidenced-based programs and demonstrate program effectiveness. The Provincial Centre of Excellence for Child and Youth Mental Health at CHEO works with organizations to meet these demands by building capacity in program evaluation. While personal instruction and mentoring are an important ways of providing support, face-to-face consultations are not always cost-effective. In this presentation we will describe the use of interactive technology and computer-based learning as an alternative and/or complementary (to face-to-face) means of delivering evaluation information and training. We discuss the process of developing these tools, and share findings from our evaluation of their effectiveness in enhancing the evaluation-related supports we offer to providers of child and youth mental health services.
Finding Common Ground: Implications of Diversity in Data Collection in a Multi-Site, Initiative Evaluation
Presenter(s):
Virginia Houmes, Washington University in Saint Louis, vhoumes@wustl.edu
Nancy Mueller, Washington University in Saint Louis, nmueller@wustl.edu
Amy Stringer Hessel, Missouri Foundation for Health, astringerhessel@mffh.org
Cheryl Kelly, Saint Louis University, kellycm@slu.edu
Jessi Erickson, Saint Louis University, ericksjl@slu.edu
Tanya Montgomery, Washington University in Saint Louis, tmontgomery@wustl.edu
Doug Luke, Washington University in Saint Louis, dluke@wustl.edu
Abstract: In 2007, a Missouri health foundation funded a comprehensive evaluation of a multi-site initiative focused on obesity prevention in Missouri. Projects funded through the initiative share common goals, but vary in focus, setting, and capacity. This can create challenges when evaluating progress, including capturing data on initiative goals; detecting broad, cross-site effects; and ensuring valid results. In conducting the initiative evaluation, we identified a core data set to be collected from each of the projects that captures common indicators while allowing for the diversity of the projects. We developed a web-based data collection system that enables centralized submission and access to the core data by grantees, evaluators, and the foundation. We also provided training on the system that was specifically tailored to each project's diverse needs and capacity. From our process, evaluators will learn the strengths and challenges of implementing a web-based data collection system within a multi-site, initiative-level evaluation.

 Return to Evaluation 2009

Add to Custom Program