Evaluation 2008 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Evaluation Practice and Policy in Social Service Systems
Multipaper Session 266 to be held in Mineral Hall Section A on Thursday, Nov 6, 10:55 AM to 12:25 PM
Sponsored by the Social Work TIG
Chair(s):
Brian Pagkos,  Community Connections of New York,  pagkos@hotmail.com
Increasing Engagement in the Evaluation of Social Service Interventions: Incorporating Empowerment Evaluation, Process Evaluation and Process Use in Design
Presenter(s):
Carol Lewis,  University of Texas,  carolmlewis@gmail.com
Megan Scarborough,  University of Texas,  megan@mail.utexas.edu
Amy Pierce,  LifeWorks Teen Parent Services,  amy.pierce@lifeworksweb.net
Robin Rosell,  People's Community Clinic,  robinr@austinpcc.org
Peg Gavin,  LifeWorks,  peg.gavin@lifeworksweb.org
Abstract: The application of empowerment evaluation (Fetterman, 2004), process evaluation (Weiss, 1998) and process use (Patton, 1987) are examined in the evaluation of social service interventions. Case in point is the LifeWorks Adolescent Family Life Demonstration Project, a community-based partnership that provides intensive case management to pregnant and parenting adolescents. Recognizing the need to balance practicality and design, researchers took an empowerment approach, making the evaluation a collaborative effort and including case managers in the data collection process. Research design and other successful strategies for staff engagement in the evaluation are outlined. Process use outcomes, particularly those associated with the program’s mental health component, are discussed, as are limitations. This is a practical and promising approach for evaluators of social service interventions, particularly interventions that rely on a patchwork of community-based partnerships and funding streams, and interventions that serve vulnerable, highly transient populations.
An Evaluation of Applicant Selection for Master of Social Work Scholarships Among Staff at New York City’s Children’s Services
Presenter(s):
Bonnie McCoy Williams,  New York City Children's Services,  bonnie.mccoy-williams@dfa.state.ny.us
Henry Ilian,  New York City Children's Services,  henry.ilian@dfa.state.ny.us
Heide Gersh Rosner,  New York City Children's Services,  heide-gersh@dfa.state.ny.us
Abstract: In 2007, Children’s Services, New York City’s child welfare agency, adopted new procedures to improve its selection process for awarding scholarships to agency staff for graduate study leading to the MSW degree. These included formal training for raters of applicant essays and workshops for prospective applicants. Because the writing quality in applicant essays was often poor, which limited the number of scholarships that could be awarded, the bulk of the workshop was devoted to writing the essays. This evaluation compares years with and without formal rater training and applicant workshops. The expectations were that with rater training, ratings would be more consistent, with fewer exceptionally lenient and exceptionally strict raters, and that applicant workshops would produce better quality essays, as evidenced by higher point scores for the essays. The evaluation uses Rasch measurement to test these assumptions.
Preventing Child Abuse and Neglect In Kentucky: A Longitudinal Research Evaluation
Presenter(s):
Ramona Stone,  University of Louisville,  ramona.stone@louisville.edu
Gerard Barber,  University of Louisville,  gmbarb01@gwise.louisville.edu
Abstract: This paper contrasts and compares two groups of clients served through the Community Collaboration for Children program in Kentucky between July 2006 and June 2008: one group was provided with Intensive In-Home services with a goal to maintain the children with their parents, and the other was provided with Supervised Visitation services with a goal of reunification. Socio-demographic information, family functioning, child outcomes, and program participation are collected quarterly for each family active in the program. Family functioning is measured using a modified version of the North Carolina Family Assessment Scale; administrative data will be merged with our datasets by social security number. Along with traditional statistical techniques, data collected at intake, quarterly/ongoing, and at closure, will be analyzed using longitudinal analysis (individual growth models) to explain the differences made by program participation in the client outcomes, while controlling for family characteristics and the effect of time.
Educating Evaluators: Strategies for Blending the Social Work Curriculum
Presenter(s):
Maureen Rubin,  University of Texas San Antonio,  maureen.rubin@utsa.edu
Jolyn Mikow,  University of Texas San Antonio,  jolyn.mikow@utsa.edu
Goutham Menon,  University of Texas San Antonio,  goutham.menon@utsa.edu
Abstract: Graduate students getting trained in Social Work are faced with multiple roles and responsibilities as they gain experience in the field after their Masters Degree. With this in mind and the ever growing need for accountability in the social service field, a newly accredited Masters Program in Texas has developed a curriculum that has blended field placement experience along with a research protocol that is to be developed and implemented during their course of study. Based on this experience of blending research and field placement experience, students are exposed to real life situations in the social service field and are trained to think about evaluation research at every step of the way in the educational, clinical and administrative roles. This paper lays out the process in which field agencies were involved in the development of the program evaluation course and highlights what has worked in setting up this educational collaboration

 Return to Evaluation 2008

Add to Custom Program