|
Evaluating Individually-tailored Services: A Proposed Strategy
|
| Presenter(s):
|
| Roger Boothroyd,
University of South Florida,
boothroy@fmhi.usf.edu
|
| Steven Banks,
University of South Florida,
tbosteve@aol.com
|
| Abstract:
Evaluators often encounter and report on the problems associated with heterogeneity among clients needing services, the services provided in response to their needs, and the potential outcomes resulting from these services (Goldenberg, 1978; Gordon, Powell, & Rockwood, 1999). A common response of evaluators is to identify and use multiple measures that span the range of variability in clients, services, and outcomes. Multivariate analyses or latent class models are often performed on the full set of outcome measures. The primary problem with this strategy is that many recipients are assessed on outcomes that their services plans were never designed to influence. In this presentation we will describe several approaches that have been developed and used to deal with this issue. Additionally, we will describe a procedure we developed, the maximum individualized change score method, summarize a simulation supporting the value of its use, and highlight the contexts in which this method has strengths over other frequently used approaches.
|
|
Evaluating Programs to Reduce Child Abuse and Maltreatment: The Abilene Replication of the Family Connections Program
|
| Presenter(s):
|
| Darryl Jinkerson,
Abilene Christian University,
darryl.jinkerson@coba.acu.edu
|
| David Cory,
New Horizons Family Connections,
dcory@sbcglobal.net
|
| Abstract:
Family Connections is a community-based project created to reduce the incidence of child abuse and maltreatment by providing services that address risk factors and link families to multiple service agencies and community organizations. Services provided help families acknowledge their areas of weakness, embrace their strengths, and empower them to act on their own behalf to improve their situations.
The current intervention is a replication of an earlier study but the evaluation approach is unique to this situation. The approach uses a variety of qualitative and quantitative data collection methods and includes a five month baseline period. The Evaluation Design is based on a “dashboard” containing 12 outputs, 8 short term goals (7 of which are mandated by State Contractual Requirements) and 6 long term goals (all of which are mandated by State Contractual Requirements). The evaluation dashboard is support by a Logic Model which displays the Program Inputs and Processes.
|
| |