|
Assessing the Capacity of Non-profit Community-Based Organizations to Conduct Program Evaluation
|
| Presenter(s):
|
| Neil Vincent, DePaul University, nvincen2@depaul.edu
|
| Reginald Richardson, Northwestern University, r-richardson2@northwestern.edu
|
| Abstract:
Community-based non-profit organizations (CBOs) now operate with the expectation that they measure program effectiveness. However, little is known about the capacity of CBOs to conduct program evaluation. This paper presents a mix-method study that explores how CBOs from a large metropolitan area conceptualize and implement program evaluation efforts as well as the barriers and resources needed to improve their efforts. It presents results from survey data collected on 134 CBOs and in-depth interviews with staff from 15 of these organizations. Implications for program evaluators who consult with CBOs are presented.
|
|
Helping Human Services Programs Succeed: Challenges for the Internal Evaluator
|
| Presenter(s):
|
| Robbie Brunger, Ounce of Prevention Fund of Florida, rbrunger@ounce.org
|
| Abstract:
The classic critique of internal evaluators is that they know more and care more about programs than an external evaluator would; this situation presents them with two special challenges. The experiences of a funding agency for local human services programs in Florida suggest that the first challenge is to improve the likelihood that those programs will be successful. That process begins with the program's design and approach to data collection, and it continues with the need to monitor the results and supply program staff with information they can use for program improvement. The second challenge occurs when writing an evaluation report, for it is necessary to avoid the perception of 'carrying water' for the program. Best practices that can help ensure a high degree of credibility for reports include an organizational structure that supports independent inquiry and a systematic documentation process to substantiate all statements of fact and conclusions.
|
|
Challenges and Successes! Working With an Entire County to Collect Outcomes Evaluation Data From Human Services Programs
|
| Presenter(s):
|
| Peggy Polinsky, Parents Anonymous® Inc, ppolinsky@parentsanonymous.org
|
| Abstract:
The experience of conducting evaluation activities in multiple sites across a large geographical area and with multiple human services program types, including anger management, adoption services, counseling, home visitation, and parenting, has created an evaluation approach that must take 'context' into account on many levels, yet make sure the evaluation activities are consistent across all sites. Parents Anonymous® Inc. is in its fourth year of working with Riverside County, California, Department of Public Social Services (DPSS) to assure mandated provider evaluation data collection and submission, as required by CAPIT and PSSF funding. This presentation will document the challenges and successes of setting up a web-based evaluation system with geographically distanced providers from multiple human services program types, helping the providers understand the necessity and value of evaluation data, working with providers and DPSS to determine appropriate outcome measures, and frequent discussions with DPSS and providers regarding data issues and interpretations.
|
|
Mandated Data Collection as Catalyst for Program Learning
|
| Presenter(s):
|
| Lois Thiessen Love, Uhlichs Children's Advantage Network, lovel@ucanchicago.org
|
| Abstract:
The extensive demands on human service organizations to produce data for funding, regulation and accreditation requirements are often viewed as program interruptions, not as aids to program learning and improvement. The evaluator's challenge is to help programs balance these demands with opportunities for program learning and improvement. A survey of human service evaluators will provide case examples of the relative success of evaluator strategies for successful use of mandated data collection. Using case examples, and a systems analytic framework, this presentation will propose contextual factors that support program learning from mandated data collection, and practical strategies for the human service evaluator in working with programs and organizations.
|
| | | |