2010 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Evaluating Social Service Programs for Government and Foundations
Multipaper Session 245 to be held in Lone Star D on Thursday, Nov 11, 10:55 AM to 12:25 PM
Sponsored by the Non-profit and Foundations Evaluation TIG
Chair(s):
Beth Stevens,  Mathematica Policy Research, bstevens@mathematica-mpr.com
Foundation Requests for Rigorous Evaluation and the Response of Their Community-based Grantees
Presenter(s):
Beth Stevens, Mathematica Policy Research, bstevens@mathematica-mpr.com
Daniel Finkelstein, Mathematica Policy Research, dfinkelstein@mathematica-mpr.com
Jung Kim, Mathematica Policy Research, jkim@mathematica-mpr.com
Michaella Morzuch, Mathematica Policy Research, mmorzuch@mathematica-mpr.com
Cicely Thomas, Mathematica Policy Research, cthomas@mathematica-mpr.com
Abstract: Foundations, government agencies, and the other funders of community-based health and social service programs are increasingly asking grantees to provide evidence that their programs work or at least are making progress towards achieving their goals. But do such programs have the capacity to provide such evidence? Can such organizations, often underfunded and overburdened, generate the rigorous and credible evidence that is now desired? As part of the evaluation of the Local Funding Partners Program (LFP) of the Robert Wood Johnson Foundation (RWJF), we surveyed the eighty-six community-based social service agency grantees in order to gather information on their experience in using evaluation to build evidence that their programs work. This paper reports on whether the projects commissioned evaluations, the forms of technical assistance they received to assist with them, the types of evaluation that were carried out, and how the evaluation results were used.
Evaluating Programs Funded by Government and Delivered by Nonprofits: A Grounded Model for More Accurate and Useful Evaluation of Contracted Social Services
Presenter(s):
Christopher Horne, University of Tennessee, Chattanooga, christopher-horne@utc.edu
Abstract: To better understand the increasingly common evaluation context of nonprofit social service programs provided under government contract, 30 in-depth interviews with a broad range of government and nonprofit administrators were analyzed, following the grounded theory approach, to identify factors that affect respondents' program outcomes. Analysis revealed that some of the most important factors are specific to purchase-of-service contracting and would not typically be captured in conventional program logic models. The most important of these factors can be categorized as components of either the formal or emergent government-nonprofit relationship. If evaluators are to contribute to the improvement and accountability of contracted social services, we should broaden program models to include these key factors, which may otherwise be overlooked. This paper presents one such model and accompanying recommendations, grounded in data, that evaluators may use to better pursue program improvement, accountability, and social betterment goals in evaluations of contracted social service programs.
Successes and Challenges of a Nonprofit Organization’s Effort to Improve Evaluation Quality by Adopting a Client Information System (CIS)
Presenter(s):
Adrienne Adams, Michigan State University, adamsadr@msu.edu
Nidal Karim, Michigan State University, buenalocura@gmail.com
Sue Coats, Turning Point Inc, 
Sallay Barrie, Michigan State University, sallayb08@gmail.com
Nkiru Nnawulezi, Michigan State University, nkirunnawulezi@gmail.com
Cris Sullivan, Michigan State University, sulliv22@msu.edu
Katie Gregory, Michigan State University, katieanngregory@gmail.com
Abstract: This paper presents the findings from pre, post, and 1-year follow-up interviews conducted with the staff of a large non-profit organization that implemented a CIS to improve the quality of the information they collect and utilize to meet their extensive internal and external evaluation needs. In this presentation, we will describe staff expectations going into the implementation process and how the outcomes matched or diverged from that vision. We will also discuss the impact of the CIS on how staff do their work and on their perceptions of the utility of the information collected. Finally, we will share the challenges posed by contextual forces such as limited resources and staff turnover, as well as share important lessons learned that could benefit other non-profit organizations considering or actively implementing their own CIS.

 Return to Evaluation 2010

Add to Custom Program