|
Framing an Evaluation of a Nonprofit Community Based Organization: A Comparison of Two Delphi Studies
|
| Presenter(s):
|
| Monica Geist,
University of Northern Colorado,
monicageist@comcast.net
|
| Abstract:
Of the many challenges of evaluating nonprofit community based organizations, there are two major challenges that can be lessened by using the Delphi method. The first challenge is that of small evaluation budgets. The second is getting stakeholders together to discuss and come to agreement on goals and concerns in the organization. The Delphi method is a set of iterative questionnaires, with controlled feedback, that allow participants to give their opinions and interact with other participants without having to attend a face-to-face meeting. By using the Delphi method, evaluators can frame the evaluation before the first site visit. This paper discusses the comparison of two versions of the Delphi method that were administered to a nonprofit organization that is a two-year self-sufficiency program for teen mothers. One version was the paper-pencil version administered via postal mail. The other version was a real-time computer version administered via the web.
|
|
Resistance to Learning From Evaluation in the Context of Non-profit Organizations
|
| Presenter(s):
|
| Luba Botcheva,
The Children's Health Council,
lbotcheva@chconline.org
|
| Julie Slay,
The Children's Health Council,
jslay@chconline.org
|
| Lynne Huffman,
The Children's Health Council,
lhuffman@chconline.org
|
| Abstract:
The presentation will focus on non-profit organizations and will address resistance to learning from evaluation at three levels: service providers, program, and organization. Learning is conceptualized as a form of change that can be resisted as often as it is embraced. This is especially true in the context of evaluation where “learners,” those who are expected to use evaluation findings and consider changing, may not be motivated to learn and are forced by external factors to change. Learners may regard knowledge as threatening; their own perceived level of competence may be challenged by new information. How can evaluators create an environment that facilitates learning and that provides a safe opportunity to change?
Several case studies of summative and formative evaluation will be presented based on our experience as internal and external evaluations for non-profit organizations. Various techniques that we used to work with resistance to learning will be discussed.
|
|
Development and Use of a Comprehensive Measure of Nonprofit Organizational Capacity
|
| Presenter(s):
|
| Sheridan Green,
JVA Consulting LLC,
sheridan@jvaconsulting.com
|
| Robin Leake,
JVA Consulting LLC,
robin@jvaconsulting.com
|
| Veronica Gardner,
JVA Consulting LLC,
v@veronicagardner.com
|
| Abstract:
JVA Consulting, LLC developed a capacity-building program for community- and faith-based organizations with funding from the U.S. Department of Health and Human Services. The program was designed to increase organizational effectiveness in many areas of capacity including leadership, organizational development, programming, funding and community engagement. The study purposes were to validate an assessment tool and to evaluate organizations' capacity gains. Research questions included: 1) were there significant gains in organizational effectiveness? And, 2) were there differences in gains based on the type or amount of capacity building provided? Data from the first three years indicate that organizations demonstrated significant gains in organizational effectiveness. Organizations receiving individualized technical assistance showed significantly greater gains than those attending workshops only. Results revealed that the amount of technical assistance was a significant predictor of gains achieved by participants. Chronbach's alpha for the instrument was .97. These results and current study analyses will be presented.
|
|
Evaluating Technical Assistance Services Provided to Grantees of Federal Agencies: Approaches of and Lessons From the MayaTech Model
|
| Presenter(s):
|
| Kimberly Jeffries Leonard,
The MayaTech Corporation,
kjleonard@mayatech.com
|
| Mesfin S Mulatu,
The MayaTech Corporation,
mmulatu@mayatech.com
|
| James Bridgers,
The MayaTech Corporation,
jbridgers@mayatech.com
|
| Darren Fulmore,
The MayaTech Corporation,
dfulomre@mayatech.com
|
| Wilhelmena Lee-Ougo,
The MayaTech Corporation,
wlee-ougo@mayatech.com
|
| Abstract:
Technical assistance [TA] contracts are increasingly becoming critical mechanisms through which federal agencies accomplish their objectives efficiently and cost-effectively. Evaluation of these services is important to establish effectiveness and contractual accountability. The Baywatch Corporation had developed a model – Targeted Evaluation Assistance Model (TEAM) – to evaluate its technical assistance services to grantees of federal agencies. The TEAM is a multimethod and multistage evaluation model and approaches TA as a tripartite (federal agency, grantees, and contractors/subcontractors) interactive process. Evaluation takes place in three stages of the TA process: 1) identification and prioritization of TA needs; 2) design and delivery of TA; and 3) output and impact of TA. Multiple electronic tracking systems are used to record interactions between the parties; qualitative and quantitative approaches are used to collect pertinent data from multiple sources appropriate for each stage of TA. Lessons learned from MayaTech's TA contract with SAMHSA's Division of Systems Improvement would be presented.
|
| | | |