|
An Alternative Approach for Narrative Documentation of Extension Programs: Tales of an Internal Evaluator
|
| Presenter(s):
|
| Michael Duttweiler,
Cornell University,
mwd1@cornell.edu
|
| Abstract:
The utility of narrative documentation for program evaluation and accountability and as a tool for organizational learning is well established. In particular, story–based approaches have become widely accepted in both evaluation and organization development settings. Yet many organizations struggle to achieve intentional approaches for story collection that are both valued by practitioners and responsive to organization learning and accountability needs. This paper describes a pilot test in a statewide extension education system of the Most Significant Change Technique as described by Davies and Dart (2005). The test revealed organizational culture, unit size and structure, and perceived value in relation to existing processes as strong influences on acceptance of and commitment to the approach. Because the pilot test was led by an internal evaluator, the case also provides opportunity to explore how the evaluator's role shaped evaluation foci and methods employed.
|
|
Characteristics Associated With Increasing the Response Rates of
Web-based Surveys
|
| Presenter(s):
|
| Thomas Archer,
The Ohio State University,
archer.3@osu.edu
|
| Abstract:
Having a respectable response rate is critical to generalize the results of any survey. Web-based surveys present their own unique set if issues. This research identified web deployment and questionnaire characteristics that could be associated with increasing the response rate to web-based surveys based on a systematic evaluation of over 120 web-based surveys to a variety of audiences during three and a half years. Fourteen web deployment characteristics and nine web-based questionnaire survey characteristics were correlated with response rates. The resultant correlations prompted recommendations: [1] Allow more time between second and third contacts; [2] Potential respondents must be convinced of the potential benefit of accessing the questionnaire; and [3] Do not be overly concerned about the length or detail of the questionnaire - getting people to the web site of the questionnaire is more important to increasing response rates.
|
|
A Stakeholder Valuation Approach to Evaluating a Program's Public Benefits: The University of Minnesota Extension's Master Gardener Program
|
| Presenter(s):
|
| Tom Bartholomay,
University of Minnesota,
barth020@umn.edu
|
| Abstract:
This presentation will focus on the evaluation process used by the University of Minnesota Extension to evaluate the public benefits of its Master Gardener program. With increasing interest in the public benefit of Extension programs, this expert opinion approach is not only useful for identifying a program's public benefits, but assessing the degree to which those benefits are valued across primary stakeholder groups. The results can be used to identify the most valued and robust attributes of a program, which can then be used for program planning, effectively describing the program to counties, and defining future outcome-based evaluations.
This presentation will describe each step in the evaluation process, how the data was reported, the results of the Master Gardener evaluation, and how the information was used by the program.
|
|
Improving the Content of Penn State Cooperative Extension ArcView Geographic Information System Workshops Through Analysis of Participant Evaluations
|
| Presenter(s):
|
| Stewart Bruce,
Pennsylvania State University,
stew@psu.edu
|
| Abstract:
In 2006 the Penn State Cooperative Extension Geospatial Technology Program delivered fifteen introductory and intermediate ArcView GIS workshops to a total of 184 participants. A post workshop evaluation survey was given to each participant asking them to list one or two thing that they learned and to describe how they planned to use this information. Additional quantitative questions dealt with how the program would affect their work and the quality of the actual workshop. In addition, an opened ended question asking for comments and suggestions was also included. A follow-up evaluation was conducted by telephone with a representative sample to gauge how participants were actually using the ArcView software and were there any changes in how they were using the information they learned. The results of this evaluation will be presented along with how the evaluation has modified the format of future ArcView workshops to emphasize areas of greater interest.
|
| | | |