Evaluation 2008 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Evaluation Methodologies Shaping Extension Education
Multipaper Session 218 to be held in Centennial Section G on Thursday, Nov 6, 9:15 AM to 10:45 AM
Sponsored by the Extension Education Evaluation TIG
Chair(s):
Nancy Franz,  Virginia Polytechnic Institute and State University,  nfranz@vt.edu
What Lies Beneath: Casting Stakeholders’ Perceptions in Rural Development Projects Evaluation, Through Q Methodology
Presenter(s):
Virginia Gravina,  University of the Republic,  virginia@fagro.edu.uy
Pedro De Hegedüs,  University of the Republic,  phegedus@adinet.com.uy
Abstract: This paper’s objective is to provide a specific example of an evaluation context using Q methodology as a key tool in development projects evaluation. A development project for family farmers, in order to train and provide them with production and organizational skills, was carried out in the state of San Luis, Argentina. Leaded by INTA, National Institute of Agriculture Technology, a government institution; the project was operant for 10 years, so project evaluation was needed for decision making. Q methodology was used to evaluate the stakeholders’ perception about the project. As a result five ways of how the beneficiaries perceived the project emerged, however all of them were imbued with project’s key topics, and none of them focused on a particular one. This reinforced the idea that development projects not only worked on the intended goals, but also are a source of other unexpected effects that should also be evaluated.
Evaluation of In-service Training Programs Using Retrospective Methods: Problems and Alternatives in Establishing Accuracy
Presenter(s):
Koralalage Jayaratne,  North Carolina State University,  jay_jayaratne@ncsu.edu
Lisa Guion,  North Carolina State University,  lisa_guion@ncsu.edu
Abstract: Due to tightening budget, extension managers are demanding for outcome data to prove the effectiveness of in-service training programs. This demand can be met only if programs are evaluated systematically. Therefore, extension specialists ask for easy, valid, and reliable methods to evaluate extension in-service training programs. Even though, the pre and post-test evaluation approach is relatively a valid method, trainers are reluctant to use this method due to practical challenges in matching pre and post surveys. Some participants are not comfortable in revealing their identity and leave pre and post surveys without any identifiers. Retrospective pre and post method is an alternative evaluation approach for this problem. This paper describes how the retrospective method was used to evaluate a state-wide extension in-service training program; presents results; shares problems and alternatives in establishing accuracy.
Utilizing Social Network Analysis in Extension: Exploring Extension's Reach
Presenter(s):
Tom Bartholomay,  University of Minnesota Extension,  barth020@umn.edu
Scott Chazdon,  University of Minnesota Extension,  schazdon@umn.edu
Mary Marczak,  University of Minnesota,  marcz001@umn.edu
Abstract: This presentation will describe the University of Minnesota Extension’s evaluation of its outreach networks using social network analysis (SNA). The presentation will include descriptions of the process, the value of SNA mapping and statistical procedures, what was learned from the data, and how the results support other evaluation goals within University of Minnesota Extension. The purpose of this evaluation was to evaluate the direction and strength of Extension’s networks in Minnesota to monitoring changes in network patterns over time and inform strategic decision-making regarding robust and neglected network zones. Findings will be used at the project, program, center, and Extension-wide level. Although Extension educators deliver their services through a variety of educational programs, they also deliver their research-based information through their social networks and relationships, enabling organizations throughout the state and elsewhere to achieve significantly greater success in their objectives. SNA provides one way to measure these important networks.
Methodological Rigor and its Relationship to Evaluation Use Within Extension
Presenter(s):
Marc Braverman,  Oregon State University,  marc.braverman@oregonstate.edu
Mary Arnold,  Oregon State University,  mary.arnold@oregonstate.edu
Abstract: This presentation explores the role and influence of methodological rigor in outcome evaluations of Extension programs. Rigor is a characteristic of evaluation quality, consisting of elements such as valid measurement strategies, strong evaluation design, sufficient sample size and power, appropriate data analyses, etc. Skilled evaluators are very aware of its importance but the same is not always true for Extension administrators, decision-makers and other stakeholders, who sometimes head straight for the evaluation’s conclusions without considering the strength of evidence behind those conclusions. The presentation discusses questions about how evaluations may be correctly or incorrectly interpreted: Which organizational relationships and structures within Extension promote the planning of high-quality evaluations? Who within Extension ensures that rigor is appropriately considered in the decision-making process? What are the implications of misinterpreting rigor for the long-term quality of Extension programs? The presentation provides recommendations on the evaluator’s role in ensuring rigor within state Extension services.

 Return to Evaluation 2008

Add to Custom Program