| In a 90 minute Roundtable session, the first
rotation uses the first 45 minutes and the second rotation uses the last 45 minutes.
|
| Roundtable Rotation I:
Triangulation of Observation Data in a Non-optimal Sample Program Evaluation |
|
Roundtable Presentation 316 to be held in Suwannee 19 on Thursday, Nov 12, 1:40 PM to 3:10 PM
|
|
Sponsored by the Extension Education Evaluation TIG
and the Pre-K - 12 Educational Evaluation TIG
|
| Presenter(s):
|
| Marlene Hurley, Empire State College, marlene.hurley@gmail.com
|
| Fernando Padro, Cambridge College, fpadro@msn.com; fernando.padro@cambridgecollege.ed
|
| Abstract:
This presentation focuses on creating credible sources of data when evaluating program implementation effectiveness under circumstances in which the given institutional sample is not representative. The first part of the discussion centers on the decision-making process based on a mixed methods design multiple case study approach rather than a meta-analysis while the second part of the paper focuses on observation through different instruments allowing for the creation of both qualitative and quantitative data providing evidence of the extent of implementation and the effectiveness of the implemented model. Instruments used to triangulate findings included a time-lapse analysis, an inventory assessment based on characteristics of implemented model, and traditional student surveys. In addition, instruments were supported by interviews of key individuals to determine perspectives of all individuals involved in the program under review in order to determine personal observation of program success and how instructor perception compares to student participant views.
|
| Roundtable Rotation II:
How to Design an Evaluation Plan When There is a Small Number of Participants? |
|
Roundtable Presentation 316 to be held in Suwannee 19 on Thursday, Nov 12, 1:40 PM to 3:10 PM
|
|
Sponsored by the Extension Education Evaluation TIG
and the Pre-K - 12 Educational Evaluation TIG
|
| Presenter(s):
|
| Amy Meier, University of Nevada, Reno, meiera@unce.unr.edu
|
| Marilyn Smith, University of Nevada, Reno, smithm@unce.unr.edu
|
| Janet Usinger, University of Nevada, Reno, usingerj@unr.edu
|
| Abstract:
Program accountability via evaluation has become increasingly important in the current economic situation. However, certain program designs and circumstances make evaluation difficult. Programs in rural communities have low numbers of participants to collect evaluation data from; and programs that designed to be intensive and target a small group of participants over a long period of time also result in low numbers of participants.
This session will begin with an introduction to the evaluation design of Bootstraps, an intensive program in a rural community. The evaluation design includes both qualitative and quantitative methods. Although the evaluation shows that there has been significant behavioral change, there is still an issue of small numbers of participants.
This session will explore ideas and practices used for evaluating programs with small numbers of participants. Suggestions of how to use qualitative or quantitative methods to produce reputable program evaluations when there are small numbers will be shared.
|