| In a 90 minute Roundtable session, the first
rotation uses the first 45 minutes and the second rotation uses the last 45 minutes.
|
| Roundtable Rotation I:
What Can Research on Evaluation Do for You? Benefits of Practitioner-based Research on Evaluation (ROE) |
|
Roundtable Presentation 462 to be held in Conference Room 12 on Thursday, Nov 3, 4:30 PM to 6:00 PM
|
|
Sponsored by the Research on Evaluation
|
| Presenter(s):
|
| Matthew Galen, Claremont Graduate University, matthew.galen@cgu.edu
|
| Silvana Bialosiewicz, Claremont Graduate University, silvana@cgu.edu
|
| Abstract:
Developing a body of research to describe and inform program evaluation practice is vital for the development of our field. This roundtable seeks to explore the ways in which evaluation practitioners can incorporate Research on Evaluation (ROE) into their existing projects in order to contribute to the field's collective knowledge. There are several potential benefits of conducting practitioner-based ROE studies. Practitioner-based ROE studies may: (1) answer research questions which are highly relevant and practical; (2) improve knowledge-sharing of lessons learned in individual evaluation projects; (3) increase evaluators' influence in shaping program and policy decisions; and (4) enhance evaluation's visibility and credibility as a professional field.
This roundtable will explore tips, frameworks, and examples for designing and implementing ROE studies within existing evaluation projects. We will also facilitate a conversation about potential issues and challenges when conducting ROE, as well as how to overcome these challenges.
|
| Roundtable Rotation II:
Evaluation After the Facts: Tips and Alternative Designs |
|
Roundtable Presentation 462 to be held in Conference Room 12 on Thursday, Nov 3, 4:30 PM to 6:00 PM
|
|
Sponsored by the Research on Evaluation
|
| Presenter(s):
|
| Julien Kouame, Western Michigan University, j5kouame@wmich.edu
|
| Fatma Ayyad, Western Michigan University, f4ayyad@wmich.edu
|
| Abstract:
This paper provides tips and alternative ways to evaluate in situations where: (1) evaluator has no baseline data, (2) the only data available come after the program has been completely implemented, (3) there is no defined criteria according to which individuals are distributed into treatment and comparison group and the client requires a quasi-experimental design.
|