|
Using Citation Analysis to Study Evaluation Influence: Strengths and Limitations of the Methodology
|
| Presenter(s):
|
| Lija Greenseid, Professional Data Analysts Inc, lija@pdastats.com
|
| Abstract:
Citation analysis is an accepted, albeit sometimes controversial, methodology used to assess the impact of scientific research and researchers. Identifying highly cited scientific papers provides clues as to influential theories and scientists within a field. This presentation explores the strengths and limitations of using citation analysis to assess the influence of program evaluations and evaluators. The presentation provides an overview of how to conduct citation analyses and shares findings from a citation analysis study of 246 evaluation reports and instruments produced by four National Science Foundation-funded program evaluations. This presentation is intended for scholars of evaluation use and influence, as well as practicing evaluators, who are interested in learning about citation analysis, exploring methods for studying evaluation influence, and hearing about what type of evaluation product was most highly cited by the field (it wasn't either a weighty cumulative report or an AEA conference presentation!).
|
|
Reviewing Systematic Reviews: Meta-analysis of What Works Clearinghouse Computer-Assisted Interventions
|
| Presenter(s):
|
| Andrei Streke, Mathematica Policy Research Inc, astreke@mathematica-mpr.com
|
| Abstract:
The What Works Clearinghouse (WWC) offers reviews of evidence on broad topics in education, identifies interventions shown by rigorous research to be effective, and develops targeted reviews of interventions. This paper systematically reviews research on the achievement outcomes of computer-assisted interventions that have met WWC evidence standards (with or without reservations). Computer-assisted learning programs have become increasingly popular as an alternative to the traditional teacher/student interaction intervention on improving student performance on various topics. The paper systematically reviews (1) computer-assisted programs featured in the intervention reports across WWC topic areas, and (2) computer-assisted programs within Beginning Reading and Adolescent Literacy topic areas. This work updates previous work by the author, includes new and updated WWC intervention reports (released since January 2008), and investigates which program and student characteristics are associated with the most positive outcomes.
|
|
Integrating Mental Model Approach in Research on Evaluation Practice
|
| Presenter(s):
|
| Jie Zhang, Syracuse University, jzhang08@syr.edu
|
| Abstract:
Program evaluation, crossing many disciplines and fields, is one of the most fully developed and important parts of the broader evaluation field. There is a scarcity of rigorous and systematic research on evaluation practice. To answer many repeated appeals for more empirical studies on evaluation practice, this research attempts to contribute to the generation of empirical knowledge to evaluation, which is necessary to explain the nature of evaluation. Grounded in mental model research, this study proposes a conceptual framework to examine the various constructs such as reasoning, problem solving, and mental model, which interact with each other, and influence evaluation practice. The methodology to investigate the hypothesized relationships among these constructs will be structural equation modeling technique. The study will result in better understanding of the relationships can be beneficial for more efficient evaluation theory-building, better understanding of evaluation practice, and contribute to better design of evaluation training and instructions.
|
| | |