December 3, 2003 The Honorable Rod Paige Secretary of Education Department of Education 400 Maryland Avenue, SW Washington, DC 20202 Dear Mr. Secretary: The American Educational Research Association (AERA) is pleased to have the opportunity to offer comments regarding “Scientifically Based Evaluation Methods” as outlined in the Federal Register of November 4. Founded in 1916, AERA is a national association of approximately 22,000 researchers dedicated to the advancement of sound science in education and the widest possible dissemination and use of this knowledge in policy and practice. As part of our mission, we publish peer-reviewed journals and books, convene an annual meeting based on merit review of proposals for papers and panels, and undertake other programs and policies to promote the infrastructure of education research. The AERA has applauded the efforts of the Department of Education to call attention to the importance of advancing the scientific quality of education research, including greater use of rigorous experimental methods involving randomized control groups. There have been too few such studies in education research and we need many more. However, we have also emphasized the overarching point that the choice of a scientific method or methods must be selected in light of the problem and questions driving the research and that selected methods must be used rigorously. Our position, specified by AERA Council in January 2003 (see attached), closely tracks that formulated in the National Research Council study, Scientific Research in Education. We urge you to modify the language for a “Proposed Priority” to be used for “any appropriate programs in the Department of Education” in FY 2004 or later. While we appreciate the value of experimental designs as an evaluation method, we believe that a judgment of “best,” as specified in the proposed language, does not adequately account for other methods of evaluation that might be as or more appropriate depending on the specific education program. We are concerned that the proposed priorities for application of scientifically based evaluation methods (1) invoke an uncommonly narrow definition of evaluation as used in the government and in the field, and (2) make no reference to the standards for scientifically valid education evaluation adopted in the legislation creating the Institute of Education Sciences (IES). It is customary to discuss “evaluation” in terms of assessment of worth, of which program impact is but one dimension and causality is seldom the only objective. The often-cited Tennessee class size experiment, for example, provided a test of a hypothesis regarding the impact of class size on the education achievement of students, and conclusively demonstrated the advantages of smaller classes in this regard. But this evaluation has been more useful for initiating the knowledge development process than concluding it: The Tennessee class size intervention clearly did not cause learning to occur, but permitted or encouraged it through processes that still must be explored. It is clear, as demonstrated by the troubled efforts of California to implement class size reduction, that there are other essential ingredients such as an adequate number of classrooms and an adequate number of qualified teachers that were not variables in the controlled experiment. It is precisely to avoid such problems that evaluation studies are charged with addressing issues of cost benefit and implementation feasibility. Often, in formative evaluation, it is also customary to make recommendations of program modifications during the evaluation itself. We recommend that the language and scope of the proposed regulations for evaluation studies be changed to align more closely with the evaluation standards provided by Congress in its most recent policy determinations about the scientifically based research in the IES. Just over a year ago, Congress specified the following definition: (19) SCIENTIFICALLY VALID EDUCATION EVALUATION. – The term “scientifically valid education evaluation” means an evaluation that -- (A) adheres to the highest possible standards of quality with respect to research design and statistical analysis; (B) provides an adequate description of the programs evaluated and, to the extent possible, examines the relationship between program implementation and program impacts; (C) provides an analysis of the results achieved by the program with respect to its projected effects; (D) employs experimental designs using random assignment, when feasible, and other research methodologies that allow for the strongest possible causal inferences when random assignment is not feasible; and (E) may study program implementation through a combination of scientifically valid and reliable methods. This standard should be adopted as the one by which the government would contemplate assigning priority points. It calls attention to the need for more rigorous methodologies in the context of the function of evaluation to assess and inform. In addition, it illuminates the relationship between program implementation and program impact. There is one aspect of the proposed priority that is not found in the IES definition of scientifically valid education evaluation that we believe would be a significant and beneficial addition: the award of priority points for evaluation designs making use of a qualified, independent, third party evaluator. We believe this is an especially appropriate criterion in evaluation of education programs at all levels of government. We believe that making modifications of the type outlined above would allow the Department to accomplish its goal of setting a priority for evaluation of programs consonant with scientific assessment of the highest quality. Also, in our view, it better positions the agency with respect to broader issues of science policy. We are concerned that any Federal agency would use its rulemaking authority to state or imply that a specific scientific method is best. We see this task as much more dynamic and conditional and, in any event, more appropriately left to merit review, the scrutiny of the scientific community, or other bodies better positioned to make such judgments (e.g., the National Academy of Sciences or the Office of Science and Technology Policy). We hope that these comments help the Department of Education in attaining its goal of using scientifically based evaluation strategies to determine the effectiveness of programs and projects. Please let us know if we can be of further help. Sincerely, Felice J. Levine, Ph.D. Executive Director
Resolution
on the Essential Elements of Scientifically-based Research The Council of the American Educational Research Association reaffirms its commitment to improving the quality of educational research. It reasserts that there are multiple components of quality research, including well specified theory, sound problem formulation, reliance on appropriate research designs and methods, and integrity in the conduct of research and the communication of research findings. A fundamental premise of scientific inquiry is that research questions should guide the selection of inquiry methods. Council recognizes randomized trials among the sound methodologies to be used in the conduct of educational research and commends increased attention to their use as is particularly appropriate to intervention and evaluation studies. However, the Council of the Association expresses dismay that the Department of Education through its public statements and programs of funding is devoting singular attention to this one tool of science, jeopardizing a broader range of problems best addressed through other scientific methods. The Council urges the Department of Education to expand its current conception of scientifically-based research. Further, the Council directs its staff and officers to take steps with the Department of Education and other associations, organizations, and agencies to achieve a broader understanding of the range of scientific methodologies essential to quality research. Adopted by unanimous resolution on January 26, 2003.
Click here to access the AEA Statement Click
here to access the Request for Comment in the Federal Register Click here to access the NEA Response as a pdf file
|