|
Session Title: Evaluating High-Risk, High-Reward Research Programs: Challenges and Approaches
|
|
Panel Session 569 to be held in Wekiwa 6 on Friday, Nov 13, 1:40 PM to 3:10 PM
|
|
Sponsored by the Research, Technology, and Development Evaluation TIG
|
| Chair(s): |
| Stephanie Philogene, National Institutes of Health, philoges@od.nih.gov
|
| Discussant(s):
|
| Gretchen Jordan, Sandia National Laboratories, gbjorda@sandia.gov
|
| Abstract:
Experts believe that targeted programs and policies to support high-risk, high-reward research are needed to preserve U.S. leadership in science and technology. Programs that fund such research take many forms. Some support carefully crafted technology-driven activities with hard milestones, while others select highly creative individuals and fund them to pursue unfettered blue sky research. In most cases, such research has a long incubation period, and in any case cannot be confused with "high quality" research that receives mainstream exposure and citations. Given these and other complications, how can such programs properly be evaluated? In this panel, we showcase different high-risk, high-reward research programs supported in the US government, explain how each is being evaluated, and examine if the evaluation is particularly suited to the program and its underlying philosophy. We look at four programs: NIH Director's Pioneer Award, NSF's Small Grants for Exploratory Research, DARPA, and NIST's Technology Innovation Program.
|
|
Evaluating the National Institutes of Standards and Technologies' Technology Innovation Program
|
| Stephen Campbell, National Institutes of Standards and Technology, stephen.campbell@nist.gov
|
|
The America Creating Opportunities to Meaningfully Promote Excellence in Technology, Education, and Sciences (COMPETES) Act was enacted on August 9, 2007, to invest in innovation through research and development, and to improve the competitiveness of the United States. The COMPETES Act established the Technology Innovation Program (TIP) for the purpose of assisting U.S. businesses and institutions of higher education or other organizations, to support, promote, and accelerate innovation in the United States through high-risk, high-reward research in areas of critical national need. In this talk, Steve will discuss how the process of establishing areas of critical national need shape and were shaped by evaluation of funding efforts in these areas, and the fielding of a customer survey for TIP's initial competition and data collection efforts of the new program. He will also highlight the importance of making data available to the larger research community.
|
|
|
Evaluating the National Institutes of Health's Director's Pioneer Award Program
|
| Mary Beth Hughes, Science and Technology Policy Institute, mhughes@ida.org
|
|
The National Institutes of Health Director's Pioneer Award (NDPA) was initiated in Fiscal Year 2004 to support individual investigators who display the creativity and talent to pursue high-risk, potentially high-payoff ideas in biomedical and behavioral sciences and to fund new research directions that are not supported by other NIH mechanisms. The Science and Technology Policy Institute (STPI) has been requested by the NIH to perform an outcome evaluations this program. The OE study design is based on two questions: (1) Are NDPA awardees conducting pioneering research with NDPA funds? And (2) what are the "spill-over" effects of the NDPA program on the Pioneers, their labs and universities, NIH, and the biomedical community? In this presentation, we discuss the NDPA program, the design of the outcome evaluation (based on in-depth interviews with the awardees and an expert panel review), the challenges associated with such an evaluation, and some preliminary results.
| |
|
Evaluating the Department of Defense's Defense Advanced Research Projects Agency
|
| Richard Van Atta, Science and Technology Policy Institute, rvanatta@ida.org
|
|
Dr. Van Atta, Core Research Staff at the Science and Technology Policy Institute (STPI), has been an official in the Department of Defense, where he was Assistant Deputy Under Secretary for Dual Use and Commercial Programs, and a researcher at the Institute for Defense Analyses (IDA) conducting technology policy research centering on the development and implementation of emerging technologies to meet national security needs. He has conducted studies for DARPA on its past research programs and their implementation, including Transformation and Transition: DARPA's Role in Fostering and Emerging Revolution in Military Affairs, (IDA, 2003), and was invited to write the introductory chapter, "Fifty Years of Innovation and Discovery," for DARPA's 50th anniversary publication. Thus, Dr. Van Atta brings both a background in assessing the technical accomplishments and research management practices of DARPA, and the DoD more broadly, and experience in conceiving and assessing "high-risk, high payoff" concepts for security applications.
| |
|
Evaluating the National Science Foundation's Small Grants for Exploratory Research Program
|
| Caroline Wagner, SRI International, caroline.wagner@sri.com
|
|
This talk will highlight findings from a recently completed evaluation of NSF's 15-year Small Grants for Exploratory Research program. The process involved a survey and interviews conducted with NSF staff and Principal Investigators. The results revealed that the grant process has been overwhelmingly successful in meeting the goals set out for the program. A small number of grants have led to spectacular results, while the majority has not led to transformations, and some have "failed" in the sense of not producing publishable results. The SGER mechanism was valued highly as an alternative to the peer review system according to NSF staff. In addition to enabling the funding of iconoclastic ideas that might be denied or overlooked by reviewers, the mechanism also gave junior faculty, minorities, and women opportunities to establish a track record so they can become competitive later on, many of whom have used the boost to good effect.
| |