2010 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Analysis and Evaluation of Research Portfolios Using Quantitative Science Metrics: Theory
Panel Session 271 to be held in Texas D on Thursday, Nov 11, 10:55 AM to 12:25 PM
Sponsored by the Research, Technology, and Development Evaluation TIG
Chair(s):
Israel Lederhendler, National Institutes of Health, lederhei@od.nih.gov
Discussant(s):
Gretchen Jordan, Sandia National Laboratories, gbjorda@sandia.gov
Abstract: This panel addresses the question of the theoretical gaps and opportunities to enhance the understanding scientific research using portfolios as a unit of analysis. Increasing attention is being given to analyzing, managing, and evaluating scientific investment using a quantitative portfolio approach. Many of the analyses seem to lack theoretical reasons for using particular methods to answer particular questions. The presentations are intended to raise the issue of how portfolio analysis, research evaluation, and scientometrics are complementary.
How Can Portfolio Analysis Assist Government Research Agencies to Make Wise Research Investments?
Wagner Robin, National Institutes of Health, wagnerr2@od.nih.gov
Matthew Eblen, National Institutes of Health, eblenm@od.nih.gov
Over the last several decades, government agencies have been increasingly required to demonstrate the value and impact of the research projects they fund. Agencies have utilized a variety of research evaluation methods and tools to assess individual research projects and programs. However, it has proven challenging to assess the entire portfolio funded by research agencies due, in part, to the diversity of program areas (e.g., how can one balance investments in cancer versus HIV/AIDs?). In addition, it has been difficult to define, predict, and measure risks and opportunities associated with research projects. We will examine the theory of portfolio analysis, which has its origins in the field of finance, and evaluate the degree to which financial concepts can be adapted to portfolios of research projects. We will conclude with suggestions of what data, tools, and metrics need to be developed to advance the field of portfolio analysis for research.
Limits of Portfolio Analysis to Address Evaluation Questions
Brian Zuckerman, Science and Technology Policy Institute, bzuckerm@ida.org
As part of a feasibility study of an outcome evaluation of the National Cancer Institute-funded Childhood Cancer Survivor Study (CCSS), the study analyzed the portfolio of NIH-funded childhood cancer survivorship research. After completing descriptive analyses of the portfolio itself, the analysis compared the publication output from the CCSS and the balance of the portfolio to assess differences in publication output and journal quality between CCSS publications and the remainder of the portfolio. While these analyses had quantitative answers, the key finding from the analysis was that due to the nature of the portfolio, the comparisons – between a single award mechanism for conducting many research studies on a large cohort of patients and multiple individual studies conducting research on smaller cohorts – weren’t analytically meaningful. This paper will summarize the study results in the context of an analysis of the limits of the utility of portfolio-analytic methods to address evaluation questions.
Reinventing Portfolio Analysis at the National Institutes of Health: Explorations in the Structure and Evolution of Biomedical Research
Israel Lederhendler, National Institutes of Health, lederhei@od.nih.gov
Kirk Baker, National Institutes of Health, bakerk@od.nih.gov
Archna Bhandari, National Institutes of Health, bhandara@od.nih.gov
Carole Christian, National Institutes of Health, cchristi@od.nih.gov
Recent investigations initiated by the NIH Portfolio Analysis Group are defining questions that blur some lines with research evaluation. The questions that are being asked are fundamental to understanding science but have only been addressed with difficulty using traditional evaluation approaches. The combination of new methods including scientometrics, text mining, concept matching, and visualizations have stimulated interesting computationally-based studies: (1) emerging areas of biomedical interest by analysis of grant content over time, (2) identification of causal relationships through knowledge discovery of grants and publications, (3) portfolio comparisons between organizations exploring the recent evolution of topics within a research field such as breast cancer research, or (4) development of methods to promote knowledge gap analysis by looking at knowledge structures of opposing theories. These studies address fundamental issues about science, they also provide an evidence base for tackling important research evaluations in the policy, organizational, and programmatic domains.
Intersections Among Scientometrics, Science Portfolio Analysis, and Research Evaluation: Does Complex Systems Science Offer Workable Theory?
Caroline Wagner, Science-Metrix Corp, caroline.wagner@science-metrix.com
Research evaluation suffers in comparison with other types of evaluation (financial, medical) because of lack of quantitative techniques based in theory. Data about research dynamics and outcomes have been difficult to acquire and even more difficult to evaluate. The emphasis on counting inputs, outputs, and outcomes, with little attention to the dynamics within knowledge creation is limiting. The ability of analysts to understand the dynamics of research as it unfolds and evolves becomes more important. Without monitoring tools based on an understanding of scientific knowledge creation, a portfolio analysis cannot track progress, and is relegated to tracking products. Recent developments in complex adaptive systems theory—being applied specifically within network analysis and pioneered by physicists and biologists--may profitably be explored to help fill the need for a theoretical basis to understanding quantitative tools. This paper addresses that possibility by discussing complex systems theory in a research context.

 Return to Evaluation 2010

Add to Custom Program