Evaluation 2008 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Meeting Challenges of Evaluating and Sustaining Research Centers and Institutes
Multipaper Session 713 to be held in Room 112 in the Convention Center on Friday, Nov 7, 4:30 PM to 6:00 PM
Sponsored by the Research, Technology, and Development Evaluation TIG
Chair(s):
Brian Zuckerman,  Science and Technology Policy Institute,  bzuckerm@ida.org
A "Meta-Evaluation" of Collaborative Research Programs: Definitions, Program Designs, and Methods
Presenter(s):
Christina Viola Srivastava,  Science and Technology Policy Institute,  cviola@ida.org
P Craig Boardman,  Science and Technology Policy Institute,  pboardma@ida.org
Brian Zuckerman,  Science and Technology Policy Institute,  bzuckerm@ida.org
Abstract: Many federal level program solicitations in science and technology focus on promoting research collaboration, especially multidisciplinary and cross-institution collaboration. Examples include a variety of Centers-style programs with different features as well as a growing number of more traditional research and training grants. While considerable research effort has been devoted to understanding research collaboration as a phenomenon, the implications of those findings with respect to program logic, management, and evaluation have not yet been fully explored. This paper will employ a “meta-evaluation” approach to begin addressing these needs. Using a comparative approach, it will draw upon several recent evaluations of research programs at the National Institutes of Health and the National Science Foundation to identify the multiple practical approaches to promoting “collaboration” and variation in the measured outcomes. Implications for evaluation methodology are discussed.
Predictors of Cooperative Research Center Post-Graduation Survival and Success: An Update
Presenter(s):
Lindsey McGowen,  North Carolina State University,  lindseycm@hotmail.com
Denis Gray,  North Carolina State University,  denis_gray@ncsu.edu
Abstract: Industry/University Cooperative Research Centers (I/UCRCs) are supported by funding from NSF but, like other center programs, are expected to achieve self-sufficiency after a fixed term (ten years). However, there is little evidence about the extent to which government funded programs are able to make this transition. This study attempts to identify the factors that predict center sustainability after they have graduated from NSF funding. Archival data and qualitative interviews with Center Directors and outside Evaluators are used to explore program sustainability of I/UCRCs post graduation from initial grant support. The study examines environmental, organizational, program, and individual level constructs to predict center status, fidelity to the I/UCRC program model, ans sustainability in terms of continued infrastructure, program activities, and outcomes. The results will be used to inform the transition process for Centers currently funded under the I/UCRC program as well as to test the applicability of program sustainability theory developed in other content areas to the case of cross-sector cooperative research programs.
From Evaluation Framework to Results: Innovative Approaches piloted with the Interim Evaluation of the Regional Centers of Excellence for Biodefense and Emerging Infectious Diseases Research (RCE) Program
Presenter(s):
Kathleen M Quinlan,  Concept Systems Inc,  kquinlan@conceptsystems.com
Abstract: This paper focuses on four key dimensions of the conduct of an interim, descriptive evaluation of the National Institute for Allergy and Infectious Diseases’ Regional Centers of Excellence (RCE) for Biodefense and Emerging Infectious Diseases Research Program. The paper will highlight 1) participation, especially in co-authoring an evaluation framework of success factors and defining the major elements of the interim evaluation; 2) responding to time and resource constraints by identifying simple measures, extensively mining existing data, and engaging the ten funded regional research Centers in completing a highly structured Information Request; 3) being deliberate about the unit of analysis to ensure it evaluated the Program as a whole but provided sufficient Center-level analyses to inform improvement; and 4) blending qualitative and quantitative data. Innovative approaches developed in this project to address common challenges of this type of evaluation inform the emerging field of evaluation of large scale biomedical research initiatives.

 Return to Evaluation 2008

Add to Custom Program