2011

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Progress Reporting for US Federal Grant Awards: Templates, Guidance, and Data Standards to Support Effective Program Evaluation
Panel Session 731 to be held in Malibu on Friday, Nov 4, 2:50 PM to 4:20 PM
Sponsored by the Research, Technology, and Development Evaluation TIG
Chair(s):
Laurel Haak, Discovery Logic, laurel.haak@thomsonreuters.com
Discussant(s):
Laurel Haak, Discovery Logic, laurel.haak@thomsonreuters.com
Abstract: Funding organizations involved in research and technology development have many approaches to evaluate program effectiveness and mission impact. A common program evaluation tool used across US federal agencies is the grant progress report, prepared by program participants usually on an annual basis but at least to close out a project. This session will explore reporting guidance provided to program participants, prospective longitudinal collection of progress report data, and efforts to create data standards to support cross-agency reporting. A discussant will provide a perspective on effectiveness of approach and moderate a discussion on opportunities for creating a shared set of data elements to support program evaluation.
Using the Logic Model Process to Guide Data Collection on Outcomes and Metrics
Helena Davis, National Institutes of Health, helena.davis@nih.gov
Grantees and community partners in the Partnerships for Environmental Public Health (PEPH) programs of the National Institutes of Environmental Health Sciences (NIEHS) have identified the lack of standardized evaluation tools and metrics as one of the biggest challenges for the PEPH program. In response, the NIEHS PEPH team developed an evaluation metrics manual. The manual uses a logic model approach to guide grantees in identifying and measuring their project activities, outputs, and impacts. When working with partners, a wide range of values can be presented; logic models and clear metrics can help to ensure that the values are explicitly discussed. The manual identifies potential activities, outputs and impacts and provides example metrics for these. The manual addresses five different thematic areas: partnerships, leveraging, products and dissemination, education and training, and capacity building. This presentation will focus on capacity building.
Evaluating Collaboration and Team Science in the National Cancer Institute's Physical Sciences: Oncology Consortium
Larry Nagahara, National Institutes of Health, larry.nagahara@nih.gov
The Physical Sciences-Oncology Centers (PS-OCs) program was founded by the National Cancer Institute to unite the fields of physical sciences and cancer biology by creating trans-disciplinary teams and supporting infrastructure. Ultimately, the success of the program will be measured by the generation of new knowledge and new fields of study to better understand the physical and chemical forces that shape and govern the emergence and behavior of cancer. To support a prospective program evaluation, PS-OC program staff have implemented a comprehensive bi-annual progress report, and are currently developing a data model, database, and reporting user interface to mine these data. The progress report collects information on a number of activities, including curriculum development, training, research methods, collaborations, scientific progress, new projects, and publications. This presentation will cover how the progress report was developed, challenges to implementation, and opportunities for evaluation when structured data are collected from program outset.
Creating a Shared Core Set of Reporting Elements
David Baker, Consortia Advancing Standards in Research Administration Information, dbaker@casrai.org
While one might try to envision a single grant progress report for use across all federal agencies, this presents issues for programs with different goals and different audiences. An alternative approach would achieve the efficiencies of the above approach while avoiding the issues of a 'one-size-fits-all' model. Defining and implementing standards and core reporting elements for progress data enables program participants to use a single data model to generate and combine global and specific reporting elements. This presentation will address the current status of efforts to create a set of reporting standards, outline stakeholders, and address what data fields and ontologies are included, as well as approaches to encourage their continued development and use by the community.

 Return to Evaluation 2011

Add to Custom Program