| Session Title: Evaluation Use for Strategic Management in Environmental and Public Health Organizations |
| Multipaper Session 332 to be held in Room 112 in the Convention Center on Thursday, Nov 6, 1:40 PM to 3:10 PM |
| Sponsored by the Research, Technology, and Development Evaluation TIG and the Environmental Program Evaluation TIG |
| Chair(s): |
| Dale Pahl, United States Environmental Protection Agency, pahl.dale@epa.gov |
| Abstract: Today, federal organizations with responsibility for environmental and health research face unique challenges when evaluating progress and impact. These challenges exist because solutions for many of today's environmental and health problems require knowledge about complex systems and dynamic interactions with human activities; it is difficult to adapt traditional approaches and metrics to evaluate such "systems" programs. The four presentations in this session explore evaluation policy, best practice, innovation, and lessons learned for strategic management and operation (e.g., program effectiveness, efficiency, and contributions to outcomes) of environmental and health programs. |
| Performance Measurement for Research Improvement |
| Phillip Juengst, United States Environmental Protection Agency, juengst.phillip@epa.gov |
| This presentation focuses on the challenge of developing meaningful performance measures for research. Measuring ultimate outcomes is especially difficult, yet focusing on the output of research publications ignores the relevance and quality of research. To address these challenges, the Environmental Protection Agency (EPA) expanded their program evaluations to provide systematic ratings of performance. These ratings, based on similar methodologies at other agencies, provide a consistent measure of long-term outcomes and a useful tool for tracking the success of program improvement strategies. EPA also developed a balanced suite of measures including bibliometric and decision-document analyses, as well as partner surveys, to measure performance across the logic model framework. Recently, EPA engaged the National Academy of Sciences and other Federal agencies in a broader dialog about how best to measure the efficiency of research, and the outcome of that discussion is leading to new directions for research efficiency and evaluation. |
| A Conceptual Model for the Capability, Effectiveness, and Efficiency of Laboratories at the United States Environmental Protection Agency |
| Andrea Cherepy, United States Environmental Protection Agency, cherepy.andrea@epa.gov |
| Michael Kenyon, United States Environmental Protection Agency, kenyon.michael@epa.gov |
| This presentation communicates an overview of a life-cycle model for laboratory facilities and describes its relevance for understanding effective and efficient laboratory functions at EPA. During the past twenty years, the growing importance of laboratory facilities in all sectors of the United States coupled with laws enacted by Congress and Executive Orders from the Office of the President have stimulated development of a conceptual model for laboratory facilities and the programs they support. This model is consistent with emerging demands for sustainable laboratory facilities. At EPA, scientific laboratories have a strong role in developing knowledge to inform decisions about environmental problems and to support agency programs. Understanding the relationship between laboratory capabilities, effectiveness, and efficiency is important for the strategic management of laboratories and for management and evaluation of the scientific functions and research programs sustained by laboratories. |
| Evaluating a Research Program Through Independent Expert Review |
| Lorelei Kowlaski, United States Environmental Protection Agency, kowalski.lorelei@epa.gov |
| This presentation will provide an update on an approach presented to the AEA in 2005 for conducting regular evaluation of the EPA's Office of Research and Development's (ORD) research programs. Rod's federal advisory committee, the Board of Scientific Counselors, is an independent expert panel that began conducting retrospective and prospective reviews of ORD's research programs in 2004 to evaluate their quality, relevance, and effectiveness. The intent was to continue these reviews on an approximately 4-5 year review cycle, and for ORD to use the recommendations to help plan, implement, and strengthen it's research programs, as well as respond to the increased focus across the government on outcomes and accountability. This presentation will address the lessons learned from implementing the BOSC review process the past 4 years, and how the process has been adapted to incorporate new concepts, such as systematic ratings of performance and efficiency measures. |
| NCI Corporate Evaluation Converging Approaches to Maximize Utility |
| James Corrigan, National Institutes of Health, corrigan@mail.nih.gov |
| Kevin D Wright, National Institutes of Health, wright@mail.nih.gov |
| Lawrence S Solomon, National Institutes of Health, solomonl@mail.nih.gov |
| The National Cancer Institute (NCI) has been building a framework to support systematic evaluation and assessments of its programs. This framework has evolved over time and includes evaluation Policies and Procedures and enhanced Evaluation Capacity. Evaluation policy and procedures were developed with input from key stakeholders such as NCI leadership, program directors, and evaluation experts to increase evaluation utility. Evaluation tools, resources, and information networks were developed to enhance evaluation capacity and increase the quantity and quality of evaluations. Taken together, these activities represent a multi-pronged corporate evaluation model that has and continues to strengthen the evaluation process and evaluation utilization. This presentation will provide examples of what a corporate evaluation policy might look like, how an evaluation policy can be operationalized, and what can be done to enhance the number and quality of evaluations consistent with a corporate evaluation policy. Corporate evaluation at NCI continues to evolve with significant opportunities for further development. |