Return to search form  

Session Title: Using Logic Models to Evaluate Research and Technology Diffusion Results: Two Cases
Panel Session 797 to be held in Royale Board Room on Saturday, November 10, 12:10 PM to 1:40 PM
Sponsored by the Research, Technology, and Development Evaluation TIG
Chair(s):
Jeff Dowd,  United States Department of Energy,  jeff.dowd@ee.doe.gov
Abstract: Two major challenges in evaluating Research, Technology, and Development and Deployment Programs are summarizing research progress for large diverse research programs in a handful of measures, and getting credible impact numbers for program deployment activities such as training that only indirectly influence adoption of a technology or practice. This panel will present results of multiple, related metrics and evaluation projects at the U.S. Department of Energy that have tackled both these challenges. We will present logic models and then the use of those models. Logic modeling has been useful in both cases to make the "magic in the middle" explicit. One example is an exercise to improve metrics for research and technology development programs. The other is a training program for industrial energy efficient technologies which, because it is based on the social science theory of diffusion, applies to the diffusion of anything, not just technology. Both cases provide generic templates for others to follow.
Cutting Edge Logic Models for Research and Technology Programs
Gretchen Jordan,  Sandia National Laboratories,  gbjorda@sandia.gov
During the past year we have been working to develop logic models that U.S. Department of Energy program managers and their stakeholders who make investment decisions find useful for describing the programs and their intended outcomes. From the investors' point of view that means the logic models need to be able to link the progress of many projects to intermediate points of research progress that then link to longer term outcomes. From the program managers' point of view this means the logic models have to build on technology roadmaps, Stage Gate questions & answers, Gantt charts, and systems integration analysis. For both audiences, these logic models suggest the most important intermediate performance measures to include in strategic, multi year, and annual plans.
Linking Projects to Program Outcomes in Metrics for Technology Development Programs
John Mortensen,  Energetics Inc,  jmortensen@energetics.com
Using the logic models just described, we identified candidate metrics by mapping existing metrics against the logic, improving upon existing metrics, and filling in gaps. Gaps included elements of the logic models that had no metrics and gaps in the logical linkages, that is, the 'performance story' that the set of metrics tells. Key stakeholders were then offered opportunity to improve and a consensus was reached. We will show the metrics we found for a variety of program types. These build on and provide information for other assessments such as stage gate and peer review. The metrics help the U.S. Department of Energy Office of Energy Efficiency and Renewable Energy programs respond to multiple requests for performance information, many of which are externally driven and to do so in a streamlined manner that is readily available to management.
The Logic of Indirect Programs to Diffuse Technologies: The Example of Training
John Reed,  Innovologie,  jreed@innovologie.com
This presentation will provide a specific example of the logic of technology diffusion for technology-related training activities, tracing that difficult path to outcomes tied to program participants, differentiated from responses of others who are not influenced by the program. The U.S. Department of Energy's Impact Evaluation Framework for Deployment Programs (presented at AEA in 2006 and set to be released in summer 2007) is under girded with ideas drawn from E. Rogers (2003) diffusion of innovations which has been widely used. The framework has identified detailed logic model templates for diffusion in four domains: End User, and three Infrastructure domains - Government/ Policy, Business, and Knowledge. The logic suggests that to be successful training needs to address the product characteristics as well as the characteristics of all four domains in order to meet the needs of the trainees. The logic also accounts for communication patterns, replication effects and sustainability.
Getting From Training to Credible Energy Savings: An Evaluation Template
Harley Barnes,  LM Business Process Solutions,  harley.h.barnes@lmco.com
EERE has developed an evaluation plan and generic survey for credibly determining the impacts of training programs. The logic-model approaches described by the other participants in this panel were used to guide the selection of questions designed to determine (1) whether energy-efficiency measures related to the training were implemented, or are planned, subsequent to the training, and (2) if so, the degree to which the training influenced the decision to implement the measures in comparison to other potential influences. Multiple questions will be used to assess the validity of the second of these two batteries of questions. We will explain why this approach was chosen to isolate the effect of the training rather than other approaches. We will describe how these results can be combined with an estimate of energy savings to enable the evaluator to calculate net energy savings.
Search Form