2011

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: A NASA Approach to Program Evaluation: Use of Social Science Methods to Engineer Education Projects in NASA Education's Portfolio
Multipaper Session 876 to be held in Huntington C on Saturday, Nov 5, 9:50 AM to 11:20 AM
Sponsored by the Government Evaluation TIG
Chair(s):
Brian Yoder, National Aeronautics and Space Administration, brian.yoder@nasa.gov
Abstract: NASA Office of Education has collaborated with a team of external evaluators to develop a comprehensive plan for evaluating its portfolio of education programs and using the findings from evaluations for program improvement. This multipaper session examines how NASA Education is using evaluation to develop knowledge about its programs for the purpose of decision making, and provides examples drawn from two national project evaluations. The first paper outlines NASA's approach to evaluation and utilization of findings. The second and third papers describe the evaluations of two national projects within NASA's Elementary and Secondary Education Program. The fourth paper shares the perspective of the two national project managers. The panel will conclude with an audience discussion of the aptness of the described stakeholder and evaluation utilization.
Overview: A NASA Approach to Program Evaluation
Brian Yoder, National Aeronautics and Space Administration, brian.yoder@nasa.gov
One of NASA's approaches to program evaluation aims at integrating program evaluation with project development so that education projects at NASA have a good chance of showing a measurable impact after a few years of refinement. This presentation provides an overview of some important considerations that informed this approach. These considerations include: developing a program evaluation process that reflects NASA's engineering culture and emphasizes team work and innovation; adheres to NASA's project planning template known as 7120.7; and aligns with current federal program evaluation guidance. This presentation will also highlight some less obvious intended goals of this evaluation approach like merging researcher knowledge and practitioner knowledge to better understand how program activities contribute to intended outcomes.
Evaluation of NASA Explorer Schools: The Formative Stage
Alina Martinez, Abt Associates Inc, alina_martinez@abtassoc.com
Sarah Sahni, Abt Associates Inc, Sarah_Sahni@abtassoc.com
Responding to recommendations from the National Research Council committee that reviewed NASA's elementary and secondary education projects, (1) NASA embarked on a redesign of the NASA Explorer Schools (NES) project in 2008. At each stage of the redesign, NASA has integrated evaluation activities and incorporated findings for program improvement. As part of the pilot activities (Spring 2010), NES gathered data from teachers and students to identify ways to improve the project's performance, and the NES project incorporated these lessons into the project for its September 2010 launch. The design of the formative evaluation has involved stakeholders and will lead to program modifications. The evaluation efforts ultimately will lead to an outcomes evaluation that investigates intended program outcomes, as laid out in the program logic model. (1) National Research Council. (2008). NASA's Elementary and Secondary Education Program: Review and Critique. Committee for the Review and Evaluation of NASA's Precollege Education Program, Helen R. Quinn, Heidi A. Schweingruber, and Michael A. Feder, Editors. Board on Science Education, Center for Education. Division of Behavioral and Social Sciences Education. Washington, D.C. The National Academies Press.
Evaluation of NASA's Summer of Innovation Project
Hilary Rhodes, Abt Associates Inc, hilary_rhodes@abtassoc.com
Kristen Neishi, Abt Associates Inc, kristen_neishi@abtassoc.com
In 2010, NASA's Office of Education launched Summer of Innovation, a NASA-infused summer experience for middle school students who underperform, are underrepresented, and underserved in science, technology, engineering, and math (STEM) fields. Since inception, process and outcomes evaluation has been an integral part of the program's development. By collecting planning and implementation data from awardees through interviews and reporting forms, and outcomes data from participating student and teachers through surveys, the evaluation has codified the lessons learned over the course of its pilot, producing actionable insight that has supported NASA's modification of the program for summer 2011. Formative evaluation efforts are continuing to support the program's continued implementation, to identify promising practices and models meriting more rigorous outcomes evaluation, understand how the awardees meet NASA requirements and the feasibility of these expectations, and continue to generate lessons learned for future implementations of SoI and of NASA's education activities more broadly.
Evaluation Utilization from NASA Project Managers' Perspectives
Rob LaSalvia, National Aeronautics and Space Administration, robert.f.lasalvia@nasa.gov
Rick Gilmore, National Aeronautics and Space Administration, richard.l.gilmore@nasa.gov
The NASA Explorer Schools (NES) and NASA Summer of Innovation (SoI) programs have integrated evaluation into the design, development, and refinement activities of the program. National project managers will discuss how building evaluation at the ground level of the program has differed from work on previous projects, and how evaluation has informed their thinking about the national projects and the individual sites. They will also provide reflections on the process of working closely with evaluators beginning at the early stages of the projects and this evaluation, describing what has worked well as well as what has been challenging.

 Return to Evaluation 2011

Add to Custom Program