Return to search form  

Session Title: When Does Evaluation Not Feel Like Evaluation? Embedding Evaluation Activities Into Programs
Panel Session 122 to be held in Fairmont Suite on Wednesday, November 7, 4:30 PM to 6:00 PM
Sponsored by the Pre-K - 12 Educational Evaluation TIG
Chair(s):
Leslie Goodyear,  Education Development Center Inc,  lgoodyear@edc.org
Discussant(s):
Sylvia James,  National Science Foundation,  sjames@nsf.gov
Abstract: Embedding evaluation within program activities is a way to encourage programs to engage in ongoing, continuous evaluation. These four presenters, evaluators of projects funded by the National Science Foundation's ITEST (Information Technology Experiences for Students and Teachers) program, will present the ways in which they have worked with projects to embed evaluation within project activities and the learnings, both programmatic and evaluative, that come from their experiences. The chair and discussant for the session will tie together these presentations with information about the ITEST program and the evaluation research work in which these presenters are involved.
Using Embedded Evaluation to Assist Teachers in Using Inquiry-based Modules That Integrate Math, Science and Information Technology (IT)
Roxann Humbert,  Fairmont State University,  roxann.humbert@fairmontstate.edu
The Comprehensive Information Technology Education in Rural Appalachia CITERA program is designed to help teachers and students learn about, experience, and diffuse Information Technology (IT) concepts within the context of existing Science, Technology, and Mathematics courses. A key objective of CITERA is to assist teachers in the design of inquiry-based modules that integrate Math and Science content, IT concepts and skills, and national and state standards. This presentation will share the embedded evaluation methods used by the CITERA program, such as peer, student, and expert reviews of the teaching units and a daily blog that is monitored by the evaluation team.
Embedding Evaluation Activities to Promote Learning
Ann Howe,  SUCCEED Apprentiship Program,  achowe@earthlink.net
The SUCCEED Apprenticeship program gives 8th -10th grade students, over a two-year period, the opportunity to have authentic and appropriate hands-on experiences in the use of technologies, techniques and tools of IT within the context of science, mathematics and engineering. Ongoing, continuous evaluation is part of the regular routine, and aimed at ensuring that each apprentice reaches a high level of competence in each topic in the curriculum so that he or she will be able to apply this knowledge and skill to complete the next required IT project. This presentation will focus on the specific techniques used by the evaluator to integrate evaluation into all aspects of the program in order to promote learning.
Using Engineering Notebooks as Embedded Evaluation
Neal Grandgenett,  University of Nebraska, Omaha,  ngrandgenett@mail.unomaha.edu
Neal Grandgenett is the evaluator of the SPIRIT educational robotics project in Nebraska. SPIRIT stands for the Silicon Prairie Initiative for Robotics in Information Technology. The SPIRIT project is striving to help middle school teachers learn how to teach science, technology, engineering and mathematics topics in the context of a flexible and 'scrounged-parts' robotics platform called the TekBot. As part of SPIRIT's ongoing evaluation process, an embedded evaluation instrument is now being used which is essentially the individual student's 'engineering notebook'. The engineering notebook format was developed by teachers in the project, and has been well embraced in the overall evaluation process. Selected examples of this embedded evaluation process will be given during the panel presentations.
Games as Embedded Assessments
Karen Peterman,  Goodman Research Group Inc,  peterman@grginc.com
Deborah Muscella,  Girls Get Connected Collaborative,  dbm@muscella.com
Since 2005, Goodman Research Group, Inc. (GRG) and the Girls Get Connected Collaborative (GGCC) have worked together to create innovative authentic assessments to evaluate students' technology skills and science knowledge related to a program called Technology at the Crossroads. As a summer and after-school program, GGCC wanted to create assessment tools that engaged students and were learning activities themselves. The assessment model used for this evaluation involves creating games and activities that require students to use the skills and knowledge they have gained from the program to compete in a series of field day competitions. These assessments have been used to capture both output and outcomes data. GRG and GGCC are now creating new assessments that can be linked with one another longitudinally. We will share examples of the games created for this evaluation as well as the lessons they have learned about using this methodology.
Search Form