Return to search form  

Session Title: Applications of Systems Thinking to Educational Evaluation
Multipaper Session 537 to be held in International Ballroom C on Friday, November 9, 10:20 AM to 11:05 AM
Sponsored by the Systems in Evaluation TIG
Chair(s):
Janice Noga,  Pathfinder Evaluation and Consulting,  jan.noga@stanfordalumni.org
Schooling as a Complex System: Appropriate Frameworks for Educational Evaluation
Presenter(s):
Tamara Walser,  University of North Carolina, Wilmington,  walsert@uncw.edu
Abstract: The purpose of this presentation is to propose appropriate frameworks for educational evaluation based on current research in complexity science and its application in education and the social sciences. Complex systems are, by definition, holistic, non-linear, unpredictable, emergent, adaptive, and changing. The brain and weather are examples of complex systems, as are social phenomena such as learning, teaching, and schooling. Although most would agree that schooling is a complex phenomenon, many of the frameworks used for instruction, student assessment, and educational evaluation do not account for this complexity—they are based on linear, cause and effect notions of schooling. Given the need for rigorous evaluations of educational programs, and the increasing complexity of schooling, it is important that evaluators use appropriate frameworks so that results are valid and meaningful.
What Else is Happening With Squishy and Marvin: Combining Program Logic, Appreciative Inquiry, and Complex Adaptive Systems Frameworks in Evaluating a K-12 Science Education Project
Presenter(s):
Lois-ellin Datta,  Datta Analysis,  datta@ilhawaii.net
Abstract: Squishy and Marvin are two squid, enthusiastically dissected by fifth graders as part of a National Science Foundation project bringing together classroom teachers and graduate science students. The evaluation framework combines (1) program logic in assessing implementation,(2) appreciative inquiry in seeing what is happening in the classrooms, and (3)complex adaptive systems to understand what may emerge from PRISM and what else that is happening may affect PRISM in a contribution analysis. The paper focuses on the nuts-and-bolts, methodologically, in applying CAS, from explaining CAS to the stakeholders, to training the CAS evaluator to data analysis. It is a promises and pitfalls case example of learning how to understand context and consequences through the framework of CAS.
Search Form