|
Applying Item Response Theory in the Evaluation of a Clinical Program
|
| Presenter(s):
|
| Mukaria Itang'ata,
Western Michigan University,
mukaria.itangata@wmich.edu
|
| Abstract:
This presentation will demonstrate the application of the item response theory (IRT) in a clinical program evaluation study to evaluate student abilities as they trained. The use of IRT was necessary in the evaluation of the program for the purposes of establishing entry-level baselines and to develop items to test students' proficiencies (abilities) over time.
|
|
A Mixed Method Approach to Evaluating Civic Learning Outcomes
|
| Presenter(s):
|
| Lisa O'Leary,
Tufts University,
lisa.o_leary@tufts.edu
|
| Abstract:
Recently, higher education has witnessed a renewed commitment to the mission of preparing students for lives of active citizenship (Boyer 1996; Checkoway 2001; Harkavy 2006). To this end, universities have started to infuse programs focused on active citizenship at their campuses for students, faculty, and staff. This paper highlights a multi-faceted evaluation that has been developed to gauge one university's success at cultivating “active citizens”, one of its core missions. The process associated with operationally defining civic engagement, developing and validating multiple measurement instruments through exploratory factor analysis, confirmatory factor analysis, and item-response theory, and implementing a mixed-method outcomes evaluation is highlighted. Specifically, the paper describes the methodology, instrumentation, and metrics of a multi-cohort, time-series evaluation that is investigating undergraduate's participation in and attitudes towards civic engagement which simultaneously evaluates a specific co-curricular program, while capturing formative data about students' development with regard to civic engagement.
|
|
Maximizing Evaluation Impact by Maximizing Methods: Social Network Analysis Combined With Traditional Methods for Measuring Collaboration
|
| Presenter(s):
|
| Carl Hanssen,
Hanssen Consulting LLC,
carlh@hanssenconsulting.com
|
| Maryann Durland,
Durland Consulting,
mdurland@durlandconsulting.com
|
| Abstract:
This paper will review the Year 3 and preliminary findings from the Year 4 Evaluations of the Milwaukee Mathematics Partnership. The Milwaukee Mathematics Partnership is a comprehensive collaborative project designed to increase student achievement in mathematics. The logic from program design to outcomes- courses, relationships, math content, teaching strategies to student increased achievement in mathematics - is complex. However, one clear factor critical for project success is collaboration. The evaluation designs incorporated Social Network Analysis as an evaluation model for exploring collaboration. This review illustrates SNA measures and how they were linked to other metrics to provide a more comprehensive evaluation of the effectiveness of the implementation strategies. In addition, the review will describe the development of an MMP leadership map based on the SNA results and how the leadership criteria guided and supported existing practices and changes for project implementation strategies and indicators of success.
|
|
A Realist Synthesis Approach to Evaluating Complex Health Interventions
|
| Presenter(s):
|
| Sanjeev Sridharan,
University of Edinburgh,
sanjeev.sridharan@ed.ac.uk
|
| Abstract:
Using an ongoing example of an evaluation of a National Demonstration project, this class will discuss an iterative process by which findings from a single evaluation are set in the context of multiple evaluations. The process is informed by a realist synthesis framework but is focused on an evaluation of a single complex health intervention. Key features of the presentation include discussion on: (a) the role of evidence, innovations, and “muddling through” in learning from evaluations; (b) An iterative series of stakeholder dialogue to develop programme theory; (c) The use of data to test and explicate hypothesized links in a programme theory; (d) multiple types of learning that are possible from an evaluation including organizational learning, process learning, understanding risk landscapes of individuals, and individual-level impacts.
|
| | | |