2011

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Evaluating Professional Development and Program Improvement by Using Mixed Methods
Multipaper Session 985 to be held in Redondo on Saturday, Nov 5, 2:20 PM to 3:50 PM
Sponsored by the Mixed Methods Evaluation TIG
Chair(s):
Chad Green,  Loudoun County Public Schools, chad.green@loudoun.k12.va.us
The Best Laid Plans... Often Go Astray: Conducting a Mixed-methods Evaluation of a Changing Project
Presenter(s):
Kimberly Cowley, Edvantia, kim.cowley@edvantia.org
Kimberly Good, Edvantia, kimberly.good@edvantia.org
Nicole Finch, Edvantia, nicole.finch@edvantia.org
Abstract: The Appalachia Regional Comprehensive Center at Edvantia began facilitating a Teacher/Leader Effectiveness Community of Practice in 2010 for its state education agency clients across the five-state region. A mixed-method evaluation plan was developed for determining outcomes and the perceptions of value that regional stakeholders placed on their participation. Data collection methods were to include meeting observations, a survey to establish members’ expectations for impact and their level of confidence pre/post for achieving that impact, informal interviews conducted during the meetings, and semi-structured interviews with staff and members. However, midway through the project year, clients’ needs and interest changed, necessitating a programmatic shift away from a regional level to state-specific efforts. This presentation highlights how even the best-laid evaluation plans can change simply as a result of programmatic adjustments, and how we realigned our evaluation design to the extent possible as a result of those changes (and lessons learned along the way).
Evaluation of the Colorado Clinical and Translational Science Institute's (CCTSI) Leadership in Innovative Team Science (LITeS) Program: A Mixed-Methods Approach
Presenter(s):
Marc Brodersen, University of Colorado, Denver, marc.brodersen@ucdenver.edu
Anne Libby, University of Colorado, Denver, anne.libby@ucdenver.edu
Abstract: This paper details the mixed-methods approach used to evaluate the CCTSI's Leadership in Innovative Team Science (LITeS) program. The program was designed to provide leadership, teamwork, and mentoring training to principal investigators and program directors of federally funded T32 and K12 training programs, as well as relevant deans of the university. The training included eight full-day workshops scheduled in two-day blocks that spanned the academic calendar from September to May. The evaluation consisted of surveys administered to all participants at the end of each training block, as well as a one-year follow-up. These surveys were designed to assess knowledge gained and utilized in reference to domains relevant to NIH's Roadmap to Translational Research. Structured interviews were then conducted with participating deans to determine the effectiveness of the program in developing skills in these domains, and the value of the program in training leaders in medical research at the university.
Assessing post-training behaviors of suicide prevention training attendees using mixed methods over time
Presenter(s):
Brandee B Hicks, ICF Macro, bbrewer@icfi.com
Adrienne G Pica, ICF Macro, gpica@icfi.com
Christine M Walrath, ICF Macro, cwalrath@icfi.com
Richard McKeon, Substance Abuse and Mental Health Services Administration, richard.mckeon@samhsa.hhs.gov
Abstract: Since 2004, the Garrett Lee Smith (GLS) Suicide Prevention Initiative has supported prevention activities in 65 different States, Tribes, Territories and 78 different college campuses across the United States. The implementation of training activities has been a key component of the GLS initiative since inception with over 350,000 individuals being trained. As part of this SAMHSA funded initiative, a multi-method evaluation was developed to learn about prevention strategies used across grantees. This paper will describe three methods that were developed to assess trainee outcomes as the evaluation has evolved, and resources used to inform this process. The measures include: 1) a post-test specific to the training type to assess knowledge, self-efficacy, and intention to use the training; 2) a qualitative follow-up interview to learn about the trainee experience and utilization of knowledge and skills gained; 3) and a quantitative follow-up survey that examines trainee's retention and utilization of training material.
Performance Evaluation on National Policies and Strategies for Child Development
Presenter(s):
Ujsara Prasertsin, Chulalongkorn University, ubib_p@hotmail.com
Chirdsak Khovasint, Srinakharinwirot University, chird91@gmail.com
Somjet Viyakarn, Silpakorn University, 
Abstract: The purpose of this research was to evaluate effectiveness, efficiency and result of management and performance in national policies and strategies for child development by mixed methods evaluation. The evaluation results showed that: context-provincial officers ran the operation as general work, not mission emerged from the strategies; input-the plan was standardized target groups, and its budget was adequately allocated; process-the plans were appropriate and effective, and monitoring and quality assessment were run; effectiveness-the percentage of target group children did not go down as planned; impact-the plan officers concentrated on target group children's behavior, not skill or proficiency improvement; sustainability-the society needed planned operations and offered help and support for cooperation; transportability-the central national strategies and plans could not operated effectively, so each province was supposed to run its own plans and strategies with supports from the center.

 Return to Evaluation 2011

Add to Custom Program