| Session Title: You + Graphic Design = Fame, Glory |
| Demonstration Session 851 to be held in California A on Saturday, Nov 5, 9:50 AM to 11:20 AM |
| Sponsored by the Data Visualization and Reporting TIG |
| Presenter(s): |
| Stephanie Evergreen, Evergreen Evaluation, stephanie@evergreenevaluation.com |
| Abstract: "Death by Powerpoint" won't literally kill your audience. But it will cause them to check their phone messages, flip ahead in the handout, and fall asleep. In this demonstration, attendees will learn the science behind good slideshows and will leave with direct, pointed changes that can be administered to their own evaluation presentations. The demonstration will focus on evidence-based principles of slideshow design that support legibility, comprehension, and retention of our evaluation work in the minds of our clients. Grounded in visual processing theory, the principles will enhance attendees' ability to communication more effectively with peers, colleagues, and clients through a focus on the proper use of color, placement, and type in slideshow presentations. |
| Session Title: A Conversation With Michael Patton About how Values Undergird the Assessment of Program Effects | |||
| Panel Session 852 to be held in California B on Saturday, Nov 5, 9:50 AM to 11:20 AM | |||
| Sponsored by the Qualitative Methods TIG | |||
| Chair(s): | |||
| Charles Reichardt, University of Denver, creichar@du.edu | |||
| Abstract: The paradigmatic views and methodological values of a leading proponent of qualitative methods will be probed and challenged in a collegial conversation about assessing program effects. Although the questions that will be asked are meant to confront and prod, the purpose of the conversation is to understand rather than debate - to talk with, rather than past, each other. If qualitative and quantitative researchers are to resolve their longstanding animosities, they must come to understand each other's perspectives and values. The purpose of this panel is to provide a forum for a qualitative researcher to answer the pointed but respectful questions that quantitative researchers need to have answered if they are to understand and appreciate qualitative methods. In return, the answers will challenge and probe quantitative assumptions and perspectives. Ample time will be left for comments and questions from the audience. | |||
| |||
| |||
|
| Session Title: Starting and Succeeding as an Independent Consultant | |||
| Panel Session 853 to be held in California C on Saturday, Nov 5, 9:50 AM to 11:20 AM | |||
| Sponsored by the Independent Consulting TIG | |||
| Chair(s): | |||
| Jennifer Williams, Out of the Crossfire Inc, jenniferwilliams.722@gmail.com | |||
| Abstract: Independent Consultants will share their professional insights on starting and maintaining an Independent Evaluation Consulting business. Panelists will describe ways of building and maintaining client relationships and share their expertise related to initial business set-up and lessons they have learned. Discussions will include the pros and cons of having an independent consulting business, the various types of business structures, methods of contracting and fee setting, as well as the personal decisions that impact having your own business. Panelists will examine some consequences of evaluation in the context of conducting independent consulting in diverse settings. The session will include ample time for audience members to pose specific questions to the panelists. | |||
| |||
| |||
| |||
| |||
|
| Session Title: Diversity Dialogue: Strategies & Stories From the Evaluation Road |
| Think Tank Session 855 to be held in Pacific A on Saturday, Nov 5, 9:50 AM to 11:20 AM |
| Sponsored by the Presidential Strand |
| Presenter(s): |
| Fiona Cram, Katoa Ltd, fionac@katoa.net.nz |
| Discussant(s): |
| Kari Greene, Oregon Public Health Division, kari.greene@state.or.us |
| Maurice Samuels, University of Chicago, mcsamuels@uchicago.edu |
| Nicole Bowman, Bowman Performance Consulting, nicky@bpcwi.com |
| Sharon Brisolara, Evaluation Solutions, sharon@evaluationsolutions.net |
| Fiona Cram, Katoa Ltd, fionac@katoa.net.nz |
| Abstract: The diverse values of varied stakeholders in evaluation contexts constitute a vital domain for evaluation. Stakeholder values help shape the direction, substance, criteria, and intended uses of the evaluation. An inevitable challenge is to respond to the diversity and plurality of legitimate stakeholder values and interests. And too often, the values and interests of stakeholders who are least well served are quieted by other more powerful voices. In response, democratic evaluation inclusively seeks to provide spaces for all stakeholders, including those least well served, to have a 'say' in the shape of the evaluation through processes of dialogue and deliberation (House & Howe, 1999). This interactive session will highlight different approaches used to engage the challenge of generating meaningful avenues for inclusion of less powerful stakeholders' values, voice, and interests in evaluation. Attendees will receive practical tools and information through presentation, interactive breakout groups, and reflections from a discussant. |
| Session Title: Reliability: The Beginning of Value | ||||||||||||||||||
| Multipaper Session 857 to be held in Pacific C on Saturday, Nov 5, 9:50 AM to 11:20 AM | ||||||||||||||||||
| Sponsored by the Quantitative Methods: Theory and Design TIG | ||||||||||||||||||
| Chair(s): | ||||||||||||||||||
| Dale Berger, Claremont Graduate University, dale.berger@cgu.edu | ||||||||||||||||||
|
| Session Title: Changing Our Tune: Reinventing Evaluation While Your Organization Transforms | |||||
| Panel Session 858 to be held in Pacific D on Saturday, Nov 5, 9:50 AM to 11:20 AM | |||||
| Sponsored by the Non-profit and Foundations Evaluation TIG | |||||
| Chair(s): | |||||
| Lester Baxter, The Pew Charitable Trusts, lbaxter@pewtrusts.org | |||||
| Abstract: Every organization with an internal evaluation function struggles with questions of that unit's mission, responsibilities, and relationship to the larger organization. This panel will examine what happened to an evaluation department when the private foundation within which it was well-established became a public charity. This change in legal status led to a dramatic transformation of the organization, including the shift to directly operating the majority of its policy reform projects and an expanded infrastructure. Panelists will discuss how the organization's rationale for evaluation (and its complementary roles in planning and knowledge sharing) evolved in response to internal changes. Discussion will focus on the unit's re-envisioning of its role, describing which efforts failed, which are succeeding, and which are still in flux. The panel will be of interest to evaluators working in complex or changing institutional environments, and those interested in the role of evaluation and planning in policy change efforts. | |||||
| |||||
| |||||
|
| Roundtable Rotation I: Evaluation Challenges of Built Environment Policy-System-Environment (PSE) Changes |
| Roundtable Presentation 859 to be held in Conference Room 1 on Saturday, Nov 5, 9:50 AM to 11:20 AM |
| Sponsored by the Collaborative, Participatory & Empowerment Evaluation TIG |
| Presenter(s): |
| Laurie Ringaert, Seattle King County Public Health, laurie.ringaert@kingcounty.gov |
| Jim Krieger, Seattle King County Public Health, james.krieger@kingcounty.gov |
| Nadine Chan, Seattle King County Public Health, Nadine.chan@kingcounty.gov |
| Kadie Bell, Seattle King County Public Health, Kadie.bell@kingcounty.gov |
| Ryan Kellogg, Seattle King County Public Health, Ryan.Kellogg@kingcounty.gov |
| Abstract: The Public Health - Seattle & King County was awarded two highly-competitive federal stimulus grants to address the leading causes of death in our region as part of the CDC Communities Putting Prevention to Work (CPPW). This presentation focuses on the evaluation of the seven local government grantees involving a participatory, developmental approach and a focus on creating policy, systems and environment (PSE) changes that would produce healthier built/food environments. PSE evaluation methodology and the introduction of health concepts into planning are both relatively new. As a result, creative evaluation plans and processes and specific tools were developed to capture baseline and changes over time. Team role challenges will be discussed. The evaluation takes into account what real world changes are possible in an atmosphere of economic downturns multiple stakeholder interests in policy development. The presenter will discuss the outcomes, challenges and lessons learned from this evaluation approach. |
| Roundtable Rotation II: Examining the Impact of a Community Partnership to Increase Capacity to Train more Nurses and Provide Better Care for Older Adults |
| Roundtable Presentation 859 to be held in Conference Room 1 on Saturday, Nov 5, 9:50 AM to 11:20 AM |
| Sponsored by the Collaborative, Participatory & Empowerment Evaluation TIG |
| Presenter(s): |
| Paula Rowland, Independent consultant, paula@global-concerns.com |
| Afsaneh Rahimian, Independent Consultant, rahimianafsaneh@yahoo.com |
| Abstract: We used a mixed-method approach to evaluate the impact of a diverse community collaboration implementing multi-layered strategies to address the shortage of nurses specializing in elder care. Pacific Lutheran University's School of Nursing worked with the partners to refocus the nursing school curriculum, hire faculty, award scholarships, expand in- patient clinical placements and create innovative community based clinical opportunities to increase students' exposure to gerontology and influence their career choice. We measured the strength of the collaboration through a focus group with the community partners, phone interviews with nursing school faculty and community stakeholders, focus group with the scholarship recipients, and an online survey of all PLU nursing students enrolled in the study year. Findings suggest that the strength of this collaboration is at the very core of the project's successful achievement of its short and mid-term outcomes and that the partnership is on its way to becoming self- sustaining. |
| Roundtable Rotation I: Improving Evaluation Practice With Youth: A Checklist for Developmentally Sensitive Program Evaluation |
| Roundtable Presentation 860 to be held in Conference Room 12 on Saturday, Nov 5, 9:50 AM to 11:20 AM |
| Sponsored by the Non-profit and Foundations Evaluation TIG |
| Presenter(s): |
| Silvana Bialosiewicz, Claremont Graduate University, silvana@cgu.edu |
| Miriam Jacobson, Claremont Graduate University, jacobson.miriam@gmail.com |
| Tiffany Berry, Claremont Graduate University, tiffany.berry@cgu.edu |
| Abstract: The evaluation of programs that serve youth can be complex given the multifaceted nature of child and adolescent development. These evaluations require a developmentally sensitive approach, which includes thoughtful consideration of the characteristics this unique population brings to the evaluation. In this Round Table we will describe a developmental sensitivity checklist framed within the Centers for Disease Control's six-phase framework. Our goal is to document the pertinent considerations for typical as well as atypical youth across each stage of the evaluation process. This checklist was developed through an extensive literature review of best practices in applied research with youth and was validated by an expert panel of developmental psychologists and veteran youth-program evaluators. In this Round Table we will introduce our tool, receive feedback to refine the tool, as well as engage evaluators in a discussion about how we can continually improve the quality of program evaluations targeting youth. |
| Roundtable Rotation II: Assessing the Systems of Supports and Opportunities for Youth in Six Detroit Neighborhoods as a Building Block for Development |
| Roundtable Presentation 860 to be held in Conference Room 12 on Saturday, Nov 5, 9:50 AM to 11:20 AM |
| Sponsored by the Non-profit and Foundations Evaluation TIG |
| Presenter(s): |
| Della M Hughes, Brandeis University, dhughes@brandeis.edu |
| Brian Dates, Southwest Counseling Solutions, bdates@swsol.org |
| Sara Plachta Elliott, The Skillman Foundation, selliott@skillman.org |
| Abstract: What does it take at a neighborhood level to ensure young people can be safe, healthy, well educated, and prepared for adulthood? The Skillman Foundation in Detroit, Michigan is investing $100 M over ten years in six neighborhoods to (among other neighborhood capacities) create systems of supports and opportunities (SOSO) with an array of youth development, volunteer, college and career exposure and access, and youth employment preparation and placement programs. Brandeis University assessed the SOSOs to provide data for planning and decision making. Southwest Counseling Solutions is a Skillman grantee charged with SOSO development and management in two of the six neighborhoods. Participants will address how the assessment took place and what kind of results it produced, the practical applications of having a database for planning and system development, how the Foundation and a community-based organization are managing the data going forward, and whether the data really makes a difference. |
| Session Title: Organizational Learning and Approaches in Calling for, Conducting, and Using Evaluations | ||||||||||||||||||||||||||||||||||
| Multipaper Session 861 to be held in Conference Room 13 on Saturday, Nov 5, 9:50 AM to 11:20 AM | ||||||||||||||||||||||||||||||||||
| Sponsored by the International and Cross-cultural Evaluation TIG | ||||||||||||||||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||||||||||||||||
| Jim Rugh, RealWorld Evaluation, jimrugh@mindspring.com | ||||||||||||||||||||||||||||||||||
|
| Session Title: Engaging Youth in School-based Youth Participatory Evaluation | |||
| Panel Session 862 to be held in Conference Room 14 on Saturday, Nov 5, 9:50 AM to 11:20 AM | |||
| Sponsored by the Collaborative, Participatory & Empowerment Evaluation TIG | |||
| Chair(s): | |||
| Robert Shumer, University of Minnesota, rshumer@umn.edu | |||
| Abstract: A lot has occurred since the early beginnings of youth led evaluation. Ever since Kim Sabo asked Brad Cousins why there weren't youth involve as evaluators, in school participatory evaluation models, a field has been growing. From Massachusetts to Michigan, from Minnesota to California, youth led evaluation has gone into high gear. Many programs are operating and expanding. However, most youth participatory evaluations are being developed through after-school programs and initiatives. Few courses exist in K-12 schools that teach youth how to be civically engaged citizens and conduct youth participatory evaluations. In this session we learn about a few programs that engage students in evaluation and then see and hear about two California programs that engage students through actual in-school efforts to evaluate service-learning and character education. programs. | |||
| |||
| |||
|
| Session Title: When Monetary Quantification Is Not Sufficient: Other Factors That Are Useful for Determining Program Success | ||||||||||||||||||||||
| Multipaper Session 863 to be held in Avila A on Saturday, Nov 5, 9:50 AM to 11:20 AM | ||||||||||||||||||||||
| Sponsored by the Costs, Effectiveness, Benefits, and Economics TIG | ||||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||||
| Nadini Persaud, University of the West Indies, npersaud07@yahoo.com | ||||||||||||||||||||||
|
| Session Title: Isn't Just Talk Talk Talk: How Systems Approaches Can Work in the Real World of Boundaries and Power |
| Think Tank Session 864 to be held in Avila B on Saturday, Nov 5, 9:50 AM to 11:20 AM |
| Sponsored by the Systems in Evaluation TIG |
| Presenter(s): |
| Mary McEathron, University of Minnesota, mceat001@umn.edu |
| Erin Watson, Michigan State University, droegeer@msu.edu |
| Discussant(s): |
| Martin Reynolds, The Open University, m.d.reynolds@open.acgçï.uk |
| Vidhya Shanker, University of Minnesota, shan0133@umn.edu |
| Abstract: Evaluators have long known, discussed and written about power imbalances evident in decisions made about how programs are funded, implemented, and evaluated. While a number of systems-thinking evaluation approaches identify multiple stakeholder perspectives and define program boundaries, two approaches - Soft Systems Methodology and Critical Systems Heuristic- can be used to address issues of power imbalances in human systems. In this interactive session, presenters share practice-based vignettes of these approaches to introduce questions of power relations and boundary judgments. Participants engage in small group discussions to explore: (1) What issues of power and boundaries are addressed using these approaches? (2) Who and/or what needs to be present to move awareness into skillful action? (3) How can these approaches address historical imbalances of power due to race, class, gender, and culture? When full group reconvenes, discussants Martin Reynolds and Vidhya Shanker reflect on group responses and with participants create recommendations for practice |
| Roundtable Rotation I: Evaluating K-12 Professional Development: English as a Second Language (ESL) Coaching and Inclusion |
| Roundtable Presentation 865 to be held in Balboa A on Saturday, Nov 5, 9:50 AM to 11:20 AM |
| Sponsored by the Pre-K - 12 Educational Evaluation TIG |
| Presenter(s): |
| Michelle Bakerson, Indiana University South Bend, mmbakerson@yahoo.com |
| Abstract: Evaluators are often contracted by school districts or organizations receiving grants to develop and facilitate programs to benefit the school or organization. One such school district in Northern Indiana is a district whose K-12 teachers received professional development in coaching English as a Second Language (ESL). The evaluation was conducted to determine teacher attitudes and perceptions toward full ESL inclusion in which ESL teachers work as coaches in conjunction with classroom teachers to provide both indirect and direct services in an inclusive setting, as proposed by the school district. The evaluation was designed to be a learning tool for facilitating the improvement of the professional development provided at this school. Accordingly, a collaborative evaluation approach was utilized to actively engage the school and the teachers during the whole process. The steps, advantages, and obstacles of this evaluation will be discussed. |
| Roundtable Rotation II: Improving Educational Leadership Through the Development of Professional Learning Communities |
| Roundtable Presentation 865 to be held in Balboa A on Saturday, Nov 5, 9:50 AM to 11:20 AM |
| Sponsored by the Pre-K - 12 Educational Evaluation TIG |
| Presenter(s): |
| Annie Woo, Oregon Department of Education, dranniewoo@gmail.com |
| Abstract: This evaluation study provided administrators of Professional Learning Communities (PLC) within eight school districts across (32 individual schools) in four states with formative and summative data to assist them in making program changes as they addressed the challenges of high school reform. We focused our efforts on the identification of the factors that contribute to the success of schools in achieving Adequate Yearly Progress through the implementation of PLC, with the aim of providing teachers with a more supportive teaching environment. Achievement data review, surveys, and interviews were conducted for the purposes of measuring the effectiveness of PLC in: a) developing effective professional learning communities in schools; b) providing teachers with both an intellectually challenging and emotionally supportive professional environment; and c) increasing student achievement. Findings pertain to PLC implementation, improvisation, challenges to evaluation, and lessons learned with relevance to high school reform. |
| Session Title: Examining and Understanding the Power and Impact of the Robert Wood Johnson Foundation's Community Health Leaders Programs | ||||
| Panel Session 866 to be held in Balboa C on Saturday, Nov 5, 9:50 AM to 11:20 AM | ||||
| Sponsored by the Non-profit and Foundations Evaluation TIG | ||||
| Chair(s): | ||||
| Laura Leviton, The Robert Wood Johnson Foundation, llevito@rwjf.org | ||||
| Discussant(s): | ||||
| Claire ReineIt, Leadership Learning Community, claire@leadershiplearning.org | ||||
| Abstract: This session presents the evaluation findings of three interrelated community health leadership programs supported by the Robert Wood Johnson Foundation (RWJF). The first two presentations will focus on the RWJF Community Health Leaders (CHL) Award Program which has recognized hundreds of innovative leaders who have made extraordinary contributions to increasing access to quality health care and improving health outcomes at the community level. The third presentation focuses on findings from the evaluation of the Ladder to Leadership: Developing the Next Generation of Community Health Leaders which is a collaborative initiative of RWJF and the Center for Creative Leadership. The initiative focuses on developing critical leadership competencies of early- to mid-career professionals. Key questions to be addressed are: (1) What factors are most influential in determining leaders' pathways? (3) What are the most significant commonalities among the leaders? (4) What are the key challenges and lessons learned and their implications? | ||||
| ||||
| ||||
|
| Session Title: Theory and Practice: Putting It All Together | |||||||||||||||||||||||||||||
| Multipaper Session 867 to be held in Capistrano A on Saturday, Nov 5, 9:50 AM to 11:20 AM | |||||||||||||||||||||||||||||
| Sponsored by the Organizational Learning and Evaluation Capacity Building | |||||||||||||||||||||||||||||
| Chair(s): | |||||||||||||||||||||||||||||
| Erika Fulmer, Centers for Disease Control and Prevention, efulmer@cdc.gov | |||||||||||||||||||||||||||||
|
| Session Title: Demonstrating the Use of Evaluation to the Social Work Student, Teacher, and Practitioner | ||||||||||
| Multipaper Session 868 to be held in Capistrano B on Saturday, Nov 5, 9:50 AM to 11:20 AM | ||||||||||
| Sponsored by the Social Work TIG | ||||||||||
| Chair(s): | ||||||||||
| Sarita Davis, Georgia State University, saritadavis@gsu.edu | ||||||||||
|
| Session Title: Design, Cost, and Initial Program Model Findings From the Substance Abuse and Mental Health Services Administration (SAMHSA) Grants for the Benefit of Homeless Individuals National Cross-Site Evaluation | |||||||
| Panel Session 869 to be held in Carmel on Saturday, Nov 5, 9:50 AM to 11:20 AM | |||||||
| Sponsored by the Alcohol, Drug Abuse, and Mental Health TIG | |||||||
| Chair(s): | |||||||
| Nahama Broner, RTI International, nbroner@rti.org | |||||||
| Abstract: SAMHSA's Grants for the Benefit of Homeless Individuals (GBHI) program funds grantees to provide services to expand and strengthen treatment to those with substance use and, or co-occurring mental disorders and link treatment services with housing with the goals of abstinence, housing stability and decreased homelessness. In operation since 2001, this U.S. federal program has not previously been evaluated. This panel presents the evaluation framework, a socioecological model, the design, and data collection and analysis methodology for each of the study's components-structure, process, outcome and cost--developed to address evaluation questions. We also present initial findings from the 25-site impact study regarding service models, intervention costs and leveraging of resources to develop integrative strategies for the provision of evidence-based treatment, wrap-around services, and housing. Implications for multi-site evaluation design of heterogeneous programs will be discussed in the context of assessing the effectiveness of interventions on abstinence, homelessness and housing. | |||||||
| |||||||
| |||||||
| |||||||
|
| Session Title: Add Usability Testing to Your Evaluation Toolbox |
| Demonstration Session 870 to be held in Coronado on Saturday, Nov 5, 9:50 AM to 11:20 AM |
| Sponsored by the Distance Ed. & Other Educational Technologies TIG |
| Presenter(s): |
| Christine Paulsen, Concord Evaluation Group, cpaulsen@concordevaluation.com |
| Abstract: Many of the programs and initiatives that we evaluate today are technology-based. It is not uncommon for initiatives to provide information to their target audiences via websites, while other interventions are delivered with software applications to mobile, handheld or other devices. To properly evaluate such initiatives, the evaluator must consider the usability (user-friendliness and accessibility) of the technology components. This demonstration will provide participants with an overview of the most common usability method--the one-on-one usability session, including the think aloud procedure. Participants will learn how to develop a usability testing script, how to recruit participants, how to run a usability session, how to analyze the data. Video examples of actual testing sessions will be included. |
| Session Title: Evaluating Complex Collaborative Science Initiatives: Utilization of Logic Models in Four Clinical Translational Science Institutes | ||||||
| Panel Session 871 to be held in El Capitan A on Saturday, Nov 5, 9:50 AM to 11:20 AM | ||||||
| Sponsored by the Health Evaluation TIG | ||||||
| Chair(s): | ||||||
| Kevin Wooten, University of Houston, Clear Lake, wooten@uhcl.edu | ||||||
| Discussant(s): | ||||||
| Nick Smith, Syracuse University, lsmith@syr.edu | ||||||
| Abstract: This panel will address the use of logic models in environments that are highly complex, non-linear, process driven, and institutionally political. Drawing upon a representative sample of academic institutions receiving Clinical and Translational Science Awards (CTSA) from the National Institute of Health, four presenters will address the role of evaluation in CTSA settings that engender controversy and inquiry. Value conflicts such as the need for scientific rigor versus emergent design, outcome versus process evaluation, and empirical methods versus action research are all difficult to address using a logic model approach. Our presentations will involve using logic models to: 1) simultaneously evaluate outcomes and development; 2) integrate multiple constituencies for collaboration and a common vision; 3) develop and evaluate multidisciplinary teams in accordance with effective team science processes; and 4) utilize stakeholders and constituencies in the real-time design of evaluation tools to address relevance and dynamic institutional cultures. | ||||||
| ||||||
| ||||||
| ||||||
|
| Session Title: A Seven Year External Evaluation of an International Aid Program in 25 Countries | |||
| Panel Session 872 to be held in El Capitan B on Saturday, Nov 5, 9:50 AM to 11:20 AM | |||
| Sponsored by the International and Cross-cultural Evaluation TIG | |||
| Chair(s): | |||
| Michael Scriven, Claremont Graduate University, mjscriv1@gmail.com | |||
| Abstract: Four of the key participants will describe their experiences, approaches, and lessons learned, in the course of a external impact evaluation that went on for seven years and involved putting evaluation teams into villages and homes in 25 countries from all the non-polar continents. Some issues and achievements that may be of general interest include: selecting and supervising interpreters and local researchers; facilitating open communications; developing a complex model that was both program specific and readily adaptable to other programs; finding a way to establish causation beyond reasonable doubt without using control groups; retaining reasonable independence despite a long and amiable relationship with the client (including contracts that were always limited to one year at a time); getting at values and attitude changes as well as the directly observable housing, nutrition, and economic changes. | |||
| |||
| |||
| |||
|
| Roundtable Rotation I: How to Train Evaluators to Interview Deaf and Hard-of-Hearing Participants: Strategies for an Effective Interview |
| Roundtable Presentation 873 to be held in Exec. Board Room on Saturday, Nov 5, 9:50 AM to 11:20 AM |
| Sponsored by the Lesbian, Gay, Bisexual, Transgender Issues TIG and the Teaching of Evaluation TIG |
| Presenter(s): |
| Jennifer Morrow, University of Tennessee, jamorrow@utk.edu |
| Ann Cisney-Booth, University of Tennessee, acisneybooth@utk.edu |
| Lisa Rimmell, University of Tennessee, lrimmell@utk.edu |
| Abstract: In this roundtable we will discuss our experiences interviewing deaf and hard-of-hearing participants in our evaluation projects. We will discuss the various ways that deaf and hard-of-hearing individuals communicate (e.g., American Sign Language, Signed English, Cued Speech, Auditory-Oral Method). We will review how evaluators can best prepare beforehand (i.e., interview protocol, room arrangements) to interview an individual who is deaf or hard-of-hearing. Lastly, we will spend most of the time leading a discussion with the audience members on strategies for conducting an effective interview with participants who are deaf or hard-of-hearing. |
| Roundtable Rotation II: Don't Forget Us! Standardizing Methods of Data Collection That are Inclusive of Transgender and Gender Non-Conforming Individuals |
| Roundtable Presentation 873 to be held in Exec. Board Room on Saturday, Nov 5, 9:50 AM to 11:20 AM |
| Sponsored by the Lesbian, Gay, Bisexual, Transgender Issues TIG and the Teaching of Evaluation TIG |
| Presenter(s): |
| Loretta Worthington, Rainbow Health Initiative, loretta.worthington@rainbowhealth.org |
| Rachel Fletcher, Rainbow Health Initiative, rachel.fletcher@rainbowhealth.org |
| Abstract: Historically, local and national health data collection and evaluation efforts have mostly excluded LGBTQ people. Consequently, there is currently no set of standardized questions to collect sexual minority and gender identity information. Gender non-conforming individuals are often lost in data collection efforts. Evaluation methods must develop appropriate and standardized methods of asking about gender identity that result in data collection of relevant information to better serve the needs of gender non-conforming populations in the health, social justice, and social policy fields. Rainbow Health Initiative will discuss health assessment data collection over a 3-year period, including the survey instrument design, complications, and final evaluation questions leading to broad data collection with regards to sexual minorities and gender identity representation. If these questions become standardized, it could significantly increase research data on gender non-conforming populations and provide the means for more research, programs, and services. |
| Session Title: The World Is Not Flat | ||||||||||||
| Multipaper Session 874 to be held in Huntington A on Saturday, Nov 5, 9:50 AM to 11:20 AM | ||||||||||||
| Sponsored by the Integrating Technology Into Evaluation | ||||||||||||
| Chair(s): | ||||||||||||
| Paul Lorton Jr, University of San Francisco, lorton@usfca.edu | ||||||||||||
| Discussant(s): | ||||||||||||
| Matthew Galen, Claremont Graduate University, matthew.galen@cgu.edu | ||||||||||||
|
| Session Title: The Role of Government Evaluation Policies and How It Affects Quality of Services |
| Think Tank Session 875 to be held in Huntington B on Saturday, Nov 5, 9:50 AM to 11:20 AM |
| Sponsored by the Government Evaluation TIG |
| Presenter(s): |
| Stanley Capela, HeartShare Human Services, stan.capela@heartshare.org |
| Discussant(s): |
| Kristin Kaylor Richardson, Western Michigan University, kkayrich@comcast.net |
| Sathi Dasgupta, SONA Consulting Inc, sathi@sonaconsulting.net |
| Gabriel M Della-Piana, Independent Consultant, dellapiana@aol.com |
| Connie Kubo Della-Piana, National Science Foundation, cdellapi@nsf.gov |
| Abstract: The purpose of this think tank is to raise several questions. We would begin by asking whether or not government policy at the federal, state or local level affects the quality of government-sponsored services and service delivery systems or programs? If we agree, the group would then explore several other questions. First, what is the role of government evaluation policies in affecting quality of services? Second, are there concrete examples where government policy on evaluation had a positive effect on ensuring quality of services? Third, can one conclude that one way government policy can ensure quality of services is to require programs to seek out accreditation as a way to evaluate its' services? |
| Session Title: A NASA Approach to Program Evaluation: Use of Social Science Methods to Engineer Education Projects in NASA Education's Portfolio |
| Multipaper Session 876 to be held in Huntington C on Saturday, Nov 5, 9:50 AM to 11:20 AM |
| Sponsored by the Government Evaluation TIG |
| Chair(s): |
| Brian Yoder, National Aeronautics and Space Administration, brian.yoder@nasa.gov |
| Abstract: NASA Office of Education has collaborated with a team of external evaluators to develop a comprehensive plan for evaluating its portfolio of education programs and using the findings from evaluations for program improvement. This multipaper session examines how NASA Education is using evaluation to develop knowledge about its programs for the purpose of decision making, and provides examples drawn from two national project evaluations. The first paper outlines NASA's approach to evaluation and utilization of findings. The second and third papers describe the evaluations of two national projects within NASA's Elementary and Secondary Education Program. The fourth paper shares the perspective of the two national project managers. The panel will conclude with an audience discussion of the aptness of the described stakeholder and evaluation utilization. |
| Overview: A NASA Approach to Program Evaluation |
| Brian Yoder, National Aeronautics and Space Administration, brian.yoder@nasa.gov |
| One of NASA's approaches to program evaluation aims at integrating program evaluation with project development so that education projects at NASA have a good chance of showing a measurable impact after a few years of refinement. This presentation provides an overview of some important considerations that informed this approach. These considerations include: developing a program evaluation process that reflects NASA's engineering culture and emphasizes team work and innovation; adheres to NASA's project planning template known as 7120.7; and aligns with current federal program evaluation guidance. This presentation will also highlight some less obvious intended goals of this evaluation approach like merging researcher knowledge and practitioner knowledge to better understand how program activities contribute to intended outcomes. |
| Evaluation of NASA Explorer Schools: The Formative Stage |
| Alina Martinez, Abt Associates Inc, alina_martinez@abtassoc.com |
| Sarah Sahni, Abt Associates Inc, Sarah_Sahni@abtassoc.com |
| Responding to recommendations from the National Research Council committee that reviewed NASA's elementary and secondary education projects, (1) NASA embarked on a redesign of the NASA Explorer Schools (NES) project in 2008. At each stage of the redesign, NASA has integrated evaluation activities and incorporated findings for program improvement. As part of the pilot activities (Spring 2010), NES gathered data from teachers and students to identify ways to improve the project's performance, and the NES project incorporated these lessons into the project for its September 2010 launch. The design of the formative evaluation has involved stakeholders and will lead to program modifications. The evaluation efforts ultimately will lead to an outcomes evaluation that investigates intended program outcomes, as laid out in the program logic model. (1) National Research Council. (2008). NASA's Elementary and Secondary Education Program: Review and Critique. Committee for the Review and Evaluation of NASA's Precollege Education Program, Helen R. Quinn, Heidi A. Schweingruber, and Michael A. Feder, Editors. Board on Science Education, Center for Education. Division of Behavioral and Social Sciences Education. Washington, D.C. The National Academies Press. |
| Evaluation of NASA's Summer of Innovation Project |
| Hilary Rhodes, Abt Associates Inc, hilary_rhodes@abtassoc.com |
| Kristen Neishi, Abt Associates Inc, kristen_neishi@abtassoc.com |
| In 2010, NASA's Office of Education launched Summer of Innovation, a NASA-infused summer experience for middle school students who underperform, are underrepresented, and underserved in science, technology, engineering, and math (STEM) fields. Since inception, process and outcomes evaluation has been an integral part of the program's development. By collecting planning and implementation data from awardees through interviews and reporting forms, and outcomes data from participating student and teachers through surveys, the evaluation has codified the lessons learned over the course of its pilot, producing actionable insight that has supported NASA's modification of the program for summer 2011. Formative evaluation efforts are continuing to support the program's continued implementation, to identify promising practices and models meriting more rigorous outcomes evaluation, understand how the awardees meet NASA requirements and the feasibility of these expectations, and continue to generate lessons learned for future implementations of SoI and of NASA's education activities more broadly. |
| Evaluation Utilization from NASA Project Managers' Perspectives |
| Rob LaSalvia, National Aeronautics and Space Administration, robert.f.lasalvia@nasa.gov |
| Rick Gilmore, National Aeronautics and Space Administration, richard.l.gilmore@nasa.gov |
| The NASA Explorer Schools (NES) and NASA Summer of Innovation (SoI) programs have integrated evaluation into the design, development, and refinement activities of the program. National project managers will discuss how building evaluation at the ground level of the program has differed from work on previous projects, and how evaluation has informed their thinking about the national projects and the individual sites. They will also provide reflections on the process of working closely with evaluators beginning at the early stages of the projects and this evaluation, describing what has worked well as well as what has been challenging. |
| Session Title: Building a Community to Set a Direction for Science Technology Engineering and Mathematics (STEM) Evaluation: What Can We Learn From Each Other and What Will Best Support Collaboration? |
| Think Tank Session 877 to be held in La Jolla on Saturday, Nov 5, 9:50 AM to 11:20 AM |
| Sponsored by the Pre-K - 12 Educational Evaluation TIG |
| Presenter(s): |
| Jack Mills, Independent Consultant, jackmillsphd@aol.com |
| Discussant(s): |
| Veronica Smith, data2insight LLC, veronicasmith@data2insight.com |
| Kathleen Haynie, Haynie Research and Evaluation, kchaynie@stanfordalumni.org |
| Tom McKlin, The Findings Group LLC, tom@thefindingsgroup.com |
| Abstract: This session will foster a conversation among evaluators of programs aimed at increasing the quality of science, technology, engineering and math (STEM) education. STEM program aims include: increasing precollege student achievement, increasing scientific literacy among the public, and broadening participation of underrepresented groups in STEM careers. Four experienced STEM evaluators began an on-going discussion following AEA-2010, hoping to elevate the theory and practice of STEM evaluation by sharing insights, concepts, tools and tools. The think tank will be organized around following questions: 1. Should we form a group devoted solely to STEM Evaluation? How shall we organize ourselves? 2. What barriers prevent freely sharing approaches, methods, instruments, reports, and broad findings among STEM evaluators? How can we remove these barriers? 3. How might the quality of our work improve by being a part of a STEM evaluators' community? 4. What are next steps for building this community? |
| Session Title: Evaluation Of Learning Processes As Dialogue Research |
| Multipaper Session 878 to be held in Laguna A on Saturday, Nov 5, 9:50 AM to 11:20 AM |
| Sponsored by the Research on Evaluation |
| Chair(s): |
| Annette Rasmussen, Aalborg University, anra@learning.aau.dk |
| Abstract: Recent years have seen quantitative evaluation research achieve a dominant position on the agenda of both social scientists and decision makers. This is a result of the widespread interest in evidence-based knowledge. The result is that qualitative dialogue research is being placed in a more marginal position when it comes to government funded research projects. However, there is a great scientific potential in dialogue research when it comes to conceptualizing meaning, social processes and contexts of learning. This multi-paper session addresses the issues, basic assumptions and contextual background of such dialogue-oriented approaches in education research from different social science perspectives. The first contribution discusses organizational aspects of conducting qualitative dialogue research. The second and third contribution look at research projects based on dialogue research with special regard to the methodological and ethical dilemmas that arise from this approach and need to be handled by the researcher. |
| Dilemmas of Just-in-Time Evaluation Research |
| Palle Rasmussen, Aalborg University, palleras@learning.aau.dk |
| This paper draws on experiences from a 3-year combined development and research project in general adult education done in three Danish regions. The aim of the project was to link courses in general adult education closer to workplaces and local communities. In the development part experimental course designs were formulated and tried out, while in the research part the conditions for and impact of the course designs were assessed through a number of case studies. The research design was open; the research tasks were decided in continuous negotiation between project managers and researchers. This assured that research resources were directed at tasks relevant to the overriding development objectives, but it also revealed differences between developers and researchers in the perception of relevance and validity of knowledge. The papers will discuss risks and potentials in this kind of collaboration between developers and researchers. |
| Dilemmas of Sense-making Evaluation and Assessment Research |
| Nanna Friche, Aalborg University, nanna@learning.aau.dk |
| This paper draws on experiences from a PhD-study of evaluation and assessment practices in a Danish community college. The aim of the study was to investigate how teachers and students make sense of non-everyday phenomenon like assessment and evaluation. With inspiration from user participating evaluation models (The BIKVA Model) the study was based on an explorative approach and the use of qualitative methods applied step-by-step starting with observation studies, followed by focus group interviews and completed by individual interviews. This approach revealed several methodological and ethical dilemmas of asking questions on sense-making processes. In addressing the questions; "How do people make sense of assessment?" and "What is credible evidence in assessment according to people?" this paper will discuss dilemmas of sense-making (and dialogue-oriented) assessment and evaluation research. |
| Dialogue Research Focused on Educational Frameworks |
| Annette Rasmussen, Aalborg University, anra@learning.aau.dk |
| This paper will describe the social science basis and background for a dialogue research approach in education. It takes as point of departure the evaluation requirements given in education development projects that are led by specific aims and policies. Evaluations following a dialogue approach are interested in, not only summing up the outcome of processes, but in understanding the learning processes during the initiative, including both the underlying interests of the participants and their understandings of the development initiative in question. So the approach is characterized by its orientation towards processes as well as participants, towards both structure and agency. In the paper these dimensions will be further outlined, related to case studies on education, and discussed with special regard to methodology, research methods and ethical considerations. |
| Session Title: Tools That Value Vulnerable Populations | |||||||||||||||||||||
| Multipaper Session 879 to be held in Laguna B on Saturday, Nov 5, 9:50 AM to 11:20 AM | |||||||||||||||||||||
| Sponsored by the Disabilities and Other Vulnerable Populations | |||||||||||||||||||||
| Chair(s): | |||||||||||||||||||||
| Frank Martin, Mathematica Policy Research, fmartin@mathematica-mpr.com | |||||||||||||||||||||
|
| Roundtable Rotation I: The Role of the Evaluator Reconsidered |
| Roundtable Presentation 880 to be held in Lido A on Saturday, Nov 5, 9:50 AM to 11:20 AM |
| Sponsored by the International and Cross-cultural Evaluation TIG |
| Presenter(s): |
| Ruofei Tian, Shenyang Normal University, rftian@163.com |
| Abstract: Education is needed by human being for two basic functions: social and individual. In nowadays China however, too many social values have been placed on the outcome of education under the disguise of 'All for the children'. When much is talked about gender differences, cultural influence, social classification, policy/decision making or public interests in an evaluation study, the interests of students involved which should indeed be given priority are mostly neglected. The role of the evaluator is therefore to serve as the spokesperson of the students whose voices are in many cases overwhelmed in the negotiations among different stakeholders. To do so, a shift of perspective in research is needed to view students as self-organized psychological systems exchanging information with their environment. |
| Roundtable Rotation II: Building Monitoring and Evaluation System in Georgia |
| Roundtable Presentation 880 to be held in Lido A on Saturday, Nov 5, 9:50 AM to 11:20 AM |
| Sponsored by the International and Cross-cultural Evaluation TIG |
| Presenter(s): |
| Ketevan Chomakhidze, Georgian Evaluation Association, kchomakhidze@evaluation.org.ge |
| Abstract: Georgian Evaluation Association (GEA) is a pioneer and the very first and only formal National Association among Former Soviet Union Countries. This paper presents what has been done by the GEA in various directions to achieve its major goal - form a branch of evaluation that will meet international standards. GEA start with analyze the existing situation in evaluation field and based on these results designed its strategic goals. The major achievement of GEA is developing of Institutional Model of M&E in Adjara Autonomies Republic of Georgia. GEA maintains the growth of qualified evaluator cadres and enhances their professional skills. For this purpose, GEA in association with IPEN (International Project Evaluation Network) is organizing the international conference 27-30 September, to be held in Batumi Georgia. Conference will be preceded by pre conference workshops that will be led by world-wide evaluation experts. |
| Session Title: Evaluating Quality Improvement Projects in Public Health Departments: Lessons From the Robert Wood Johnson Foundation's Quality Improvement Collaborative | |||||
| Panel Session 881 to be held in Lido C on Saturday, Nov 5, 9:50 AM to 11:20 AM | |||||
| Sponsored by the Health Evaluation TIG | |||||
| Chair(s): | |||||
| William Riley, University of Minnesota, riley0011@umn.edu | |||||
| Discussant(s): | |||||
| Brenda Henry, Robert Wood Johnson Foundation, bhenry@rwjf.org | |||||
| Elizabeth Lownik, University of Minnesota, beth.lownik@gmail.com | |||||
| Abstract: Quality improvement (QI) is an increasingly important competency for local health departments, and multiple initiatives are currently taking place to implement QI projects health departments around the country. As this work continues, evaluation of these QI projects is an essential component to assess success and elucidate lessons for QI implementation and scaling up in public health departments around the country. The Robert Wood Johnson Foundation funded a collaborative of 13 public health departments across the country to implement QI projects and to hire evaluators to begin the important work of assess the projects and compiling information for dissemination. This session brings evaluators from several of the projects in this collaborative to present their findings regarding how to evaluate Quality Improvement projects in local health departments. | |||||
| |||||
| |||||
| |||||
| |||||
| |||||
|
| Session Title: Tests of Two Frameworks for Evaluating the Impact of Health Research | ||||||||||||||||||||||
| Multipaper Session 882 to be held in Malibu on Saturday, Nov 5, 9:50 AM to 11:20 AM | ||||||||||||||||||||||
| Sponsored by the Research, Technology, and Development Evaluation TIG | ||||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||||
| Mary Beth Hughes, Science and Technology Policy Institute, m.hughes@gmail.com | ||||||||||||||||||||||
|
| Session Title: Values in Action |
| Demonstration Session 883 to be held in Manhattan on Saturday, Nov 5, 9:50 AM to 11:20 AM |
| Sponsored by the Indigenous Peoples in Evaluation |
| Presenter(s): |
| Joan LaFrance, Mekinak Consulting, lafrancejl@gmailcom |
| Richard Nichols, Colyer Nichols Inc, colyrnickl@cybermesa.com |
| Karen E Kirkhart, Syracuse University, kirkhart@syr.edu |
| Abstract: This session will provide an experience of "values in action" by engaging those attending in activities that contribute to entering an Indigenous world and building community. Come prepared to experience ceremony protocol, feasting, gifting, and sharing your lineage. These are actions that link to relationship, family and community -- critical values that permeate throughout Indigenous communities. Through experiencing the expression of these values in mock activities, participants will have an opportunity to learn and discuss how values mediate entry into a community and influence evaluator credibility. After the "values in action" segment, the facilitators will discuss ways in which Indigenous values inform program quality and the evaluation process. Examples will be provided regarding how the process of identifying shared or common values among Indigenous evaluation stakeholders contributes a sense of ownership of the evaluation process. |
| Session Title: Lessons Learned From Conducting Evaluations | ||||||||||||||||||
| Multipaper Session 884 to be held in Monterey on Saturday, Nov 5, 9:50 AM to 11:20 AM | ||||||||||||||||||
| Sponsored by the Graduate Student and New Evaluator TIG | ||||||||||||||||||
| Chair(s): | ||||||||||||||||||
| Jason Burkhardt, Western MIchigan University, jason.t.burkhardt@wmich.edu | ||||||||||||||||||
|
| Session Title: If Stakeholders Matter, Which Stakeholders do We Listen to First? | |||
| Panel Session 885 to be held in Oceanside on Saturday, Nov 5, 9:50 AM to 11:20 AM | |||
| Sponsored by the AEA Conference Committee | |||
| Chair(s): | |||
| Sanjeev Sridharan, University of Toronto, sridharans@smh.ca | |||
| Abstract: While working with different stakeholders across evaluations, there are consistent questions about evaluation use driving the discussion regarding the value of the product and to the stakeholders: - To whom should evaluations be useful? - How do we get evaluations that meet the needs of all audiences? - Is this evaluation to be useful for learning, accountability or both? While responses differ, evaluators worldwide are united in their exploration of these challenges. In Canada, within the context of a renewed federal evaluation policy, preliminary research with representatives from a cross-section of departments suggests a disconnect between evaluation criteria and stakeholder beliefs. In this panel, we will explore the legitimacy of different stakeholder viewpoints in making judgments of quality, merit or worth of an initiative. A combination of realistic evaluation and contribution analysis offer stakeholders an option by which to explore "what works for whom in what context under what conditions?" | |||
| |||
| |||
| |||
|
| Session Title: Framing Public Value, Building Identity, and Enhancing Learning Experiences: A Sampler of Visitor Studies in Zoos, Aquariums, and Natural History Museums | |||
| Panel Session 886 to be held in Palisades on Saturday, Nov 5, 9:50 AM to 11:20 AM | |||
| Sponsored by the Evaluating the Arts and Culture TIG | |||
| Chair(s): | |||
| Kathleen Tinworth, Denver Museum of Nature and Science, kathleen.tinworth@dmns.org | |||
| Abstract: How can public value and learning be measured in informal settings like a natural history museum, zoo, or aquarium? What role does value play in these environments? What kind of technology is employed to answer value and experience questions? And who does this work? For a third year, members of the Visitor Studies Association (VSA), an international network of professionals committed to understanding and enhancing visitor experiences in informal settings through research, evaluation and dialogue, will present a showcase of studies in informal environments. This year, we focus on frameworks and technology used to measure public value of zoos and natural history museums as well as the ways in which interacting with these settings develops individual's identity as environmental stewards. | |||
| |||
| |||
| |||
| |||
|
| Session Title: Weaving Networks, Weaving Change: Practical Uses and Experiences of Inter-Organizational Network Analysis Findings | ||||
| Panel Session 887 to be held in Palos Verdes A on Saturday, Nov 5, 9:50 AM to 11:20 AM | ||||
| Sponsored by the Social Network Analysis TIG | ||||
| Chair(s): | ||||
| Raul Martinez, Harder+Company Community Research, rmartinez@harderco.com | ||||
| Discussant(s): | ||||
| Paul Harder, Harder+Company Community Research, pharder@harderco.com | ||||
| Abstract: Increasingly, evaluations of funders supporting local service providers focus on measuring improvements to the larger systems of care. Based on panelist's experiences conducting evaluations of First 5 funded programs in several counties in California, this panel will discuss the implications of quantifying the measurement of systems of care from the perspectives of evaluators and stakeholders. Evaluators will present findings from a number of different First 5 County evaluations all of which used the same method to assess inter-agency collaboration and mapping the networks. Stakeholders from these First 5 Counties discuss their experiences and raise issues regarding the appropriate uses, implementation and meaning from network findings, including how network analysis can assess change over time and how to best use findings for different stakeholders and evaluation goals. The discussant will summarize the main themes from each paper to describe how inter-organizational network analysis can improve evaluation capacity. | ||||
| ||||
| ||||
|
| Session Title: Voices From the Field: A Dialogue on Incorporating Cultural Responsiveness Into Evaluation Training and Practice |
| Think Tank Session 888 to be held in Palos Verdes B on Saturday, Nov 5, 9:50 AM to 11:20 AM |
| Sponsored by the Multiethnic Issues in Evaluation TIG |
| Presenter(s): |
| Krystal Tomlin, The Robert Wood Johnson Foundation, tomlinkrystal@gmail.com |
| Discussant(s): |
| Adina Wright, The Robert Wood Johnson Foundation, amwright777@gmail.com |
| Lauren Ramsey, The Robert Wood Johnson Foundation, laurenmeta@gmail.com |
| Jose Reyes, The Robert Wood Johnson Foundation, mr.josereyes@hotmail.com |
| Sudha Sivaram, National Institutes of Health, sudha.sivaram@nih.gov |
| Eric Wat, Special Services for Groups, ewat@ssgmain.org |
| Valerie Williams, University Corporation for Atmospheric Research, vwilliam@ucar.edu |
| Howard Walters, OMG Center for Collaborative Learning, howard@omg.org |
| Erica Lizano, University of Southern California, erica.lizano@gmail.com |
| Monica Getahun, OMG Center for Collaborative Learning, monica@omgcenter.org |
| Abstract: In response to the growth and cultural diversity of organizations and communities, evaluators have begun to embrace the practice of Culturally Responsive Evaluation (CRE) which focuses on underlying factors such as race and culture. While the practice is still in its infant stage, there are several champions that are leading the way including the Robert Wood Johnson Foundation (RWJF) who has developed several fellowships that nurture professionals by providing them with the necessary skills and experience in CRE. In this case, the RWJF evaluation fellows propose a discourse that focuses on the application of CRE in the "real world" by reflecting on their field placement experience and on the application of the Race Matters Toolkit, which assesses the cultural responsiveness of organizations. In addition to the fellow's experiences, several seasoned professionals will share their personal insight of incorporating the CRE values and approaches in their practice. |
| Session Title: Evaluating the 3 Rs: Remediation, Retention, and Roadblocks | |||||||||||||||
| Multipaper Session 890 to be held in Salinas on Saturday, Nov 5, 9:50 AM to 11:20 AM | |||||||||||||||
| Sponsored by the Assessment in Higher Education TIG | |||||||||||||||
| Chair(s): | |||||||||||||||
| Katherine Beck, Westwood College, kbeck@westwood.edu | |||||||||||||||
|
| Session Title: Performance Measurement for Policy Advocacy: Insights From Embedded MEL Practitioners | |||
| Panel Session 891 to be held in San Clemente on Saturday, Nov 5, 9:50 AM to 11:20 AM | |||
| Sponsored by the Advocacy and Policy Change TIG | |||
| Chair(s): | |||
| Gabrielle Watson, Oxfam America, gwatson@oxfamamerica.org | |||
| Abstract: Policy advocacy work has gained increasing recognition by Boards and the philanthropic supporters of the non-profit world as an effective way to achieve systemic change benefiting large numbers of people. Along with this growing support comes growing demand to demonstrate results. This has spawned significant innovation and development within the field of policy advocacy evaluation and a specific interest in tools used by the private sector, such as performance measurement. This panel presents the experiences of two organizations that have been using performance measurement of their policy advocacy work. Presenters, each an internal MEL staffer, explore issues such as the tension between strategic learning and accountability and the search for meaningful and manageable indicators. Finally, presenters will explore how performance measurement systems can be used as a way to prompt teams to engage in meaningful conversations that result in collective learning and corrective actions. | |||
| |||
| |||
|
| Session Title: From Positive Youth Development to Full Potential: Rubrics That Shift Practice and Evaluation | |||
| Panel Session 892 to be held in San Simeon A on Saturday, Nov 5, 9:50 AM to 11:20 AM | |||
| Sponsored by the Collaborative, Participatory & Empowerment Evaluation TIG | |||
| Chair(s): | |||
| Kim Sabo Flores, Thrive Foundation for Youth, kim@thrivefoundation.org | |||
| Abstract: To move the aim for the positive youth development field, from youth competence to full potential, Thrive Foundation for Youth has made deep investments in the area of thriving. Thriving is an idealized, dynamic state between the person and his or her context. The notion of thriving has serious implications for policy and programming, as it targets practices that grow a young person's pursuit of full potential. Additionally, the focus necessitates new measures that move away from a stagnant snap shot of a young person's status at any given moment, toward methods that capture ongoing growth and strategy. This panel will share Thrive Foundation For Youth's innovative use of rubrics to influence the practice and measurement of thriving. Panelists will discuss the early lessons learned from 5 grantees over a six-month period. Examples will be drawn from youth development programs that include youth-led social change programs and volunteer mentoring programs. | |||
| |||
| |||
| |||
|
| Session Title: Human Services Initiatives in K-12 School Settings | ||||||||||||||||||||||
| Multipaper Session 893 to be held in San Simeon B on Saturday, Nov 5, 9:50 AM to 11:20 AM | ||||||||||||||||||||||
| Sponsored by the Pre-K - 12 Educational Evaluation TIG and the Human Services Evaluation TIG | ||||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||||
| Javan Ridge, Colorado Springs School District 11, ridgejb@d11.org | ||||||||||||||||||||||
| Discussant(s): | ||||||||||||||||||||||
| Lisa Garbrecht, EVALCORP Research & Consulting, lgarbrecht@evalcorp.com | ||||||||||||||||||||||
|
| Roundtable Rotation I: The Good, the Bad and the Unanticipated: Exploring the Consequences of Changing an Evaluation Plan Midstream |
| Roundtable Presentation 894 to be held in Santa Barbara on Saturday, Nov 5, 9:50 AM to 11:20 AM |
| Sponsored by the Evaluation Use TIG |
| Presenter(s): |
| Kate Golden, University of Nebraska, kgolden@unmc.edu |
| Jill Kumke, University of Nebraska, jkumke@unmc.edu |
| Megan Borer, University of Nebraska, mborer@unmc.edu |
| Lisa St Clair, University of Nebraska, lstclair@unmc.edu |
| Abstract: When a program struggles with the quality and implementation of its model, evaluators weigh the risk and reward of modifying evaluation designs. The decision to change, even slightly, can present significant challenges but may also yield surprising benefits. We faced this when evaluating an early childhood coaching program for childcare centers. We responded by adding a qualitative component to a primarily objectives focused quantitative design. Reflective of an evaluation approach that incorporates continuous improvement principles, this change provided a forum for reviewing the coaching model and considering improvements. This simple, yet effective adjustment provided a surprising range of benefits and raised new challenges. Impacts are noted on funders, program staff and the evaluation team. This roundtable presentation will encourage discussion around the unexpected effects of modifying evaluation designs. Questions presented will weigh the roles of context, evaluator/client relationship and resources when considering whether to alter the evaluation design. |
| Roundtable Rotation II: Perspectives on Conflicts Between Policy Makers' and Evaluator's Values When Using Evaluation Research on Mentoring Programs |
| Roundtable Presentation 894 to be held in Santa Barbara on Saturday, Nov 5, 9:50 AM to 11:20 AM |
| Sponsored by the Evaluation Use TIG |
| Presenter(s): |
| Laura Lunsford, University of Arizona, lglunsfo@email.arizona.edu |
| Abstract: This roundtable proposes to examine the conflict in values that arises when an intervention, championed by a highly visible policy maker, is at odds with evaluation research. A case study of an undergraduate mentoring program for low-income students will be presented to frame the discussion. Mentoring is a valued activity, but research suggests that mentoring is a voluntary relationship, that it does not benefit all students, and negative outcomes are possible. However, many programs require mentoring, assume mentoring will benefit all students, and have few or no controls for managing bad relationships. Staff may be unwilling to 'rock the boat' with program champions to make needed changes. What is the role and professional obligation of the evaluator when these value conflicts occur? The audience will share thoughts about value conflicts between evaluators and program administrators and suggest ways to resolve such conflicts. |
| Session Title: Surprise in Evaluation: Values and Valuing as Expressed in Political Ideology, Program Theory, Metrics, and Methodology |
| Think Tank Session 895 to be held in Santa Monica on Saturday, Nov 5, 9:50 AM to 11:20 AM |
| Sponsored by the Theories of Evaluation TIG |
| Presenter(s): |
| Jonathan Morell, Fulcrum Corporation, jamorell@jamorell.com |
| Discussant(s): |
| Tarek Azzam, Claremont Graduate University, tarek.azzam@cgu.edu |
| Joanne Farley, University of Kentucky, joanne.farley@uky.edu |
| Abstract: How does political ideology affect program theories, methodologies, and metrics? Participants will be randomly assigned to groups, and asked to sketch an evaluation based on one of three positions. 1) Government has an obligation to alleviate social inequities and thereby promote the public good. 2) Government's role is to uphold civil order so people to pursue their own goals, with the consequences of their actions being their own personal responsibility. In general, less government is better. 3) The family is the primary unit of social cohesion, and there resides the locus of decisions about issues such as health and education. Government can be active or passive, as long as it supports the centrality of the family as the locus of moral authority and daily living. During report backs and we will compare how the evaluation designs differ with respect to program theory, metrics, and methodology. |
| Session Title: Strengthening Values for Child Welfare Through Participatory Evaluation: Service Commitment, Job Expectations and Goals |
| Multipaper Session 896 to be held in Sunset on Saturday, Nov 5, 9:50 AM to 11:20 AM |
| Sponsored by the Human Services Evaluation TIG |
| Chair(s): |
| Christine Mathias, University of California Berkeley, cmathias@berkeley.edu |
| Discussant(s): |
| Barrett Johnson, University of California Berkeley, barrettj@berkeley.edu |
| Abstract: Developing and implementing first, a statewide graduate school MSW program and, then a core curriculum for newly hired staff in the values-laden field of public child welfare services involved multi-level, multi-organizational and cross-institutional arrangements and many diverse stakeholders. The State legislation and federal regulations about child welfare training require program evaluation but are non-specific. Participatory evaluation was the method of choice for this evaluation. At first each part of the project was evaluated separately, but integrating the core evaluation with the title IV-E graduate school evaluation has improved our ability to systematically examine the effects of training and education. This session describes three aspects of the evaluation in detail: why we do it; how we involve stakeholders in the statewide evaluation of a standardized core curriculum; and building a chain of evidence, using the example of the examination of IV-E graduates' retention factors. |
| Developing a Participatory Training Evaluation for Child Welfare |
| Leslie Zeitler, University of California Berkeley, lzeitler@berkeley.edu |
| As part of a coordinated strategic plan to revise and improve standardized core training, active participation by a variety of university and agency stakeholders is necessary. This paper will show how regional training academy staff, trainers, county staff developers, curriculum writers, subject matter experts, trainees, and evaluation consultants all contribute to the improvement of standardized in-service training. Among ways these stakeholders participate are training pilots, quality assurance efforts, analyses of test data, use of trainer forums, and brief trainee focus groups. The design and decision-making roles of the Macro-Evaluation Committee will be presented. Consistent with an overall evaluation framework, trainee demographic and test data is analyzed for possible test item bias, aggregate performance, and potential demographic differences. We will describe the challenging process of coordinating participation by stakeholders in the curriculum in the test development/revision process. |
| Valuing Workers' Values: Helping Social Workers Stay in Child Welfare Service |
| Sherrill Clark, University of California Berkeley, sjclark@berkeley.edu |
| Research on the retention of public child welfare workers suggests that a service commitment is influential in attracting persons to the field. Thus, supporting this value may help to facilitate their long term worker retention in the profession. A qualitative data analysis of responses to a recent IV-E MSW graduate survey confirmed that helping others or making a difference in others' lives is a salient aspect of the job for new public child welfare social workers. This theme was present across all six years the survey was administered. In order to understand how to better foster the relationship between having a commitment to service and remaining in the child welfare field, the results of this analysis helped inform the focus of future evaluation of the factors that retain professional child welfare social workers. |
| Child Welfare Workers' Values: A Longitudinal Assessment of Job Expectations and Career Goals |
| Susan Jacquet, University of California, Berkeley, sjcaquet@berkeley.edu |
| Sherrill Clark, University of California Berkeley, sjclark@berkeley.edu |
| To evaluate the effects of educational and training programs on Title IV-E child welfare workers' values and career goals relative to their retention in the field, we have created a framework of questions focusing on service commitment, job/career expectations and goals which we ask at multiple points in the workers' careers. Cohorts are surveyed when newly graduated/hired, at 3 years post-graduation, and 5-years post-graduation. Telephone interviews involve graduates in the evaluation development by soliciting new focus areas to examine. The sequence of surveys occurs annually; each cohort is followed for at least five years. Comparisons between IV-E and non-IV-E new hires indicate there are differences between IV-E graduates and others, which we will report in this session. The information gathered in this evaluation along with the workers' retention status informs the evaluation of education and training programs as well as the agencies that hire them. |
| Session Title: Evaluating Teacher Professional Development in STEM: Examining Teacher Learning and Perceptions | |||||||||||||||||
| Multipaper Session 897 to be held in Ventura on Saturday, Nov 5, 9:50 AM to 11:20 AM | |||||||||||||||||
| Sponsored by the Pre-K - 12 Educational Evaluation TIG | |||||||||||||||||
| Chair(s): | |||||||||||||||||
| John Gargani, Gargani + Company Inc, john@gcoinc.com | |||||||||||||||||
| Discussant(s): | |||||||||||||||||
| John Gargani, Gargani + Company Inc, john@gcoinc.com | |||||||||||||||||
|