| Session Title: Making Evaluation and Research Universally Accessible |
| Skill-Building Workshop 802 to be held in Centennial Section A on Saturday, Nov 8, 8:00 AM to 9:30 AM |
| Sponsored by the Special Needs Populations TIG |
| Presenter(s): |
| Yvonne Kellar-Guenther, University of Colorado Denver, yvonne.kellar-guenther@uchsc.edu |
| Nancy Koester, University of Colorado Denver, nancy.koester@uchsc.edu |
| William Betts, University of Colorado Denver, william.betts@uchsc.edu |
| Abstract: When people hear of universal access/universal usability (UA/UU), many think of access for persons with disabilities. Product designers, however, now realize that UA/UU leads to a process of keeping all situations and all people in mind (Vanderheiden, 2000). As evaluation aims to become more inclusive of groups that can benefit from UA/UU (e.g. the elderly, poor, children too young to read, substance abusers) it is becoming vital the evaluators think about universal access to the whole evaluation process. The goal of this session is to talk about ways to increase accessibility for all groups who participate in evaluations. These ideas are a mix of what has been gleaned through existing literature on UA/UU for product design and our own experience of doing evaluation that includes persons of all ages with all types of disabilities. During this session we will also work through parts of a study to make it accessible. |
| Session Title: Creating Excellent Data Graphs for Everyday Evaluation Products |
| Demonstration Session 803 to be held in Centennial Section B on Saturday, Nov 8, 8:00 AM to 9:30 AM |
| Sponsored by the Quantitative Methods: Theory and Design TIG |
| Presenter(s): |
| Frederic Malter, University of Arizona, fmalter@email.arizona.edu |
| Abstract: Visualizing raw data and empirical findings is a crucial part of most evaluation work. The demonstration will first introduce theoretical principles for graphical excellence by drawing on the groundbreaking work of Edward Tufte, a leading authority in data visualization,. One important goal of the demonstration is to convey to participants that visualizing data should be part of every data analytic endeavor in evaluation practice. Exemplary data graphs will be shown to discuss the degree to which they succeeded or failed in realizing graphical excellence. Two software tools, MS Excel and Tableau, will be employed in live demonstration exercises that aim to teach participants how graphical excellence can be achieved in their everyday work with simple means. A number of empirical findings from evaluation practice (percentages, mean differences with CI, time series data, geographically rendered measures) serve as a starting point for a live creation of graphs with widely used tools. |
| Session Title: Envisioning Culturally Responsive Evaluation Policy: Perspectives From the United States and New Zealand | |||||
| Panel Session 804 to be held in Centennial Section C on Saturday, Nov 8, 8:00 AM to 9:30 AM | |||||
| Sponsored by the Presidential Strand | |||||
| Chair(s): | |||||
| Stafford Hood, University of Illinois at Urbana-Champaign, slhood@illinois.edu | |||||
| Discussant(s): | |||||
| Finbar Sloane, Arizona State University, finbar.sloane@asu.edu | |||||
| Bernice Anderson, National Science Foundation, banderso@nsf.gov | |||||
| Abstract: Imagine that the newly elected US President has asked you for guidance on governmental evaluation policy for culturally responsive evaluation. The President is interested in practical guidelines that can inform and direct governmental evaluations that are culturally responsive, respectful and reparative. The President welcomes theoretical justification of the evaluation policy guidelines generated, but insists that the emphasis be on evaluation practice not theory. The President has elicited three distinct, albeit complementary, perspectives to inform this challenge. First, the perspectives of historically oppressed cultures in the US are critically important to any governmental evaluation policy on culturally responsive evaluation. Second, the perspectives of the Maori as indigenous people of New Zealand are to be highly valued, as they have proactively engaged in policies and practices that are intended to be genuinely bicultural. Third, the perspectives of the philanthropic community in the US can well complement the public perspective of the government. | |||||
| |||||
| |||||
|
| Session Title: Evaluation Policy and Practice in Government Settings | ||||||||||||||||||||||||
| Multipaper Session 805 to be held in Centennial Section D on Saturday, Nov 8, 8:00 AM to 9:30 AM | ||||||||||||||||||||||||
| Sponsored by the Government Evaluation TIG | ||||||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||||||
| Rakesh Mohan, Idaho Legislature, rmohan@ope.idaho.gov | ||||||||||||||||||||||||
| Discussant(s): | ||||||||||||||||||||||||
| Rakesh Mohan, Idaho Legislature, rmohan@ope.idaho.gov | ||||||||||||||||||||||||
|
| Session Title: When Community Passions and Personal Callings Meet Empiricism: Exploring the Interpersonal Side of Program Evaluation Policy Shifts |
| Demonstration Session 806 to be held in Centennial Section E on Saturday, Nov 8, 8:00 AM to 9:30 AM |
| Sponsored by the Independent Consulting TIG |
| Presenter(s): |
| Michael Lyde, Lyde and Associates, drlyde@charter.net |
| Abstract: A community-based agency has a rich history of effecting positive change in the lives of its clients. One critical element missing from this history is a catalog of formal evaluation reports that provide a counterpoint to the many testimonials and other qualitative evidence of the agency's effectiveness. A new program evaluation team is contracted and takes numerous steps to remedy the evaluation limitations of this agency and they live happily ever after, right? Perhaps, but some of the underlying work (i.e., relationship building, empowering agency staff, etc.) is the focus of this demonstration session. Inherent in any paradigm shift is the clash of philosophies and resistance to change. This demonstration will provide a forum for the presentation, exchange, and refinement of strategies that professional evaluators can utilize to overcome these challenges. |
| Session Title: Techniques and Strategies to Increase Participation in Mental Health and Substance Abuse | ||||||||||||||||||||||||||||
| Multipaper Session 807 to be held in Centennial Section F on Saturday, Nov 8, 8:00 AM to 9:30 AM | ||||||||||||||||||||||||||||
| Sponsored by the Alcohol, Drug Abuse, and Mental Health TIG | ||||||||||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||||||||||
| Garrett Moran, Westat, garrettmoran@westat.com | ||||||||||||||||||||||||||||
|
| Session Title: Course-Evaluation Designs II: Faculty Perspectives on Practices and Continuing Development | ||||||||||||||||||||||||
| Multipaper Session 808 to be held in Centennial Section G on Saturday, Nov 8, 8:00 AM to 9:30 AM | ||||||||||||||||||||||||
| Sponsored by the Assessment in Higher Education TIG | ||||||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||||||
| Rick Axelson, University of Iowa, rick-axelson@uiowa.edu | ||||||||||||||||||||||||
| Discussant(s): | ||||||||||||||||||||||||
| Jennifer Reeves, Nova Southeastern University, jennreev@nova.edu | ||||||||||||||||||||||||
|
| Session Title: Engaging Stakeholders in the Scientific Enterprise: Using Concept Mapping for Research Priority Setting and Participatory Evaluation |
| Multipaper Session 809 to be held in Centennial Section H on Saturday, Nov 8, 8:00 AM to 9:30 AM |
| Sponsored by the Research, Technology, and Development Evaluation TIG |
| Chair(s): |
| Scott Rosas, Concept Systems Inc, srosas@conceptsystems.com |
| Discussant(s): |
| Scott Rosas, Concept Systems Inc, srosas@conceptsystems.com |
| Abstract: Recent transformation of the scientific research enterprise has led to a corresponding need for participatory methods that involve stakeholders in shaping directions for future research. This session examines multi-level stakeholder involvement and contributions across three projects focused on the planning and evaluation of scientific priorities. The first project engaged researchers and agency staff to co-construct a framework of success factors for evaluating an emerging infectious disease research program. The second project engaged internal and external stakeholders, including funding agency staff, researchers and collaborators of a multi-site research network, to identify cancer research priorities. The third project engaged residents, organizational associates, managers, and executives, family members, and experts to develop a community-articulated research agenda for future geriatric and aging research. This panel will summarize the common methodology, highlight its use across the three projects, and conclude with a discussion of the involvement of stakeholders at multiple levels. |
| Defining Success for the National Institute of Allergy and Infectious Diseases' Regional Centers of Excellence in Biodefense and Emerging Infectious Diseases Research Program: A Co-Authored Evaluation Framework and Plan |
| Mary Kane, Concept Systems Inc, mkane@conceptsystems.com |
| Kathleen M Quinlan, Concept Systems Inc, kquinlan@conceptsystems.com |
| The National Institute of Allergy and Infectious Diseases (NIAID) Regional Centers of Excellence in Biodefense and Emerging Infectious Diseases Research Program was first funded in 2003, as part of a large funding allocation for biodefense. Given the newness, the broad mandate and the innovative approaches of the program, an evaluation of its first five years was planned. The concept mapping methodology provided a rigorous, structured approach for scientists to articulate the conceptual model underlying their endeavor, a major challenge in this type of evaluation. Center researchers and agency staff co-constructed a framework of success factors that served as the foundation upon which a task force of leaders within NIAID and the RCEs collaboratively identified evaluation questions and measures. This thorough, participatory planning process set the stage for participant commitment to, involvement in and acceptance of the interim evaluation. |
| Identifying Research Priorities for the National Cancer Institute's Cancer Research Network: Developing a Collaboratively Authored Conceptual Framework |
| Kathleen M Quinlan, Concept Systems Inc, kquinlan@conceptsystems.com |
| Katy Hall, Concept Systems Inc, khall@conceptsystems.com |
| Leah Tuzzio, Group Health Care Cooperative, tuzzio.l@ghc.org |
| Wendy McLaughlin, National Institutes of Health, wendy.mclaughlin@nih.hhs.gov |
| Ed Wagner, National Institutes of Health, wagner.e@ghc.org |
| Martin Brown, National Institutes of Health, mbrown@mail.nih.gov |
| Robin Yabroff, National Institutes of Health, robin.yabroff@nih.hhs.gov |
| Entering its third 5-year funding cycle, the National Cancer Institute's (NCI) Cancer Research Network (CRN), consisting of 14 integrated health systems nationwide, is a cooperative research grant that encourages the generation of new research ideas and increased involvement by other cancer researchers. To support research agenda planning and decision-making, the CRN sought stakeholder input on scientific research priorities. Key leaders brainstormed 98 research topics and then used the concept mapping approach to organize the ideas conceptually. Both internal and external CRN stakeholders those directly involved with the network and those with an interest in cancer research were invited to rate the ideas to determine CRN research priorities. The framework includes elements related to the biological, behavioral, and economic aspects of cancer; informatics and diffusion research; and aspects of the healthcare system, setting a research agenda that will improve the quality and effectiveness of preventive, curative, and supportive interventions for cancer. |
| Setting the Research Agenda with Communities |
| Mary Kane, Concept Systems Inc, mkane@conceptsystems.com |
| This initiative yielded a collaboratively authored comprehensive framework to guide the selection of future research programs in the field of aging and wellness. The Institute for Optimal Aging (IOA) stakeholders were residents of three senior living communities in the Chicago area; professional and para-professional associates who provide care giving and programming to the residents; executives and field employees operational corporation, and academics and researchers in geriatrics. Participants used a mix of web-based and on-site methods for collecting and organizing data; this input created the conceptual framework of priority research areas on aging. Through document review and key informant interviews, the conceptual research framework was enriched and rendered more relevant to future research needs in geriatrics. The benefits of engaging residents, associates and academics in one endeavor included greater depth in the research framework, and a strong sense of contributing to future geriatric research. |
| Session Title: Assessment and Improvement of Government Agency Collaboration for Disaster Preparedness and Recovery | |||||||||||
| Multipaper Session 810 to be held in Mineral Hall Section A on Saturday, Nov 8, 8:00 AM to 9:30 AM | |||||||||||
| Sponsored by the Disaster and Emergency Management Evaluation TIG | |||||||||||
| Chair(s): | |||||||||||
| Bolton Patricia, Battelle, bolton@battelle.org | |||||||||||
|
| Session Title: Randomized Control Trials: Regression Discontinuity and a Poor Relative | |||||||||||||||||||||
| Multipaper Session 811 to be held in Mineral Hall Section B on Saturday, Nov 8, 8:00 AM to 9:30 AM | |||||||||||||||||||||
| Sponsored by the Quantitative Methods: Theory and Design TIG | |||||||||||||||||||||
| Chair(s): | |||||||||||||||||||||
| Frederick Newman, Florida International University, newmanf@fiu.edu | |||||||||||||||||||||
|
| Session Title: Distance Education: Course and Program Level Evaluation | |||||||||||||||||||||||
| Multipaper Session 812 to be held in Mineral Hall Section C on Saturday, Nov 8, 8:00 AM to 9:30 AM | |||||||||||||||||||||||
| Sponsored by the Distance Ed. & Other Educational Technologies TIG | |||||||||||||||||||||||
| Chair(s): | |||||||||||||||||||||||
| Diane Chapman, North Carolina State University, diane_chapman@ncsu.edu | |||||||||||||||||||||||
|
| Session Title: Working Together to Enhance the Quality of Science and Math Education Evaluations: GK-12 Project Evaluations |
| Think Tank Session 813 to be held in Mineral Hall Section D on Saturday, Nov 8, 8:00 AM to 9:30 AM |
| Sponsored by the Organizational Learning and Evaluation Capacity Building TIG |
| Presenter(s): |
| Patti Bourexis, The Study Group Inc, studygroup@aol.com |
| Discussant(s): |
| Rita Fierro, Fierro Evaluation, fierro.evaluation@gmail.com |
| Annelise Carleton-Hug, Trillium Associates, annelise@trilliumassociates.com |
| Mimi McClure, National Science Foundation, mmcclure@nsf.gov |
| Abstract: This think tank provides a forum for evaluators to discuss key issues in evaluating GK-12 projects, the National Science Foundation initiative which places graduate students from science, mathematics and engineering disciplines in K-12 classrooms to share their content knowledge. Think tank participants will have an opportunity to identify project-level evaluation questions we might pursue as a community, useful evaluation tools and resources that might be shared, and plans for sustaining a learning community of GK-12 evaluators. By working collectively within a learning community with shared interests, we anticipate our discussions will contribute to building the capacity not only of the individual project evaluations, but also for enhancing the capacity of NSF to use project-level evaluation findings to inform the GK-12 program nationally. The think tank will offer opportunities for small group in-depth discussions along with whole-group dialogue designed to stimulate further thinking and exchange. |
| Session Title: Internal Review Boards (IRB) Place in the Philanthropic and Nonprofit Sector: Are Foundations and the Vulnerable Populations They Serve at Risk? |
| Think Tank Session 814 to be held in Mineral Hall Section E on Saturday, Nov 8, 8:00 AM to 9:30 AM |
| Sponsored by the Non-profit and Foundations Evaluation TIG |
| Presenter(s): |
| Delia Carmen, Annie E Casey Foundation, dcarmen@aecf.org |
| Discussant(s): |
| Delia Carmen, Annie E Casey Foundation, dcarmen@aecf.org |
| Bill Bacon, Packard Foundation, bbacon@packard.org |
| Ben Kerman, Casey Family Services, bkerman@caseyfamilyservices.org |
| Abstract: As a matter of practice, Foundations make major investments in the evaluation of both their own program initiatives as well as to support research in their fields of interest. Such investments are primarily made to large university and research partners with existing IRB credentials that provide needed protections to constituents or human subjects that are included in these efforts. However, as Foundations move to become more and more data-driven and results oriented the demands for performance measures that can only be obtained through original data collection which is increasingly being carried out by non-credentialed grantees as principal investigators, presents a new set of challenges for foundations and the non-profit sector in safeguarding the privacy, confidentiality, rights and privileges of those individuals who participate in and share information for study. This Think-tank session will ask participants to explore and discuss the best ways to incorporate non-government mandated IRB protocols into foundations' grant-giving protocols without becoming onerous, unwieldy barriers to reflective learning and strategic use of data. Key questions to be raised and discussed include: 1. What is deemed to be research in the grant-giving field? Who makes this assessment for program officers? 2. What are the legal implications for a foundation funding research or evaluation and where is the line between a foundation and grantee's responsibility? 3. What can Foundations ethically do with older, existing potentially rich, informative research data that may not have been collected using IRB protocols or under an IRB-approved protocol that does not comply with current standards? 4. How do foundations build the necessary capacity of its non-university/ research partner grantees to develop and sustain the necessary informed consent and confidentiality safeguards into their original data collection activities? |
| Session Title: On the Outside Looking In: Lessons From the Field for External Evaluators | |||||||||||||||||||||||||
| Multipaper Session 815 to be held in Mineral Hall Section F on Saturday, Nov 8, 8:00 AM to 9:30 AM | |||||||||||||||||||||||||
| Sponsored by the Pre-K - 12 Educational Evaluation TIG | |||||||||||||||||||||||||
| Chair(s): | |||||||||||||||||||||||||
| Faith Connolly, Naviance, faith.connolly@naviance.com | |||||||||||||||||||||||||
|
| Session Title: Federal Policy and Grass-Roots Practice: Explaining and implementing federal performance measurement (evaluation) policy requirements through self-help guides |
| Demonstration Session 816 to be held in Mineral Hall Section G on Saturday, Nov 8, 8:00 AM to 9:30 AM |
| Sponsored by the Collaborative, Participatory & Empowerment Evaluation TIG |
| Presenter(s): |
| Kenneth Terao, JBS International, kterao@jbsinternational.com |
| Anna Marie Schmidt, JBS International, aschmidt@jbsinternational.com |
| Nicole Vicinanza, JBS International, nvicinanza@jbsinternational.com |
| Edie Cook, Independent Consultant, elm20@juno.com |
| Susan Hyatt, Business Nonprofit Connections, shyatt@bnconnections.com |
| Abstract: For 13 years, JBS International's Project STAR has helped grantees of the Corporation for National and Community Service (CNCS) develop, implement, and report on performance measurement and evaluation. During this time Project STAR has developed a series of paper and web-based self-help evaluation documents for CNCS grantee programs (AmeriCorps, VISTA, Senior Corps and Learn and Serve America). This demonstration will walk participants through the steps we take to make our evaluation and performance measurement TA materials timely, accurate and user friendly. We will focus on how context and work with grantees and CNCS led document development; the step-by-step approach in developing each document and document series; and how the documents are employed in training and remote technical assistance to grantees. STAR's materials meet the immediate performance measurement and evaluation policy needs for CNCS grantees, but the concepts used in developing them apply to many evaluation efforts and audiences. |
| Session Title: Multisite Evaluations: Challenges, Methods, and Approaches in Public Health | ||||||||||
| Panel Session 817 to be held in the Agate Room Section B on Saturday, Nov 8, 8:00 AM to 9:30 AM | ||||||||||
| Sponsored by the Cluster, Multi-site and Multi-level Evaluation TIG | ||||||||||
| Chair(s): | ||||||||||
| Thomas Chapel, Centers for Disease Control and Prevention, tchapel@cdc.gov | ||||||||||
| Abstract: Consensus of key players regarding a program and its components is optimal for any evaluation, but it is particularly challenging in multisite evaluations. In federal and state programs that are implemented by networks of grantees and frontline practitioners, the evaluation process is a formidable one because evaluation skills and availability of data sources vary site by site. More importantly, in multisite evaluations, beyond agreement on the high-level purpose of the program, the frontline activities can differ widely. Representatives from the three programs discussed on this panel have had to face the challenges of monitoring performance at multiple sites, and for some, they have had to use site data to illustrate overall program performance nationally. Program representatives will discuss their programs, involvement of their grantees and partners in developing evaluation approaches, and the approaches taken. The process for developing and implementing the evaluation will be discussed as will decisions on where to impose uniformity or grant autonomy in indicators and data collection. Transferable lessons from their experience will be identified. | ||||||||||
| ||||||||||
| ||||||||||
|
| Session Title: Do Schools Know Best? A Foundation Explores the Question | ||||||
| Panel Session 818 to be held in the Agate Room Section C on Saturday, Nov 8, 8:00 AM to 9:30 AM | ||||||
| Sponsored by the Non-profit and Foundations Evaluation TIG | ||||||
| Chair(s): | ||||||
| Albert Bennett, Roosevelt University, abennett@roosevelt.edu | ||||||
| Abstract: The Lloyd A Fry Foundation High School Initiative began in the fall of 2001. There were two primary goals of the Initiative. The first goal was to increase student achievement. The second goal was to create sustainable improvements in the teaching and learning environment through distributed leadership and collaborative decision making. The Initiative was funded for five years and each school was able to receive up to $1.25 million over the life of the project. In addition to this support, each of the six schools received $50,000 for planning and $25,000 for retreats. Two assumptions guided the Initiative. The first assumption was that schools know their problems (and solutions) best, so give them the resources they need and get out of the way. The second assumption was that schools will make better decisions if more individuals are involved in problem identification and decision-making. If principals become more inclusive, they will be able to share leadership responsibilities, thereby making what is quickly becoming an unmanageable job more manageable. And finally, that the decisions of these leadership teams would be significantly different (i.e., better) than previous decisions made in the old authoritarian pattern. | ||||||
| ||||||
| ||||||
| ||||||
|
| Session Title: Evaluating Math Science Partnership Projects in New York State: Finding Evidence and Documenting Results |
| Multipaper Session 819 to be held in the Granite Room Section A on Saturday, Nov 8, 8:00 AM to 9:30 AM |
| Sponsored by the Pre-K - 12 Educational Evaluation TIG |
| Chair(s): |
| Dianna Newman, University at Albany - State University of New York, dnewman@uamail.albany.edu |
| Abstract: The US Education Math Science Partnership (MSP) program seeks to bridge the gap between current and expected content, pedagogy and student outcomes in math and science education. As federal and state priorities shift and funding increases, the evaluation component of this initiative has become increasingly important. This multi-paper presentation explores the evaluation methodologies proven to be successful in documenting local and statewide MSP programs and highlights "end of project" findings. The MSP grants discussed in this session assume that successful professional development empowers teachers with knowledge and skills to create an effective classroom environment, thus facilitating the transfer of learning. Evaluators focus not only on the professional development and student outcomes but also on the process by which teachers transferred the new skills and knowledge to the classroom. Additionally, insights are gleaned from assessing student work, thus making strong connections between teachers' practices and student performance. |
| Finding Evidence For Math Science Initiatives |
| Kathy Gullie, University at Albany - State University of New York, kp9854@albany.edu |
| Dianna Newman, University at Albany - State University of New York, dnewman@uamail.albany.edu |
| Evaluators of a Federally funded Math Science Partnership grant will present quasi-experimental methods and findings related to identifying ways to document evidence that meet GPRA indicators for federal and state agencies and address how to develop a plan that facilitates finding evidence of success. The goal MSP Partnerships is to foster student improvement in math by improving teacher knowledge of math content and math pedagogy. Teachers receive 60 hours of professional development from grant and district related resources, based on the assumption that successful professional development empowers teachers with knowledge and skills to create effective classroom environments. These papers investigate this transfer of learning while looking at the intermediate and integrated functions of teaching math. Analysis of student work and its relationship to grant initiated professional development and student academic achievement will be discussed. Evaluators will present findings highlighting student academic achievement on individual student folders, on local report cards, district and state tests. |
| Teacher Professional Development and Student Math Achievement: Results from Two Large-Scale Grants |
| Anna Valtcheva, University at Albany - State University of New York, avaltcheva@gmail.com |
| Kristina Mycek, University at Albany - State University of New York, km1042@albany.edu |
| In this age of accountability, educators are urged to meet the requirements of the No Child Left Behind (NCLB) Act that targets improvement in students' achievement while closing the racial achievement gap. In attempts to attain these goals, school districts across the country have initiated numerous programs to provide teachers with additional training. The purpose of this paper is to present the results of a study investigating the relationship between teachers' level of involvement in professional development offerings and students' mathematics achievement. Hierarchical linear modeling (HLM) data was analyzed and collected as part of a multi-phase mixed method evaluation process of a Math Science Partnership (MSP) Initiative. This program focuses on the enhancement of student outcomes in higher-level mathematics and science achievement in large urban settings as well as at-risk rural and small city schools. Results pertaining to students with special needs and Limited English Proficiency (LEP) will be discussed. |
| Addressing Gaps in Evaluation: Balancing Priorities |
| Amy Germuth, Compass Consulting Group LLC, agermuth@mindspring.com |
| Math Science Partnerships operate under a relatively simple logic model. The model for such partnership is that teacher professional development that emphasizes content and pedagogical skills in tandem should result in changes in teachers' knowledge and practice, thus benefiting students as evidenced by increased achievement. Despite this simple model, few evaluations have adequately addressed these different components; especially transfer of learning, the most critical component. As the state-level evaluators for MSP programs in New York Compass has worked with multiple partners, including the USED to address such gaps in evaluations. Compass will share their lessons learned about potential evaluation models and instruments that may promote better understanding of MSPs and their potential outcomes, and will speak to the need to balance federal, state and USED priorities when conducting such evaluations. |
| Session Title: Various Approaches to Evaluating Provision of Health Care Services | ||||||||||||||||||
| Multipaper Session 820 to be held in the Granite Room Section B on Saturday, Nov 8, 8:00 AM to 9:30 AM | ||||||||||||||||||
| Sponsored by the Health Evaluation TIG | ||||||||||||||||||
| Chair(s): | ||||||||||||||||||
| Robert LaChausse, California State University at San Bernardino, rlachaus@csusb.edu | ||||||||||||||||||
|
| Session Title: Evaluation Education in Diverse Countries | |||||||||||||||||
| Multipaper Session 821 to be held in the Granite Room Section C on Saturday, Nov 8, 8:00 AM to 9:30 AM | |||||||||||||||||
| Sponsored by the International and Cross-cultural Evaluation TIG | |||||||||||||||||
| Chair(s): | |||||||||||||||||
| Pauline E Ginsberg, Utica College, pginsbe@utica.edu | |||||||||||||||||
|
| Roundtable: What Can You Tell in the Short Term? |
| Roundtable Presentation 822 to be held in the Quartz Room Section A on Saturday, Nov 8, 8:00 AM to 9:30 AM |
| Sponsored by the Extension Education Evaluation TIG |
| Presenter(s): |
| Ben Silliman, North Carolina State University, ben_silliman@ncsu.edu |
| Abstract: This roundtable focuses on evaluation of short-term events with youth. Brief events attended by large numbers of youth provide optimal opportunities for 4-H and other youth programs to gather data on program quality and outcomes. Such events may even provide opportunities to monitor long-term experiences and outcomes. However, settings such as youth conferences, short courses, trips, and camps offer logistical challenges (e.g., scheduling, location, use of certain methods and technologies) and practical limitations (e.g., intervention duration, developmental and learning potential) that restrict the feasibility and accuracy of the evaluation process. These issues are discussed in the context of two evaluation projects done with different degrees of success. Discussion will focus on maximizing the value of short-term evaluations with youth. |
| Roundtable: Evaluating Health Messages: Using Laptops and Embedded Messages to Assess Anti-drug Messages |
| Roundtable Presentation 823 to be held in the Quartz Room Section B on Saturday, Nov 8, 8:00 AM to 9:30 AM |
| Sponsored by the Health Evaluation TIG |
| Presenter(s): |
| Jason Siegel, Claremont Graduate University, jason.siegel@cgu.edu |
| Eusebio Alvaro, Claremont Graduate University, eusebio.alvaro@cgu.edu |
| William Crano, Claremont Graduate University, william.crano@cgu.edu |
| Abstract: This roundtable will discuss an evaluation of experimentally manipulated anti-drug messages. The focus will not be the results, per se, but rather the advantages of using multiple laptops and embedded messages as a means of evaluating messages targeting young adolescents. The experimental, anti-drug, messages were embedded in an anti-bullying video to “bury the chestnut.” The use of laptops allowed for short videos to keep participants entertained while filling out a long survey. Laptops also allowed participants to move at their own pace and the synchronized voice-over assisted participants with reading difficulties. Additional advantages of using laptops to evaluate health messages will be discussed along with some of the drawbacks and the costs. |