| Session Title: Using the Program Evaluation Standards, Third Edition, to Define and Enhance Evaluation Quality |
| Skill-Building Workshop 362 to be held in Lone Star A on Thursday, Nov 11, 4:30 PM to 6:00 PM |
| Sponsored by the Presidential Strand |
| Presenter(s): |
| Donald Yarbrough, University of Iowa, d-yarbrough@uiowa.edu |
| Lyn Shulha, Queen's University at Kingston, lyn.shulha@queensu.ca |
| Rodney Hopson, Duquesne University, hopson@duq.edu |
| Flora Caruthers, Florida Legislature, caruthers.flora@oppaga.fl.gov |
| Abstract: Attendees will learn about and apply the newly published (August 2010) 3rd edition Program Evaluation Standards (SAGE, 2010). The Joint Committee Task Force authors, serving as session leaders, will review quality control steps taken during the standards development process, guide individuals and groups in applications of the standards, and lead discussions about how the standards define and enhance evaluation quality in specific evaluation settings. Attendees will also have the opportunity to report their own evaluation dilemmas and discuss in small and large groups how to apply the program evaluation standards to increase and balance dimensions of evaluation quality, such as utility, feasibility, propriety, accuracy, and evaluation accountability, in these settings. The workshop will deal explicitly with metaevaluation and its role in evaluation quality improvement and accountability. Attendees will receive handouts to support reflective practice in their future evaluations and evaluation-related work. |
| Session Title: Systems in Evaluation TIG Business Meeting and Presentation: Meet and Greet With Systems in Evaluation Authors |
| Business Meeting Session 363 to be held in Lone Star B on Thursday, Nov 11, 4:30 PM to 6:00 PM |
| Sponsored by the Systems in Evaluation TIG |
| TIG Leader(s): |
| Janice Noga, Pathfinder Evaluation and Consulting, jan.noga@pathfinderevaluation.com |
| Margaret Hargreaves, Mathematica Policy Research, mhargreaves@mathematica-mpr.com |
| Mary McEathron, University of Minnesota, mceat001@umn.edu |
| Presenter(s): |
| Janice Noga, Pathfinder Evaluation and Consulting, jan.noga@pathfinderevaluation.com |
| Abstract: As the Systems in Evaluation TIG continues to grow and bring in new members, we strive to keep our membership informed of advances in this rapidly developing area within evaluation. In 2010, three new books by TIG members Michael Patton, Patricia Rogers, Bob Williams, and Richard Hummelbrunner appeared in press. We invite you to meet these authors in person to hear about their new books, hear them discuss each others’ work, ask questions about their ideas, and get first-hand advice on vexing systems-related problems. It is an exciting opportunity to have them all together to talk about the cutting edge of systems and evaluation. |
| Session Title: The Biggest Winners: Empowerment Evaluation Exercises to Strengthen Primary Prevention Capacity |
| Skill-Building Workshop 364 to be held in Lone Star C on Thursday, Nov 11, 4:30 PM to 6:00 PM |
| Sponsored by the Collaborative, Participatory & Empowerment Evaluation TIG |
| Presenter(s): |
| Sandra Ortega, Ohio DELTA and Rape Prevention Education, ortega.12@osu.edu |
| Rebecca Cline, The Ohio Domestic Violence Network, rclineodvn@aol.com |
| Amy Bush Stevens, Owl Creek Consulting, amybstevens@mac.com |
| Abstract: The Center for Disease Control awarded funds for both the Rape Prevention Education and DELTA initiatives to Ohio creating a unique opportunity for developing a collaboration between the CDC, State of Ohio Department of Health, Ohio Domestic Violence Network (a private, non-profit organization) and local prevention service providers for increasing primary prevention capacity. The presenters will share lessons learned and empowerment evaluation exercises they developed for increasing primary prevention capacity of service providers and the state violence prevention coalition over the past three years. These tools have served to align national, state and local goals, objectives and outcomes regarding violence prevention efforts. Moreover, they have worked in tandem with the Getting to Outcomes framework to increase accountability through integrating evaluation activities into service provision and state planning. Participants will have the opportunity to work with the tools to increase their collaborative evaluation capacity building skills in this hands on workshop. |
| Session Title: Non-profits & Foundations Evaluation TIG Business Meeting |
| Business Meeting Session 365 to be held in Lone Star D on Thursday, Nov 11, 4:30 PM to 6:00 PM |
| Sponsored by the Non-profit and Foundations Evaluation TIG |
| TIG Leader(s): |
| Lester Baxter, Pew Charitable Trusts, l.baxter@pewtrusts.org |
| Charles Gasper, Missouri Foundation for Health, cgasper@mffh.org |
| Joanne Carman, University of North Carolina at Charlotte, jgcarman@uncc.edu |
| Helen Davis Picher, William Penn Foundation, hdpicher@williampennfoundation.org |
| Session Title: Fundamentals of Power Analysis and Sample Size Determination |
| Demonstration Session 366 to be held in Lone Star E on Thursday, Nov 11, 4:30 PM to 6:00 PM |
| Sponsored by the Quantitative Methods: Theory and Design TIG |
| Presenter(s): |
| Steven Pierce, Michigan State University, pierces1@msu.edu |
| Abstract: In quantitative studies, statistical power (the probability of detecting an effect that actually exists) is closely tied to sample size. Evaluators can use power analysis to plan what sample size should be targeted during data collection to make best use of limited evaluation resources. This introductory session will cover the fundamental concepts involved in using power analysis and describe how power analysis can be used to improve the quality of a quantitative evaluation study. It will define key terms, explain why power analysis is important, and then discuss practical issues such as how to pick a power analysis method that matches your hypotheses, how to come up with reasonable numbers to plug into power analysis formulas, and why it is important to examine how sensitive the results are to your assumptions. Some examples will be presented, and software tools and other resources will be recommended. |
| Session Title: Cluster, Multi-site, and Multi-level TIG Business Meeting and Demonstration: A Mixed Methods Approach to Measurement for Multi-site Evaluation |
| Business Meeting Session 367 to be held in Lone Star F on Thursday, Nov 11, 4:30 PM to 6:00 PM |
| Sponsored by the Cluster, Multi-site and Multi-level Evaluation TIG |
| TIG Leader(s): |
| Rene Lavinghouze, Centers for Disease Control and Prevention, shl3@cdc.gov |
| Martha Ann Carey, Maverick Solutions, marthaann123@sbcglobal.net |
| Presenter(s): |
| Fred Springer, Evaluation, Management & Training Associates Inc, fred@emt.org |
| Wendi Siebold, Evaluation, Management & Training Associates Inc, wendi@emt.org |
| Carrie Petrucci, Evaluation, Management & Training Associates Inc, cpetrucci@emt.org |
| Abstract: This mixed-method approach to measurement for use in multi-site evaluations (MSE’s) treats natural diversity in program context and implementation as a learning opportunity, rather than a challenge to internal validity. Capitalizing on the differences within and across multiple sites, knowledge for evidence-based practice is built inductively by measuring naturally occurring variability across sites, and using it to identify robust relationships with program effects. Multi-level analysis examines relations and interactions at and across individual, group, process, and context levels, and provides strong tests of external validity. Measurement is developed inductively, and combines the use of available and primary data collection, including observational data, semi-structured interviews, standardized instruments, and administrative data. This multi-site, multi-level approach enhances the quality of evaluation by using a “site-level protocol” with measures that are pertinent to practice. The epistemological foundation will be discussed, followed by explicit examples of how this approach is implemented from design to analysis. |
| Roundtable Rotation I: Toward Universal Design for Evaluation: Continuing the Conversation |
| Roundtable Presentation 368 to be held in MISSION A on Thursday, Nov 11, 4:30 PM to 6:00 PM |
| Sponsored by the Special Needs Populations TIG |
| Presenter(s): |
| Jennifer Sulewski, University of Massachusetts, Boston, jennifer.sulewski@umb.edu |
| Abstract: Universal design refers to designing products or programs so that they are accessible to everyone. Originally conceived in the context of architecture and physical accessibility for people with disabilities, the concept of Universal Design has been adapted to a variety of contexts, including technology, education, and the design of programs and services. At Evaluation 2009, a panel presented on the idea of Universal Design for Evaluation, drawing on the panelists’ individual experiences conducting research with people with and without disabilities. As a follow-up to last year’s session, we invite this year’s conference attendees to a discussion of our collective experiences conducting evaluations with people with disabilities and other vulnerable populations. We will give a brief recap of last year’s session but plan to spend most of the session discussing promising practices, lessons learned, and what Universal Design might look like applied to the evaluation field. |
| Roundtable Rotation II: Special Populations: Strategies for Collecting Data, Giving Voice |
| Roundtable Presentation 368 to be held in MISSION A on Thursday, Nov 11, 4:30 PM to 6:00 PM |
| Sponsored by the Special Needs Populations TIG |
| Presenter(s): |
| Sheila A Arens, Mid-Continent Research for Education and Learning, sarens@mcrel.org |
| Andrea Beesley, Mid-Continent Research for Education and Learning, abeesely@mcrel.org |
| Abstract: The Guiding Principles direct evaluators to attend to differences among participants. Paying attention to diversity and actively seeking to include voices of those who may be marginalized is not, however, just a matter of abiding by the Guiding Principles; it is a matter of technical adequacy of data and hence, the validity of evaluative endeavors. Presenters will draw on their experiences collecting data from special populations. Through a series of questions and scenarios, presenters will discuss the importance of clarification of values, issues of selecting participants (and concomitant concerns about attrition), planning for accommodations to data collection instruments, and following through. This roundtable is relevant to seasoned and newer evaluators. Discussing scenarios and sharing experiences will prepare participants for evaluations including special populations. |
| Session Title: Historical Shifts in Evaluation Policy abnd Evaluation Practice: What We've Learned About Quality Evaluation | ||||||||||||||
| Multipaper Session 369 to be held in MISSION B on Thursday, Nov 11, 4:30 PM to 6:00 PM | ||||||||||||||
| Sponsored by the | ||||||||||||||
| Chair(s): | ||||||||||||||
| Ruth Anne Gigliotti, Synthesis Professional Services Inc., rgigliotti@synthesisps.com | ||||||||||||||
| Discussant(s): | ||||||||||||||
| Catherine Callow-Heusser, EndVision Research and Evaluation LLC, cheusser@endvision.net | ||||||||||||||
|
| Session Title: Independent Consulting TIG Business Meeting |
| Business Meeting Session 370 to be held in BOWIE A on Thursday, Nov 11, 4:30 PM to 6:00 PM |
| Sponsored by the Independent Consulting TIG |
| TIG Leader(s): |
| Frederic Glantz, Kokopelli Associates LLC, fred@kokopelliassociates.com |
| Rita Fierro, Independant Consultant, fierro.evaluation@gmail.com |
| Michelle Baron, The Evaluation Baron LLC, michelle@evaluationbaron.com |
| Session Title: Evaluation Quality From a Federal Perspective | ||||
| Panel Session 371 to be held in BOWIE B on Thursday, Nov 11, 4:30 PM to 6:00 PM | ||||
| Sponsored by the Multiethnic Issues in Evaluation TIG | ||||
| Chair(s): | ||||
| Elmima Johnson, National Science Foundation, ejohnson@nsf.gov | ||||
| Abstract: “Evaluation Quality” has been identified as the conference theme with a focus on its conceptualized and operationalization. Another area of importance is evaluation utilization. This panel will discuss the definitions of evaluation and “new ways of thinking about the systematic assessment of our evaluation work” from a Federal perspective, i.e., the National Science Foundation (NSF), its grantees and contractors and the US Government Accountability Office (GAO). The utilization of a contextual/cultural perspective will be woven throughout the discussions of the various evaluation mechanisms described. | ||||
| ||||
| ||||
| ||||
| ||||
|
| Session Title: Indigenous Peoples in Evaluation TIG Business Meeting |
| Business Meeting Session 372 to be held in BOWIE C on Thursday, Nov 11, 4:30 PM to 6:00 PM |
| Sponsored by the Indigenous Peoples in Evaluation TIG |
| TIG Leader(s): |
| Katherine A Tibbetts, Kamehameha Schools, katibbet@ksbe.edu |
| Kalyani Rai, University of Wisconsin, Milwakee, kalyanir@uwm.edu |
| Joan LaFrance, Mekinak Consulting, lafrancejl@gmail.com |
| Roundtable Rotation I: Designing for Change: The Experience of the Quitline Iowa Evaluation |
| Roundtable Presentation 373 to be held in GOLIAD on Thursday, Nov 11, 4:30 PM to 6:00 PM |
| Sponsored by the Alcohol, Drug Abuse, and Mental Health TIG |
| Presenter(s): |
| Disa Cornish, University of Northern Iowa, disa.cornish@uni.edu |
| Gene Lutz, University of Nothern Iowa, gene.lutz@uni.edu |
| Abstract: Since 2008, the Iowa Tobacco Cessation Program Evaluation has comprehensively evaluated state-funded tobacco cessation programs. One of these programs, Quitline Iowa, is a telephone-based counseling service that also offers a free two-week supply of nicotine replacement therapy (gum, patches, or lozenges) to Iowans who are trying to quit using tobacco products. The evaluation of Quitline Iowa includes two methods: follow-up interviews with Quitline Iowa callers and secret shopper calls to 1-800-QUIT-NOW. Follow-up interviews were conducted with participants 3, 6, and 12 months after their first call to Quitline Iowa. A questionnaire developed by the evaluator assessed changes in tobacco use behaviors and aspects of the callers’ experiences. In July 2010, the evaluation follow-up interview protocol will change to a protocol designed by the Centers for Disease Control and Prevention (CDC) for all state quitline evaluations to use. This presentation will discuss challenges and lessons learned through the change process. |
| Roundtable Rotation II: Adapting the Strategic Prevention Framework Model for Use in Suicide Prevention and Other Abbreviated Funding Cycles Benefiting From Grantee, Stakeholder, Evaluator Collaboration |
| Roundtable Presentation 373 to be held in GOLIAD on Thursday, Nov 11, 4:30 PM to 6:00 PM |
| Sponsored by the Alcohol, Drug Abuse, and Mental Health TIG |
| Presenter(s): |
| Trena Anastasia, University of Wyoming, tanastas@uwyo.edu |
| Trish Worley, University of Wyoming, tworley1@uwyo.edu |
| Abstract: This session will demonstrate a process of adapting the Strategic Prevention Framework model for abbreviated grant cycles while maintaining fidelity to the process. Ideas for using a front loading of the needs assessment to jump start coalition buy-in, in an effort to move toward rapid adoption of needs based prevention strategies will be shared. The example shown demonstrates the adaptation to a three year grant cycle for suicide prevention where outcome baseline measures were identified in the needs assessment phase. Bring your ideas for building community evaluation partners given the constraints of funding time lines, to share with the group. |
| Roundtable Rotation I: Evaluating Innovation and Capacity Building in Arts Organizations: Challenges and Lessons Learned in Capturing the Complexity |
| Roundtable Presentation 374 to be held in SAN JACINTO on Thursday, Nov 11, 4:30 PM to 6:00 PM |
| Sponsored by the Evaluating the Arts and Culture TIG |
| Presenter(s): |
| Mary Piontek, EmcArts Inc, mpiontek@umich.edu |
| Abstract: In a time of unprecedented change for arts institutions, leaders recognize that business as usual cannot assure organizational health and success in the marketplace. Thriving organizations will be those that increase their emphasis on innovation, and make the most compelling case by demonstrating creative adaptation in their thinking and nimbleness in their response to change. This session will discuss evaluation strategies being used within the evolving, unpredictable, and non-linear context of innovation to (a) support intentionality in making change; (b) document the how, when, why, and by whom changes were made; (c) critically explore the results of and learnings, expected and unexpected; and (d) assist organizations in articulating how power, decision-making processes, policies, knowledge, and resources are used to promote and institutionalize innovation. This work draws upon developmental and formative evaluation practices, traditional program design and evaluation tools, and customized instruments and indicators for assessing capacity and impact. |
| Roundtable Rotation II: The Beauty of Internal Evaluation in the Arts: Using Metaphors and Symbols to Develop the Evaluation Capacity of the Board and Staff |
| Roundtable Presentation 374 to be held in SAN JACINTO on Thursday, Nov 11, 4:30 PM to 6:00 PM |
| Sponsored by the Evaluating the Arts and Culture TIG |
| Presenter(s): |
| Kathleen Norris, Plymouth State University, knorris@plymouth.edu |
| Abstract: A challenge that exists within non-profit arts organizations is the development of evaluation capacity. Though the Board and staff of the organization with whom I work have the capacity to respond to requests for data from funders of discrete projects, there is need for a larger evaluation context that will assist with strategic planning and provide information about the organization as a whole. The use of metaphors and symbols in the process of evaluation development has engaged the board and staff in new ways and has assisted in the integration of evaluation within the routine activities of the staff. In this roundtable presentation and discussion, the metaphors and symbols and the processes used within this organization will be provided, and participants will be asked to discuss whether or not they can imagine using metaphors and symbols to engage members of their organizations and if so, what those might be. |
| Session Title: Evaluation Without Borders: Lessons From Other Countries | |||||||||||||||||
| Multipaper Session 375 to be held in TRAVIS A on Thursday, Nov 11, 4:30 PM to 6:00 PM | |||||||||||||||||
| Sponsored by the Assessment in Higher Education TIG | |||||||||||||||||
|
| Session Title: Evaluation Managers and Supervisors TIG Business Meeting and Panel: Reflections on Evaluation Management Expertise and Competencies From Two Perspectives | |||
| Business Meeting with Panel Session 376 to be held in TRAVIS B on Thursday, Nov 11, 4:30 PM to 6:00 PM | |||
| Sponsored by the Evaluation Managers and Supervisors TIG | |||
| TIG Leader(s): | |||
| Ann Maxwell, United States Department of Health and Human Services, ann.maxwell@oig.hhs.gov | |||
| Sue Hewitt, Health District of Northern Larimer County, shewitt@healthdistrict.org | |||
| Laura Feldman, University of Wyoming, lfeldman@uwyo.edu | |||
| Chair(s): | |||
| Thomas Horwood, ICF International, thorwood@icfi.com | |||
| Abstract: “Managing evaluation is an almost invisible practice, one little studied and little written about. Illuminating everyday practice and perspectives on it serves to make the taken-for-granted, the seemingly invisible and often ineffable, available” (Baizerman & Compton, 2009, p. 8). This session will feature two evaluation managers who work together to manage several evaluation studies but who represent different perspectives. One perspective (the client) is that of an evaluation manager from a public agency who oversees multiple studies simultaneously. The other perspective (the contractor) is that of an evaluation manager from a consulting firm who also manages multiple studies at the same time. Each panelist will reflect individually on the types of evaluation and management expertise and competencies. Finally, the two panelists will compare and contrast these perspectives based on their roles in their individual organizations and end with an opportunity for attendees to offer their own observations or ask questions. | |||
| |||
|
| Session Title: Face to Face With the Authors of the Needs Assessment Kit: Challenging Questions (With a Twist) and Hopefully Meaningful Answers | ||||
| Panel Session 377 to be held in TRAVIS C on Thursday, Nov 11, 4:30 PM to 6:00 PM | ||||
| Sponsored by the Needs Assessment TIG | ||||
| Chair(s): | ||||
| James Altschuld, The Ohio State University, altschuld.1@osu.edu | ||||
| Discussant(s): | ||||
| Hsin-Ling Hung, University of Cincinnati, hunghg@ucmail.uc.edu | ||||
| Abstract: Needs Assessment (NA) is a necessary part of the process of planning, implementing, and evaluating successful programs. The Needs Assessment KIT (five integrated books on the process) was published in late 2009. Its goal was to enhance the practice of NA. This panel is an opportunity to question the authors via a lively and highly interactive format, part of which will be the solicitation before the conference of questions and comments as well as obtaining those from the audience for the session. The discussants will use themes from them to guide the discussion – the panelists will not have access to prior issues and thoughts. Thus the discussion will not be scripted and spontaneous in nature. | ||||
| ||||
| ||||
|
| Session Title: Health Indicator Systems for Evaluation of Local, State, and National Chronic Disease Prevention and Control Initiatives | ||||||||
| Multipaper Session 378 to be held in TRAVIS D on Thursday, Nov 11, 4:30 PM to 6:00 PM | ||||||||
| Sponsored by the Health Evaluation TIG | ||||||||
| Chair(s): | ||||||||
| Todd Rogers, Public Health Institute, txrogers@pabell.net | ||||||||
|
| Session Title: Quality by Design: Statewide Human Services Workforce Evaluation Using an Integrated Framework | |||||
| Panel Session 379 to be held in INDEPENDENCE on Thursday, Nov 11, 4:30 PM to 6:00 PM | |||||
| Sponsored by the Human Services Evaluation TIG | |||||
| Chair(s): | |||||
| Chris Mathias, California Social Work Education Center, cmathias@berkeley.edu | |||||
| Discussant(s): | |||||
| Todd Franke, University of California, Los Angeles, tfranke@ucla.edu | |||||
| Abstract: This states university and human services agency partnership is a consortium of the states schools of social work, public human service agencies, and other related professional organizations. It facilitates the integration of education and practice to assure effective, culturally competent service delivery in the human services. The partnerships goals are to: re-professionalize public human service through a specialized education program for public human services, develop a continuum that connects pre service education to in service training, engage in research and evaluation to develop evidence based practices and finally advocate for responsive policies and resources to support practice improvement and client outcomes. Evaluations from three of the partnerships programs will be presented. Plans for integrating the evaluations using theoretical constructs and longitudinal design as guiding principles will be discussed with the goal of improving the ability to better assess the impact of these programs on practice and client outcomes. | |||||
| |||||
| |||||
| |||||
|
| Session Title: Assessing Change Over Time | |||||||||||||||||||
| Multipaper Session 380 to be held in PRESIDIO A on Thursday, Nov 11, 4:30 PM to 6:00 PM | |||||||||||||||||||
| Sponsored by the Quantitative Methods: Theory and Design TIG | |||||||||||||||||||
| Chair(s): | |||||||||||||||||||
| Wendy Garrard, University of Michigan, wgarrard@umich.edu | |||||||||||||||||||
|
| Session Title: A Systems Approach to Building and Assessing Evaluation Plan Quality | |||||
| Panel Session 381 to be held in PRESIDIO B on Thursday, Nov 11, 4:30 PM to 6:00 PM | |||||
| Sponsored by the Organizational Learning and Evaluation Capacity Building TIG and the Systems in Evaluation TIG | |||||
| Chair(s): | |||||
| Jennifer Urban, Montclair State University, urbanj@mail.montclair.edu | |||||
| Discussant(s): | |||||
| William M Trochim, Cornell University, wmt1@cornell.edu | |||||
| Abstract: The Cornell Office for Research on Evaluation (CORE) uses a systems-based approach to program evaluation and planning that is operationalized through the Systems Evaluation Protocol (SEP). The SEP has been developed, tested, and refined through “Evaluation Partnerships” established with forty education programs in two contexts: Cornell Cooperative Extension, and Outreach Offices in NSF Materials Research, Science and Engineering Centers. Drawing on the SEP, evaluation theory, and experience with these Partnerships, CORE’s concept of evaluation plan quality emphasizes the quality of the program model underlying the plan; how well an evaluation “fits” the program; and the “internal alignment” of the evaluation plan. The panel presents our definition of evaluation plan quality, tools we have developed to begin to assess quality, how we operationalize and observe the development of quality in the Evaluation Partnerships, and education research on the importance of inquiry-based approaches to learning that are embedded in the Evaluation Partnerships. | |||||
| |||||
| |||||
| |||||
|
| Session Title: Data for All: Democratizing Data Without Compromising Quality |
| Demonstration Session 382 to be held in PRESIDIO C on Thursday, Nov 11, 4:30 PM to 6:00 PM |
| Sponsored by the |
| Presenter(s): |
| Sarah Kohler Chrestman, Louisiana Public Health Institute, skohler@lphi.org |
| Lisanne Brown, Louisiana Public Health Institute, lbrown@lphi.org |
| Abstract: As the greater New Orleans (GNO) region is continuing to rebuild, the need for democratized data at the neighborhood level continues to grow. Instead of maximizing efforts, there is a great deal of duplication of data collection because agencies, organizations, and residents are unaware of what is already available. The Orleans Neighborhood Health Implementation Plan (ONHIP) is working to improve the availability of data through the development and use of a public website with neighborhood specific data and interactive mapping and query capability. This presentation will discuss which data sources are publicly available, the issues to consider when democratizing data, and its benefits and challenges. Data quality is of utmost importance as poor data can cause more harm than good. Participants will learn methods, including education, to ensure the quality of the data is maintained and to reduce any opportunities to misrepresent data. |
| Roundtable Rotation I: Evaluation Goes to College: The Collaborative Evaluation of a Graduate Program |
| Roundtable Presentation 383 to be held in BONHAM A on Thursday, Nov 11, 4:30 PM to 6:00 PM |
| Sponsored by the Collaborative, Participatory & Empowerment Evaluation TIG |
| Presenter(s): |
| Seriashia Chatters, University of South Florida, schatter@mail.usf.edu |
| EunKyeng Baek, University of South Florida, ebaek@mail.usf.edu |
| Thanh Pham, University of South Florida, tvpham2@mail.usf.edu |
| Yvonne Hunter, University of South Florida, yohunter@mail.usf.edu |
| Abstract: A need was identified at a large, public, US university to determine the level of satisfaction of students and faculty in a Counselor Education graduate program. There is a wealth of research available regarding the evaluation of K-12 programs. However, there is limited information regarding the evaluation of higher education programs especially evaluations applying the collaborative approach. This evaluation was conducted to identify students and faculty satisfaction level of the graduate program and to recognize the differences between evaluating a K-12 program and the evaluation of a graduate program. In order to identify the needs and concerns of all relevant stakeholders, a Collaborative Evaluation approach was utilized. We will discuss how the process and procedures of the collaborative approach were implemented and strengths and challenges of utilizing this method in evaluating a graduate program will be addressed. |
| Roundtable Rotation II: Working Together to Design Effective Evaluation Tools |
| Roundtable Presentation 383 to be held in BONHAM A on Thursday, Nov 11, 4:30 PM to 6:00 PM |
| Sponsored by the Collaborative, Participatory & Empowerment Evaluation TIG |
| Presenter(s): |
| Rebeca Diaz, WestEd, rdiaz@wested.org |
| Abstract: This presentation will discuss a collaborative approach to developing effective evaluation instruments with key stakeholders carrying out a federal education grant. The main goals of the grant are to increase teacher content knowledge in U.S. history, enhance teacher practice, and increase student learning. The presenter has seven years of experience evaluating these federal grants designed to provide professional development for history teachers, and continues to explore new methods to effectively measure teacher outcomes. The evaluation approach, which consists of both qualitative and quantitative methods, is collaborative in nature. The evaluator employs an approach that involves not only project leaders but also the teachers involved in the program. |
| Session Title: San Antonio River Improvements Project: Field Trip to Ecosystem Restoration Sites |
| Demonstration Session 384 to be held in OFF SITE FIELDTRIP on Thursday, Nov 11, 4:30 PM to 6:00 PM |
| Sponsored by the Environmental Program Evaluation TIG |
| Presenter(s): |
| Annelise Carleton-Hug, Trillium Associates, annelise@trilliumassociates.com |
| Abstract: This demonstration session offers an opportunity to learn about the massive multi-agency river restoration efforts underway on the San Antonio River just south of downtown San Antonio. The session includes a site visit and walking tour of the Eagleland and Mission Reach portions of the river that were previously channelized for flood control. The current project is restoring a more natural channel morphology and native plants. The tour will be lead by a specialist from the San Antonio River Authority, and discussions will include the pre-project assessment, project goals, evaluation and monitoring plans. Additional topics will be the challenges of ecosystem restoration and conservation in the urban interface, and addressing recreation uses. Space for this demonstration/field trip is limited. |
| Session Title: Education Evaluation: Connecting Professional Development to Changes in Classroom Practice | |||||||||||||||||||||||||
| Multipaper Session 385 to be held in BONHAM C on Thursday, Nov 11, 4:30 PM to 6:00 PM | |||||||||||||||||||||||||
| Sponsored by the Pre-K - 12 Educational Evaluation TIG | |||||||||||||||||||||||||
| Chair(s): | |||||||||||||||||||||||||
| Bianca Montrosse, Western Carolina University, bianca.montrosse@gmail.com | |||||||||||||||||||||||||
| Discussant(s): | |||||||||||||||||||||||||
| Susan Connors, University of Colorado, Denver, susan.connors@ucdenver.edu | |||||||||||||||||||||||||
|
| Session Title: Addressing Schools as Organizations in Educational Evaluation | ||||||||||||||||||||||||||
| Multipaper Session 386 to be held in BONHAM D on Thursday, Nov 11, 4:30 PM to 6:00 PM | ||||||||||||||||||||||||||
| Sponsored by the Pre-K - 12 Educational Evaluation TIG | ||||||||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||||||||
| Diane Binder, The Findings Group LLC, diane.binder@thefindingsgroup.com | ||||||||||||||||||||||||||
| Discussant(s): | ||||||||||||||||||||||||||
| Chad Green, Loudoun County Public Schools, chad.green@loudoun.k12.va.us | ||||||||||||||||||||||||||
|
| Session Title: Evaluating the Science of Discovery in Complex Health Systems: Challenges and Opportunities | |||||||
| Panel Session 387 to be held in BONHAM E on Thursday, Nov 11, 4:30 PM to 6:00 PM | |||||||
| Sponsored by the Research, Technology, and Development Evaluation TIG | |||||||
| Chair(s): | |||||||
| Alison Buchan, University of British Columbia, abuchan@medd.med.ubc.ca | |||||||
| Discussant(s): | |||||||
| Alison Buchan, University of British Columbia, abuchan@medd.med.ubc.ca | |||||||
| Abstract: Complex health problems such as chronic disease or pandemics require knowledge that transcends disciplinary boundaries in order to generate solutions. Such transdisciplinary discovery requires researchers to work and collaborate across boundaries, combining elements of basic and applied science. At the same time, calls for more interdisciplinary health science acknowledge that there are few metrics to evaluate the products associated with these new ways of working. The Research on Academic Research (RoAR) initiative was established to evaluate the process of discovery and impact of collaboration that emerged through the Life Sciences Institute at the University of British Columbia, a state-of-the-art facility designed to support researchers - self-organized around specific health problems rather than disciplines. A logic model depicting the factors influencing such collaboration is presented along with a multi-method evaluation plan to assist understanding of the discovery process in this new environment and develop new metrics for assessing collaborative impact. | |||||||
| |||||||
| |||||||
| |||||||
|
| Session Title: Government Evaluation TIG Business Meeting and Panel: Happy Anniversary to Us! Celebrating Twenty Years of Government Evaluation | ||||||
| Business Meeting with Panel Session 388 to be held in Texas A on Thursday, Nov 11, 4:30 PM to 6:00 PM | ||||||
| Sponsored by the Government Evaluation TIG | ||||||
| TIG Leader(s): | ||||||
| Stanley Capela, HeartShare Human Services of New York, stan.capela@heartshare.org | ||||||
| David J Bernstein, Westat, davidbernstein@westat.com | ||||||
| Sam Held, Oak Ridge Institute for Science and Education, sam.held@orau.org | ||||||
| Chair(s): | ||||||
| Stanley Capela, HeartShare Human Services of New York, stan.capela@heartshare.org | ||||||
| Abstract: In 1990, the theme of the AEA conference was Evaluation and the Formulation of Public Policy, a topic that is central to the role of evaluation in government. During the 1990 conference, a session attended by about 20 people gathered to discuss the possibility of establishing a State and Local Government Evaluation Topical Interest Group (TIG), which was approved by the AEA Board in early 1991. In 2005, the TIG’s focus was broadened, and the name was changed to the Government Evaluation TIG. This panel will celebrate the 20th anniversary of the TIG with three highly relevant presentations, including a panel discussion with the current and past chairs of the Government Evaluation TIG, a key note address by Joe Wholey, one of the leading experts on government evaluation in the United States, and the Government Evaluation TIG’s annual business meeting. | ||||||
| ||||||
| ||||||
|
| Session Title: Using Logic Models to Facilitate Comparisons of Evaluation Theory | |||||
| Multipaper Session 389 to be held in Texas B on Thursday, Nov 11, 4:30 PM to 6:00 PM | |||||
| Sponsored by the Theories of Evaluation TIG | |||||
| Chair(s): | |||||
| Marv Alkin, University of California, Los Angeles, alkin@gseis.ucla.edu | |||||
| Discussant(s): | |||||
| Robin Lin Miller, Michigan State University, mill1493@msu.edu | |||||
|
| Session Title: Examining the Mixing in Mixed Methods Evaluation | ||||||||||||||||||||||
| Multipaper Session 390 to be held in Texas C on Thursday, Nov 11, 4:30 PM to 6:00 PM | ||||||||||||||||||||||
| Sponsored by the | ||||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||||
| Jori Hall, University of Georgia, jorihall@uga.edu | ||||||||||||||||||||||
| Discussant(s): | ||||||||||||||||||||||
| Mika Yamashita, Academy for Educational Development, myamashita@aed.org | ||||||||||||||||||||||
|
| Session Title: Linking Professional Associations to Advance the Study of Science and Innovation Policy | |||
| Panel Session 391 to be held in Texas D on Thursday, Nov 11, 4:30 PM to 6:00 PM | |||
| Sponsored by the Research, Technology, and Development Evaluation TIG | |||
| Chair(s): | |||
| Susan Cozzens, Georgia Institute of Technology, susan.cozzens@iac.gatech,edu | |||
| Abstract: This panel of representatives of professional associations with interests in science and technology policy and evaluation will begin a dialogue to strengthen the linkages among these communities. After each presents an overview of their association, the topics that are central to their discussion, and “hot topics”, they will brainstorm specific ways they might interact more in the future. Organizations represented in addition to the AEA Research, Technology and Development Topical Interest Group are the Atlanta Conference on Science and Innovation, the Association for Public Policy Analysis and Management (APPAM), and the Academy of Management. Interaction with the audience will add other viewpoints such as the Technology Transfer Society. Strengthening this community is a goal of two U.S. federal initiatives, the Science of Science Policy in the White House Office of Science and Technology Policy and the Science of Science and Innovation Policy program at the National Science Foundation. | |||
| |||
| |||
| |||
|
| Session Title: Case Studies in Evaluation Use | |||||||||||||||||||||||||
| Multipaper Session 392 to be held in Texas E on Thursday, Nov 11, 4:30 PM to 6:00 PM | |||||||||||||||||||||||||
| Sponsored by the Evaluation Use TIG | |||||||||||||||||||||||||
| Chair(s): | |||||||||||||||||||||||||
| Jennifer Iriti, University of Pittsburgh, iriti@pitt.edu | |||||||||||||||||||||||||
|
| Session Title: Research on Evaluation Standards and Methods | |||||||||||||||||||||
| Multipaper Session 393 to be held in Texas F on Thursday, Nov 11, 4:30 PM to 6:00 PM | |||||||||||||||||||||
| Sponsored by the Research on Evaluation TIG | |||||||||||||||||||||
| Chair(s): | |||||||||||||||||||||
| Matthew Galen, Claremont Graduate University, matthew.galen@cgu.edu | |||||||||||||||||||||
|
| Session Title: Evaluation Methodology | ||||||||||||||||||||
| Multipaper Session 394 to be held in CROCKETT A on Thursday, Nov 11, 4:30 PM to 6:00 PM | ||||||||||||||||||||
| Sponsored by the Extension Education Evaluation TIG | ||||||||||||||||||||
| Chair(s): | ||||||||||||||||||||
| Paul Pope, Texas A&M University, ppope@aged.tamu.edu | ||||||||||||||||||||
|
| Session Title: Process Lessons for Applied Research and Evaluation from Capital Projects |
| Demonstration Session 395 to be held in CROCKETT B on Thursday, Nov 11, 4:30 PM to 6:00 PM |
| Sponsored by the Business and Industry TIG |
| Presenter(s): |
| Kate Rohrbaugh, Independent Project Analysis, krohrbaugh@ipaglobal.com |
| Abstract: Capital projects in industry are projects that require the investment of significant capital to maintain or improve a capital asset. In this demonstration, the presenter will provide an overview on the practices that are considered adequate planning in the world of capital projects and identify the parallels (of which there are many) for applied research and evaluation. These parallels were identified during the development of a research work process intended to improve and maintain the intellectual assets of Independent Project Analysis (IPA), a management consulting firm in Virginia that offers evaluation services and conferences for companies in the process industries. During this demonstration, the audience will become familiar with the practices and the phases of capital projects and how they apply to research and evaluation. Additionally, the presenter will identify areas of divergence and share implementation challenges. |
| Session Title: Improving the Quality of Peacebuilding Evaluation |
| Demonstration Session 396 to be held in CROCKETT C on Thursday, Nov 11, 4:30 PM to 6:00 PM |
| Sponsored by the International and Cross-cultural Evaluation TIG |
| Presenter(s): |
| Cheyanne Scharbatke-Church, Child Development Associate, cheyanne.church@tufts.edu |
| Abstract: This demonstration session will outline 2 new tools specific to the evaluation of peacebuilding in the international arena. The demonstration will be targeted at evaluators working in conflict and post-conflict settings evaluating peacebuilding projects, though all international social change projects may find the demonstration has utility. The presentation is based on the results of over 7 years of collaborative learning with peacebuilding practitioners and donors, led by the Reflecting on Peace Practice project of CDA. This collaborative learning included, but was not limited to, over 15 evaluations that sought to apply these tools and lessons as well as the experience of including key findings into the OECD-DAC guidance on peacebuilding evaluation. The demonstration will focus specifically on how to assess the effect of peacebuilding programming at the societal level, oft called peace writ large. It will also demonstrate how to adopt a systemic approach to the evaluation of peacebuilding. |
| Session Title: Haiti: Challenges in Emergency Response and Recovery Bring Challenges (and Innovation) in Evaluation | |||
| Panel Session 397 to be held in CROCKETT D on Thursday, Nov 11, 4:30 PM to 6:00 PM | |||
| Sponsored by the Disaster and Emergency Management Evaluation TIG | |||
| Chair(s): | |||
| Nan Buzard, American Red Cross, buzardn@usa.redcross.org | |||
| Abstract: On January 12, 2010, a magnitude 7 earthquake struck the Haitian coast 10 miles from the capital of Port-au-Prince, causing massive damage and significant loss of life. The American Red Cross (ARC) delegates working in Haiti were among the first to respond, and the public was generous in providing funds to ARC for emergency response and recovery programs. In coordination with the International Federation of Red Cross and Crescent Societies (IFRC), the International Committee for the Red Cross, and UN and US Government agencies, ARC continued the immediate emergency response while an IFRC needs assessment team deployed to conduct field assessments in eleven sector areas, to begin defining medium-term recovery priorities. This panel by ARC staff (including the Lead of the IFRC Recovery Assessment Team) will discuss how this unprecedented urban disaster challenged existing models of needs assessment, data collection and analysis, and M&E design, leading to useful innovation. | |||
| |||
| |||
| |||
|
| Session Title: Student Centered Issues in Evaluation | ||||||||||||||||
| Multipaper Session 398 to be held in SEGUIN B on Thursday, Nov 11, 4:30 PM to 6:00 PM | ||||||||||||||||
| Sponsored by the Graduate Student and New Evaluator TIG | ||||||||||||||||
| Chair(s): | ||||||||||||||||
| Thelma Woodard, University of Tennessee, Knoxville, twoodar2@utk.edu | ||||||||||||||||
|
| Session Title: Impact Evaluation at the Millennium Challenge Corporation (MCC): Theory, Application, and Complications | ||||
| Panel Session 399 to be held in REPUBLIC A on Thursday, Nov 11, 4:30 PM to 6:00 PM | ||||
| Sponsored by the International and Cross-cultural Evaluation TIG | ||||
| Chair(s): | ||||
| Marc Shapiro, Millennium Challenge Corporation, shapiromd@mcc.gov | ||||
| Discussant(s): | ||||
| Jack Molyneaux, Millennium Challenge Corporation, molyneauxjw@mcc.gov | ||||
| Abstract: The Millennium Challenge Corporation (MCC) is committed to conducting rigorous independent impact evaluations of its programs as an integral part of its focus on results. MCC expects that the results of its impact evaluations will help guide future investment decisions and contribute to a broader understanding in the field of development effectiveness. MCC’s impact evaluations involve a variety of methods chosen as most appropriate to the context. This panel first provides an overview of evaluation at MCC. This includes the defining the evaluation objectives, the number and variety of approaches to evaluation supported, the criteria that underlie decisions about whether and how to evaluate, and the linkages between decisions to fund projects and eventual evaluation results. Next, the panel provides three examples of evaluations being conducted across sectors and involving different methods across countries. The presenters will discuss the challenges involved in implementing these evaluations and lessons learned. | ||||
| ||||
| ||||
| ||||
|
| Session Title: Health Evaluation TIG Business Meeting and Presentation: Bridging the Evidence Gap in Obesity Prevention - A Framework to Inform Decision Making |
| Business Meeting Session 400 to be held in REPUBLIC B on Thursday, Nov 11, 4:30 PM to 6:00 PM |
| Sponsored by the Health Evaluation TIG |
| TIG Leader(s): |
| Robert LaChausse, California State University, San Bernardino, rlachaus@csusb.edu |
| Jenica Huddleston, University of California, Berkeley, jenhud@berkeley.edu |
| Debora Goldberg, Virginia Commonwealth University, goetzdc@vcu.edu |
| Chair(s): |
| Laura Leviton, Robert Wood Johnson Foundation, llevito@rwjf.org |
| Presenter(s): |
| Shirki Kumanyika, University of Pennsylvania, skumanyi@mail.med.upenn.edu |
| Discussant(s): |
| Jennifer Greene, University of Illinois at Urbana-Champaign, jcgreene@illinois.edu |
| Madhabi Chatterji, Columbia University, mb1434@columbia.edu |
| Abstract: In 2008, the Institute of Medicine of The National Academies convened a committee of experts to examine innovative ways in which the existing evidence base and research on obesity and obesity prevention programs could be accessed, evaluated and made useful to a wide range of policy-makers and decision-makers. The charge was to develop a framework to guide decision-makers in locating and using evidence to make effective decisions. A practical, action-oriented framework was developed to guide policymakers on how to use the available base of research evidence, and supplement that with complementary forms of credible evidence relevant to problem-solving and decision-making in obesity-prevention contexts. The framework, contained in the report “Bridging the Evidence Gap in Obesity Prevention: A framework to Inform Decision Making,” was released in April 2010. This session will unveil the L.E.A.D. framework for evidence-based decision-making and speaks directly to the theme of the 2010 AEA conference: Evaluation Quality. |
| Session Title: Building Capacity for Youth Participatory Evaluation | ||||
| Panel Session 401 to be held in REPUBLIC C on Thursday, Nov 11, 4:30 PM to 6:00 PM | ||||
| Sponsored by the Collaborative, Participatory & Empowerment Evaluation TIG | ||||
| Chair(s): | ||||
| Jane Powers, Cornell University, jlp5@cornell.edu | ||||
| Discussant(s): | ||||
| Shep Zeldin, University of Wisconsin, Madison, rszeldin@wisc.edu | ||||
| Abstract: As the field of Youth Participatory Evaluation (YPE) has grown, a variety of resources have been developed to support its implementation and enhance its practice. Our four panelists have had extensive experience engaging youth in a variety of participatory evaluation projects, roles and contexts (including cyber environments). They will describe effective training approaches and strategies, and share their collective lessons learned in building the capacity of youth and adults to carry out YPE efforts. This will include their experience with proven effective curricula, tools, and processes, and recommendations on how to successfully conduct YPE in a high quality, authentic manner. There will be ample time for dialogue with the audience to enable discussion about the application of these resources to potential YPE efforts. | ||||
| ||||
| ||||
| ||||
|