2010 Banner

Return to search form  

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Where Theory Meets Practice in Assessing Advocacy and Policy Change
Multipaper Session 711 to be held in BOWIE C on Saturday, Nov 13, 8:00 AM to 9:30 AM
Sponsored by the Advocacy and Policy Change TIG
Chair(s):
Ehren Reed,  Innovation Network, ereed@innonet.org
How Do You Know You Are Making a Difference? A Collaborative Study With Social Justice Advocacy Organizations in Canada
Presenter(s):
Bessa Whitmore, Carleton University, elizabeth_whitmore@carleton.ca
Maureen Wilson, University of Calgary, mwilson@ucalgary.ca
Avery Calhoun, University of Calgary, calhoun@ucalgary.ca
Abstract: What is the meaning of success and what are the factors or conditions that contribute to it? These are the two questions addressed in a collaborative study with nine very diverse groups and organizations across Canada that are engaged in advocacy work. While policy change is certainly a key aspect when thinking about advocacy effectiveness, our findings broaden the understanding of what success means and how it is achieved. For example, raising public awareness, group functioning and personal experience are regarded as significant aspects of success by those working directly in the field. Factors that contribute to effectiveness include activities that attract and retain participants, elements of group or organizational functioning, and a wide range of methods and strategies. In this presentation, we will discuss our methodology and findings in more detail, and the implications for those working in the field of advocacy evaluation.
Hard Evidence for Hard Times: A Policy Analysis of the Rise of Evidence-based Practice in Evaluation of Home Visiting Programs Using Kingdon’s Multiple Streams Model
Presenter(s):
Stephen Edward McMillin, University of Chicago, smcmill@uchicago.edu
Abstract: This paper examines Kingdon’s problem, policy, and politics streams across two domains: 1) the rise of evidence-based practice in evaluation, and 2) the roughly contemporaneous rise of home visiting programs. This paper then elaborates five significant evaluation barriers that home visiting programs face: 1) Lack of external validity in evaluations due to inadequate attention to implementation issues in complex interventions such as home visiting (American Evaluation Association, 2008); 2) Lack of a coherent sociological perspective in evaluation that acknowledges how social conditions impact individual outcomes; 3) Truncated selection of only individually-focused outcome measures in evaluation that ignore social networks and variables; 4) Inadequate evaluation frameworks that fail to examine all relevant variable domains, such as neighborhood and program factors that influence implementation; and 5) Failure to evaluate home visiting as a broader safety net of potentially universal social supports rather than simply a narrowly targeted intervention.
The Politics and Ethics of Advocacy Evaluation
Presenter(s):
Denise L Baer, Johns Hopkins University, src_dlbaer@hotmail.com
Abstract: Can political advocacy be evaluated? A positive answer presumes that efforts to change the direction of public policy can be objectively identified and measured. The body of political science research on the nature of interests and on the terrain of public opinion change would say “no” because no political interest is ever really neutral. Yet the field of advocacy evaluation is burgeoning as the advent of mission-oriented philanthropy leads foundations to shift from funding programs and services to seeking policy and systems change, This paper compares and contrasts existing approaches to evaluation of advocacy with political science perspectives on the ethics and nature of interests and political change. Five types of strategies and their political impact are defined: agenda setting, public policy, communications and public relations, message framing, and electoral and GOTV (get-out-the-vote) strategies. Each uses different tools, has different effects and require different time frames.

 Return to Evaluation 2010

Add to Custom Program