|
Session Title: Policy Evaluation: Learning About What, When and For Whom?
|
|
Panel Session 310 to be held in Carroll Room on Thursday, November 8, 9:35 AM to 11:05 AM
|
|
Sponsored by the Advocacy and Policy Change TIG
|
| Chair(s): |
| John Sherman,
Headwaters Group,
jsherman@headwatersgroup.com
|
| Abstract:
This session will offer lessons learned about approaches to, challenges of, and rationale for undertaking evaluations of policy and advocacy efforts. The three presenters reflect on their hands-on experience and hard-earned lessons. They provide new ways they are approaching policy change and advocacy evaluations that overcome some of the challenges, help them better understand which aspects of policy evaluation are most critical for which audiences, and how the evaluations can provide relevant learnings for each.
|
|
Let's Get Real About Real-Time Reporting
|
| Julia Coffman,
Harvard Family Research Project,
jcoffman@evaluationexchange.org
|
|
In recent years, the term 'real time' has infiltrated the evaluation world. Evaluators use it to describe their reporting approaches, meaning that they report regularly so their work can inform ongoing learning and strategy decisions. Real-time reporting is particularly important for advocacy efforts, which often evolve without a predictable script. To make informed decisions, advocates need timely answers to the strategic questions they regularly face. But while real-time evaluation reporting to inform their responses makes good sense in theory, it can be difficult to implement successfully in practice. Even when regular reporting takes place, given the rapidly-changing policy context, its success in informing advocacy strategy can be hit or miss. This presentation will offer ideas on successful real-time evaluation reporting for advocates and how to create flexible evaluation plans that can adapt to changing learning needs.
|
|
|
Learning During Intense Advocacy Cycles
|
| Ehren Reed,
Innovation Network Inc,
ereed@innonet.org
|
|
Through a multi-year evaluation of a collaborative effort to change national immigration policy, Innovation Network has employed creative methodological approaches to efficiently capture and manage the large amounts of data generated and to effectively synthesize key learnings about how advocates gain access to, build relationships with, and influence policymakers. Innovation Network will discuss the environmental and contextual factors that posed challenges to employing traditional data collection methods, and then describe a specific focus group protocol that we designed to follow the peaks and valleys of the policy advocacy cycle. The resulting Intense Period Debrief Protocol serves to foster learning by eliciting qualitative information from a group of key players shortly after a policy window-and the inevitably corresponding period of intense advocacy activity-occurs.
| |
|
Accountable Learning in Policy Evaluation: Politics and Practice
|
| John Sherman,
Headwaters Group,
jsherman@headwatersgroup.com
|
|
The policy advocacy landscape is full of complexity. The policy focus (legislative, regulatory, and legal), the scale at which it is occurring (local, state, federal or international), the time period over which it is occurring (months, years or even decades), the capacity of the groups, and the significant amount of unforeseeable factors impacting the work, are some of its notable features. With several cluster-level policy evaluations underway or completed, and years of experience as policy advocates, Headwaters offers its observations on effective evaluation approaches in this dynamic landscape, that also help the target audience(s) determine which aspects of lessons identified in the evaluation are most important for them, and for which they should be accountable.
| |