Date: Friday, January 16, 2026
Greetings from Washington University in St. Louis! I am Jenrose Fitzgerald, Program Manager for the Data for Social Impact (DSI) initiative at the Center for Social Development at WashU. My co-authors are Jillian Martin from Duke University, Dan Ferris from Washington University in St. Louis, Sara Mohamed and Paul Sorenson from University of Missouri St. Louis. The objective of DSI is to build capacity and connection among social sector organizations working to increase impact for and with the communities they serve. While DSI programming reaches national and international audiences, we are particularly focused on the St. Louis region. Broadly, we aim to build community data capacity, which includes both evaluation capacity and equitable data practices.
The DSI initiative brings together a wide range of people with varying data skills, organizational contexts, and roles, and advances the idea that everyone is a data person. Because of this, it is important to meet people where they are and develop offerings and tools that can build connection and a sense of shared values both within and across organizations and sectors. Given these factors, our evaluation approach to DSI needed to be flexible enough to respond quickly to changing community priorities. We draw on developmental evaluation and emergent design evaluation, allowing us to incorporate our evaluation findings into our programming and pivot in our approach when needed.
As the initiative has evolved, we have identified critical pivot points, moments in which we shifted course in response to participant feedback while maintaining our core values and vision. For example, interviews with social sector leaders shifted our approach from an original focus on ethical data science training aimed at executive leadership to collaborative and equitable data practices more broadly, with an emphasis on stronger communication and collaboration across roles. Feedback from our advisory committee and from public roundtables initiated a shift in curriculum areas, leading us to prioritize accessible language and programming designed to accommodate all levels of data fluency. Participant input even led to a name change from Data Science for Social Impact (DSSI) to Data for Social Impact (DSI). Finally, feedback from event and online module surveys led to the formation of communities of practice and data equity cohorts grounded in peer learning and knowledge exchange.
For DSI, a developmental and emergent approach to evaluation has provided a flexible framework for building responsive programming for a diverse range of participants with multiple, sometimes conflicting priorities, while simultaneously building connection and a sense of shared values and vision around data and impact.
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.