Date: Wednesday, March 25, 2026
Hi, I’m Pam Drake, a Senior Research Scientist at ETR. In this post, I share how we are using the PRISM model in a randomized controlled trial (RCT) of an adolescent pregnancy prevention program.
RCTs often emphasize testing the impact or effectiveness of a program or intervention and may overlook process data that are critical for understanding how interventions are used in real-world settings by real people. Effectiveness alone does not lead to sustainability. If we do not capture context, interventions risk being effective in theory but hard to maintain in practice. Integrating an implementation science framework into RCTs can be very powerful.
This is where the Practical, Robust, Implementation and Sustainability Model (PRISM) comes in. It helps us examine key, multilevel contextual factors—from program planning through sustainment—alongside outcomes. PRISM extends the popular RE-AIM framework by adding more contextual depth to program planning and implementation evaluation.
PRISM extends RE-AIM by adding in four domains that can be measured to some extent in an RCT:
The use of PRISM RCTs can also address equity issues that ultimately impact program sustainability. It helps ensure varied perspectives in the evaluation and can guide exploration of contextual issues that can lead to differences in program implementation and outcomes.
We are currently applying PRISM in our RCT evaluation of Wrap It Up, a five-session teen pregnancy prevention booster program for older adolescents, based on Health Connected’s Teen Talk High School Refresher. Our evaluation is taking place in California high schools. Schools are randomized to either receive Wrap It Up or conduct business as usual. We collect our main evaluation data via surveys at baseline and at 6 and 12 months post-baseline. We used PRISM as a guide for our implementation evaluation, identifying questions related to the RE-AIM elements, as well as the four PRISM contextual domains. We then identified data sources and methods to answer these questions.
We measure perspectives from youth participants, program facilitators, and school staff to capture lived experiences, asking questions such as: What do key school staff think about Wrap It Up, and what recommendations do they have? We collect data on characteristics that influence implementation and uptake among participants, facilitators, and schools, including age and gender among youth, facilitator experience, and school size and type. In addition, we examine the external community context, with a focus on rurality and the presence of health services, and we track feedback from parents and other community members. Finally, we examine infrastructure, including facilitator training, school support needs, and the implementation context, all of which are key to long-term sustainability.
We are using RE-AIM as a framework to track recruitment, retention, and equity indicators, identifying disparities that could undermine program reach and looking at characteristics of those who decline participation or experience unintended impacts. Our data sources include youth surveys, facilitator surveys, attendance logs, fidelity logs, classroom observations, and archival school data.
Preliminary findings point to barriers, including high absence rates in some schools, and facilitators, such as strong school support for the program, that affect sustainability. Using PRISM helps us design interventions that work beyond the RCT.
PRISM bridges the gap between RCTs and real-world sustainability, advancing broader reach and lasting impact.
The American Evaluation Association is hosting Health Evaluation TIG Week with our colleagues in the Health Evaluation Topical Interest Group. The contributions all this week to AEA365 come from our Health Evaluation TIG members. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org. AEA365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.