Date: Monday, July 21, 2025
Hello, AEA365 community! Liz DiLuzio here, Lead Curator of the blog. This week is Individuals Week, which means we take a break from our themed weeks and spotlight the Hot Tips, Cool Tricks, Rad Resources and Lessons Learned from any evaluator interested in sharing. Would you like to contribute to future individuals weeks? Email me at AEA365@eval.org with an idea or a draft and we will make it happen.
Hello AEA365 community. I am Nancy Bernal, MPH, CHES with Education, Training, and Research (ETR). At ETR, we are driven by our mission to improve health and well-being for youth and communities by championing science.
One of the projects we’re currently supporting is an Office of Population Affairs-funded initiative focused on helping youth ages 16–19 across California make informed sexual health decisions as they move from high school into adulthood.
As part of our efforts to strengthen the design of this randomized controlled trial of a sexual health education program, we used cognitive interviews to refine the baseline survey.
We conducted 10 cognitive interviews with California youth aged 16–19. To recruit participants, we reached out to local nonprofit organizations, the broader community, and the school district. We conducted interviews online; each lasted 30 to 45 minutes. To cover all of our questions while minimizing burden, we divided the questions into sets and rotated them across interviews. All participants reviewed a core set of questions focused on survey administration (e.g., how they prefer to receive surveys, when they prefer to take surveys, what might encourage more honest responses to sexual health-related questions) Then, each participant reviewed another thematic set of questions.
During interviews, we walked participants through the draft survey questions on the screen. Instead of asking them to respond with their actual answers, we invited them to think aloud, sharing how they interpreted each question and what came to mind as they read. We also asked for ways to improve the questions to make them clearer. This process helped us better understand how young people might interpret the survey and identify areas where wording, format, or clarity could be improved.
Through our cognitive interviews, we gathered valuable insights into how young people prefer to receive and engage with surveys, as well as how they interpret our questions. A majority of participants (60%) prefer to receive the survey link via text rather than email, explaining that texting is more convenient and that they rarely check their email. Similarly, 60% indicated they would most likely complete the survey on their phone rather than using a computer, tablet, or laptop. Nearly all participants (90%) said they would be more likely to answer honestly if they were assured that the survey was private and confidential.
They felt that the survey overall used inclusive language but still suggested small adjustments to improve tone. Participants encouraged us to use more examples and definitions for unfamiliar terms, and they advised against overly formal language, reinforcing the need to align survey language more closely with how young people speak.
Cognitive interviews helped us improve our survey questions and served as a youth engagement strategy by involving young people in shaping the evaluation instruments we used to tell their stories.
This post was co-developed using ChatGPT-4o (“Sol”), an AI writing partner who helped structure the author’s reflections and refine tone for publication.
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.