Partnering with Communities to Improve Program Implementation and Research
Insights from the Promoting Adolescent Sexual Health and Safety Evaluation
Promoting Adolescent Sexual Health and Safety (PASS) is an aspirational adolescent pregnancy prevention program that empowers youth and their caretakers to challenge gendered and societal norms, build healthy relationships, and connect with local health services. The Urban Institute developed the PASS program in 2012 in partnership with the DC Housing Authority and residents and community-based organizations from the Benning Terrace Development using a community-based participatory research approach. From 2017 to 2022, Urban conducted a community-engaged, quasi-experimental evaluation of the program in partnership with Sasha Bruce Youthwork and other community-based organizations in DC.
Drawing on the evaluation’s process documentation and reflections from the research team, community partners, and program facilitators, these fact sheets highlight program successes, challenges, and lessons learned throughout the PASS evaluation in the following areas:
- Strengthening Program Implementation and Evaluation Through Community-Based Partnerships
- A Community-Centered Approach to Program Implementation
- Collaborating with Community-Based Partners to Conduct Surveys
These fact sheets provide insights from the PASS evaluation that are widely applicable to community-engaged research and program implementation across policy areas. Researchers, program developers, and policymakers alike can learn from how the PASS evaluation approached partnerships, program implementation, and surveying to balance the requirements of a rigorous evaluation with the qualities of authentic community engagement.
- Strengthening Program Implementation and Evaluation Through Community-Based Partnerships:
- Balancing the structural requirements of a rigorous program and evaluation with investment in strong relationships with partners, community members, and young people can ultimately improve program and evaluation outcomes.
- Ensuring the project and collaboration are mutually beneficial to all partners and finding time and resources to collaborate with partners beyond the immediate confines of the evaluation can help build trust and strong relationships that can be sustained over time.
- It is important to discuss the sustainability or future of the partnership from the start and revisit plans and goals periodically to develop a long-term vision for the work together.
- A Community-Centered Approach to Program Implementation:
- A curriculum is only ever as good as its facilitators and the relationships between facilitators and participants; it is important to invest in facilitators from the communities you are working with to promote sustainability and minimize turnover.
- Communities, even those within the same city, are diverse and have different needs. For a community-centered model to be effective, implementation partners need space and flexibility to adapt the model to each community’s individual context.
- The rigorous evaluation requirements to determine whether PASS was an evidence-based model placed constraints on program implementation, which ultimately impeded the quality of program delivery. A community-centered approach to program implementation makes fidelity to a rigid research design and model difficult, as rigidity does not allow space for the deep relationships and trust building required for success in this work.
- Collaborating with Community-Based Partners to Conduct Surveys:
- Including community partners in evaluation design and data collection processes can improve the feasibility of the design, survey retention and completion rates, and, ultimately, overall data quality.
- Training community partners in research evaluation design and data collection can increase their capacity to develop and administer their own surveys in the future.
- Future evaluations of community-centered youth programs could better assess and improve effectiveness by incorporating more robust qualitative data collection and structuring questions to determine whether an intervention works and identify what is working and what is not. This includes designing questions that are both directly related to key program outcomes and targeted to understand the broader context in which these programs operate.