Dissemination & Implementation Science
Examining Externally Observable Factors that Predict Community Partner Interest in Joining an Implementation Study
Rafael T. Esteva Hache, B.A.
Post-Baccalaureate Student
University of California, Berkeley
Berkeley, California
Marlen Diaz, B.A.
Graduate Student
University of California, Berkeley
Berkeley, California
Allison G. Harvey, Ph.D.
Professor & Clinical Psychologist
University of California Berkeley
Berkeley, California
Background: Community Academic Partnerships (CAPs) are fundamental to closing the gap between science and practice, as they combine academic expertise and community knowledge to develop implementable, evidence-based practices (EBPs). Commonly studied factors that determine the success of CAPs include the community partner’s readiness for change, familiarity with EBPs, and trust in researchers. However, these factors are typically assessed once a CAP is formed, creating opportunities for selection bias. This study aims to examine if factors that can be observed pre-partnership are associated with a Community Behavioral Health Agency’s (CBHA) interest in joining an implementation study. Identifying these factors would alert the field to differences between organizations that engage in CAPs and those that do not, and may allow researchers to adapt their outreach strategies when contacting possible partners.
Method: CBHAs (N=13) in California were contacted biweekly for a month to gauge their interest in forming a CAP. CBHAs were offered free training in sleep treatment as part of a NIMH-funded implementation study. Interest in forming a CAP was assessed based on whether CBHA leadership met with researchers. After the outreach period, publicly available data was collected from CBHA websites, the CDC, and election records. Factors of interest included overall budget, workforce training budget, and strategic fit, defined as whether the CBHA’s strategic plan mentioned EBTs. Counties’ COVID vaccination rates and Democrat vote share in 2020 were also considered since these factors have been associated with trust in scientists. Lastly, the CDC COVID Community Vulnerability Index (CCVI), a measure that includes socioeconomic and healthcare system variables, was included.
Results: Three CBHAs were interested in a CAP and ten were uninterested. One interested CBHA included EBTs in their strategic plan, and the remaining twelve did not. As predicted, interested CBHAs had higher overall budgets (M=346.36, in millions of dollars) compared to uninterested CBHAs (M=35.69), yielding a large effect (d=0.83 95% CI [-.01‐1.67]). However, this difference was not significant (p=.2). Also as predicted, interested CHBAs had larger workforce training budgets (medium effect size, d=0.75 95% CI [-.08-1.59]), but this difference was not significant (p=.29). Similarly, we found large but nonsignificant effects in the predicted direction for vaccination rates (d=7.07 95% CI [4.89-9.25]), Democrat vote share (d=5.23 95% CI [3.53-6.93]), and CCVI (d=1.35 95% CI [0.45-2.25]) based on CBHA interest .
Conclusion: The small sample size likely explains the lack of significance of these results, yet an examination of effect sizes lends tentative support to the hypotheses. Specifically, CBHAs with larger budgets, and in highly vaccinated, mostly Democrat counties were more interested in joining a CAP. These results raise the possibility that implementation science findings are based on a subset of community settings. Research with a larger sample of community partners and considering a wider array of factors is needed to better understand how to forge CAPs with a broader range of CBHAs.