Symposia
Dissemination & Implementation Science
Amber Calloway, Ph.D.
Research Associate
The Penn Collaborative for CBT and Implementation Science, Perelman School of Medicine, University of Pennsylvania
Philadelphia, Pennsylvania
Samantha Hernandez, PhD
Project Coordinator
National Center for PTSD; Stanford University
Menlo Park, California
Jiyoung Song, PhD
PhD Student
University of California, Berkeley
Berkeley, California
Alayna Park, Ph.D.
University of Oregon
Eugene, Oregon
Kimberlye E. Dean, Ph.D.
Clinical Research Fellow
Massachusetts General Hospital/Harvard Medical School
Boston, Massachusetts
Soo Jeong Youn, PhD
Assistant Professor
Harvard Medical School
Boston, Massachusetts
Robert DeRubeis, PhD
Professor
University of Pennsylvania
Philadelphia, Pennsylvania
Dawne Vogt, PhD
Health Science Specialist
National Center for PTSD
Boston, Massachusetts
Luana Marques, PhD
Associate Professor
Harvard Medical School
Weston, MA
Shannon Wiltsey Stirman, Ph.D.
Associate Professor/Acting Deputy Director
Stanford University
Menlo Park, California
Torrey A. Creed, Ph.D.
Assistant Professor
Perelman School of Medicine at the University of Pennsylvania
Philadelphia, Pennsylvania
Fidelity measurement is key for clinical trials, training, and quality improvement efforts but the “gold standard” of observer ratings is time-consuming, requires extensive training, and may feel intrusive. Worksheets are employed in most cognitive behavioral therapies (CBTs) to support interventions and may therefore also offer information about session activities. This study, part of a larger R01, aimed to validate a measure of fidelity based on worksheets in cognitive processing therapy (CPT) and CBT. It was hypothesized that for both CPT and CBT, fidelity ratings of worksheets would be associated with (1) observer-rated fidelity and (2) symptom change.
Clinicians (N=164) treated clients (N=403) with CPT for PTSD or CBT for depression/anxiety in routine care settings). Clients completed or reviewed (from homework) CPT (N=190) or CBT worksheets (N=81) with therapists in session. A fidelity scoring system for CBT and CPT worksheets was refined from Stirman et al. Trained observers rated treatment sessions for competence (N=552 [CPT]; 268 [CBT]) using the Cognitive Therapy Rating Scale (CBT) or the CPT Therapist Adherence and Competence scale (CPT). Treatment outcome was measured with the PTSD Checklist for DSM-5 (PCL-5), Beck Anxiety Inventory (BAI), and Patient Health Questionnaire-9 (PHQ-9). Spearman rank correlations and multilevel modeling were used.
In our preliminary analyses, worksheet competence was associated with observer-rated competence for CBT (r=.18, p=.11, Beta=0.36, p=.01) but not CPT (r=.15, p=.03, Beta=0.55, p=.13). Analyses with time-lagged models showed that worksheet adherence was associated with change on the PCL-5 in CPT (Beta=0.52, p=.02) but competence was not (Beta=0.02, p=.79). Worksheet competence did not predict change on the PHQ-9 (Beta=.06, p=.29) or the BAI (Beta=-0.14, p=.33) in CBT. Observer rated adherence and competence were associated with outcomes for CPT (Beta=-0.05, p=.01; Beta=-0.01, p=.03). Observer rated CBT competence predicted anxiety outcomes (Beta=0.02, p=.01), but not depression (Beta=-0.01, p=.48).
The results offer preliminary evidence that worksheets can provide information about CBT competence in routine care settings but may not reflect CPT adherence. Worksheet fidelity does not appear to predict treatment outcome for CBT but may be predictive for CPT. The findings are consistent with mixed findings regarding fidelity predicting outcomes in the overall literature. Implications for fidelity monitoring and treatment outcome will be discussed.