Skip to main content

Implementation support for contingency management: preferences of opioid treatment program leaders and staff

Abstract

Background

Contingency management (CM), a behavioral intervention that provides incentives for achieving treatment goals, is an evidence-based adjunct to medication to treat opioid use disorder. Unfortunately, many front-line treatment providers do not utilize CM, likely due to contextual barriers that limit effective training and ongoing support for evidence-based practices. This study applied user-informed approaches to adapt a multi-level implementation strategy called the Science to Service Laboratory (SSL) to support CM implementation.

Methods

Leaders and treatment providers working in community-based opioid treatment programs (OTPs; N = 43) completed qualitative interviews inquiring about their preferences for training and support implementation strategies (didactic training, performance feedback, and external facilitation). Our team coded interviews using a reflexive team approach to identify common a priori and emergent themes.

Results

Leaders and providers expressed a preference for brief training that included case examples and research data, along with experiential learning strategies. They reported a desire for performance feedback from internal supervisors, patients, and clinical experts. Providers and leaders had mixed feelings about audio-recording sessions but were open to the use of rating sheets to evaluate CM performance. Finally, participants desired both on-call and regularly scheduled external facilitation to support their continued use of CM.

Conclusions

This study provides an exemplar of a user-informed approach to adapt the SSL implementation support strategies for CM scale-up in community OTPs. Study findings highlight the need for user-informed approaches to training, performance feedback, and facilitation to support sustained CM use in this setting.

Peer Review reports

Implementation support for contingency management: preferences of opioid treatment program leaders and staff

Opioid use disorder and opioid-related overdose deaths are a major public health crisis in the USA. Medication for opioid use disorder (MOUD) is considered the frontline treatment, typically involving medications including methadone and buprenorphine [1]. Opioid treatment programs (OTPs) are regulated by the Substance Abuse and Mental Health Administration (SAMHSA) and provide medications and counseling to individuals in the community with OUD [2]. Providers working in OTPs face a number of unique challenges including high patient volume and a fast-paced work environment, [3] as well as systems-level issues such as strict federal guidelines regulating patient contact and low reimbursement for services [4]. Combined, these challenges increase the risk of provider burnout and turnover and make it difficult to train and retain the OTP workforce in evidence-based practices [5,6,7].

Contingency management (CM), or the provision of patient incentives to promote achievement of treatment-related goals, is an evidence-based behavioral intervention with strong support as an adjunctive intervention in OTPs [8]. CM typically involves providing tangible incentives (e.g., gift cards, small prizes) for achieving treatment goals, with positive reinforcement serving to increase the likelihood of continued substance abstinence and/or treatment attendance. Incentives are provided on a consistent (e.g., weekly) schedule, with prizes awarded immediately following achievement of the goal (e.g., a negative urine toxicology screen or attendance at counseling [9]). CM has large effect sizes when delivered in combination with MOUD [8] and has emerged as a superior treatment when compared head-to-head with cognitive behavioral therapy as an adjunctive intervention [10]. Unfortunately, a myriad of barriers have limited OTP providers’ ability to successfully implement CM with fidelity [11, 12].

Traditional approaches to scaling up evidence-based practices like CM often involve didactic workshops, usually as part of continuing education requirements for licensure or for new practice implementation at an agency [12, 13]. Although these trainings are effective for enhancing provider knowledge about evidence-based practices, they are often insufficient for sustaining provider behavior change or new practice implementation [14]. An extensive body of research has shown that didactic training is enhanced when paired with the provision of training and support strategies such as feedback on provider performance and access to an external facilitator [15,16,17].

The Science-to-Service Laboratory (SSL), developed by the SAMHSA-funded network of Addiction Technology Transfer Centers (ATTCs), is an example of a multi-level implementation strategy that combines three empirically supported elements: didactic training, performance feedback, and external facilitation [18]. These three elements address contextual determinants at both the provider and organizational levels. The SSL typically commences with didactic training, often delivered as a 1-day workshop with experiential learning, followed by ongoing feedback on the fidelity of intervention delivery. These elements address provider-level determinants such as knowledge and perceptions of the intervention. Meanwhile, technology transfer specialists provide external facilitation to address organization-level determinants such as how the intervention fits into the workflow, ensuring time and funding for intervention delivery, provider turnover/retraining, and leadership engagement. The New England ATTC has been evaluating the SSL strategy since 2008 and has described the strategy extensively in prior work (see [18,19,20]).

In early SSL work, we found that agencies providing substance use treatment were more likely to adopt CM when they participated in all of these SSL elements than when they only completed some of the elements [18]. In a more recent study specifically focused on OTPs, we compared implementation outcomes in seven OTPs that received the SSL to 11 OTPs that only received didactic training-as-usual (without performance feedback or external facilitation). We found that the SSL resulted in both significantly higher adoption and faster implementation of CM when compared to didactic training-as-usual, providing evidence for the additional utility of the performance feedback and facilitation strategies [20, 21].

We employed a user-informed approach grounded within formative qualitative research to understand the unique needs of OTP staff with regard to training and support strategies for CM implementation. Leaders and providers from multiple OTPs shared their preferences regarding CM didactic training, performance feedback, and external facilitation to inform format, content, frequency, and delivery. The long-term goal of this study was to inform the final design of the SSL implementation strategy used in a cluster randomized implementation trial with 30 OTPs throughout New England.

Methods

Recruitment

At the time of this study (see [22]), there were 13 OTPs in the state of Rhode Island, all of which were invited to nominate staff for participation. Research staff contacted executive leaders and directors from the OTPs via both phone and email to describe the qualitative study, to describe researcher interest in receiving input from OTPs on the SSL strategy, and to request nominations of eligible staff. Each OTP nominated two providers and two leaders to participate. Eligibility criteria for leaders included supervising or managing providers and at least 6 months of employment at their site. Providers needed to have been employed for at least 3 months and have an active caseload that involved providing psychosocial counseling services to patients at their treatment facility.

Participant enrollment

The Institutional Review Board at the Miriam Hospital, an affiliate of the Alpert Medical School of Brown University, approved this study and granted a waiver of documented consent (Project Number: 210718 45CFR 46.110(7)): all nominated participants completed informed consent verbally over the phone or in-person at their OTP office. Leaders and providers were invited to participate in 45 to 60-min audio-recorded interviews as part of the informed consent process. Participants were assured of their rights to confidentiality and privacy. Participants were also assured that their responses would not be shared with their employer and no identifying information was included in the data collection. Participants were told that decisions about participation would not be shared with OTP leaders and would not affect their employment at their OTP. Participants were offered $100 for completion of the interview.

Interview procedures

We conducted audio-recorded interviews both in person and over the phone with providers and leaders. Four interviewers trained by the study PI (SJB) in semi-structured interview methods conducted the interviews one-on-one with participants either on-site at the OTPs or via phone. Interviewers included two postdoctoral fellows (KS and CM), one Bachelor’s level Research Assistant, and one Master’s level Research Assistant. Interviews included questions about a wide range of CM design and training preferences (see Becker et al., 2019 for the interview guide [22]). All providers were given a working definition of CM at the start of the interview to ensure that all participants had sufficient knowledge of CM principles.

The current study focused on questions regarding participants’ preferences for the SSL implementation strategy to scale up CM, including providers’ and leaders’ preferences for didactic training elements (e.g., content, format, and delivery), performance feedback (e.g., how often, by whom), and external facilitation (e.g., how often, how accessed, focal topics). These questions were prioritized to inform our adaptation of the SSL using a user-informed formative evaluation approach [23]. Interviewers also took notes regarding provider demographic characteristics and any feedback received on the interview questions.

Data analysis

Interview recordings were transcribed and cleaned to ensure the removal of all identifying information. Transcripts were not returned to participants for correction. Three independent coders completed transcript coding and thematic analysis using a reflexive team analysis approach [24, 25]. Coders included two Research Assistants (KY and SM; One Bachelor’s and one Master’s level, both new to the study at the time of coding) and one Postdoctoral Fellow (KS; also an interviewer). The coders collaboratively developed a coding dictionary that included both a priori (i.e., didactic training, performance feedback, and facilitation themes) and emergent themes based on review of the transcripts. The coders then imported all codes and transcripts into NVivo version 12 coding software and applied the dictionary to all 43 transcripts. Two coders each independently coded half the transcripts each, with the third coder coding 20% of the transcripts to ensure inter-rater reliability. Coders met weekly to discuss and resolve coding discrepancies and achieve 100% consensus in coding decisions. Coders also discussed additional emergent themes identified during the coding process and made modifications to the coding dictionary to achieve saturation of themes identified.

After coding was complete, the coding team ran queries in NVivo to identify the most commonly endorsed preferences for CM training format, performance feedback, and external facilitation. The most common themes and sub-themes were tabulated through transcript frequency counts. Exemplar quotes were identified for each theme. Findings were shared with leadership at each OTP to give them the opportunity to provide feedback on the interpretation of results.

Results

The primary goal of this analysis was to evaluate how front-line providers and organization leadership would adapt a CM-focused implementation strategy at their site, considering their unique organizational context. Table 1 presents characteristics of the final sample, and Table 2 presents a summary of emergent themes and illustrative quotes.

Table 1 Participant sociodemographic characteristics (N = 43)
Table 2 Themes related to the design of the implementation strategy with definitions and illustrative quotes

Sample characteristics

Administrators from 11 of the 13 approached OTPs (85%) agreed to nominate staff. OTP leaders and directors nominated a total of 44 staff (22 leaders, 22 providers) to participate. Twenty-one leaders (95% of nominated) and 22 providers (100% of nominated) enrolled and completed qualitative interviews. Participants were primarily White (93%), female (72%), and had earned a bachelor’s degree as their highest education level (42%). Years of employment at their current OTP varied significantly among participants from 3.5 months to 41 years; average tenure was just under 5 years. There were no significant demographic differences between providers and leaders.

Didactic training

Preferences regarding didactic training encompassed three emergent sub-themes: training format, content, and learning tools. In general, preferences for didactic training were similar for leaders and providers.

Training format

The training format sub-theme encompassed preferences pertaining to both the location and duration of training. Seven participants (4 leaders, 3 providers) shared their opinions about where didactic training should occur. Three leaders and two providers expressed a desire for on-site training at their OTP. One of these leaders noted the convenience of in-house training, stating that, “If somebody came out to us, that would be absolutely perfect.” By contrast, two participants (1 leader, 1 provider) explicitly stated a preference for off-site training; one leader emphasized the value of going to a secondary location for training by suggesting, “I think the workshops and seminars are good ‘cause it takes the staff away from here, and they can concentrate on just that.” Four participants made suggestions about training duration (1 leader, 3 providers). Preferences ranged from a minimum of “an hour” to a maximum of 2 days (“couple of days’ worth”).

Training content

With regard to training content, quotes repeatedly referenced a desire for case examples paired with research evidence. Requests for specific case examples of successful CM implementation were common (2 leaders, 4 providers). One leader suggested “…having concrete examples of where it’s been successful would be very helpful and how another agency may have implemented it.” Other respondents suggested that case examples would help with “buy-in” of OTP counselors and leaders.

Six participants (4 leaders, 2 providers) shared impressions about the value of objective research evidence as a teaching tool. For example, one leader stated, “Definitely giving them just a literature review… perhaps I’m old-school and very academic, for that’s the best way to disseminate information.” Remaining quotes supported the value of research data in convincing staff that “it’s not just someone’s idea” and inspiring them to adopt a new intervention.

Learning tools

In terms of learning tools, participant comments reflected a desire for active learning strategies during the training and supplementary resources to take home after the training. Two participants (1 leader, 1 provider) requested experiential strategies such as role plays and behavioral rehearsal, and both thought that active learning could help the staff become “comfortable” with the intervention. As an example, the leader recommended, “... doing a few role-plays sometimes helps for some people…I think just making sure they have all the tools in their tool belt, as we say, to make sure that they can implement it.”

Meanwhile, two participants (1 leader, 1 participant) spontaneously requested supplemental resources such as handouts and training materials as a means of helping training participants to retain information. The leader explicitly recommended handouts, noting “...[staff] love handouts, because if they’re not hearing you because they’re burnt out that day, they have somethin’ to take with them.” Similarly, the provider recommended that all didactic materials be compiled and shared after the training, noting, “when trainings happen, you’re not able to get all this information in one shot. You can try, but you can’t so if everything could be in a nice packet…”

Performance feedback

In the performance feedback theme, sub-themes that emerged included the feedback source (i.e., who would provide performance feedback) and the feedback delivery (i.e., how feedback would be evaluated and shared with the provider).

Feedback source

Participants suggested several potential sources of performance feedback, with substantial consistency between providers and leaders. The most common recommendation (10 leaders, 17 providers) was an in-house supervisor. Participants suggesting this option generally cited a desire for comfort and rapport with the individual providing feedback. For example, one provider explained “I just think getting feedback from somebody you don’t know is a lot tougher than getting feedback from somebody you have supervision with once a month.”

The next most popular suggestion (5 leaders, 15 providers) was to receive performance feedback directly from CM patients. Respondents recommending this option shared the belief that the patients would have the most “accurate information” into how helpful the counselor’s CM delivery was for enhancing their normal care. Other providers advocated that patient feedback should be a central component of intervention evaluation, because patients are the target end user (i.e., “who I’m helping”).

Finally, seven participants (2 leaders, 5 providers) indicated a preference to receive performance feedback from a CM expert outside of their clinic. Various benefits cited of expert feedback included objective input, reduced potential for conflict between co-workers, and assurance of equitable feedback. For example, a provider shared her opinion that, “the best bet would be someone outside of here. It would become unequal, I think, if it was someone within the clinic.” Similarly, a leader shared the view that “somebody that doesn’t know us that well, or hasn’t worked with us, and doesn’t have a personal relationship, is gonna tell us the truth and is gonna lay it [feedback] out how it needs to be laid out.”

Feedback delivery

Regarding feedback delivery, providers and leaders shared their impressions about both audio recordings and rating scales. Of the thirteen participants that expressed their opinions on the use of audio recordings for feedback, five participants (3 leaders, 2 providers) were in favor and eight (3 leaders, 5 providers) expressed some concern over such a tool. Those in favor touted the potential for high quality feedback. For example, one leader shared, “…We have done that with MI [motivational interviewing] where we’ve had to tape ourselves with a patient and then send that out, and then get a ratings sheet on that, that could be a good idea.” By contrast, other participants expressed wariness over audio recordings as a viable option due to concerns about discomfort (both their own and their patient’s), as indicated in this provider’s response, “Yeah, it’s tough. As far as recorded, I, personally, don’t feel like any of my patients, or a very limited number of my patients, would feel comfortable having anything documented on record so openly such as that….”

Feedback about performance feedback rating scales was more consistently positive. A greater number of providers than leaders (5 providers, 3 leaders) shared positive feedback about such scales, with six recommending them in the context of supervision and two recommending them in the context of patient care. Beyond explicit suggestions of rating scales, participants also had positive views about performance evaluations using patient surveys/questionnaires (3 providers, 2 leaders) and CM checklists (2 providers). One leader shared her endorsement of scales, noting “With a checklist or with the rating scale, you can see it. Then when you’re talking about it, you can process through what’s getting in the way. I like rating scales.”

External facilitation

Participants also reported on their preferences for external facilitation. Responses indicated interest in either a remote support system (2 leaders, 6 providers) or in-person contact (4 leaders, 3 providers) offered in-house. Nine leaders and providers (2 leaders, 7 providers) expressed a desire to receive “as needed,” “on-call,” or “as things happen” ongoing support. In addition, ten participants (6 leaders, 4 providers) expressed a desire for additional structured facilitation sessions at pre-scheduled intervals. The most popular suggestion for the frequency of sessions was monthly (4 leaders, 2 providers), closely followed by quarterly (2 leaders, 2 providers). Requests for support included help with both CM delivery (e.g., “type of incentive [CM] for someone, and …how we’re going to do it”) and with CM implementation (e.g., “track it how we’re as a staff buying into it”). Desired support for help with intervention delivery included a number of issues unique to CM implementation including how to monitor whether CM was working for patients, how to identify which patients earned prizes, and how to monitor and award the actual prizes. Meanwhile, ideal support for help with CM implementation encompassed topics such as promoting staff “buy in,” providing ongoing training via seminars and workshops, and having ongoing monitoring of staff CM use.

Remote facilitation recommendations encompassed a range of options including email, phone, or video-conference sessions. One provider shared her perspective that remote support would help her to feel more confident about intervention delivery: “If I could get the facilitator’s contact information to send them an email about something if I needed help in a situation—Just knowing that I have the support, I think I’d feel a lot better.” Meanwhile, those participants that advocated for in-person support noted the convenience of being able to consult with a facilitator in the course of routine operations. Some participants suggested having the facilitator drop into the OTP at random to check in, while others suggested having the facilitator join at pre-determined intervals (e.g., at routinely scheduled staff meetings).

Discussion

This study conducted user-informed formative research (e.g., recruiting potential users, defining target users and their needs, and conducting interviews with target users to understand their preferences) to solicit feedback from OTP leaders and front-line providers about a comprehensive CM implementation strategy. Emergent themes analysis informed adaptation of the SSL implementation strategy for delivery to OTPs in a large hybrid 3 implementation-effectiveness trial (see Curran et al. for details on hybrid trial designs [26]). In general, there was high concordance between providers and leaders in terms of their preferences.

With regard to didactic training, respondents indicated a preference for a relatively brief (e.g., half-day to 2 days) workshop, buttressed by case examples, research data, experiential learning, and resources. These findings are in alignment with previous research regarding effective aspects of didactic training, as experiential learning strategies (i.e., role plays) have been shown to expand a workshop’s potential to increase intervention skills and subsequent implementation with patients [13, 14]. Role plays act as a form of behavioral rehearsal (i.e., rehearsal of how CM will be delivered with a patient), which increases training success and intervention fidelity [27]. Literature also suggests that the provision of case examples renders evidence-based interventions more compelling and increases clinician interest in gaining training [28]. Feedback from OTP staff was highly consistent with the SSL model, which typically consists of 1 day of didactic training pairing research data with experiential learning, and suggests that the inclusion of CM-focused case examples and resources would be of significant value to OTP staff [18].

Participants had varied views on how to best receive performance feedback but were generally in favor of receiving feedback to enhance CM fidelity, particularly in the form of objective rating scales. Some respondents preferred feedback from external CM experts, though more respondents were comfortable getting feedback from an internal source (i.e., a supervisor) or from their patients directly. These findings suggest that future research on our SSL approach, typically reliant on external technology transfer specialists, might benefit from evaluating additional CM training for internal clinical supervisors (i.e., a train-the-trainer approach [29]). A train-the-trainer model could enhance provider comfort with receiving performance feedback and improve CM sustainability potential by limiting the need for continued external support [30].

Respondents also had varying views of the utility of audio-recording CM sessions, with some highlighting the utility of recordings and others expressing concerns about patient privacy. Participants’ general receptivity to feedback was encouraging given the literature supporting the effectiveness of performance feedback for enhancing training outcomes and maximizing evidence-based practice fidelity [11, 12, 15, 17, 31]. The ambivalence about audio-recordings was not surprising given the brief tenure and limited education (i.e., bachelor’s level of education) of many OTP providers in the current sample [32]. Indeed, prior work has demonstrated that providers with limited training may experience increased evaluation anxiety when receiving supervision on audio or video-recorded sessions [32]. Though common, such ambivalence presents a unique training challenge given that audio-recordings are considered one of the gold-standard approaches for performance and fidelity monitoring [33, 34]. These results suggest that the SSL strategy would likely benefit from inclusion of an explicit orientation to the performance feedback process that clearly outlines expectations for the use of audio recording to socialize OTP staff into their role and expectations [32]. Additionally, to assure OTP staff of equitable, fair assessment, feedback would ideally be provided via well-validated scales such as the Contingency Management Competence Scale (CMCS [35]) to measure the quality of CM delivery.

In terms of external facilitation, participants expressed an interest in both on-call/as needed consultation and more structured remote support (ideally offered monthly) to help them learn CM skills and troubleshoot problems while implementing CM. Facilitation has been identified as a core component of novel practice implementation across numerous studies, many of which have used an external coach or facilitator to enhance the effectiveness of didactic training [31, 36]. The combination of formal and informal support is also a key component of the SSL implementation strategy: a technology transfer specialist offers partner sites formal monthly facilitation calls and informal consultation as needed, focused on addressing obstacles to implementation [18]. The current results suggest that for OTP staff, the facilitation sessions should not only focus on implementation support, but also on skillful delivery of CM, given some of the unique challenges associated with CM intervention delivery.

Implications: user-informed modifications to the SSL implementation strategy

The current study suggested that the SSL three-tiered implementation strategy would benefit from adaptations to improve fit with OTP staff, many of whom had limited experience at their OTP, familiarity with CM, and higher education. Our research team made several key adaptations to each component of the strategy to balance OTP staff feedback while also maintaining the SSL’s key evidence-based components (didactic training, performance feedback, and external facilitation). First, we adapted our typical 1-day CM didactic workshop by reducing the amount of time spent on research data and increasing time spent discussing case examples documenting successful CM implementation (including review of behavioral targets, prizes, and reinforcement schedules) and engaging in experiential learning. We also augmented the workshop with a plethora of CM resources (including training videos and recorded role-plays), made highly accessible via a project website (https://www.sites.brown.edu/projectmimic).

Additionally, we added explicit training content orienting providers to the audio-recording process and required that providers submit an audio recorded role-play (rated as adequate on the CMCS) prior to CM delivery with patients. To address participants’ preference for performance feedback from an internal supervisor, we had each site identify 1–2 leaders who would be responsible for supervising CM delivery in the longer term. Identified leaders received monthly performance feedback reports on their providers’ CM delivery (i.e., copies of their CMCS performance reports) and CM implementation (i.e., consistency and reach of CM delivery). Identified leaders received instruction to use the CMCS in order to institutionalize performance feedback after removal of active support. This approach was a more feasible first step than a train-the-trainer model given the large number of partner programs and the need to monitor trainer fidelity; however, as noted earlier, evaluating train-the-trainer models is a worthy direction for future research.

Finally, we offered two distinct monthly remote facilitation sessions: one led by a national CM expert and another led by a technology transfer specialist to provide support in both intervention and implementation delivery. This was a significant change to the SSL approach as external facilitation is typically only provided by technology transfer specialists who are experts in implementation support, but not in the actual intervention. In between remote facilitation sessions, OTP staff could call a project hotline answered by multiple research staff with any questions about either CM delivery (e.g., how to calculate prize draws if a patient missed a session) or the nuts and bolts of implementation (e.g., how to use the audio recorder).

Limitations

Several limitations may impact the implications of these findings as this work is a first step in obtaining user feedback about the SSL strategy. Our sample consisted of primarily White, female providers, which may have implications for whether these findings transfer to other populations or users of the SSL strategy [37]. Of note, these demographics are representative of addiction treatment providers in New England [38], highlighting a need to improve diversity in the workforce at large. Next, our sample consisted of providers nominated by their organization for participation. This recruitment method introduces the potential for selection bias by highlighting the perspectives of the strongest CM leaders and providers at their agencies. Finally, we acknowledge that participants’ verbalized preferences for CM implementation may be prone to social desirability (i.e., due to speaking with research staff who may be perceived as vested in the training). We attempted to mitigate against this concern by conducting this study with different OTPs than those that ultimately participated in the cluster randomized trial.

Conclusions and future directions

These limitations notwithstanding, the current study represents a novel attempt to apply formative research to adapt a multi-level implementation strategy to improve fit within the OTP specific context. While this study did not engage in a comprehensive user-centered design approach (see Lyon et al., 2019 and Lyon & Koerner, 2016, for descriptions of comprehensive approaches [39, 40]), the use of interviews to inform SSL modifications for community-based OTPs represents a potential first step in adapting an implementation strategy in line with the Discover phase of the Discover, Design, Build, and Test (DDBT) user-centered design framework [39]. User-centered design principles provide an opportunity to develop and adapt both interventions and implementation strategies with the end user in mind through the use of stakeholder feedback [40, 41]. The Discover phase of the DDBT framework focuses on discovering areas for strategy modification by identifying influential stakeholders (i.e., OTP leaders and providers), evaluating stakeholders’ needs (i.e., organizational contextual factors), and identifying barriers or usability issues through interviews, focus groups, direct observation, and usability testing (i.e., direct interaction with SSL techniques [39]). Results of the current study highlighted a number of needs and barriers to the SSL that informed an initial set of SSL modifications including adjustments to each of the three elements of the evidence-based implementation strategy: didactic training (e.g., structure, format, techniques used), performance feedback (e.g., source and frequency of feedback), and external facilitation (e.g., source and frequency of facilitation). Future research could build upon the methods employed in the current study and apply user-centered design principles from the Design and Build phases of the DDBT framework to beta-test and refine implementation strategy elements prior to formal testing and deployment in a specific setting. The DDBT framework would also facilitate a process of determining the transferability of study findings to other populations and settings seeking to employ the SSL [39, 42].

Overall, the current study serves as a model for using a user-informed approach to modify existing implementation strategies to maximize their fit in novel settings. User-informed adaptation of implementation strategies is not often employed in the implementation science literature but has the potential to increase the uptake of both an implementation strategy and the evidence-based practice being implemented. In an ongoing cluster randomized trial, our team will specifically evaluate the extent to which the adapted SSL strategy is associated with improvements in both implementation outcomes (i.e., CM exposure, CM skill, CM sustainment) and patient outcomes (i.e., patient abstinence, patient attendance).

Availability of data and materials

The data that support the findings of this study are available on request from the corresponding author KS. The data are not publicly available due to them containing information that could compromise research participant consent.

Abbreviations

MOUD:

Medication for opioid use disorder

OUD:

Opioid use disorder

CM:

Contingency management

OTP:

Opioid treatment program

SSL:

The Science-to-Service Laboratory

ATTC:

Addiction Technology Transfer Center

CMCS:

Contingency Management Competence Scale

DDBT:

Discover, Design, Build, and Test Framework

References

  1. Connery HS. Medication-assisted treatment of opioid use disorder: review of the evidence and future directions. Harv Rev Psychiatry. 2015;23(2):63–75. https://doi.org/10.1097/HRP.0000000000000075.

    Article  PubMed  Google Scholar 

  2. Alderks CE. Trends in the use of methadone, buprenorphine, and extended-release naltrexone at substance abuse treatment facilities: 2003–2015 (update); 2017.

    Google Scholar 

  3. Beitel M, Oberleitner L, Muthulingam D, Oberleitner D, Madden LM, Marcus R, et al. Substance abuse experiences of burnout among drug counselors in a large opioid treatment program: a qualitative investigation. Substance Abuse. 2018;39(2):211–7. https://doi.org/10.1080/08897077.2018.1449051.

  4. Oser CB, Biebel EP, Pullen EL, Harp KLH. The influence of rural and urban substance abuse treatment counselor characteristics on client outcomes. J Soc Serv Res. 2011;37(4):390–402. https://doi.org/10.1080/01488376.2011.582020.

    Article  PubMed  PubMed Central  Google Scholar 

  5. Oser CB, Biebel EP, Pullen E, Harp KLH. Causes, consequences, and prevention of burnout among substance abuse treatment counselors: a rural versus urban comparison. J Psychoactive Drugs. 2013;45(1):17–27. https://doi.org/10.1080/02791072.2013.763558.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Shoptaw S, Stein JA, Rawson RA. Burnout in substance abuse counselors - impact of environment, attitudes, and clients with HIV. J Subst Abus Treat. 2000;19(2):117–26. https://doi.org/10.1016/S0740-5472(99)00106-3.

    Article  CAS  Google Scholar 

  7. Garner BR, Hunter BD, Modisette KC, Ihnes PC, Godley SH. Treatment staff turnover in organizations implementing evidence-based practices: turnover rates and their association with client outcomes. J Subst Abus Treat. 2012;42(2):134–42. https://doi.org/10.1016/j.jsat.2011.10.015.

    Article  Google Scholar 

  8. Griffith JD, Rowan-Szal GA, Roark RR, Simpson DD. Contingency management in outpatient methadone treatment: a meta-analysis. Drug Alcohol Depend. 2000;58(1-2):55–66. https://doi.org/10.1016/S0376-8716(99)00068-X.

    Article  CAS  PubMed  Google Scholar 

  9. Petry NM. Contingency management: what it is and why psychiatrists should want to use it. Psychiatrist. 2011;35(5):161–3. https://doi.org/10.1192/pb.bp.110.031831.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Rawson RA, Huber A, McCann M, Shoptaw S, Farabee D, Reiber C, et al. A comparison of contingency management and cognitive-behavioral approaches during methadone maintenance treatment for cocaine dependence. Arch Gen Psychiatry. 2002;59(9):817–24. https://doi.org/10.1001/archpsyc.59.9.817.

    Article  CAS  PubMed  Google Scholar 

  11. Rash CJ, Petry NM, Kirby KC, Martino S, Roll J, Stitzer ML. Identifying provider beliefs related to contingency management adoption using the contingency management beliefs questionnaire. Drug Alcohol Depend. 2012;121(3):205–12. https://doi.org/10.1016/j.drugalcdep.2011.08.027.

    Article  PubMed  Google Scholar 

  12. Petry NM, Simcic F. Recent advances in the dissemination of contingency management techniques: clinical and research perspectives. J Subst Abus Treat. 2002;23:81–6. https://doi.org/10.1016/S0740-5472(02)00251-9.

    Article  Google Scholar 

  13. Hartzler B, Jackson TR, Jones BE, Beadnell B, Calsyn DA. Disseminating contingency management: impacts of staff training and implementation at an opiate treatment program. J Subst Abus Treat. 2014;46(4):429–38. https://doi.org/10.1016/j.jsat.2013.12.007.

    Article  Google Scholar 

  14. Beidas RS, Kendall PC. Training therapists in evidence-based practice: a critical review of studies from a systems-contextual perspective. Clin Psychol Sci Pract. 2010;17(1):1–30. https://doi.org/10.1111/j.1468-2850.2009.01187.x.

    Article  Google Scholar 

  15. Colquhoun HL, Carroll K, Eva KW, Grimshaw JM, Ivers N, Michie S, et al. Advancing the literature on designing audit and feedback interventions: identifying theory-informed hypotheses. Implement Sci. 2017;12(1):117. https://doi.org/10.1186/s13012-017-0646-0.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Powell BJ, Beidas RS, Lewis CC, Aarons GA, McMillen JC, Proctor EK, et al. Methods to improve the selection and tailoring of implementation strategies. J Behav Health Serv Res. 2017;44:177–94. https://doi-org.revproxy.brown.edu/10.1007/s11414-015-9475-6.

  17. Sholomskas DE, Syracuse-Siewert G, Rounsaville BJ, Ball SA, Nuro KF, Carroll KM. We don’t train in vain: a dissemination trial of three strategies of training clinicians in cognitive-behavioral therapy. J Consult Clin Psychol. 2005;73(1):106–15. https://doi.org/10.1037/0022-006X.73.1.106.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Squires DD, Gumbley SJ, Storti SA. Training substance abuse treatment organizations to adopt evidence-based practices: the addiction technology transfer Center of new England Science to service laboratory. J Subst Abus Treat. 2008;34(3):293–301. https://doi.org/10.1016/j.jsat.2007.04.010.

    Article  Google Scholar 

  19. Mello MJ, Becker SJ, Bromberg J, Baird J, Zonfrillo MR, Spirito A. Implementing alcohol misuse SBIRT in a National Cohort of pediatric trauma centers-a type III hybrid effectiveness-implementation trial. Implement Sci. 2018;13(1):35. https://doi.org/10.1186/s13012-018-0725-x.

    Article  PubMed  PubMed Central  Google Scholar 

  20. Becker SJ, Squires DD, Strong DR, Barnett NP, Monti PM, Petry NM. Training opioid addiction treatment providers to adopt contingency management: a prospective pilot trial of a comprehensive implementation science approach. Subst Abus. 2016;37(1):134–40. https://doi.org/10.1080/08897077.2015.1129524.

    Article  PubMed  PubMed Central  Google Scholar 

  21. Helseth SA, Janssen T, Scott K, Squires DD, Becker SJ. Training community-based treatment providers to implement contingency management for opioid addiction: time to and frequency of adoption. J Subst Abus Treat. 2018;95:26–34. https://doi.org/10.1016/j.jsat.2018.09.004.

    Article  Google Scholar 

  22. Becker SJ, Scott K, Murphy CM, Pielech M, Moul SA, Yap KRK, et al. User-centered design of contingency management for implementation in opioid treatment programs: a qualitative study. BMC Health Serv Res. 2019;19(1):466. https://doi.org/10.1186/s12913-019-4308-6.

    Article  PubMed  PubMed Central  Google Scholar 

  23. Bauer MS, Damschroder L, Hagedorn H, Smith J, Kilbourne AM. An introduction to implementation science for the non-specialist. BMC Psychol. 2015;3(1):32. https://doi.org/10.1186/S40359-015-0089-9.

    Article  PubMed  PubMed Central  Google Scholar 

  24. Hsieh HF, Shannon SE. Three approaches to qualitative content analysis. Qual Health Res. 2005;15(9):1277–88. https://doi.org/10.1177/1049732305276687.

    Article  PubMed  Google Scholar 

  25. Graneheim UH, Lundman B. Qualitative content analysis in nursing research: concepts, procedures and measures to achieve trustworthiness. Nurse Educ Today. 2004;24(2):105–12. https://doi.org/10.1016/j.nedt.2003.10.001.

    Article  CAS  PubMed  Google Scholar 

  26. Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care. 2012;50(3):217–26. https://doi.org/10.1097/MLR.0b013e3182408812.

    Article  PubMed  PubMed Central  Google Scholar 

  27. Beidas RS, Cross W, Dorsey S. Show me, don’t tell me: behavioral rehearsal as a training and analogue fidelity tool. Cogn Behav Pract. 2014;21(1):1–11. https://doi.org/10.1016/j.cbpra.2013.04.002.

    Article  PubMed  Google Scholar 

  28. Stewart RE, Chambless DL. Interesting practitioners in training in empirically supported treatments: research reviews versus case studies. J Clin Psychol. 2009;66:n/a. https://doi.org/10.1002/jclp.20630.

    Article  Google Scholar 

  29. Martino S, Ball SA, Nich C, Canning-Ball M, Rounsaville BJ, Carroll KM. Teaching community program clinicians motivational interviewing using expert and train-the-trainer strategies. Addiction. 2011;106(2):428–41. https://doi.org/10.1111/j.1360-0443.2010.03135.x.

    Article  PubMed  Google Scholar 

  30. Bearman SK, Bailin A, Terry R, Weisz JR. After the study ends: a qualitative study of factors influencing intervention sustainability. Prof Psychol Res Pract. 2019;51(2):134–44. https://doi.org/10.1037/pro0000258.

    Article  Google Scholar 

  31. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the expert recommendations for implementing change (ERIC) project. Implement Sci. 2015;10(1):21. https://doi.org/10.1186/s13012-015-0209-1.

    Article  PubMed  PubMed Central  Google Scholar 

  32. Huhra RL, Yamokoski-Maynhart CA, Prieto LR. Reviewing videotape in supervision: a developmental approach. J Couns Dev. 2008;86(4):412–8. https://doi.org/10.1002/j.1556-6678.2008.tb00529.x.

    Article  Google Scholar 

  33. Borrelli B. The assessment, monitoring, and enhancement of treatment fidelity in public health clinical trials. J Public Health Dent. 2011;71(s1):S52–63. https://doi.org/10.1111/j.1752-7325.2011.00233.x.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Baer JS, Ball SA, Campbell BK, Miele GM, Schoener EP, Tracy K. Training and fidelity monitoring of behavioral interventions in multi-site addictions research. Drug Alcohol Depend. 2007;87(2-3):107–18. https://doi.org/10.1016/j.drugalcdep.2006.08.028.

    Article  PubMed  Google Scholar 

  35. Petry NM, Ledgerwood DM. Contingency management competence scale for reinforcing attendance the contingency management competence scale for reinforcing attendance; 2010.

    Google Scholar 

  36. Lyon AR, Stirman SW, Kerns SEU, Bruns EJ. Developing the mental health workforce: review and application of training approaches from multiple disciplines. Adm Policy Ment Health Ment Health Serv Res. 2011;38(4):238–53. https://doi.org/10.1007/s10488-010-0331-y.

    Article  Google Scholar 

  37. Polit DF, Beck CT. Generalization in quantitative and qualitative research: myths and strategies. Int J Nurs Stud. 2010;47(11):1451–8. https://doi.org/10.1016/j.ijnurstu.2010.06.004.

    Article  PubMed  Google Scholar 

  38. Rieckmann T, Farentinos C, Tillotson CJ, Kocarnik J, McCarty D. The substance abuse counseling workforce: education, preparation, and certification. Subst Abus. 2011;32(4):180–90. https://doi.org/10.1080/08897077.2011.600122.

    Article  PubMed  PubMed Central  Google Scholar 

  39. Lyon AR, Munson SA, Renn BN, Atkins DC, Pullmann MD, Friedman E, et al. Use of human-centered design to improve implementation of evidence-based psychotherapies in low-resource communities: protocol for studies applying a framework to assess usability. JMIR Res Protoc. 2019;8(10):e14990. https://doi.org/10.2196/14990.

    Article  PubMed  PubMed Central  Google Scholar 

  40. Lyon AR, Koerner K. User-centered design for psychosocial intervention development and implementation. Clin Psychol Sci Pract. 2016;23(2):180–200. https://doi.org/10.1111/cpsp.12154.

    Article  Google Scholar 

  41. Dopp AR, Parisi KE, Munson SA, Lyon AR. A glossary of user-centered design strategies for implementation experts. Transl Behav Med. 2019;9(6):1057–64. https://doi.org/10.1093/tbm/iby119.

    Article  PubMed  Google Scholar 

  42. Chilana PK, Ko AJ, Wobbrock J. From user-centered to adoption-centered design. In: Proc. 33rd Annu. ACM Conf. Hum. Factors Comput. Syst., vol. 2015. New York: ACM; 2015. p. 1749–58. https://doi.org/10.1145/2702123.2702412.

    Chapter  Google Scholar 

Download references

Acknowledgements

The authors are incredibly grateful to the opioid treatment programs affiliated with CODAC Behavioral Healthcare and Discovery House for their collaboration and partnership on this study.

Funding

Data collection for this work was supported by the National Institute of General Medical Sciences [P20GM125507, PI: Becker] and the National Institute on Drug Abuse [R01DA046941; MPIs: Becker and Garner]. The time and effort of Dr. Scott was supported by a fellowship grant from the National Institute on Alcohol Abuse and Alcoholism [T32AA007459; PI: Monti].

Author information

Authors and Affiliations

Authors

Contributions

KS contributed to the conceptualization, qualitative codebook development, training in qualitative analysis, data analysis and interpretation, drafting, and editing of the full manuscript. SJ contributed to the data analysis, full manuscript drafting, and editing. SM and KY contributed to the qualitative interview coding, drafting of the manuscript the “Methods” and “Results” sections, and data analysis and interpretation. CMM contributed to the data analysis and interpretation and full manuscript editing. SJB contributed to the qualitative codebook development and training in the qualitative analysis. BRG and SJB contributed to the manuscript and study conceptualization, data analysis and interpretation, and full manuscript editing. The authors read and approved the final manuscript.

Corresponding author

Correspondence to Kelli Scott.

Ethics declarations

Ethics approval and consent to participate

The Institutional Review Board at the Miriam Hospital, an affiliate of the Alpert Medical School of Brown University, approved this study and granted waiver of documented consent (Project Number: 210718 45CFR 46.110(7)).

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Scott, K., Jarman, S., Moul, S. et al. Implementation support for contingency management: preferences of opioid treatment program leaders and staff. Implement Sci Commun 2, 47 (2021). https://doi.org/10.1186/s43058-021-00149-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s43058-021-00149-2

Keywords