Skip to main content

Professionals’ management of the fidelity–adaptation dilemma in the use of evidence-based interventions—an intervention study

Abstract

Background

Evidence-based interventions (EBIs) can be effective tools for the prevention of disease and health promotion. However, their implementation often requires a delicate balance between the need to adjust the intervention to the context in which it is implemented and the need to keep the core components that make the intervention effective. This so-called dilemma between fidelity and adaptation is often handled by health professionals in the sustainment phase of an implementation (i.e., once the intervention has been adopted and institutionalized in an organization), but not much is known about how and to what extent health professionals are affected by this dilemma. Focusing on the sustainment phase, this project aims to study (1) how fidelity and adaptation are managed by professionals using an EBI, (2) how the fidelity–adaptation dilemma affects professionals’ psychosocial working conditions, and (3) how a structured decision support influences professionals’ management of the dilemma and their psychosocial working conditions.

Methods

The study is set in Sweden, and the EBI in focus is a parental program (All Children in Focus). A longitudinal within-person intervention design is used, combined with a cross-sectional survey design. Data sources include web-based questionnaires, brief interviews, fidelity ratings, paper-and-pen questionnaires, and written documentation, collected at multiple time points with both group leaders and parents as respondents.

Discussion

This project approaches fidelity and adaptation from the perspective of the professionals that manage EBIs during the sustainment phase of implementation. Although it is well known that EBIs continue to change over time, it remains to be understood how the fidelity–adaptation dilemma can be managed so that the effectiveness of interventions is retained or improved, not diluted. Moreover, the project adds to the literature by presenting an occupational health perspective on the fidelity–adaptation dilemma. It is acknowledged that fidelity and adaptation may have consequences for not only clients but also the occupational wellbeing of the professionals managing the dilemma, and subsequently, their willingness and ability to deliver EBIs in a sustainable way.

Background

Evidence-based interventions (EBIs) are widely recognized as effective tools for improving individuals’ health and wellbeing. To be considered “evidence-based,” an intervention must go through rigorous evaluations to prove it is effective, often in a series of trials in both controlled (i.e., efficacy trials) and real settings (i.e., effectiveness trials). The idea behind this process is that if an intervention can show positive effects even in an effectiveness trial, then the intervention is ready to be disseminated and implemented without major adaptations.

Contrary to these expectations, adaptations to EBIs are common. Between 44 and 88% of EBI users (e.g., health professionals) have reported modifying some part of the original intervention, such as the procedure, dosage, content, format, or target group [1,2,3]. These adaptations are often motivated by differences between the context for which the EBIs were designed and evaluated and the context in which they are being used, leading to a misfit between the EBI and the context of its application that must be addressed [4, 5].

Although there are studies showing high fidelity to be related to better outcomes than low fidelity [6, 7], other studies have suggested that adapted EBIs may be more effective than non-adapted EBIs [8, 9]. This tension has been referred to as the fidelity–adaptation dilemma. This implies that it is a matter of either–or. Yet, contemporary conceptualizations tend to emphasize that fidelity and adaptations can coexist; that is, that the relationship is both–and, not either–or [10,11,12]. This makes fidelity and adaptation a paradox rather than a dilemma [13].

Reconciling this paradox entails considering how fidelity and adaptation should be managed so that the EBI can fit a specific context while retaining its core components [14, 15]. This ensures that the adapted EBI is at least non-inferior to and possibly better than the original version [16, 17]. This can be achieved either by adapting the context by removing obstacles to high fidelity or by adapting the EBI to fit the context, for example, by omitting or modifying components that are not applicable or feasible in a specific context [14]. Moreover, in order to ensure thoughtful decisions that promote fit without threatening the integrity of the EBI, adaptation decisions should be made proactively and with careful consideration of how they affect the EBI’s core components and subsequent effectiveness [1, 2, 18,19,20]. This implies a structured process involving multiple stakeholders and experts on the EBI, implementation, and the local context [1, 2, 18,19,20].

Several models and frameworks have been developed to structure and support the process of making decisions regarding adaptations [1, 2, 20]. The purpose is often to enable the adoption of an EBI by facilitating the dissemination of EBIs in settings that may differ from the EBI’s original development context. The models consist of elaborate processes involving multiple steps, including identification of mismatches that call for adaptations and pilot-testing and evaluation of the adapted version [21]. Thus, the adaptation process spans all implementation stages (i.e., exploration, preparation, implementation, and sustainment) [22]. For example, during the exploration and preparation phases, it is proposed that multiple stakeholders are engaged in a structured, rational decision process to identify obstacles to implementation that affect the fit between the EBI and the context (e.g., [2, 23]). During the active implementation phase, the models might propose systems to monitor and fidelity (and adaptation) data, enabling data-driven decisions [24]. In this phase, substantive implementation support may be available to support iterative troubleshooting processes where fidelity and adaptation decisions are part of continuous improvement cycles [25, 26].

However, less attention has been paid to how fidelity and adaptation decisions play out later in the implementation process (i.e., the sustainment phase) when the EBI has become part of everyday practice (i.e., institutionalized). In the best-case scenario, there is no additional need for decisions about fidelity and adaptation during the sustainment phase because tensions have already been resolved. However, this assumption is not upheld if the context changes, as this may throw off any carefully negotiated fit between an EBI and a context. Such an approach is illustrated in the dynamic sustainability process [17], which points to the need for continued learning and problem solving through ongoing adaptations. It is also consistent with definitions of sustainability as the ability to continuously deliver value, rather than as the consistent delivery of an EBI [27]. Therefore, the management of fidelity and adaptation during the sustainment phase needs to be further investigated.

The territory for fidelity and adaptation management is different in the sustainment phase than in the earlier implementation phases. The extra resources allocated to the implementation have generally ended, including experts and data management support. Therefore, once an EBI has been adopted and implemented, the professionals delivering the EBI become the default decision-makers regarding fidelity and adaptation. With the additional resources to manage implementation withdrawn, professionals are likely to make decisions about fidelity and adaptations under bounded conditions characterized by lack of time, information, and think-space in combination with multiple concurrent obligations [28,29,30], contrary to the rational, structured approach outlined in most adaptation frameworks [2, 19, 23]. However, little is known about how professionals manage fidelity and adaptation during sustained use of EBIs, and there have been few attempts to develop ways to support them in doing so.

The way in which professionals manage the fidelity–adaptation dilemma has important implications not only for the effectiveness of EBIs but also for the professionals themselves. Having to make decisions in these far from ideal situations might burden professionals. For instance, they may find themselves in emotionally or ethically charged situations in which they want to adhere to protocol but doing so is not possible, or in which they feel that adaptations would be appropriate but feel compelled to adhere to the protocol [31, 32]. This dilemma is, therefore, likely to be perceived as a potential work-related stressor [8, 33]. Thus, health professionals might be negatively affected cognitively and emotionally by dealing with the fidelity–adaptation dilemma. This highlights the need to increase our knowledge about the fidelity–adaptation dilemma as a part of professional psychosocial working conditions and to explore how professionals’ confidence and skills in managing the dilemma can be improved. To the best of our knowledge, such research is currently missing. This project aims to fill that gap.

Aims

In this project, we will study fidelity and adaptation during the sustainment phase of the implementation of an EBI (a parenting program), from the perspective of the professionals. The program, All Children in Focus, has been shown to have positive effects on children’s and parents’ health [34], and it is widely offered by community services in Sweden.

To address the research gaps mentioned above, the aims of this project are threefold: (1) to study how adaptation and fidelity are managed by professionals when using an EBI, (2) to study how the fidelity–adaptation dilemma affects professionals’ psychosocial working conditions, and (3) to study how a structured decision support intervention influences professionals’ management of the dilemma and their psychosocial working conditions. Thus, the project focuses on professionals’ experiences of managing fidelity and adaptations when using an EBI as part of their everyday work, including how they can be supported in their decision-making.

Research questions

The aims are developed in the following research questions (RQ):

  • RQ1) How are fidelity and adaptation managed by professionals during the (sustained) use of an EBI?

  • RQ2) What consequences does the fidelity and adaptation dilemma have for professionals’ experience of their psychosocial working conditions?

  • RQ3) How does a decision support intervention affect the fidelity and adaptation dilemma and its consequences?

  • How can the four subquestions to RQ3 be intented?

  • How does the decision support function, and how is it perceived by professionals?

  • How does the decision support affect professionals’ psychosocial working conditions?

  • How does the decision support affect how fidelity and adaptations are managed?

  • How does the decision support affect the value created by the EBI (outcomes for parents, for professionals, and for the organization)?

Theoretical framework

The formulation of research questions and the interpretation of results are guided by the recently proposed value equation framework [14]. This theoretical framework proposes a way in which the fidelity–adaptation dilemma can be reconciled by focusing on the value that an EBI can produce rather than just the intervention effects. The framework construes implementation strategies as a method to create a fit between EBIs and context, emphasizing that fit can be achieved either by adapting the intervention to the context or the context to the intervention. The optimal decision is the one that maximizes the value that can be achieved across clients, professionals, the organization, and the system. The value equation, in turn, relies on the dynamic sustainability framework [5], which emphasizes the need to assess care settings and outcomes on an ongoing basis, not just prior to implementation, as the analyses of the process and outcome data provide ample opportunities to refine and improve the intervention [5].

Furthermore, we use the job demands-resources model as the guiding theory underlying the study of the dilemma as part of the professional psychosocial work environment [35]. The theory suggests that individuals’ experiences of strain and motivation at work are a function of the cognitive, emotional, and physical demands they face and the resources that are available to deal with these demands. Having too many demands and limited resources increases the risk of strain and stress, whereas sufficient resources in relation to demands result in a motivational process. Thus, the theory suggests that using an EBI under conditions that require decisions about fidelity and adaptation may put the professional under emotional and cognitive pressure if not met by sufficient resources.

Methods

Study design

This study will use a longitudinal within-person intervention design to address RQ1 and RQ3 combined with a cross-sectional survey design to address RQ2. The within-person design requires that the same individuals participate in both a control condition (using the parent program as usual) and the experimental condition (using the parenting program after participating in the decision support intervention). This design is particularly useful when randomization or recruitment of a control group is unfeasible. The within-person design may, in addition, be complemented with a pretest–posttest design without a control group for professionals with whom a within-person design is not possible (e.g., because they do not run enough parent groups).

Setting, recruitment, and participants

The study will be performed in the context of the All Children in Focus (ABC) parent program in Sweden, in close collaboration with the organization providing training and support for ABC. The ABC program has been provided since 2011. The target group of the study is professionals (group leaders) who lead the parenting program. Parents in the groups will also be invited to participate.

The group leaders will be recruited to the study through the collaboration organization. Group leaders are from various backgrounds (e.g., teachers, psychologists, social workers); work in community organizations, such as social services and pre-schools; and are primarily from the capital region. All group leaders (N 800) who have received training in ABC by the organization will be invited to participate in the study through their mailing list. After receiving written information about the aim of the project and a description of what participation entails, how integrity will be protected, and how all participation is voluntary and can be withdrawn at any time, the group leaders will be asked to provide informed consent and their contact information through an online form.

Parents will be recruited by the group leaders participating in the study. They will be introduced to the project through a video presented at the beginning of the first parent group session or by the group leaders. Parents will also receive information about the aim of the project and a description of what participation entails, how integrity will be protected, and how all participation is voluntary and can be withdrawn at any time. They will then be asked to provide their informed consent.

The parenting program

The ABC is a universal health-promoting parenting program developed for parents with children in the age of 3–12 years old. It targets the parent–child relationship and parental everyday experience with the aim to promote children´s development [34]. The program consists of four 2.5-h group meetings in groups of up to 14 participants and one follow-up session. The program is provided in several languages, and parents can choose to attend singly or in couples. It has previously been evaluated in an RCT and shown to be efficacious [34] and cost-effective [36] (for further details, see [37]).

The decision support intervention

The structured decision support targets the group leaders with the following core functions:

  1. 1)

    To provide group leaders with knowledge and awareness of the relationship between how EBIs are used and the value they produce for clients, professionals, organizations, and systems, in keeping with the value equation framework [14];

  2. 2)

    To enable group leaders to make informed choices concerning the adaptation of EBI; that is, to maximize the value that the EBI can produce by ensuring optimizing adherence to core components, making changes in the context if needed, and adaptations of the EBI if needed to improve use and functioning.

The decision support intervention is based on the Planned Adaptation Model [2] and the Useful Evidence Model [13] and guided by the value equation framework [14]. Participants will be guided through a process of making decisions about fidelity and adaptation based on the identification of intervention and contextual components that combine to produce the aspired outcome: value. This includes the identification of core components and the activities needed to retain them, as well as the identification of the components of the context that are non-compatible with the achievement of the aspired values.

The decision support intervention will be delivered in two 2.5-h workshops held 2–3 weeks apart. It will be conducted by the researchers and held either face-to-face or digitally, if needed due to pandemic restrictions.

The decision support intervention was developed in conjunction with an ongoing project aiming to support fidelity and adaptation decisions during earlier phases of implementation [38]. The face-to-face version has been pilot-tested as part of that project. The digital version has been tested separately with (a) staff from the collaboration organization and (b) four professionals delivering the parent program. Additional file 1 presents the Template for Intervention Description and Replication (TIDieR) checklist [39].

Data collection procedure

Multiple data sources will be used to address the RQs, collected at multiple time points with both group leaders and parents as respondents. We will use web-based questionnaires, brief interviews, fidelity ratings, paper-and-pen questionnaires, and written documentation (for the type of data collection methods and timing, see Fig. 1).

Fig. 1
figure 1

Overview of data collections

The group leaders will first be invited to respond to a web-based group leader questionnaire (Q0), which will include an invitation to participate in the decision support intervention. Thus, the questionnaire will be used for the cross-sectional study to address RQ2. It will also be used as a baseline for the decision support intervention (RQ3). Those agreeing to participate in the intervention will also be invited to further data collections, as outlined in Fig. 1. The web-based group leader questionnaire (Q0) will be repeated at two time points (Q1 and Q2). Additional data will be collected during the parent group sessions (with parents and group leaders as respondents) and the decision support workshops (with group leaders) (see Fig. 1).

Three sources of data will be collected in conjunction with the parent group sessions. First, a parental questionnaire (P0 and P1 in Fig. 1) will be distributed to parents during the first and last parent group sessions and used to evaluate the effect of the parent groups on parent and child behavior. Second, parents will be asked to provide fidelity ratings in a fidelity questionnaire (F0–F4) at the end of each parent group session. Group leaders will distribute questionnaires to parents. Third, brief telephone interviews will be held with the group leaders after each parent group session (i1–i4) to assess fidelity and adaptations. In line with the within-person design, the data collection will be repeated for a second group after the group leader has participated in the decision support workshops.

Two data collections will take place in conjunction with the decision support workshops. First, documentation (D0 and D1) produced by the participating group leaders during the sessions describing the fidelity–adaptation decision-making process will be copied and compiled. Second, a brief process evaluation questionnaire, including appraisals of the workshop as well as knowledge of and attitudes towards fidelity and adaptations, will be collected at four time points: before the first workshop (W0), after the first (W1) and second (W2) workshops, and at a follow-up (W3) in conjunction with the last web-based group leader questionnaire (Q2).

Study variables and instruments

Table 1 presents the study variables and instruments for the data collection.

Table 1 Overview of study variables and instruments used in the data collection

Analysis

This is a multimethod study involving both qualitative and quantitative data. The qualitative data will be analyzed through content analysis. The quantitative data will be analyzed with descriptive analyses (e.g., frequencies, correlations) and more advanced analysis, such as multilevel analysis (e.g., accounting for dependencies in parental data), and mix-methods to investigate and triangulate changes over time.

Discussion

The goal of this project is to investigate how professionals experience and manage the fidelity–adaptation dilemma during the sustainment phase of implementation and how the dilemma affects their psychosocial working conditions. Moreover, a decision support intervention, which focuses on professionals and might serve as a tool to manage the fidelity–adaptation dilemma during sustained use, is tested.

The project will contribute to the development of knowledge on the implementation of EBIs in four main ways. First, we focus on how the fidelity–adaptation dilemma plays out after the active implementation phase, when EBIs are used as part of regular services. This complements the growing literature on how adaptations are managed that primarily focuses on earlier implementation phases, i.e., during exploration, preparation, and, to some extent, active implementation. In this, we address a research gap concerning how the fidelity–adaptation dilemma plays out during the sustainment phase, a research gap that has persisted even though it is well-known that adaptations are common as EBIs are spread and used [29, 42, 57].

Second, the study adds to the current knowledge of how professionals manage fidelity–adaptation during the use of an EBI. Understanding what guides the choices professionals make when dealing with the fidelity–adaptation dilemma can contribute to the advancement of implementation science by showing what issues remain to be solved once the main implementation support is removed. The findings may also inform the design of parental programs by indicating which parts of a program are challenging for professionals to sustain.

Third, to our knowledge, little attention has been paid to how professionals themselves are affected when managing fidelity–adaptation dilemmas. The literature so far has primarily focused on the effects of the dilemma on clients (i.e., how it impacts intervention effectiveness). Subsequently, we know little about the consequences of the dilemma from an occupational health perspective. For example, a group leader that struggles with the fidelity–adaptation dilemma may experience cognitive load or emotional distress. In addition to the effect this may have on the professionals, it may also impact their performance as group leaders and subsequently the benefits for participating parents, as emotional distress has been shown to be inversely related to empathic skills [58], which has in turn been shown to affect the benefits for participating parents [59].

Fourth, the project will complement the literature on how decisions on fidelity and adaptation can be supported by testing a structured decision support intervention, which focuses on professionals who are already using an EBI in daily practice (i.e., during sustainment). The intervention is novel in its goal of targeting how professionals make decisions, aiming to provide them with the awareness, knowledge, and skills to make decisions based on how the decision might impact the value of the EBI. The decision support intervention is the first attempt to develop and test an intervention for the sustainment phase of implementation based on the newly proposed value equation, offering a theoretical ground for fidelity–adaptation decisions [14]. In this way, the decision support intervention provides a practical tool for how professionals can be supported in considering the impact their decisions can have on the value that the EBI can result in.

Availability of data and materials

The datasets used will be available from the corresponding author on reasonable request.

Abbreviations

EBI:

Evidenced-based intervention

References

  1. Aarons GA, Green AE, Palinkas LA, Self-Brown S, Whitaker DJ, Lutzker JR, et al. Dynamic adaptation process to implement an evidence-based child maltreatment intervention. Implement Sci. 2012;7(1):1–9.

    Article  Google Scholar 

  2. Lee SJ, Altschul I, Mowbray CT. Using planned adaptation to implement evidence-based programs with new populations. Am J Community Psychol. 2008;41(3-4):290–303.

    Article  PubMed  Google Scholar 

  3. Kumpfer K, Alvarado R, Smith P, Bellamy N. Cultural sensitivity and adaptation in family-based prevention interventions. Prev Sci. 2002;3(3):241–6.

    Article  PubMed  Google Scholar 

  4. Glasgow RE, Vinson C, Chambers D, Khoury MJ, Kaplan RM, Hunter C. National Institutes of Health approaches to dissemination and implementation science: current and future directions. Am J Public Health. 2012;102(7):1274–81.

    Article  PubMed  PubMed Central  Google Scholar 

  5. Chambers DA, Glasgow RE, Stange KC. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implement Sci. 2013;8(1):117.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Elliott DS, Mihalic S. Issues in disseminating and replicating effective prevention programs. Prev Sci. 2004;5(1):47–53.

    Article  PubMed  Google Scholar 

  7. Mihalic S. The importance of implementation fidelity. Emotional Behavior Disord Youth. 2004;4(4):83–105.

    Google Scholar 

  8. Sundell K, Beelmann A, Hasson H, von Thiele Schwarz U. Novel programs, international adoptions, or contextual adaptations? Meta-analytical results from German and Swedish intervention research. J Clin Child Adolesc Psychol. 2016;45(6):784–96.

    Article  PubMed  Google Scholar 

  9. Barrera M Jr, Castro FG, Strycker LA, Toobert DJ. Cultural adaptations of behavioral health interventions: a progress report. J Consult Clin Psychol. 2013;81(2):196.

    Article  PubMed  Google Scholar 

  10. Mejia A, Leijten P, Lachman JM, Parra-Cardona JR. Different strokes for different folks? Contrasting approaches to cultural adaptation of parenting interventions. Prev Sci. 2017;18(6):630–9.

    Article  PubMed  Google Scholar 

  11. Pérez D, Van der Stuyft P, del Carmen ZM, Castro M, Lefèvre P. A modified theoretical framework to assess implementation fidelity of adaptive public health interventions. Implement Sci. 2015;11(1):91.

    Article  Google Scholar 

  12. Anyon Y, Roscoe J, Bender K, Kennedy H, Dechants J, Begun S, et al. Reconciling adaptation and fidelity: implications for scaling up high quality youth programs. J Prim Prev. 2019;40(1):35–49.

    Article  PubMed  Google Scholar 

  13. Hasson H, Von Thiele Schwarz U. Användbar Evidens - Om anpassningar och följsamhet [useful evidence - about adaptations and fidelity]. Stockholm: Natur & Kultur; 2017.

    Google Scholar 

  14. von Thiele SU, Aarons GA, Hasson H. The value equation: three complementary propositions for reconciling fidelity and adaptation in evidence-based practice implementation. BMC Health Serv Res. 2019;19(1):868.

    Article  Google Scholar 

  15. Castro FG, Yasui M. Advances in EBI development for diverse populations: towards a science of intervention adaptation. Prev Sci. 2017;18(6):623–9.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Aarons GA, Sklar M, Mustanski B, Benbow N, Brown CH. “Scaling-out” evidence-based interventions to new populations or new health care delivery systems. Implement Sci. 2017;12(1):111.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Chambers D, Glasgow R, Stange K. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implement Sci. 2013;8(1):117.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Wiltsey Stirman S, Gamarra JM, Bartlett BA, Calloway A, Gutner CA. Empirical examinations of modifications and adaptations to evidence-based psychotherapies: methodologies, impact, and future directions. Clin Psychol Sci Pract. 2017;24(4):396–420.

    Google Scholar 

  19. Castro F, Barrera M Jr, Martinez C Jr. The cultural adaptation of prevention interventions: resolving tensions between fidelity and fit. Prev Sci. 2004;5(1):41–5.

    Article  PubMed  Google Scholar 

  20. Escoffery C, Lebow-Skelley E, Udelson H, Böing EA, Wood R, Fernandez ME, et al. A scoping study of frameworks for adapting public health evidence-based interventions. Transl Behav Med. 2019;9(1):1–10.

    Article  PubMed  Google Scholar 

  21. Movsisyan A, Arnold L, Evans R, Hallingberg B, Moore G, O’Cathain A, et al. Adapting evidence-informed complex population health interventions for new contexts: a systematic review of guidance. Implement Sci. 2019;14(1):105.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  22. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011;38(1):4–23.

    Article  PubMed  Google Scholar 

  23. Aarons GA, Green AE, Palinkas LA, Self-Brown S, Whitaker DJ, Lutzker JR, et al. Dynamic adaptation process to implement an evidence-based child maltreatment intervention. Implement Sci. 2012;7(1):32.

    Article  PubMed  PubMed Central  Google Scholar 

  24. Frykman M, Hasson H, Athlin ÅM, von Thiele Schwarz U. Functions of behavior change interventions when implementing multi-professional teamwork at an emergency department: a comparative case study. BMC Health Serv Res. 2014;14(1):1–13.

    Article  Google Scholar 

  25. Becan JE, Bartkowski JP, Knight DK, Wiley TR, DiClemente R, Ducharme L, et al. A model for rigorously applying the exploration, preparation, implementation, sustainment (EPIS) framework in the design and measurement of a large scale collaborative multi-site study. Health Justice. 2018;6(1):9.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Lyon AR, Koerner K. User-centered design for psychosocial intervention development and implementation. Clin Psychol Sci Pract. 2016;23(2):180–200.

    Article  Google Scholar 

  27. Moore JE, Mascarenhas A, Bain J, Straus SE. Developing a comprehensive definition of sustainability. Implement Sci. 2017;12(1):110.

    Article  PubMed  PubMed Central  Google Scholar 

  28. Todd PM, Brighton H. Building the theory of ecological rationality. Mind Mach. 2016;26(1-2):9–30.

    Article  Google Scholar 

  29. Mosson R, Hasson H, Wallin L, von Thiele Schwarz U. Exploring the role of line managers in implementing evidence-based practice in social services and older people care. Br J Soc Work. 2017;47(2):542–60.

    Google Scholar 

  30. Moore JE, Bumbarger BK, Cooper BR. Examining adaptations of evidence-based programs in natural contexts. J Prim Prev. 2013;34(3):147–61.

    Article  PubMed  Google Scholar 

  31. Durlak J, DuPre E. Implementation matters: a review of research on the influence of implementation on program outcomes and the factors affecting implementation. Am J Community Psychol. 2008;41(3-4):327–50.

    Article  PubMed  Google Scholar 

  32. Kälvemark S, Höglund AT, Hansson MG, Westerholm P, Arnetz B. Living with conflicts-ethical dilemmas and moral distress in the health care system. Soc Sci Med. 2004;58(6):1075–84.

    Article  PubMed  Google Scholar 

  33. Burston AS, Tuckett AG. Moral distress in nursing: contributing factors, outcomes and interventions. Nurs Ethics. 2013;20(3):312–24.

    Article  PubMed  Google Scholar 

  34. Ulfsdotter M, Enebrink P, Lindberg L. Effectiveness of a universal health-promoting parenting program: a randomized waitlist-controlled trial of all children in focus. BMC Public Health. 2014;14(1):1083.

    Article  PubMed  PubMed Central  Google Scholar 

  35. Demerouti E, Bakker AB, Nachreiner F, Schaufeli WB. The job demands-resources model of burnout. J Appl Psychol. 2001;86(3):499–512.

    Article  CAS  PubMed  Google Scholar 

  36. Ulfsdotter M, Lindberg L, Månsdotter A. A cost-effectiveness analysis of the Swedish universal parenting program all children in focus. PLoS One. 2015;10(12):e0145201.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  37. Lindberg L, Ulfsdotter M, Jalling C, Skärstrand E, Lalouni M, Rhodin KL, et al. The effects and costs of the universal parent group program–all children in focus: a study protocol for a randomized wait-list controlled trial. BMC Public Health. 2013;13(1):1–12.

    Article  Google Scholar 

  38. Hasson H, Gröndal H, Rundgren ÅH, Avby G, Uvhagen H, von Thiele Schwarz U. How can evidence-based interventions give the best value for users in social services? Balance between adherence and adaptations: a study protocol. Implement Sci Commun. 2020;1(1):1–9.

    Article  Google Scholar 

  39. Hoffmann TC, Glasziou PP, Boutron I, Milne R, Perera R, Moher D, et al. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. Bmj. 2014;348:g1687.

    Article  PubMed  Google Scholar 

  40. Weiner BJ, Lewis CC, Stanick C, Powell BJ, Dorsey CN, Clary AS, et al. Psychometric assessment of three newly developed implementation outcome measures. Implement Sci. 2017;12(1):1–12.

    Article  Google Scholar 

  41. Rabin BA, McCreight M, Battaglia C, Ayele R, Burke RE, Hess PL, et al. Systematic, multimethod assessment of adaptations across four diverse health systems interventions. Front Public Health. 2018;6:102.

    Article  PubMed  PubMed Central  Google Scholar 

  42. Kakeeto M, Lundmark R, Hasson H, von Thiele Schwarz U. Meeting patient needs trumps adherence. A cross-sectional study of adherence and adaptations when national guidelines are used in practice. J Eval Clin Pract. 2017;23(4):830–8.

    Article  PubMed  Google Scholar 

  43. Stirman S, Miller C, Toder K, Calloway A. Development of a framework and coding system for modifications and adaptations of evidence-based interventions. Implement Sci. 2013;8(1):65.

    Article  PubMed  PubMed Central  Google Scholar 

  44. Huijg JM, Gebhardt WA, Dusseldorp E, Verheijden MW, van der Zouwe N, Middelkoop BJ, et al. Measuring determinants of implementation behavior: psychometric properties of a questionnaire based on the theoretical domains framework. Implement Sci. 2014;9(1):1–15.

    Article  Google Scholar 

  45. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health Ment Health Serv Res. 2011;38(2):65–76.

    Article  Google Scholar 

  46. McEachern AD, Dishion TJ, Weaver CM, Shaw DS, Wilson MN, Gardner F. Parenting young children (PARYC): validation of a self-report parenting measure. J Child Fam Stud. 2012;21(3):498–511.

    Article  PubMed  PubMed Central  Google Scholar 

  47. Malmberg M, Rydell A-M, Smedje H. Validity of the Swedish version of the strengths and difficulties questionnaire (SDQ-Swe). Nord J Psychiatry. 2003;57(5):357–63.

    Article  PubMed  Google Scholar 

  48. Brestan EV, Jacobs JR, Rayfield AD, Eyberg SM. A consumer satisfaction measure for parent-child treatments and its relation to measures of child behavior change. Behav Ther. 1999;30(1):17–30.

    Article  Google Scholar 

  49. Attkisson C, Greenfield T. The client satisfaction questionnaire (CSQ) scales. In: Sederer LL, Dickey B, editors. Outcome assessment in clinical practice. Baltimore: Williams & Wilkins; 1996.

    Google Scholar 

  50. Jacobs SR, Weiner BJ, Bunger AC. Context matters: measuring implementation climate among individuals and groups. Implement Sci. 2014;9(1):1–14.

    Article  Google Scholar 

  51. Pejtersen JH, Kristensen TS, Borg V, Bjorner JB. The second version of the Copenhagen psychosocial questionnaire. Scand J Public Health. 2010;38(3 suppl):8–24.

    Article  PubMed  Google Scholar 

  52. Berthelsen H, Westerlund H, Søndergård KT. COPSOQ II: en uppdatering och språklig validering av den svenska versionen av en enkät för kartläggning av den psykosociala arbetsmiljön på arbetsplatser: Stressforskningsinstitutet; 2014.

    Google Scholar 

  53. Semmer NK, Jacobshagen N, Meier LL, Elfering A, Beehr TA, Kälin W, et al. Illegitimate tasks as a source of work stress. Work Stress. 2015;29(1):32–56.

    Article  PubMed  PubMed Central  Google Scholar 

  54. Aarons GA. Mental health provider attitudes toward adoption of evidence-based practice: the evidence-based practice attitude scale (EBPAS). Ment Health Serv Res. 2004;6(2):61–74.

    Article  PubMed  PubMed Central  Google Scholar 

  55. Rigotti T, Schyns B, Mohr G. A short version of the occupational self-efficacy scale: structural and construct validity across five countries. J Career Assess. 2008;16(2):238–55.

    Article  Google Scholar 

  56. Mosson R, Augustsson H, Bäck A, Åhström M, von Thiele SU, Richter A, et al. Building implementation capacity (BIC): a longitudinal mixed methods evaluation of a team intervention. BMC Health Serv Res. 2019;19(1):287.

    Article  PubMed  PubMed Central  Google Scholar 

  57. Moore J, Bumbarger B, Cooper B. Examining adaptations of evidence-based programs in natural contexts. J Prim Prev. 2013;34(3):147–61.

    Article  PubMed  Google Scholar 

  58. Thomas MR, Dyrbye LN, Huntington JL, Lawson KL, Novotny PJ, Sloan JA, et al. How do distress and well-being relate to medical student empathy? A multicenter study. J Gen Intern Med. 2007;22(2):177–83.

    Article  PubMed  PubMed Central  Google Scholar 

  59. Giannotta F, Özdemir M, Stattin H. The implementation integrity of parenting programs: which aspects are most important? Child Youth Care Forum. 2019;48:917–33. 

Download references

Acknowledgements

We would like to thank Ms Emma Hedberg Rundgren for her invaluable assistance in pilot-testing of the decision support and the preparation for data collections.

Funding

This study has received research grant funding from the Swedish Research Council (Ref No 2016-01261) after a competitive peer-review process. The council is Sweden’s largest governmental research funding, and they distribute close to 7 billion SEK each year to support Swedish research within all scientific fields. The funder has no role in the design and conduct of the study, including the collection, analysis, and interpretation of the data or the writing of the manuscript. The content is solely the responsibility of the authors and does not necessarily represent the official views of the Swedish Research Council. Open Access funding provided by Mälardalen University.

Author information

Authors and Affiliations

Authors

Contributions

UvTS, HH, MN, JZ, and FG designed the project. UvTS secured funding for the project and was responsible for the ethical application supported by MN and JZ. FG drafted the first version of the study protocol. All authors discussed the draft, revised it, and approved the final manuscript.

Corresponding author

Correspondence to Ulrica von Thiele Schwarz.

Ethics declarations

Ethics approval and consent to participate

Ethical approval for this project including all data collections was obtained from the Swedish Ethical Review Authority (ref no. 2019-06276). Informed consent will be obtained from all study participants. In the case of refusal, these individuals will not be included in the dataset used for analyses.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

TIDieR checklist

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

von Thiele Schwarz, U., Giannotta, F., Neher, M. et al. Professionals’ management of the fidelity–adaptation dilemma in the use of evidence-based interventions—an intervention study. Implement Sci Commun 2, 31 (2021). https://doi.org/10.1186/s43058-021-00131-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s43058-021-00131-y

Keywords