Skip to main content

Unraveling implementation context: the Basel Approach for coNtextual ANAlysis (BANANA) in implementation science and its application in the SMILe project



Designing intervention and implementation strategies with careful consideration of context is essential for successful implementation science projects. Although the importance of context has been emphasized and methodology for its analysis is emerging, researchers have little guidance on how to plan, perform, and report contextual analysis. Therefore, our aim was to describe the Basel Approach for coNtextual ANAlysis (BANANA) and to demonstrate its application on an ongoing multi-site, multiphase implementation science project to develop/adapt, implement, and evaluate an integrated care model in allogeneic SteM cell transplantatIon facILitated by eHealth (the SMILe project).


BANANA builds on guidance for assessing context by Stange and Glasgow (Contextual factors: the importance of considering and reporting on context in research on the patient-centered medical home, 2013). Based on a literature review, BANANA was developed in ten discussion sessions with implementation science experts and a medical anthropologist to guide the SMILe project’s contextual analysis. BANANA’s theoretical basis is the Context and Implementation of Complex Interventions (CICI) framework. Working from an ecological perspective, CICI acknowledges contextual dynamics and distinguishes between context and setting (the implementation’s physical location).


BANANA entails six components: (1) choose a theory, model, or framework (TMF) to guide the contextual analysis; (2) use empirical evidence derived from primary and/or secondary data to identify relevant contextual factors; (3) involve stakeholders throughout contextual analysis; (4) choose a study design to assess context; (5) determine contextual factors’ relevance to implementation strategies/outcomes and intervention co-design; and (6) report findings of contextual analysis following appropriate reporting guidelines. Partly run simultaneously, the first three components form a basis both for the identification of relevant contextual factors and for the next components of the BANANA approach.


Understanding of context is indispensable for a successful implementation science project. BANANA provides much-needed methodological guidance for contextual analysis. In subsequent phases, it helps researchers apply the results to intervention development/adaption and choices of contextually tailored implementation strategies. For future implementation science projects, BANANA’s principles will guide researchers first to gather relevant information on their target context, then to inform all subsequent phases of their implementation science project to strengthen every part of their work and fulfill their implementation goals.

Peer Review reports


The importance of contextFootnote 1 for a successful and sustainable implementation has gained significant attention in implementation science (IS) with contextual analysis increasingly being recognized as vital to IS methodology [1,2,3,4]. While contextual analyses’ value is widely accepted, guidance on how to conduct one is lacking and no unified definition of contextual analysis in IS exists. We understand contextual analysis as a foundational phase within IS projects to which specific research questions and IS theories, models, or frameworks (TMFs) are applied [3, 5, 6]. It entails the mapping of relevant qualitative and quantitative information about the context (e.g., multilevel implementation determinants, practice patterns) in which an intervention will be delivered. Starting (prospectively) at the beginning of each IS project, the results of the contextual analysis become the basis of all subsequent phases of an IS project: they inform intervention development or adaption, guide choices regarding implementation strategies, help interpret implementation and effectiveness outcomes, and guide selection of sustainability strategies [7,8,9,10]. As context is dynamic and evolves, continuous monitoring of context including for example several assessments of context throughout the project is important. Our view is that contextual analysis requires methodological strengthening as very limited guidance is available so far and conceptual and methodological unclarity on contextual analysis exists.

Conceptual inconsistencies between the applied methods and approaches hamper the development of a standardized approach [11]. In their systematic review of 64 empirical implementation studies, Rogers et al. identified over 40 distinct strategies to study context via quantitative, qualitative, and mixed-methods approaches [11]. Whereas assessment of contextual factors often focuses on the meso-level (e.g., organizational culture and climate, readiness for implementation), macro-level factors (e.g., political and economic climate) are rarely considered [12,13,14,15,16].

TMFs provide guidance on which contextual factors to study, but not on how to study context per se [17,18,19]. Commonly applied TMFs that incorporate context include the Consolidated Framework for Implementation Research (CFIR) [20], the Integrated Promoting Action on Research Implementation in Health Services (i-PARIHS) framework [21], or the Theoretical Domains Framework (TDF) [22]. This emphasis on theory contrasts with an increasing number of IS studies that focus on mapping single facilitators and barriers to implementation, but that follow no specific theory. This absence both obscures the researchers’ rationale for choosing their variables and limits theoretical development based on empirical evidence. Along with any multilevel perspective, they also commonly lack assessments of interactions between context, intervention, and implementation [11, 15, 23,24,25,26,27]. And what contextual information they generate is rarely linked to subsequent phases in their IS project [28]. In cases where contextual analysis is treated not as a separate foundational phase of an IS project, but as an add-on, contextual data are commonly only performed retrospectively, as part of a process evaluation [19, 29]. This obviously excludes any chance of applying any contextual information to the IS project’s next phases. Furthermore, as findings of contextual analyses are rarely published (particularly meso-level factors, e.g., in regard to practice patterns), transparency of a project might be limited as valuable methodological observations, e.g., in regard to intervention development, that might be important to other researchers’ projects are not reported ([30]; Mielke J, Brunkert T, Zúñiga F, Simon M, Zullig L, De Geest S: Methodological approaches to study context in implementation science studies: an evidence gap map, Under review). To conclude, a comprehensive guidance focusing not only on individual aspects of a contextual analysis (e.g., theoretical underpinning or methods to assess context) is lacking in implementation science.

As part of a series of guidelines commissioned by the Agency for Healthcare Research and Quality, Stange and Glasgow [31] provided an initial step-wise approach to assessing and reporting contextual factors throughout the phases of patient-centered medical home research. Since this approach was not initially developed for IS projects, it lacks further details on the operationalization of context, specific methods to assess context, and the use of contextual information to inform later IS project phases. To fill these gaps and to provide guidance “how to do” contextual analysis in IS projects, we developed the Basel Approach for coNtextual ANAlysis (BANANA), which entails six components. Accordingly, this paper has two objectives: first, to describe the six components of BANANA; and second, to describe its application to the SMILe project, i.e., development/adaption, implementation, and evaluation of an integrated care model (ICM) in allogeneic stem cell transplantation facilitated by eHealth (SMILe) (Table 1) [32, 33].

Table 1 Description case example SMILe projecta ([32,33,34]; Valenta S, Ribaut J, Leppla L, Mielke J, Teynor A, Koehly K, Gerull S, Grossmann F, Witzig-Brändli V, De Geest S, for the SMILe study team: Context-specific adaptation of an eHealth-facilitated, integrated care model and tailoring its implementation strategies – a mixed-methods study as a part of the SMILe implementation science project, Under review)


To develop BANANA, we used a multiphase approach. First, we conducted a literature review, focusing on methodological IS papers available via major electronic data bases (PubMed, EMBASE, and Web of Science). To identify existing methodological approaches for contextual analysis, we screened the identified papers’ reference lists. The only authors to provide a methodological description of an entire contextual analysis were Stange and Glasgow [32]; others addressed only individual aspects, e.g., the use of TMFs or methods to study context [3, 11, 31]. However, as Stange and Glasgow’s approach is not IS specific, we used it as a basis to develop BANANA and adapted it as necessary to guide a contextual analysis for the SMILe project and for IS projects in general. Briefly, SMILe is a multi-site, multiphase IS project (Table 1): Phase A entailed analyzing the context and target users’ technology acceptance, as well as developing/adapting and extending the SMILe-ICM and its setting-specific implementation strategies [34, 36, 37]. In fact, BANANA was originally developed to guide contextual analysis in this phase to inform intervention development (German setting) [34] and intervention adaptation (Swiss setting). Phase B focused on the SMILe-ICM’s implementation and evaluation using a hybrid effectiveness-implementation randomized controlled trial [32, 33].

Second, research group members (SDG, LL, SV, AT, JM) consulted with IS experts (LLZ, FZ) and a medical anthropologist (SS) in iterative discussion sessions about the identified literature and how to elaborate Stange and Glasgow’s approach in view of its application for IS projects. We used the three-step approach as a basis, adapted it to IS (e.g., IS terminology, relevance of context for different phases of an IS project), and complemented it with identified evidence on context and its assessment (e.g., methods for data collection) as well as other relevant aspects (e.g., use of empirical evidence) (Additional file 1). Our understanding of context and how it is reflected in BANANA was conceptually based on the Context and Implementation of Complex Interventions (CICI) framework [1] and is described elsewhere in detail [38]. CICI is a meta-framework that in contrast to other frameworks explicitly focuses on the multilevel, dynamic context, i.e., interactions between intervention, implementation, and context. CICI operationalizes context across seven domains (geographic, epidemiological, socio-cultural, socio-economic, political, legal-ethical). Each of these entails micro-, meso-, and macro-level contextual factors. CICI differentiates between setting and context, defining it as the physical location in a context in which an intervention is implemented [1]. In the setting interactions between the intervention, the implementation and the other contextual factors occur. Thus, the contextual analysis includes an assessment not only of contextual aspects but also of the setting in which the implementation takes place. BANANA spans around the CICI’s constructs (i.e., multidimensional, multilevel, dynamic) and acknowledges in its methodological guidance differences between context and setting. Starting with an initial version of BANANA compiled by the first and last authors (JM, SDG), we further refined each of the steps (e.g., specifying reporting of contextual findings). After ten discussion rounds between all study authors, we reached a consensus on BANANA (i.e., conceptual underpinning, different components, and methodological aspects). The initial and final version of BANANA, as well as which aspects of BANANA were informed by the CICI framework, can be found in Additional file 1.


The BANANA approach entails six components (Table 2): (1) choosing a TMF, (2) using empirical evidence, (3) involving stakeholders, (4) designing a study specifically for the contextual analysis, (5) determining the relevance of contextual factors for implementation strategies/outcomes and intervention co-design, and (6) reporting on the contextual analysis. Stakeholder involvement represents a key element of contextual analysis and thus relates to all other components. BANANA’s components are presented linearly, but depending on the project, components 1–3 can occur concurrently and determine the need and focus for component 4. Meaning that depending on the project aims and context information available, researchers need to carefully review in their specific case whether component 4 of BANANA is relevant to gain further contextual information and to reflect on how to realize those components. Working in an interdisciplinary research team that combines different expertise and skills can therefore be instrumental in planning and executing a contextual analysis. BANANA is explained in detail below; for each component, an example from the SMILe project is provided. We also provide further key resources in regard to the different components of BANANA in Additional file 2 that provide further guidance. A detailed description of the SMILe study and its methods for contextual analysis can be found elsewhere [34].

Table 2 Description of the six components of the Basel Approach for coNtextual ANAlysis (based on Stange and Glasgow [31])

Component 1: Choosing a TMF to guide contextual analysis

Identification and selection of TMFs

In general, the use of TMFs is essential to inform and guide all phases of IS projects and increase the findings’ generalizability [24, 41,42,43]. Regarding contextual analysis, a TMF can serve as “a comprehensive starting point” to identify contextual factors that influence implementation.

The selection of a framework is often perceived as difficult, as a large number of IS and other TMFs are available [24, 41, 44, 45]. Therefore, following Moullin et al.’s recommendations [46], we suggest considering four criteria when selecting a TMF: (1) it is intended/designed for contextual analysis; (2) it acknowledges the multidimensional, multilevel, and dynamic nature of context; (3) it includes guidance on operationalization of concepts (e.g., contextual factors); (4) it fits the intervention and setting.

Resources that provide an overview of TMFs or support the identification, selection, and combining of TMFs include key IS papers [3, 24, 44, 47] and tools such as the D&I Models Webtool [48]. To justify and report TMF selection, the implementation theory comparison and selection tool (T-CaST) can provide useful guidance [41]. Based on 16 criteria relating to applicability, usability, testability, and acceptability, T-CaST provides a first attempt to select and compare TMFs [41]. Furthermore, to ensure a TMF’s fit and applicability for a specific setting and/or context, stakeholders can be involved (cf. component 3) [46].

Combining of TMFs for context and setting

As context differs from the setting1, we suggest the combination of a context- and setting-specific TMF, as such combinations enhance the granularity of contextual analysis. While context-specific TMFs provide an overview of factors that may influence implementation (e.g., socio-cultural characteristics), setting-specific TMFs indicate factors that influence a specific intervention’s implementation in a specific setting (e.g., site characteristics, practice patterns, work flows, and processes within that setting). A broad variety of TMFs are available for specific settings and/or interventions, e.g., the Chronic Care Model [49] or the Primary Care Practice Development Model [39].

Case example — use of TMFs in the SMILe project

In the SMILe project, we chose the CICI framework [1] as an overarching framework for contextual analysis. In our view, as it acknowledges contextual dynamics and distinguishes between context and setting, it is currently the most mature framework available. Working with the CICI framework, we assessed relevant micro- and meso-level contextual factors from the three context domains—geographic (i.e., Internet access, type of connection), epidemiological (i.e., patient demographics), and socio-cultural (i.e., self-management, health behavior). We did not explicitly assess further contextual factors. As SMILe project leaders (LL, SV) themselves have been working for years within the SMILe-ICM’s implementation setting, they had implicit and explicit contextual knowledge (e.g., organizational culture, leadership, and legal aspects).

As the SMILe project’s focus is on developing and implementing an eHealth-facilitated ICM, we combined the CICI framework with the eHealth Enhanced Chronic Care Model (eCCM) to gain a deeper understanding of the target setting (the stem cell transplant center) (Fig. 1) [1, 35]. The eCCM supports the re-design of acute care-oriented processes towards chronic illness management [35, 49]. The SMILe researchers assessed factors from the model’s five building blocks (i.e., self-management support, delivery system design, clinical decision support, clinical information systems, eHealth education) [34]. Micro-level factors of interest included self-management support and technology openness; on the meso-level, they included transplant center structural characteristics, practice patterns in follow-up care, the level of chronic illness management, team composition, and clinician demographics [35]. Macro-level factors were considered but not explicitly assessed and reported (e.g., legal aspects).

Fig. 1
figure 1

Combination of the Context and Implementation of Complex Interventions (CICI) framework [1] and the eHealth Enhanced Chronic Care Model (eCCM) [35] to guide contextual analysis within the SMILe project. Figure adapted from Pfadenhauer et al. [1] and Gee et al. [35]

Component 2: Using empirical evidence for contextual analysis

TMFs provide a comprehensive overview of how context is conceptualized and which contextual factors are relevant for implementation. However, not all aspects mentioned in the TMFs are relevant to each IS project. Therefore, using available empirical evidence can help to determine what is already known about the specific implementation context and relevant contextual factors. Four sources of evidence exist, i.e., (1) local data and information, (2) professional knowledge/clinical experience, (3) patient experiences and preferences, and (4) research [50]. The first three can be considered through stakeholder input (cf. component 3); local data can be also identified, e.g., by studying audit and performance data. To assess evidence from research, a literature review on relevant contextual factors should be conducted [1, 11, 51].

Additional file 3 provides an overview of micro-, meso-, and macro-level contextual factors we identified (via our literature review) as the most commonly reported influencing implementation [11, 13, 27, 52]. Additionally, Rogers et al. identified team-level factors (e.g., team characteristics and teamwork, team stability, morale) important to implementation [11]. Reviews such as these provide broad views of relevant contextual factors. However, whenever possible, researchers need to consider evidence on implementation determinants for specific interventions, target groups, or settings, e.g., Evans et al.’s research toolkit to study organizational contextual factors influencing the implementation of ICMs [53]. That toolkit includes a framework and measurement tools to study those factors.

Empirical evidence can shed light on gaps in our understanding of context and can point to aspects of context that need further inquiry. It thus informs the focus and extent of the primary data collection in component 4 or even unravels in certain circumstances that no further primary data collection is required to inform further steps of the implementation project [1].

Case example — use of empirical evidence in the SMILe project

In order to optimally inform the SMILe contextual analysis, all sources of evidence were used. First, a literature review revealed limited evidence on follow-up practice patterns in allogeneic stem cell transplantation (research evidence). Other identified studies reported on challenges with eHealth implementation (e.g., low adoption rates) [54,55,56,57], including relevant contextual factors that tend to hinder implementation (e.g., technology acceptance, interoperability of technology, financial resources) [58,59,60,61,62]. The findings of the literature review served as the basis for deciding which factors to explore in more detail as part of our contextual analysis [63]. Based on the factors identified, questionnaire surveys and interview guides for contextual analysis were chosen—and, if necessary, complemented—to clarify our picture of the studied context. For example, as part of the contextual analysis, target patients’ technology openness was assessed and patients’ and clinicians’ experiences using eHealth to support health or healthy behaviors explored [34, 64].

Second, in addition to the literature review, our studies in allogeneic stem cell transplantation as well as clinical experience of the SMILe team and patient feedback highlighted the challenges patients face in trying to improve their self-management behavior, e.g., medication non-adherence or physical activity [63, 65]. Based on this evidence, we added specific questions about self-management challenges and how to overcome them to our interviews and focus groups [34].

Furthermore, the SMILe project leader’s knowledge among others about the organizational culture, leadership, work processes, and structures in the implementation setting, helped (1) to specify factors relevant to consider in the contextual analysis (e.g., level of chronic illness management); (2) to identify relevant stakeholders in the clinical setting and to rate their influence on the implementation; (3) to provide a basic understanding of the setting’s readiness for change and their openness towards eHealth technology; and (4) to inform the study design (e.g., feasibility of recruitment strategy and inclusion/exclusion criteria) and selection of appropriate methods (e.g., a combination of individual and focus group interviews with clinicians was selected due to conflicting time schedules). Finally, information on the setting’s resources (e.g., financial, staffing) and operability of the IT system were gained in individual, informal stakeholder meetings (local data and information). Those meetings were essential for the context-based development and adaptation of the SMILe-ICM, to identify potential hindering factors for its implementation in the specific setting and plan ahead for its sustainability.

Component 3: Stakeholder involvement in contextual analysis

Stakeholder involvement is essential in every component of a contextual analysis. Stakeholders are “those individuals [or organizations] without whose support and feedback an organization, or a project within an organization [or beyond] cannot subsist” [40]. They can be targeted or affected by an intervention (e.g., patients and caregivers), actively implementing an intervention (e.g., healthcare practitioners), deciding on whether it will be implemented (e.g., organizational leaders, policy makers) [1, 66,67,68,69]. It is also possible to ask input from specific experts (e.g., epidemiologists, researchers) on dedicated topics.

Identification of stakeholders and development of a stakeholder strategy

The matter of which stakeholders to involve in contextual analysis always depends on the project’s specific focus [40]. To help ensure productive and robust stakeholder involvement, developing a stakeholder strategy is key. This indicates which and how stakeholders will be involved at each step of the contextual analysis, specifies each group’s tasks, and outlines methods or tools to involve each group. Essentially, stakeholder selection must be systematic, i.e., involving multiple stakeholder perspectives from every relevant level (micro, meso, and/or macro), while balancing power and bridging inter-group disparities, e.g., between patients and care specialists. Identified stakeholders can be mapped on a matrix (i.e., influence-interest-capacity matrix) that specifies their characteristics, e.g., role, degree of influence, anticipated effects, and outcomes of involving them [40]. Throughout the project, continuous changes in context require continuous involvement of the stakeholders, e.g., via regular stakeholder meetings [10, 30]. Furthermore, their needs must be continuously evaluated and adapted as necessary.

Stakeholder tasks and tools for involvement

Since no specific guidance is available regarding stakeholder involvement in IS projects, general guidelines such as INVOLVE [70] or the PARADIGM Patient Engagement Toolbox [71] can support researchers to plan stakeholder tasks and choose tailored tools for stakeholder involvement. Within a contextual analysis, stakeholder tasks can include helping to choose a TMF, identifying/selecting relevant contextual factors for analysis, and evaluating and monitoring those factors throughout the project. By helping research teams interpret the findings of the contextual analysis, stakeholders can also deepen their understanding of inherent inter-factor relationships. Further tasks include supporting the development of contextually adapted intervention and implementation strategies [31, 46, 66, 72]. In these ways, stakeholder involvement can contribute to interventions’ acceptance, adoption, and feasibility. That is, engaged stakeholders will add considerably to an intervention’s successful implementation and sustainment [73].

Case example — stakeholder involvement in the SMILe project

The SMILe project involves stakeholders at multiple levels throughout the project and thus also in the contextual analysis [34]. Potential stakeholders were identified in brainstorming sessions and via one-to-one in-depth discussion by the SMILe research team, project leaders, and clinicians working in the field. Selections were based on their expert opinions and their perceptions of the stakeholder’s influence in regard to the implementation process and sustainability of the SMILe-ICM as well as their interest in the SMILe-ICM. In the second center, a stakeholder matrix was developed, indicating each of the stakeholder’s impact/influence, and necessary resources for their involvement (Valenta S, Ribaut J, Leppla L, Mielke J, Teynor A, Koehly K, Gerull S, Grossmann F, Witzig-Brändli V, De Geest S, for the SMILe study team: Context-specific adaptation of an eHealth-facilitated, integrated care model and tailoring its implementation strategies – a mixed-methods study as a part of the SMILe implementation science project, Under review). The final stakeholder group included the target group (stem cell-transplanted patients) and implementers (transplant team members, e.g., in- and outpatient nurses, physicians, and psycho-oncologists). Furthermore, extending the “typical end-users,” we also involved decision-makers (e.g., transplant directors, nursing directors, and head nurses) and other stakeholders including hospital IT and medical controllers and patients’ family members. In addition to being tremendously useful in identifying the most appropriate participants for focus groups, they participated in data collection, supported interpretation of study findings, helped develop/adapt the SMILe-ICM, and helped choose/adapt implementation strategies [34]. The stakeholders were involved longitudinally (i.e., over the whole project period), to keep track to changes in the dynamic and evolving context. Over the course of the contextual analysis, other stakeholders, including experts in medical device regulation and health insurers, were involved via individual in-depth interviews.

Component 4: Study design for contextual analysis

As a contextual analysis in our view is a foundational phase, and thus functions as a separate study within IS projects, it requires additional considerations regarding research methods and study design. Data collection concerning relevant contextual factors is informed by components 1–3 of BANANA, i.e., theory, empirical evidence, and stakeholder input. The choice of methods is driven by the research questions. In addition, considering that available resources (time, funding, personnel) for contextual analysis are usually constrained, researchers need to strike a “balance between speed and rigor” [74]. This balance will influence how extensively a contextual analysis can be carried out and which methods can be applied [75].

Methods and measurement tools to study context

To deepen the research team’s understanding of the context, a combination of quantitative and qualitative methods is typically used [11, 76, 77]. Where mixed-methods approaches are used [78,79,80,81], the overall focus can be on quantitative data, qualitative data, or any supported mix of the two [78, 82].

Quantitative methods include numerous types of surveys (e.g., online surveys, paper-pencil questionnaires, telephone surveys), systematic interviews, direct observations, or routine data. Quantitatively assessed contextual factors include, e.g., implementation climate, organizational culture and climate, available resources, and readiness for change. Several reviews provide overviews of current measurement tools and their psychometric properties [14, 15, 25, 83,84,85,86,87,88]. Furthermore—for instance, on the CFIR [89] and EPIS framework [90] project websites—measurement and data extraction tools are available to assess aspects of context mentioned in the frameworks. However, before applying any such measurement tools, research teams must ensure that they are appropriate for their intended use, produce psychometrically sound results, and will be used consistently over time to ensure the comparability of results [16].

To explore qualitative contextual factors, interviews (unstructured or semi-structured), focus groups, observations, or document analysisFootnote 2 can be applied [91]. Qualitative methods are particularly suitable to identify stakeholders’ preferences and needs, values, beliefs, and attitudes and how these influence their behavior. Published recommendations for the use of qualitative methods in IS include a white paper by the National Cancer Institute [91,92,93,94]. Furthermore, for certain frameworks, such as the CFIR or CICI, interview guides have been developed to guide the exploration of context constructs [1, 89].

Some of these quantitative and qualitative approaches have been criticized for focusing only on specific levels (e.g., the meso/organizational level) or “only provid[ing] a cursory view of complex and dynamic contexts” [29]. However, alongside quantitative and/or qualitative methods, ethnographic methods can complement both these types of data, thereby facilitating in-depth insights in organizational and contextual processes that influence implementation. An ethnographic approach can help highlight interactions within the context that remain undetected by other methods, but that may have a substantial impact on the intended implementation [29]. Furthermore, details that may not be obvious to the interviewee (e.g., ritualized everyday actions, cultural and social norms) or differences between what is said and what is done can be identified via ethnographic methods [95,96,97].

Considering the limited resources available for contextual analysis, the current trend is increasingly toward rapid qualitative or rapid ethnographic approaches. For example, the Rapid Assessment Procedure-Informed Clinical Ethnography (RAPICE) method combines rapid assessment procedures with ethnography [74, 98]. Initial evidence suggests that rapid research methods can be as effective and rigorous as traditional approaches but more time- and cost-effective [98,99,100]. However, a research team planning on using these methods for the first time should be aware that applying them effectively and efficiently may require special training, multiple attempts, and methodological adaptions to fit their research setting [99,100,101,102].

Timepoints for data collection

BANANA focuses on the prospective assessment of context; however, as context is dynamic and evolves further, timepoints for considering context should be planned through the IS project. This does not mean starting over with component 1 of BANANA, but rather “keeping the thermometers in the system.” This can be achieved, for instance, through repeated assessments (e.g., surveys or use of routine data) or other methods (e.g., observations, site visits, document analysis) as well as through regular exchanges or informal conversations with stakeholders [38]. Currently, little guidance is available regarding which contextual factors to record at which timepoints and how frequently [31, 103]. Further insights may be gained from Ariadne Labs’ Atlas Initiative that aims to develop a data repository of contextual factors related to the implementation success of different interventions in different settings and at different timepoints of analysis (before implementation, 6 weeks after implementation, and monthly after that) [104, 105].

Case example — SMILe project data collection and analysis

For the SMILe contextual analysis, an explanatory mixed-methods (quantitative/qualitative) design was applied based on the research aims formulated [34]. Specific aims of this analysis are described in Table 1. Data collection and analysis were guided by the eCCM and the CICI framework. First, questionnaire surveys were conducted with each participating center’s patients, clinicians, and transplant director. These questionnaires allowed us to assess each center’s structural characteristics, practice patterns regarding chronic illness management, overall level of chronic illness management, current levels of self-management behavioral, and technology openness and acceptance [34]. We also gathered the demographic characteristics of patients and clinicians.

The questionnaires cover the eCCM’s five building blocks and have been applied by the research team to previous work in heart and solid organ transplantation [64, 106,107,108,109]. All questionnaires were adapted as appropriate to the allogeneic stem cell transplant setting. In some cases, we supplemented the questionnaires with further contextually relevant factors (e.g., patients’ acceptance of symptom monitoring and data sharing), based on aspects described in the context domain of the CICI framework ([35]; Valenta S, Ribaut J, Leppla L, Mielke J, Teynor A, Koehly K, Gerull S, Grossmann F, Witzig-Brändli V, De Geest S, for the SMILe study team: Context-specific adaptation of an eHealth-facilitated, integrated care model and tailoring its implementation strategies – a mixed-methods study as a part of the SMILe implementation science project, Under review).

Second, to support our understanding of the quantitative findings and allow a deeper understanding of further aspects relevant to the development/adaptation and implementation of the SMILe-ICM (e.g., patient’s performance of self-management tasks, patient’s and clinician’s barriers to technology use), we conducted focus groups with clinicians as well as focus groups and individual interviews with patients. In both cases, our interview guides followed the eCCM’s building blocks [34]. In the second center where we implemented an adapted version of the SMILe-ICM, we explored factors facilitating or hindering the SMILe-ICM’s implementation by means of focus group discussions ([35]; Valenta S, Ribaut J, Leppla L, Mielke J, Teynor A, Koehly K, Gerull S, Grossmann F, Witzig-Brändli V, De Geest S, for the SMILe study team: Context-specific adaptation of an eHealth-facilitated, integrated care model and tailoring its implementation strategies – a mixed-methods study as a part of the SMILe implementation science project, Under review). An overview of the variables assessed in the surveys and themes explored during the focus group and individual interviews can be found in Additional file 4.

Quantitative and qualitative data were collected over a 1-year period. Ongoing changes in context (e.g., changes in leadership) were noted by the SMILe project leaders and documented. The team also had a regular exchange with stakeholders via informal conversations and official stakeholder meetings. The data analysis followed three eCCM-guided steps: (1) analysis of quantitative (descriptive tables) and qualitative results (meta-maps), (2) mapping of quantitative and qualitative findings in a joint display, and (3) reflection on findings and their implications for intervention development and choices of implementation strategies (again in a joint display) [34].

Component 5: Identifying and describing the relevance of contextual and setting factors for intervention co-design, implementation strategies, and outcomes

Implementation success and sustainment of an intervention and the implementation strategy depend heavily on how well they align to the target context [8, 110,111,112,113,114].

Intervention development and selection of implementation strategies

Numerous frameworks/guidelines help researchers develop interventions and select implementation strategies. One of these is the Medical Research Council (MRC) guidance for the development and evaluation of complex interventions in healthcare. Whereas previously context was mainly considered during process evaluation, i.e., retrospectively, the MRC guidance now recommends examining interactions between the intervention and context across all phases of intervention development, implementation, and evaluationFootnote 3 [114].

Another guidance focusing on both intervention development and implementation strategies is Bartholomew, Parcel, and Kok’s “Intervention Mapping”—a five-step process, the foundation of which is a contextual analysis [115]. Other methods that can be applied to match implementation strategies to contextual factors are concept mapping, group model building, and conjoint analysis [116].

Furthermore, originally designed to facilitate implementation strategy choices, the CFIR–ERIC Implementation Strategy Matching Tool speeds the identification of implementation strategies available in the Expert Recommendations for Implementing Change (ERIC) compilation. The ERIC compilation’s strategies address specific constructs described in the CFIR framework [7]. Just as with implementation strategies, specific sustainability strategies can also be selected to ensure that a successfully implemented intervention remains in clinical practice.

Adaption of interventions and implementation strategies

Even where proven intervention or implementation strategies are available, adaptions are usually required to ensure their effectiveness in a new context [110, 111, 117]. However, before making changes, it is necessary to distinguish between core intervention components—which have to be implemented as they are to achieve a desired effect—and those adaptable to various contexts [111]. Building on the idea of Intervention Mapping, Implementation Mapping was developed for use with interventions that have already been developed and tested [25]. To ensure that an adaptation is transparent and reproducible, a description should be given of which contextual details necessitate it and how the proposed adaption addresses those details [118]. Another source for adapting interventions is the three-step ADAPT guidance [110]. When adapting an intervention, it is always necessary to record which intervention components or implementation strategies were adapted, in which ways, and why. Frameworks such as FRAME [118] and FRAME-IS [119] can support this process.

Interpretation of implementation and effectiveness outcomes

An intervention’s likely effects will vary across contexts and settings [120]. The findings of the contextual analysis help to understand mechanisms that influence the implementation process (i.e., what was implemented and how well), and how these mechanisms will likely influence the intended intervention’s effectiveness. Usually, this component is part of a process evaluation [120].

To describe how and why a specific intervention leads to its expected effects, as well as to trace causal pathways between intervention components, implementation strategies, and contextual factors, it will be necessary to develop a program theory [114, 121].

Case example — relevance of the contextual analysis for development/adaption of the SMILe-ICM and implementation strategies

Contextual analysis guided the development/adaption of the SMILe-ICM and the selection of implementation strategies. All quantitative and qualitative findings were synthesized in a joint display and the intervention’s implications summarized. Identified gaps both in self-management (e.g., symptom recognition) and in delivery system design (e.g., chronic care delivery, continuity of care) highlight a need to re-engineer the current acute care model towards an ICM [34]. Following the Behavior Change Wheel methodology, we considered the identified determinants to help us choose intervention functions and behavioral change techniques, [33, 122]. As patients and clinicians were open to the use of an eHealth technology, but expressed concerns that technology might replace human contact, the SMILe-ICM intervention includes both human- and technology-based components [33]. The adaptions of the SMILe-ICM and its implementation strategies followed the FRAME and the FRAME-IS framework (Valenta S, Ribaut J, Leppla L, Mielke J, Teynor A, Koehly K, Gerull S, Grossmann F, Witzig-Brändli V, De Geest S, for the SMILe study team: Context-specific adaptation of an eHealth-facilitated, integrated care model and tailoring its implementation strategies – a mixed-methods study as a part of the SMILe implementation science project, Under review). The ERIC taxonomy was used to choose and describe context-specific implementation strategies (e.g., conducting local consensus discussions, creating new clinical teams) [34, 123]. In addition, the contextual analysis itself represented a valuable implementation strategy: conducting a local need assessment. Finally, as part of phase B, the implementation pathway and outcomes (i.e., acceptability, appropriateness, feasibility, and fidelity) will be assessed from a patients’ and healthcare providers’ perspective and likely influences of context considered [32].

Component 6: Reporting of contextual analysis

As contextual analysis informs subsequent phases of an IS project—affecting, for example, intervention development—it is a critical component of that project and needs to be reported as such [124, 125]. However, given the limited space available in journal articles, detailed findings of contextual analyses and their uses should be reported in separate papers. These are by no means restricted to dedicated IS journals but can also include journals with a clinical focus [126]. Furthermore, a much more serious impediment to the reporting and dissemination of contextual findings is the lack of clear, comprehensive guidelines on how to report contextual analyses [124, 125]. For instance, the Standards for Reporting Implementation Studies (StaRI Checklist) recommend the CFIR for reporting relevant contextual factors; however, information specifying which aspects of the contextual analysis to report is missing [20, 127, 128].

Case example — reporting of the SMILe contextual analysis

The SMILe project’s contextual analysis findings for its first study site were published in a separate paper. The same paper described the research team’s implementation strategies and outlined their findings’ implications for re-engineering stem cell transplant follow-up care [34]. A second paper described how the research team had based their intervention component and mode-of-delivery choices on information from their contextual analysis [33]. Applying the BANANA approach to the SMILe project, we were focusing sharply on making our decision-making processes and results transparent and replicable. That is, at each step, we are ensuring that both the results and the processes used to achieve them can be employed by other researchers (e.g., for scale-up).


Contextual analysis should be the foundation of every IS project in our view. As noted above, contextual analysis results inform all subsequent project phases, enhancing interventions’ implementation and sustainability in real-world settings. In comparison to previous studies on facilitators and barriers, the BANANA approach does not only describe individual methods to study context (e.g., surveys or interviews), it provides methodological guidance on planning and conducting contextual analyses in IS projects. Furthermore, BANANA describes how contextual information can be reported and used to inform further project phases (e.g., intervention development). While we have described BANANA in terms of six individual components, neither all components have to be performed in every contextual analysis, nor do they always operate sequentially (Fig. 2). Particularly the first three—choosing a TMF, identifying empirical evidence, and involving stakeholders—are partly concurrent or can be executed in a different order with stakeholder involvement being a key component that is linked to all other components. Once in place, they form a firm foundation upon which to identify and assess relevant contextual factors (component 4).

Fig. 2
figure 2

Overview of the six components of the Basel Approach for coNtextual ANAlysis (BANANA)

When presenting BANANA at conferences or in workshops, participants often ask us for a checklist they can apply to their project’s contextual analysis. However, we have deliberately avoided a “checklist approach”: the aspects to be studied in context and the methods chosen always depend on the individual research project and its research questions. Applying a checklist risks oversimplifying the context and undervaluing the complex interconnections of contextual factors, many of which differ from one setting to the next [77, 129]. In the worst cases, only superficial contextual knowledge would be generated, limiting the contextual analyses’ capacity either to inform later phases of the IS project or to ensure the implementation’s success [77, 130]. Therefore, planning and conducting a contextual analysis usually requires a high degree of reflexivity and an experienced transdisciplinary research team covering experiences in the field of IS (e.g., knowledge and use of implementation TMFs, understanding of all implementation phases), and a broad knowledge of how to apply research methods.

In addition to the project and research questions, however, pre-existing contextual knowledge and the researcher’s role influences planning and conducting of a more targeted contextual analysis [38, 131]. The SMILe project leaders (LL, SV) had both worked for several years in the SMILe-ICM’s target setting. For example, both have ample experience in the care of stem cell-transplanted patients as well as implicit and explicit knowledge of the target context and setting (e.g., work processes, resources available, leadership, organizational culture, and legal aspects). These experiences and their role within the clinical setting may have been influential in shaping the focus of our contextual analysis [38]. Given their background as advanced practice nurses, specific values, assumptions, and beliefs (i.e., mental models) drive their understanding of context. This includes, for example, which contextual factors they perceive as relevant to affect implementation, or how they interpret specific findings of the contextual analysis (i.e., confirmation bias). However, the use of a theoretical underpinning to guide contextual analyses (i.e., the eCCM and CICI framework), as well as various evidence sources in addition to the project leader’s professional knowledge might counteract potential bias [38]. Furthermore, by being embedded in the context, they have an insider perspective that may ultimately be helpful not only for conducting the context analysis (e.g., involving stakeholders from the setting), but also to support the implementation and sustainability of the intervention in practice [38, 132].

Second, although the importance of context in IS projects has been widely emphasized, funding agencies remain hesitant to fund contextual analyses as they are not yet recognized as a crucial part of an IS project. A contextual analysis’ rigor and thoroughness both reflect the available resources such as time, personnel, and especially funding [74, 75]. These circumstances should be considered when evaluating contextual analysis and interpreting its results. In projects, where resources are constrained, we do not recommend omitting components of BANANA that a relevant to the individual project, but rather thinking about how the components are carried out. In terms of empirical evidence, for example, informal meetings with key stakeholders could provide sufficient understanding of relevant contextual factors so that the amount of data collected during the contextual analysis can be narrowed down. Other aspects that can be considered in terms of resources include the timeframe of the contextual analysis (e.g., one vs. more timepoints), the number of participants involved, or the methods chosen (e.g., use of rapid qualitative methods). Informal conversations with stakeholders, for example, can also complement data collection or recurrent measures of context, when changes in context need to be observed. In addition, funding agencies need to acknowledge contextual analysis as a foundational phase in IS projects and provide specific funding mechanisms to adequately resource this phase.

Strengths and limitations

BANANA was developed based on evidence and expert discussion and successfully applied within the SMILe project to guide intervention development [34] as well as intervention adaptation ([35]; Valenta S, Ribaut J, Leppla L, Mielke J, Teynor A, Koehly K, Gerull S, Grossmann F, Witzig-Brändli V, De Geest S, for the SMILe study team: Context-specific adaptation of an eHealth-facilitated, integrated care model and tailoring its implementation strategies – a mixed-methods study as a part of the SMILe implementation science project, Under review). Its six components provide an overall, theory-based guidance for “how to do” contextual analyses in IS projects and raise questions in regard to contextual analysis that needs to be answered individually for each project. BANANA is not limited to the stem cell transplant population or any clinical settings, yet contextual analyses based on the principles of BANANA were already conducted in community-based care [133] and geriatric care [134]. Further application of BANANA for example in the community pharmacy setting is planned. Thereby, based on our operationalization of contextual analysis, BANANA is particularly useful for earlier stage work (e.g., preparatory work, hybrid 1 studies) as well as IS studies that include the development/adaptation, implementation, and evaluation of an intervention and its implementation strategies. For studies focusing on the sustainment or scale-up of interventions the extent to which components of BANANA are relevant might differ. Additional testing will be necessary to ensure its reliability for other project phases (e.g., sustainability, scale-up) and other settings (e.g., in low- and middle-income countries). Furthermore, we are considering methods of finding a broader consensus between implementation experts regarding BANANA’s six components, e.g., by applying a Delphi approach. Another limitation of BANANA is that interactions in context—particularly regarding how individuals are embedded within a context, and how they are influenced by and shape that context—require more consideration than was possible within the scope of this study. Therefore, we plan to further develop BANANA and complement it via social science elements [38].

Implications for research and funders

Improving researchers’ consideration of context and their reporting of it in IS studies will clearly require conceptual and methodological developments; however, further measures are also required. First, coupled with the acknowledgment of contextual analysis as the foundational first phase of every IS project, its relevance to implementation success requires funding agencies to rethink how to support this phase. That is, adequately resourcing contextual analyses will require specific funding schemes [77]. Within reasonable tolerances, this will require a timeline for a thorough contextual analysis and further components (e.g., intervention development) [77]. Second, the reporting of context should be a condition for the publication of IS projects. Appropriate standards and guidelines must be developed to support researchers to meet this requirement.


Contextual analysis is a foundational phase of every IS project, providing essential information to all subsequent phases. The BANANA approach successfully guided the SMILe project’s contextual analysis. To help researchers make sense of their target contexts, and to strengthen every part of their work, this approach’s principles can also be applied to other IS projects. However, further adaption and testing of BANANA in other projects are required. Equally importantly, considering the vast heterogeneity of the studies we reviewed, a coordinated campaign will be required to unify and enhance IS researchers’ efforts to conduct and report on contextual analyses. As a first step, a common set of analysis and reporting guidelines will do much to improve the success and quality of implementation efforts.

Availability of data and materials

Data analyzed during this study are available from the corresponding author upon reasonable request.


  1. Context can be defined as “a set of characteristics and circumstances that consist of active and unique factors, within which the implementation is embedded” [1]. Context is multilevel, multidimensional, and dynamic. It interacts with an intervention and its implementation in the setting, i.e., the “physical location, in which an intervention is put into practice” [1].

  2. Document analysis can be quantitative as well.

  3. Four phases of the MRC guidance: (1) developing a new intervention or identifying an already existing intervention, (2) assessing the feasibility and acceptability of an intervention, (3) assessing an intervention (evaluation), and (4) implementing an intervention [114].



Basel Approach for coNtextual ANAlysis in implementation science

CICI framework:

Context and Implementation of Complex Interventions framework


eHealth Enhanced Chronic Care Model


Integrated care model


Implementation science


Theory, model, or framework


  1. Pfadenhauer LM, Gerhardus A, Mozygemba K, Lysdahl KB, Booth A, Hofmann B, et al. Making sense of complexity in context and implementation: the Context and Implementation of Complex Interventions (CICI) framework. Implement Sci. 2017;12(1):21.

  2. Rogers L, De Brún A, McAuliffe E. Development of an integrative coding framework for evaluating context within implementation science. BMC Med Res Methodol. 2020;20(1):158.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  3. Nilsen P, Bernhardsson S. Context matters in implementation science: a scoping review of determinant frameworks that describe contextual determinants for implementation outcomes. BMC Health Serv Res. 2019;19(1):189.

    Article  PubMed  PubMed Central  Google Scholar 

  4. Dryden-Palmer KD, Parshuram CS, Berta WB. Context, complexity and process in the implementation of evidence-based innovation: a realist informed review. BMC Health Serv Res. 2020;20(1):81.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  5. De Geest S, Zúñiga F, Brunkert T, Deschodt M, Zullig LL, Wyss K, et al. Powering Swiss health care for the future: implementation science to bridge “the valley of death”. Swiss Med Wkly. 2020;150:w20323.

  6. Davis M, Beidas RS. Refining contextual inquiry to maximize generalizability and accelerate the implementation process. Implement Res Pract. 2021;2:2633489521994941.

    Google Scholar 

  7. Waltz TJ, Powell BJ, Fernández ME, Abadie B, Damschroder LJ. Choosing implementation strategies to address contextual barriers: diversity in recommendations and future directions. Implement Sci. 2019;14(1):42.

    Article  PubMed  PubMed Central  Google Scholar 

  8. Craig P, Di Ruggiero E, Frolich KL, Mykhalovskiy E, White M, Campbell R, et al. Taking account of context in population health intervention research: guidance for producers, users and funders of research; 2018.

  9. Squires JE, Aloisio LD, Grimshaw JM, Bashir K, Dorrance K, Coughlin M, et al. Attributes of context relevant to healthcare professionals’ use of research evidence in clinical practice: a multi-study analysis. Implement Sci. 2019;14(1):52.

  10. Chambers DA, Glasgow RE, Stange KC. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implement Sci. 2013;8(1):117.

    Article  PubMed  PubMed Central  Google Scholar 

  11. Rogers L, De Brún A, McAuliffe E. Defining and assessing context in healthcare implementation studies: a systematic review. BMC Health Serv Res. 2020;20(1):591.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  12. Allen JD, Towne SD, Maxwell AE, DiMartino L, Leyva B, Bowen DJ, et al. Measures of organizational characteristics associated with adoption and/or implementation of innovations: a systematic review. BMC Health Serv Res. 2017;17(1):591.

  13. Watson DP, Adams EL, Shue S, Coates H, McGuire A, Chesher J, et al. Defining the external implementation context: an integrative systematic literature review. BMC Health Serv Res. 2018;18(1):209.

  14. Chor KHB, Wisdom JP. Olin S-CS, Hoagwood KE, Horwitz SM: Measures for predictors of innovation adoption. Adm Policy Ment Health. 2015;42(5):545–73.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Chaudoir SR, Dugan AG, Barr CH. Measuring factors affecting implementation of health innovations: a systematic review of structural, organizational, provider, patient, and innovation level measures. Implement Sci. 2013;8:22.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Lewis CC, Stanick CF, Martinez RG, Weiner BJ, Kim M, Barwick M, et al. The Society for Implementation Research Collaboration Instrument Review Project: a methodology to promote rigorous evaluation. Implement Sci. 2015;10(1):2.

  17. Davidoff F. Understanding contexts: how explanatory theories can help. Implement Sci. 2019;14(1):23.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Johns G. Reflections on the 2016 decade award: incorporating context in organizational research. Acad Manage Rev. 2017;42(4):577–95.

    Article  Google Scholar 

  19. Kirk MA, Kelley C, Yankey N, Birken SA, Abadie B, Damschroder L. A systematic review of the use of the consolidated framework for implementation research. Implement Sci. 2015;11(1):72.

    Article  Google Scholar 

  20. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50.

    Article  PubMed  PubMed Central  Google Scholar 

  21. Harvey G, Kitson A. PARIHS revisited: from heuristic to integrated framework for the successful implementation of knowledge into practice. Implement Sci. 2016;11:33.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Atkins L, Francis J, Islam R, O’Connor D, Patey A, Ivers N, et al. A guide to using the Theoretical Domains Framework of behaviour change to investigate implementation problems. Implement Sci. 2017;12(1):77.

  23. Moullin JC, Sabater-Hernandez D, Fernandez-Llimos F, Benrimoj SI. A systematic review of implementation frameworks of innovations in healthcare and resulting generic implementation framework. Health Res Policy Syst. 2015;13:16.

    Article  PubMed  PubMed Central  Google Scholar 

  24. Tabak RG, Khoong EC, Chambers DA, Brownson RC. Bridging research and practice: models for dissemination and implementation research. Am J Prev Med. 2012;43(3):337–50.

    Article  PubMed  PubMed Central  Google Scholar 

  25. Fernandez ME, Walker TJ, Weiner BJ, Calo WA, Liang S, Risendal B, et al. Developing measures to assess constructs from the Inner Setting domain of the Consolidated Framework for Implementation Research. Implement Sci. 2018;13(1):52.

  26. Pfadenhauer LM, Mozygemba K, Gerhardus A, Hofmann B, Booth A, Lysdahl KB, et al. Context and implementation: a concept analysis towards conceptual maturity. Z Evid Fortbild Qual Gesundhwes. 2015;109(2):103–14.

  27. Squires JE, Graham I, Bashir K, Nadalin-Penno L, Lavis J, Francis J, et al. Understanding context: a concept analysis. J Adv Nurs. 2019;0(0):1–23.

  28. Szymczak JE. Beyond barriers and facilitators: the central role of practical knowledge and informal networks in implementing infection prevention interventions. BMJ Qual Saf. 2018;27(10):763–5.

    Article  PubMed  Google Scholar 

  29. Haines ER, Kirk MA, Lux L, Smitherman AB, Powell BJ, Dopp A, et al. Ethnography and user-centered design to inform context-driven implementation. Transl Behav Med. 2022;12(1):ibab077.

  30. Neta G, Glasgow RE, Carpenter CR, Grimshaw JM, Rabin BA, Fernandez ME, et al. A framework for enhancing the value of research for dissemination and implementation. Am J Public Health. 2015;105(1):49–57.

  31. Stange KC, Glasgow RE. Contextual factors: the importance of considering and reporting on context in research on the patient-centered medical home. Rockville: Agency for Healthcare Research and Quality; 2013. ARHQ Publication No. 13-0045-EF

    Google Scholar 

  32. De Geest S, Valenta S, Ribaut J, Gerull S, Mielke J, Simon M, Bartakova J, Kaier K, Eckstein J, Leppla L et al: The SMILe Integrated Care Model in Allogeneic SteM Cell TransplantatIon faciLitated by eHealth: a protocol for a hybrid 1 effectiveness-implementation randomized controlled trial. BMC Health Services Research. 2022;22(1):1067.

  33. Leppla L, Schmid A, Valenta S, Mielke J, Beckmann S, Ribaut J, et al. Development of an integrated model of care for allogeneic stem cell transplantation facilitated by eHealth-the SMILe study. Support Care Cancer. 2021;29:8045–57.

  34. Leppla L, Mielke J, Kunze M, Mauthner O, Teynor A, Valenta S, et al. Clinicians and patients perspectives on follow-up care and eHealth support after allogeneic hematopoietic stem cell transplantation: a mixed-methods contextual analysis as part of the SMILe study. Eur J Oncol Nurs. 2020;45:101723.

  35. Gee PM, Greenwood DA, Paterniti DA, Ward D, Miller LM. The eHealth enhanced chronic care model: a theory derivation approach. J Med Internet Res. 2015;17(4):e86.

    Article  PubMed  PubMed Central  Google Scholar 

  36. Ribaut J, Leppla L, Teynor A, Valenta S, Dobbels F, Zullig LL, et al. Theory-driven development of a medication adherence intervention delivered by eHealth and transplant team in allogeneic stem cell transplantation: the SMILe implementation science project. BMC Health Serv Res. 2020;20(1):827.

  37. Leppla L, Hobelsberger S, Rockstein D, Werlitz V, Pschenitza S, Heidegger P, et al. Implementation science meets software development to create eHealth components for an integrated care model for allogeneic stem cell transplantation facilitated by eHealth: the SMILe study as an example. J Nurs Scholarsh. 2020;53(1):35–45.

  38. Mielke J, De Geest S, Zúñiga F, Brunkert T, Zullig LL, Pfadenhauer LM, et al. Understanding dynamic complexity in context—enriching contextual analysis in implementation science from a constructivist perspective. Front Health Serv. 2022;2:1–7.

  39. Miller WL, Crabtree BF, Nutting PA, Stange KC, Jaén CR. Primary care practice development: a relationship-centered approach. Ann Fam Med. 2010;8(Suppl 1):S68–79; s92.

    Article  PubMed  PubMed Central  Google Scholar 

  40. Barkhordarian A, Demerjian G, Jan A, Sama N, Nguyen M, Du A, et al. Stakeholder engagement analysis - a bioethics dilemma in patient-targeted intervention: patients with temporomandibular joint disorders. J Transl Med. 2015;13(1):15.

  41. Birken SA, Rohweder CL, Powell BJ, Shea CM, Scott J, Leeman J, et al. T-CaST: an implementation theory comparison and selection tool. Implement Sci. 2018;13(1):143.

  42. Birken SA, Powell BJ, Shea CM, Haines ER, Alexis Kirk M, Leeman J, et al. Criteria for selecting implementation science theories and frameworks: results from an international survey. Implement Sci. 2017;12(1):124.

  43. Davidoff F, Dixon-Woods M, Leviton L, Michie S. Demystifying theory and its use in improvement. BMJ Qual Saf. 2015;24(3):228–38.

    Article  PubMed  PubMed Central  Google Scholar 

  44. Strifler L, Cardoso R, McGowan J, Cogo E, Nincic V, Khan PA, et al. Scoping review identifies significant number of knowledge translation theories, models, and frameworks with limited use. J Clin Epidemiol. 2018;100:92–102.

  45. Liang L, Bernhardsson S, Vernooij RW, Armstrong MJ, Bussières A, Brouwers MC, et al. Use of theory to plan or evaluate guideline implementation among physicians: a scoping review. Implement Sci. 2017;12(1):1–12.

  46. Moullin JC, Dickson KS, Stadnick NA, Albers B, Nilsen P, Broder-Fingert S, et al. Ten recommendations for using implementation frameworks in research and practice. Implement Sci Commun. 2020;1(1):42.

  47. Minogue V, Matvienko-Sikar K, Hayes C, Morrissey M, Gorman G, Terres A. The usability and applicability of knowledge translation theories, models, and frameworks for research in the context of a national health service. Health Res Policy Syst. 2021;19(1):105.

    Article  PubMed  PubMed Central  Google Scholar 

  48. D&I Models Webtool []. Accessed 29 Sept 2021.

  49. Wagner EH, Austin BT, Von Korff M. Organizing care for patients with chronic illness. Milbank Q. 1996;74(4):511–44.

  50. Rycroft-Malone J, Seers K, Titchen A, Harvey G, Kitson A, McCormack B. What counts as evidence in evidence-based practice? J Adv Nurs. 2004;47(1):81–90.

    Article  PubMed  Google Scholar 

  51. Squires JE, Hutchinson AM, Coughlin M, Bashir K, Curran J, Grimshaw JM, et al. Stakeholder perspectives of attributes and features of context relevant to knowledge translation in health settings: a multi-country analysis. Int J Health Policy Manag. 2021;11(8):1373–90.

  52. Li S-A, Jeffs L, Barwick M, Stevens B. Organizational contextual features that influence the implementation of evidence-based practices across healthcare settings: a systematic integrative review. Syst Rev. 2018;7(1):72.

    Article  PubMed  PubMed Central  Google Scholar 

  53. Evans JM, Grudniewicz A, Gray CS, Wodchis WP, Carswell P, Baker GR. Organizational context matters: a research toolkit for conducting standardized case studies of integrated care initiatives. Int J Integr Care. 2017;17(2):1–10.

  54. Simblett S, Greer B, Matcham F, Curtis H, Polhemus A, Ferrão J, et al. Barriers to and facilitators of engagement with remote measurement technology for managing health: systematic review and content analysis of findings. J Med Internet Res. 2018;20(7):e10480.

  55. Jeffs E, Vollam S, Young JD, Horsington L, Lynch B, Watkinson PJ. Wearable monitors for patients following discharge from an intensive care unit: practical lessons learnt from an observational study. J Adv Nurs. 2016;72(8):1851–62.

    Article  PubMed  Google Scholar 

  56. Thies K, Anderson D, Cramer B. Lack of adoption of a mobile app to support patient self-management of diabetes and hypertension in a federally qualified health center: interview analysis of staff and patients in a failed randomized trial. JMIR Hum Factors. 2017;4(4):e24.

  57. Glasgow RE, Phillips SM, Sanchez MA. Implementation science approaches for integrating eHealth research into practice and policy. Int J Med Inform. 2014;83(7):e1–e11.

    Article  PubMed  Google Scholar 

  58. Asthana S, Jones R, Sheaff R. Why does the NHS struggle to adopt eHealth innovations? A review of macro, meso and micro factors. BMC Health Serv Res. 2019;19(1):984.

    Article  PubMed  PubMed Central  Google Scholar 

  59. Dünnebeil S, Sunyaev A, Blohm I, Leimeister JM, Krcmar H. Determinants of physicians’ technology acceptance for e-health in ambulatory care. Int J Med Inform. 2012;81(11):746–60.

    Article  PubMed  Google Scholar 

  60. Faber S, van Geenhuizen M, de Reuver M. eHealth adoption factors in medical hospitals: a focus on the Netherlands. Int J Med Inform. 2017;100:77–89.

    Article  PubMed  Google Scholar 

  61. Granja C, Janssen W, Johansen MA. Factors determining the success and failure of eHealth interventions: systematic review of the literature. J Med Internet Res. 2018;20(5):e10235.

    Article  PubMed  PubMed Central  Google Scholar 

  62. Ross J, Stevenson F, Lau R, Murray E. Factors that influence the implementation of e-health: a systematic review of systematic reviews (an update). Implement Sci. 2016;11(1):146.

    Article  PubMed  PubMed Central  Google Scholar 

  63. Kirsch M, Berben L, Johansson E, Calza S, Eeltink C, Stringer J, et al. Nurses’ practice patterns in relation to adherence-enhancing interventions in stem cell transplant care: a survey from the Nurses Group of the European Group for Blood and Marrow Transplantation. Eur J Cancer Care. 2014;23(5):607–15.

  64. Vanhoof JMM, Vandenberghe B, Geerts D, Philippaerts P, De Mazière P, DeVito DA, et al. Consortium obotP-T: Technology experience of solid organ transplant patients and their overall willingness to use interactive health technology. J Nurs Scholarsh. 2018;50(2):151–62.

  65. Gresch B, Kirsch M, Fierz K, Halter J, Nair G, Denhaerynck K, et al. Medication nonadherence to immunosuppressants after adult allogeneic haematopoietic stem cell transplantation: a multicentre cross-sectional study. Bone Marrow Transplant. 2017;52(2):304–6.

  66. Ramanadhan S, Davis MM, Armstrong R, Baquero B, Ko LK, Leng JC, et al. Participatory implementation science to increase the impact of evidence-based cancer prevention and control. Cancer Causes Control. 2018;29(3):363–9.

  67. Glasgow RE, Chambers D. Developing robust, sustainable, implementation systems using rigorous, rapid and relevant science. Clin Transl Sci. 2012;5(1):48–55.

    Article  PubMed  PubMed Central  Google Scholar 

  68. Goodman MS, Sanders Thompson VL. The science of stakeholder engagement in research: classification, implementation, and evaluation. Transl Behav Med. 2017;7(3):486–91.

    Article  PubMed  PubMed Central  Google Scholar 

  69. Varvasovszky Z, Brugha R. A stakeholder analysis. Health Policy Plan. 2000;15(3):338–45.

    Article  CAS  PubMed  Google Scholar 

  70. INVOLVE. Briefing notes for researchers: involving the public in NHS, public health and social care research. Eastleigh: INVOLVE; 2012.

    Google Scholar 

  71. Patients Active in Research and Dialogues for an Improved Generation of Medicines (PARADIGM) []. Accessed 29 Sept 2021.

  72. Churruca K, Ludlow K, Taylor N, Long JC, Best S, Braithwaite J. The time has come: embedded implementation research for health care improvement. J Eval Clin Pract. 2019;25(3):373–80.

  73. Bombard Y, Baker GR, Orlando E, Fancott C, Bhatia P, Casalino S, et al. Engaging patients to improve quality of care: a systematic review. Implement Sci. 2018;13(1):98.

  74. Ramanadhan S, Revette AC, Lee RM, Aveling EL. Pragmatic approaches to analyzing qualitative data for implementation science: an introduction. Implement Sci Commun. 2021;2(1):70.

    Article  PubMed  PubMed Central  Google Scholar 

  75. Subramanian L, Elam M, Healey AJ, Paquette E, Henrich N. Context matters—but what aspects? The need for evidence on essential aspects of context to better inform implementation of quality improvement initiatives. Jt Comm J Qual Patient Saf. 2021;47(11):748–52.

  76. Pfadenhauer LM. Conceptualizing context and intervention as a system in implementation science: learning from complexity theory; comment on “Stakeholder Perspectives of Attributes and Features of Context Relevant to Knowledge Translation in Health Settings: A Multi-country Analysis”. Int J Health Policy Manag. 2021;11(8):1570–3.

  77. Howarth E, Devers K, Moore G, O’Cathain A, Dixon-Woods M. Contextual issues and qualitative research. In: Challenges, solutions and future directions in the evaluation of service innovations in health care and public health, vol. 4; 2016. p. 105–20. Health Serv Deliv Res.

    Google Scholar 

  78. Palinkas LA, Aarons GA, Horwitz S, Chamberlain P, Hurlburt M, Landsverk J. Mixed method designs in implementation research. Adm Policy Ment Health. 2011;38(1):44–53.

    Article  PubMed  Google Scholar 

  79. Beidas RS, Wolk CL, Walsh LM, Evans AC Jr, Hurford MO, Barg FK. A complementary marriage of perspectives: understanding organizational social context using mixed methods. Implement Sci. 2014;9:175.

    Article  PubMed  PubMed Central  Google Scholar 

  80. Albright K, Gechter K, Kempe A. Importance of mixed methods in pragmatic trials and dissemination and implementation research. Acad Pediatr. 2013;13(5):400–7.

    Article  PubMed  Google Scholar 

  81. Green CA, Duan N, Gibbons RD, Hoagwood KE, Palinkas LA, Wisdom JP. Approaches to mixed methods dissemination and implementation research: methods, strengths, caveats, and opportunities. Adm Policy Ment Health. 2015;42(5):508–23.

    Article  PubMed  PubMed Central  Google Scholar 

  82. Creswell JW, Plano Clark VL. Designing and conducting mixed methods research. 3rd ed. Los Angeles: Sage; 2018.

    Google Scholar 

  83. McHugh S, Dorsey CN, Mettert K, Purtle J, Bruns E, Lewis CC. Measures of outer setting constructs for implementation research: a systematic review and analysis of psychometric quality. Implement Res Pract. 2020;1:2633489520940022.

    Google Scholar 

  84. Clinton-McHarg T, Yoong SL, Tzelepis F, Regan T, Fielding A, Skelton E, et al. Psychometric properties of implementation measures for public health and community settings and mapping of constructs against the Consolidated Framework for Implementation Research: a systematic review. Implement Sci. 2016;11(1):148.

  85. Kien C, Schultes M-T, Szelag M, Schoberberger R, Gartlehner G. German language questionnaires for assessing implementation constructs and outcomes of psychosocial and health-related interventions: a systematic review. Implement Sci. 2018;13(1):150.

    Article  PubMed  PubMed Central  Google Scholar 

  86. Gagnon M-P, Attieh R, Ghandour EK, Légaré F, Ouimet M, Estabrooks CA, et al. A systematic review of instruments to assess organizational readiness for knowledge translation in health care. PlosS One. 2014;9(12):e114338.

  87. Rabin BA, Lewis CC, Norton WE, Neta G, Chambers D, Tobin JN, et al. Measurement resources for dissemination and implementation research in health. Implement Sci. 2016;11:42.

    Article  PubMed  PubMed Central  Google Scholar 

  88. Weiner BJ, Mettert KD, Dorsey CN, Nolen EA, Stanick C, Powell BJ, et al. Measuring readiness for implementation: a systematic review of measures’ psychometric and pragmatic properties. Implement Res Pract. 2020;1:2633489520933896.

    Google Scholar 

  89. Consolidated Framework for Implementation Research []. Accessed 10 Oct 2021.

  90. EPIS Framework []. Accessed 10 Oct 2021.

  91. National Cancer Institute: Qualitative methods in implementation science.; 2015.

    Google Scholar 

  92. Hamilton AB, Finley EP. Qualitative methods in implementation research: an introduction. Psychiatry Res. 2019;280:112516.

    Article  PubMed  PubMed Central  Google Scholar 

  93. Palinkas LA, Mendon SJ, Hamilton AB. Innovations in mixed methods evaluations. Annu Rev Public Health. 2019;40(1):423–42.

    Article  PubMed  PubMed Central  Google Scholar 

  94. Creswell JW, Poth CN. Qualitative inquiry & research design: choosing among five approaches. 4th ed. Thousand Oaks: SAGE Publications; 2018.

    Google Scholar 

  95. Checkland K, Harrison S, Marshall M. Is the metaphor of ‘barriers to change’ useful in understanding implementation? Evidence from general medical practice. J Health Serv Res Policy. 2007;12(2):95–100.

    Article  PubMed  Google Scholar 

  96. Soom Ammann E, Van Holten K. Mit allen Sinnen ins Feld - Teilnehmende Beobachtung als Methode. QuPuG. 2017;4(1):6–14.

    Google Scholar 

  97. Daae J, Boks C. A classification of user research methods for design for sustainable behaviour. J Clean Prod. 2015;106:680–9.

    Article  Google Scholar 

  98. Palinkas LA, Zatzick D. Rapid assessment procedure informed clinical ethnography (RAPICE) in pragmatic clinical trials of mental health services implementation: methods and applied case study. Adm Policy Ment Health. 2019;46(2):255–70.

    Article  PubMed  PubMed Central  Google Scholar 

  99. Nevedal AL, Reardon CM, Opra Widerquist MA, Jackson GL, Cutrona SL, White BS, et al. Rapid versus traditional qualitative analysis using the Consolidated Framework for Implementation Research (CFIR). Implement Sci. 2021;16(1):67.

    Article  PubMed  PubMed Central  Google Scholar 

  100. Vindrola-Padros C, Johnson GA. Rapid techniques in qualitative research: a critical review of the literature. Qual Health Res. 2020;30(10):1596–604.

    Article  PubMed  Google Scholar 

  101. Vindrola-Padros C, Vindrola-Padros B. Quick and dirty? A systematic review of the use of rapid ethnographies in healthcare organisation and delivery. BMJ Qual Saf. 2018;27(4):321–30.

    Article  PubMed  Google Scholar 

  102. Gale RC, Wu J, Erhardt T, Bounthavong M, Reardon CM, Damschroder LJ, et al. Comparison of rapid vs in-depth qualitative analytic methods from a process evaluation of academic detailing in the Veterans Health Administration. Implement Sci. 2019;14(1):11.

    Article  PubMed  PubMed Central  Google Scholar 

  103. Coles E, Anderson J, Maxwell M, Harris FM, Gray NM, Milner G, et al. The influence of contextual factors on healthcare quality improvement initiatives: a realist review. Syst Rev. 2020;9(1):94.

    Article  PubMed  PubMed Central  Google Scholar 

  104. The Atlas Initiative: five questions with Natalie Henrich []. Accessed 30 Nov 2021.

  105. Atlas Initiative []. Accessed 30 Nov 2021.

  106. Berben L, Denhaerynck K, Dobbels F, Engberg S, Vanhaecke J, Crespo-Leiro MG, et al. Building research initiative group: chronic illness management and adherence in transplantation (BRIGHT) study: study protocol. J Adv Nurs. 2015;71(3):642–54.

    Article  PubMed  Google Scholar 

  107. Denhaerynck K, Berben L, Dobbels F, Russell CL, Crespo-Leiro MG, Poncelet AJ, et al. Multilevel factors are associated with immunosuppressant nonadherence in heart transplant recipients: the international BRIGHT study. Am J Transplant. 2018;18(6):1447–60.

    Article  PubMed  PubMed Central  Google Scholar 

  108. Berben L, Russell CL, Engberg S, Dobbels F, De Geest S. Development, content validity and inter-rater reliability testing of the Chronic Illness Management Implementation – Building Research Initiative Group: Chronic Illness Management and Adherence in Transplantation: an instrument to assess the level of chronic illness management implemented in solid organ transplant programmes. Int J Care Coord. 2014;17(1-2):59–71.

    Google Scholar 

  109. Gugiu PC, Coryn C, Clark R, Kuehn A. Development and evaluation of the short version of the Patient Assessment of Chronic Illness Care instrument. Chronic Illn. 2009;5(4):268–76.

    Article  PubMed  Google Scholar 

  110. Moore G, Campbell M, Copeland L, Craig P, Movsisyan A, Hoddinott P, et al. Adapting interventions to new contexts—the ADAPT guidance. BMJ. 2021;374:n1679.

    Article  PubMed  PubMed Central  Google Scholar 

  111. Chambers DA, Norton WE. The adaptome: advancing the science of intervention adaptation. Am J Prev Med. 2016;51(4, Supplement 2):S124–31.

    Article  PubMed  PubMed Central  Google Scholar 

  112. Bleijenberg N, de Man-van Ginkel JM, Trappenburg JCA, Ettema RGA, Sino CG, Heim N, et al. Increasing value and reducing waste by optimizing the development of complex interventions: enriching the development phase of the Medical Research Council (MRC) Framework. Int J Nurs Stud. 2018;79:86–93.

    Article  PubMed  Google Scholar 

  113. Haines ER, Dopp A, Lyon AR, Witteman HO, Bender M, Vaisson G, et al. Harmonizing evidence-based practice, implementation context, and implementation strategies with user-centered design: a case example in young adult cancer care. Implement Sci Comm. 2021;2(1):45.

    Article  Google Scholar 

  114. Skivington K, Matthews L, Simpson SA, Craig P, Baird J, Blazeby JM, et al. A new framework for developing and evaluating complex interventions: update of Medical Research Council guidance. BMJ. 2021;374:n2061.

    Article  PubMed  PubMed Central  Google Scholar 

  115. Bartholomew LK, Parcel GS, Kok G. Intervention mapping: a process for developing theory and evidence-based health education programs. Health Educ Behav. 1998;25(5):545–63.

    Article  CAS  PubMed  Google Scholar 

  116. Powell BJ, Beidas RS, Lewis CC, Aarons GA, McMillen JC, Proctor EK, et al. Methods to improve the selection and tailoring of implementation strategies. J Behav Health Serv Res. 2017;44(2):177–94.

    Article  PubMed  PubMed Central  Google Scholar 

  117. Powell BJ, Fernandez ME, Williams NJ, Aarons GA, Beidas RS, Lewis CC, et al. Enhancing the impact of implementation strategies in healthcare: a research agenda. Front Public Health. 2019;7(3):1–9.

  118. Wiltsey Stirman S, Baumann AA, Miller CJ. The FRAME: an expanded framework for reporting adaptations and modifications to evidence-based interventions. Implement Sci. 2019;14(1):58.

    Article  PubMed  PubMed Central  Google Scholar 

  119. Miller CJ, Barnett ML, Baumann AA, Gutner CA, Wiltsey-Stirman S. The FRAME-IS: a framework for documenting modifications to implementation strategies in healthcare. Implement Sci. 2021;16(1):36.

    Article  PubMed  PubMed Central  Google Scholar 

  120. Moore GF, Audrey S, Barker M, Bond L, Bonell C, Hardeman W, et al. Process evaluation of complex interventions: Medical Research Council guidance. BMJ. 2015;350:h1258.

    Article  PubMed  PubMed Central  Google Scholar 

  121. O'Cathain A, Croot L, Duncan E, Rousseau N, Sworn K, Turner KM, et al. Guidance on how to develop complex interventions to improve health and healthcare. BMJ Open. 2019;9(8):e029954.

    Article  PubMed  PubMed Central  Google Scholar 

  122. Michie S, van Stralen MM, West R. The behaviour change wheel: a new method for characterising and designing behaviour change interventions. Implement Sci. 2011;6(1):42.

    Article  PubMed  PubMed Central  Google Scholar 

  123. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10:21.

    Article  PubMed  PubMed Central  Google Scholar 

  124. Wells M, Williams B, Treweek S, Coyle J, Taylor J. Intervention description is not enough: evidence from an in-depth multiple case study on the untold role and impact of context in randomised controlled trials of seven complex interventions. Trials. 2012;13(1):95.

    Article  PubMed  PubMed Central  Google Scholar 

  125. Datta J, Petticrew M. Challenges to evaluating complex interventions: a content analysis of published papers. BMC Public Health. 2013;13(1):568.

    Article  PubMed  PubMed Central  Google Scholar 

  126. Mielke J, Brunkert T, Zullig LL, Bosworth HB, Deschodt M, Simon M, et al. Relevant journals for identifying implementation science articles: results of an international implementation science expert survey. Front Public Health. 2021;9(458):1–8.

  127. Pinnock H, Barwick M, Carpenter CR, Eldridge S, Grandes G, Griffiths CJ, et al. Standards for reporting implementation studies (StaRI) statement. BMJ. 2017;356:i6795.

    Article  PubMed  PubMed Central  Google Scholar 

  128. Pinnock H, Barwick M, Carpenter CR, Eldridge S, Grandes G, Griffiths CJ, et al. Standards for reporting implementation studies (StaRI): explanation and elaboration document. BMJ Open. 2017;7(4):e013318.

    Article  PubMed  PubMed Central  Google Scholar 

  129. Tomoaia-Cotisel A, Scammon DL, Waitzman NJ, Cronholm PF, Halladay JR, Driscoll DL, et al. Context matters: the experience of 14 research teams in systematically reporting contextual factors important for practice change. Ann Fam Med. 2013;11(Suppl 1):S115–23.

    Article  PubMed  PubMed Central  Google Scholar 

  130. Rogers L, De Brún A, Birken SA, Davies C, McAuliffe E. Context counts: a qualitative study exploring the interplay between context and implementation success. J Health Organ Manag. 2021;35(7):802–24.

    Article  PubMed Central  Google Scholar 

  131. Meier N, Dopson S. Theoretical lenses on context. In: Meier N, Dopson S, editors. Context in Action and How to Study It: Illustrations from Health Care. 1st ed. United States: Oxford University Press; 2019. p. 13–32.

    Chapter  Google Scholar 

  132. Vindrola-Padros C, Pape T, Utley M, Fulop NJ. The role of embedded research in quality improvement: a narrative review. BMJ Qual Saf. 2017;26(1):70–80.

    Article  PubMed  Google Scholar 

  133. Yip O, Huber E, Stenz S, Zullig LL, Zeller A, De Geest SM, et al. A contextual analysis and logic model for integrated care for frail older adults living at home: the INSPIRE project. Int J Integr Care. 2021;21(2):1–16.

  134. AdvantAGE - Development and implementation of an ADVANced Practice Nurse-led interprofessional Transitional cAre model for frail GEriatric adults. []. Accessed 18 Aug 2022.

Download references


The authors would like to thank Thekla Brunkert, Hélène Schoemans, and Kristina Arnahoutova for their critical feedback on the manuscript and Chris Shultis for the manuscript editing.


Not applicable.

Author information

Authors and Affiliations



JM and SDG conceptualized the study and developed an initial version of BANANA. BANANA was iteratively further developed by LL, SV, LLZ, FZ, SS, AT, SDG, and JM. The manuscript was drafted by JM; SDG, FZ, LLZ, LL, SV, SS, and AT provided ongoing feedback and critically revised the manuscript. The authors read and approved the final manuscript.

Corresponding author

Correspondence to Sabina De Geest.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

LLZ reports research support from Proteus Digital Health, and the PhRMA Foundation, as well as consulting for Pfizer and Novartis. SDG consults for Sanofi and Novartis. All activities are unrelated to the current work.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

The Basel Approach for coNtextual ANAlysis (BANANA): Overview of its development process and theoretical underpinning.

Additional file 2.

Key resources for each component of the Basel Approach for coNtextual ANAlysis (BANANA).

Additional file 3.

Overview of contextual factors most commonly reported in empirical evidence to influence implementation.

Additional file 4.

Overview of variables assessed and themes explored in the SMILe project.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Mielke, J., Leppla, L., Valenta, S. et al. Unraveling implementation context: the Basel Approach for coNtextual ANAlysis (BANANA) in implementation science and its application in the SMILe project. Implement Sci Commun 3, 102 (2022).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: