Skip to main content

Process mapping with failure mode and effects analysis to identify determinants of implementation in healthcare settings: a guide

Abstract

Background

Generating and analyzing process maps can help identify and prioritize barriers to the implementation of evidence-based practices in healthcare settings. Guidance on how to systematically apply and report these methods in implementation research is scant. We describe a method combining a qualitative approach to developing process maps with a quantitative evaluation of maps drawn from the quality improvement literature called failure mode and effects analysis (FMEA).

Methods

We provide an outline and guidance for how investigators can use process mapping with FMEA to identify and prioritize barriers when implementing evidence-based clinical interventions. Suggestions for methods and reporting were generated based on established procedures for process mapping with FMEA and through review of original research papers which apply both methods in healthcare settings. We provide case examples to illustrate how this approach can be operationalized in implementation research.

Results

The methodology of process mapping with FMEA can be divided into four broad phases: 1) formulating a plan, 2) generating process maps to identify and organize barriers over time, 3) prioritizing barriers through FMEA, and 4) devising an implementation strategy to address priority barriers. We identified 14 steps across the four phases. Two illustrative examples are provided. Case 1 describes the implementation of referrals to chiropractic care for adults with low back pain in primary care clinics. Case 2 describes the implementation of a family navigation intervention for children with autism spectrum disorder seeking care in pediatric clinics. For provisional guidance for reporting, we propose the REporting Process mapping and Analysis for Implementation Research (REPAIR) checklist.

Conclusions

Process mapping with FMEA can elucidate barriers and facilitators to successful implementation of evidence-based clinical interventions. This paper provides initial guidance for more systematic applications of this methodology in implementation research. Future research should use a consensus-building approach, such as a multidisciplinary Delphi panel, to further delineate the reporting standards for studies that use process mapping with FMEA.

Peer Review reports

Background

Accurately identifying the determinants that impede (i.e. barriers) or allow (i.e. facilitators) the uptake of evidence-based practices is a central mission in implementation research [1]. Implementation strategies can target these determinants with the goal of increasing the rate at which evidence-based practices become part of routine clinical care [2]. Prioritizing which determinants are targeted in a specific setting may be critical to developing feasible and effective implementation strategies [3]. In this paper, we consider identifying barriers and facilitators through the development of process maps, along with their subsequent evaluation through failure mode and effects analysis (FMEA).

Current methods for identifying barriers are limited in that they often do not explicitly explore relationships between barriers and/or do not consider how barriers occur or interact over time, even though they may rely on frameworks that consider time or stages of implementation (e.g., EPIS framework) [4]. Visualizing the occurrence of barriers as they impede the implementation of an evidence-based practice may help to prioritize barriers and inform where or when to intervene with specific implementation strategies. Furthermore, widely used qualitative approaches for identifying barriers and facilitators—such as in-depth interviews or focus groups—do not necessarily create the opportunity to prioritize barriers based on when or how often they occur. Quantitative surveys alone are also limited, as they may lack the flexibility needed to identify and prioritize distinct barriers across different healthcare settings.

The development and analysis of process maps using FMEA may address some of these challenges. FMEA, which has origins in engineering and manufacturing operations, is increasingly used in healthcare settings. Prior guides to this methodology have focused on the general steps of conducting FMEA rather than a systematic approach to creating process maps, or vice versa [5,6,7]. Furthermore, FMEA is often used to assess quality or safety of delivery of a procedure rather than to identify barriers or facilitators of successful implementation of interventions in or across healthcare and community settings. Indeed, a case study by Kononowech et al.noted that pragmatic, tangible examples of process mapping techniques are lacking in the implementation science literature [8]. Further guidance on this method is needed so that it may be more broadly applied in implementation efforts within healthcare settings. In turn, additional applications of this method are needed to evaluate its validity or utility for identifying determinants of implementation in comparison with more traditional methods.

Use of process maps with FMEA may also help to align implementation science with improvement practice through partnering with improvement scientists who have experience with this methodology [9]. However, additional development of this approach is needed so that methods are clear, systematic and reproducible, particularly for projects that use this method across multiple clinics or health systems. Thus, there is an opportunity for synergistic collaboration across quality improvement and implementation science fields to advance the methodology and achieve the shared goal of improving patient care [10].

Methods

In this manuscript we describe the use of process mapping with FMEA to identify and prioritize barriers that can be the target of tailored implementation strategies. The approach we describe was informed by: 1) prior guides or literature reviews on use of FMEA with or without guidance on process mapping; and 2) reviewing original research papers which apply process mapping with FMEA in healthcare settings. In addition to presenting a step-by-step guide with a reporting checklist, we provide two case studies to illustrate the use of this methodology in implementation research.

Results

Literature review

We identified several guides and reviews involving process mapping with FMEA [5, 6, 11] or FMEA only [7, 12,13,14,15,16,17,18,19,20,21,22] in healthcare settings. We handsearched these articles (for citations forwards and backwards in time) to find a list of original research articles that report use of both process mapping and FMEA. For example, a recent systematic review identified 105 studies using process mapping in healthcare settings [5], 8 of which used FMEA to analyze process maps [23,24,25,26,27,28,29,30]. We did not find published guidelines for conducting process mapping and FMEA in implementation research.

Additionally, we searched PubMed for additional articles through January of 2024. Our search strategy, described further in Supplemental File 1, identified 721 potentially relevant articles. In total, we identified 54 original research articles that presented both process maps and FMEA from our literature review and handsearching citations of relevant articles. Of these 54, only 1 self-identified as being ‘implementation research’ [31]. The rest self-identified as quality improvement studies, although several also indicated that the end goal of their work was to implement an innovation. In Supplemental Table 1, we present characteristics of select articles (n=38/54, 70%) [23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60] which vividly describe using multidisciplinary teams for process mapping and FMEA within a healthcare setting.

Guide for applying process mapping with FMEA

Based on our literature review, we developed the following guide for conducting process mapping and analysis with FMEA to identify and prioritize barriers that can be the target of tailored implementation strategies. In Fig. 1 we propose a method with four phases: formulate an overall plan; generate process maps; analyze process maps by FMEA; and devise a tailored implementation strategy that targets the barriers prioritized by FMEA.

Fig. 1
figure 1

Guided approach to using process mapping and FMEA in implementation research

Phase 1: Formulating a plan

Step 1: Choose a guiding framework

Investigators should choose a framework that is expected to effectively guide the thematic organization of constructs and analysis of process-related data, contextualize findings, and allow for generalization of results [61]. Without effective use of frameworks, investigators may come to false conclusions regarding their implementation effort, and the study’s impact on the field may also be compromised. The chosen framework can come from the field of implementation science or related fields (e.g., improvement, behavioral, or organizational sciences). It should reflect both the specific context and goals of the study. The Theory Comparison and Selection Tool (T-CaST) was created to help research teams choose the appropriate theories, models, and frameworks for their implementation efforts [62]. Research teams wanting to study the factors that influence implementation outcomes—which is the focus of this guide—will choose a determinant framework such as the Theoretical Domains Framework (TDF) [63] or the Consolidated Framework for Implementation Research (CFIR) [64, 65]. For research teams studying the beliefs and behaviors of healthcare workers, TDF may be a sensible choice. For teams aiming to understand or change health system infrastructure or policies, CFIR may be suitable. It is also possible to employ multiple frameworks, particularly when one framework can complement the other, e.g., using CFIR and TDF [66].

Step 2: Form teams

A team of stakeholders is assembled; these individuals will generate the data for building process maps. If investigators are involving multiple sites (e.g., more than one clinic or health system), a stakeholder team should be developed to represent each site [37]. The number and types of stakeholders will depend on the study goals. Even when the implementation effort involves a clinical procedure with specific steps (e.g., surgery), it can be important to consider other, non-clinican perspectives [67]. In fact, it is often crucial to involve a diverse mix of stakeholder perspectives when the aim is to increase adoption of an evidenced-based practice in usual care [29, 33, 37, 55]. This ranges from an end-user point of view (e.g., patients who can voice their own needs and resources) to a macro-level point of view (e.g., administrators who can identify policies or other factors that influence implementation within the organization) [14, 37]. Although researchers may identify stakeholders with membership in one particular role as the basis for their recruitment, it is important to note that the same individual can have membership in multiple roles through their own lived experiences, e.g., clinic staff or providers who also identify as patients or caregivers. Researchers who acknowledge and encourage participants to share their full perspective are likely to gain a richer level of detail when developing process maps.

Step 3: Recruit stakeholders for process mapping

Sampling strategies may include convenience sampling (involving stakeholders or a particular site to which the investigators have easy access) or purposive sampling (seeking out clinic sites or stakeholders that represent variation around the implementation of the practice being studied) [68]. The number of stakeholders to involve depends on the complexity and scope of the process of interest. Those involved should have knowledge of and experience with different parts of the process. To mitigate potential time constraints of stakeholders, investigators are advised to work closely with clinic site champions to ensure that the team is being strategic about when and how to ask for stakeholders’ time.

Phase 2: Generating process maps

Step 4: Conduct initial process map meetings

The goal of these meetings is to collect data from stakeholders for an initial process map. Meetings can be one-on-one with a research team member, which may be easier to schedule for individuals working in busy clinical settings. When it is feasible to convene multiple or all team members, it may be more efficient to create multiple maps at once as part of a focus group or workshop facilitated by the research team. Researchers should be intentional about ensuring participants’ psychological safety, i.e., their willingness to speak up and disagree openly without fear of repercussions. Psychological safety may depend on the inherent power dynamics of the group or the sensitivity/urgency of the topic being studied. Employing the following general measures should promote psychological safety and participant candor: conducting process map meetings without the presence of clinical leaders/management and/or ensuring that group interviewees are from the same organizational level, and communicating that all participant data will be de-identified.

In these process map meetings, the researcher asks the stakeholder(s) to tell a comprehensive narrative based on their experiences and understanding of the process being studied. The process has a predetermined start and end point, such that the scope and timeline of the process is clear. The investigator can display a basic skeleton of a map (e.g., 2-3 nodes or boxes representing steps in the process) to orient the stakeholder(s) (Fig. 2, Panel A). The investigator prompts the stakeholder by asking them to note specific steps in the process, along with contextual factors that may determine success at that step, e.g., the barriers or facilitators of implementation. Interviews or focus groups may be recorded and transcribed. Alternatively, research team members can take comprehensive notes. Researchers should choose a means of data collection that allows them to carefully document the following types of information: general steps of the process, barriers, facilitators, and other contextual factors of participants or the clinical setting.

Fig. 2
figure 2

Visual guide to generating process maps. A shows a process map skeleton that includes a starting point (1), an end point (4) and some of the known intermediate steps (2 and 3). This figure can orient the stakeholder for the creation of their own process map. The dotted line indicates that there may be additional intermediate steps and that participants should name such steps in telling the narratives for their own process maps. B provides an example that retains the original steps of the skeleton map (black) with an individual stakeholder or group adding additional steps (pink) and barriers/failure modes (blue) that may prevent one from getting to that step. Some of the steps have a single barrier while others have multiple. C shows how parts of the map can be organized by swim lanes that indicate a particular context or perspective. One example of context would be that Context A in figure indicates the inner setting (e.g., a clinic) and Context B indicates the outer setting (e.g., the community). In this example, showing barriers that occur in the context of a clinical or health system may be more or less addressable. Swim lanes may also be used to show a particular context, see Broder-Fingert [31] where separate ‘lanes’ are given to patients, providers, and staff. It is important to note that some stakeholders may contribute to different parts of the map. For example, if Context A indicates the clinical setting, it may be that Steps 2 and 3 and the associated barriers are primarily identified by clinicians and clinical staff, while Steps 5 and 6 involve a transition to the community setting and thus may be identified by other stakeholders, such as patients or community leaders

Step 5: Chronologically order notes from stakeholder meetings

Notes from stakeholder interviews should be organized chronologically—that is, all steps and determinants (barriers or facilitators) are ordered according to how the stakeholder reported the progression of the process over time. If the research team would like to capture illustrative “cases,” interview notes might also serve as the basis for generating a comprehensive narrative for a given stakeholder. Narratives could also be revisited after collecting the data from FMEA to shed light on the rationale that participants may have used when answering the FMEA rating questions.

Step 6: Build a process map skeleton

Chronological notes are organized to build initial process maps by first identifying all of the distinct steps of the process (Fig. 2, Panel B). Each step will be a general node in the process map. These represent subprocesses that are not failure modes and thus will not be rated by FMEA. Basic rules for generating process maps include expressing each node or subprocess from left to right (or top to bottom) as they occur over time [11, 13]. This should be illustrated in appropriate software (e.g., Microsoft PowerPoint, Adobe Illustrator). Events that recur over time can be expressed as they occur or only once to simplify the map (frequency of occurrence will be evaluated in FMEA). The framework or theory chosen to guide the creation and analysis of finalized process maps may also inform the formatting of maps [64]. The precise structural format of these maps may vary but should be defined clearly in a protocol document and methods section of a manuscript.

Step 7: Incorporate failure modes into process map

From the stakeholder narratives, additional elaboration on contextual details or reasons for why the subprocess would not be successful can be categorized as being a: “failure mode” (a barrier that prevents the success of the subprocess), “facilitator” (a factor that promotes the success of the subprocess), or “other” (a potentially important contextual detail that does not clearly prevent or facilitate the subprocess but may be relevant when designing implementation strategies). Once conceptually categorized in this way, each failure mode (barrier) is added to the map under the appropriate node.

Step 8: Decide on consistent language and layout for process maps

The process map is refined to be appropriately concise and clear. Language used to describe steps or failure modes in initial maps may be reworded using terminology from the guiding theoretical framework. To make subsequent FMEA ratings easier, all failure modes are described using a uniform format with a clear subject and verb. Perspectives from different stakeholders should be clearly depicted in the map. This can be achieve by color-coding failure mode boxes (e.g., blue boxes for Provider and yellow boxes for Patient), organizing the map into horizontal “swim lanes” as shown in Fig. 2, Panel C, and/or simply following a standard text format where the stakeholder is always identified first (e.g., patient cannot financially afford the evidence-based practice).

Step 9: Finalize process maps

Process map meetings should continue at least until it becomes clear that enough initial maps have been generated for one team; this occurs when investigators see indications of thematic saturation (e.g., an increasing lack of “new” steps or failure modes identified in process map meetings with new stakeholders) [69]. Our experiences with this method indicate that 8-12 stakeholder interviews at a given clinic allow for successful process mapping. This is consistent with existing recommendations for thematic analyses of qualitative data [70]. However, more complex processes may require additional interviews to achieve thematic saturation.

At this point, the failure modes from each individual stakeholder map are coded using the framework selected in Step 1 so that individual maps can be combined and a comprehensive list of failure modes can be generated. As in traditional deductive coding, the research team should build a codebook that operationalizes definitions for the thematic codes that will be used to label failure modes. Thematic coding of failure modes can be accomplished using software such as NVivo to code narratives or coding failure modes from process maps in a matrix in Microsoft Excel. Thematic coding of failure modes within a matrix draws inspiration from rapid qualitative analysis approaches [71, 72]. Coding failure modes from individual maps within a matrix for the same clinic team will facilitate map consolidation into one cumulative map per team. This final process map, which reflects the process as described by all stakeholder members of one team, can be sent out to those stakeholders for member checking before proceeding to FMEA analysis [73]. Researchers should note the total number of unique barriers, which involves verifying that each barrier is indeed distinct. For example, investigators may have to decide whether a “cultural barrier” is different from a “language barrier.”

Phase 3: Analyzing maps by FMEA

Step 10: Plan FMEA workshops

In accordance with the Joint Commission International, FMEA scoring yields a risk priority number (RPN), which is calculated by multiplying the individual scores of occurrence (O), detection (D), and severity (S) for each failure mode/barrier (Fig. 3, Panel A) [15, 16]. Higher numbers in Fig. 3, Panel B, indicate worse scores for occurence and severity (i.e., greater probability of occurrence and greater severity of impact) and better scores for detection, i.e., greater probability to detect the barriers.

Fig. 3
figure 3

Visual guide to failure mode and effects analysis (FMEA). A depicts how to calculate the risk priority number (RPN) for a failure mode, based on the individual scores of severity (S), occurrence (O), and detection (D). depicts a table of raw FMEA results for example failure modes a-i. The language for these illustrative examples was created using CFIR constructs [64, 65]. The numerical scoring here is for illustrative purposes only and does not represent actual data. C depicts an example graphic of visualizing FMEA results for failure modes a-i. Note how the severity dimension (which is anchored to the outcome of the process) stands alone on the vertical axis, while a composite dimension for occurrence and detection (both of which are anchored to the failure mode itself) is on the horizontal axis. This may aid in identifying the failure modes with risk that are skewed towards the process outcome, which may be relevant for developing an implementation strategy

In preparation for the FMEA workshop, investigators can use the process map generated in Phase 2 to develop rating guides to be distributed to all participants prior to performing map analysis, such that questions can be resolved early and participants arrive to the workshop prepared to engage. Rating guides can include the following: an overview of FMEA with a straightforward example, descriptions of how the O, D, and S dimensions are defined for the study, a copy of the stakeholder team’s final process map, and a cumulative list of identified barriers. As part of describing the O, D, and S dimensions of FMEA, the research team develops a uniform scoring system that lists each numerical score and its corresponding description. This is demonstrated in Denny et al. [37], in which having the lowest occurrence score of 1 corresponded to remote occurrence, meaning no known occurrence or happens \(<\) 10% of the time, while having the lowest detection score of 1 corresponded to very high detection, meaning that the error [is] almost always detected or [is caught] 9 out of 10 times. To provide clarity, the rating guide also shows how the occurrence (O) and detection (D) dimensions are anchored to the barrier itself, while the severity (S) dimension is anchored to the outcome of the process; this is exemplified in Rienzi et al. [55], in which only the S dimension is anchored to the process outcome of “injury for gametes, embryos, or patients.” As shown in Fig. 3, we have modified the D dimension so that "10" indicates very high detection rather than "1" which allows the high RPN scores to prioritize barriers that are feasible to detect (rather than those that are hard to detect), frequently occurring, and severe. Depending on the resolution of detail preferred, either a 5-point or 10-point FMEA scale can be used. Because such rating scales can be subject to response bias and ceiling effect, researchers should highlight the objective anchors embedded in their scales and rating guides to mitigate these issues. General survey methodology articles can provide further guidance on fine-tuning Likert-scaled survey instruments to minimize information loss and bias [74].

The goal of this analysis is ultimately to prioritize which of the identified barriers could be effectively targeted with implementation strategies. If multiple stakeholder teams are involved, it may be helpful to compile a comprehensive list of all identified barriers across stakeholder teams. Then, all stakeholders can rate every barrier with FMEA, even if a particular team did not originally identify certain barriers in their process map. This allows comparisons to be made across stakeholder teams, since all teams would rate the same barriers but not every team would rate these barriers the same way. This list of barriers should include brief descriptions that might explain reasons for why each barrier occurs and examples of each barrier occurring [41, 49].

When considering the final FMEA survey, if the research team decides that there are too many identified barriers to feasibly rate, the list can first be reduced according to stakeholder priority. The decision of how many barriers is too many may depend on participating stakeholders, their availability, and resources for implementation. To reduce survey burden and allow for high-quality assessments, a preliminary survey or workshop could allow stakeholders to choose their top priority barriers to be rated with FMEA. The word priority indicates the most consequential failure modes, meaning that these would be most likely to occur, least likely to be detected (or most likely to be detected), and have the most severe effect on the overall process. These top priority failure modes would be expected to have the highest risk priority numbers (RPNs) by scoring high in each of the three FMEA dimensions, as described [31]. Preliminary prioritization can determine a shortened list of barriers to be rated in subsequent FMEA analysis. However, another option is seen in work by Kisling et al. [49] and Denny et al. [37] that rated all barriers, such that only after FMEA were priority barriers selected for further investigation based on RPNs.

Step 11: Piloting FMEA workshops and data collection

As described above, a preliminary survey or workshop can be conducted to gain feedback before finalizing survey items or rating guides. This pilot work will also help investigators navigate any technical issues with distributing the survey items, e.g., through a secure online survey. A final survey can then be distributed. Participants will rate the barriers according to the O, D, and S dimensions of FMEA either asynchronously or together during a workshop.

Step 12: Data analysis

Analysis of numerical data collected from FMEA surveys should be summarized using descriptive statistics, e.g., the mean or median and range of values for O, D, S, and RPNs and the impact score. The highest RPN scores can be reported in tables or figures as illustrated in Fig. 3, Panels B and C.

Exploratory analysis can be designed to see if barriers differ at various levels (e.g., health systems, clinics, or among participating stakeholder groups). Making comparisons should involve statistical tests when appropriate, e.g., ANOVA for comparisons of scores across clinics. However, sample size may not be conducive for between group comparisons.

Phase 4: Devising an implementation strategy

Step 13: Choose which barriers are to be addressed

The goal of rating barriers according to FMEA in implementation research is to inform an implementation strategy. Specifically, the calculated RPNs can provide a starting point for deciding which of the identified barriers ought to be addressed during implementation. This prioritization of barriers according to FMEA results in preparation for process redesign and optimization is commonly seen in quality improvement work for health services [29, 37, 38, 41]. There is also opportunity here for investigators to consider the relevance of unique factors other than failure modes (e.g., facilitators and setting-specific contextual factors) as they decide how to target their implementation efforts. Opportunities for devising implementation strategies will depend on the context within which the research team is operating. For example, the team may need to decide if it is feasible to address important high-RPN barriers that occur outside of the healthcare system, e.g., through partnering with community-based organizations.

Step 14: Match implementation strategy to chosen barriers

Once a list of priority barriers has been generated, investigators devise or refine an implementation strategy targeting these factors. The specificity of such strategies will be highly dependent on contextual features, i.e., implementation climate, as well. Thus, decisions about targeting strategies will be aided by referring back to the comprehensive narratives from initial process mapping.

This mapping of strategies to known barriers, sometimes termed "implementation mapping," is an emerging approach that may lead to more effective or efficient implementation strategies [75]. For barriers that align to CFIR constructs, the CFIR-Expert Recommendations for Implementing Change (ERIC) Strategy Matching Tool can be used to identify expert-endorsed candidate strategies [76]. Additional approaches can be leveraged to ensure systematic modification of implementation strategies to better address barriers that are relevant to specific contexts [77]. If use of these formal implementation mapping approaches is not feasible, use of less formal yet structured and established group discussion facilitation techniques (e.g., often used for quality improvement) can also be useful [78]. These techniques involve brainstorming discussions to identify potential strategies that account for multiple perspectives (e.g., including individuals involved with the different steps of the mapped process) and decision-making discussions that weigh the effort versus impact of potential strategies to select ones that are reasonable to pursue. Regardless of which approach is used, the main focus of this step is to meaningfully incorporate the contextual and experiential knowledge of those impacting or impacted by the mapped process in devising and refining the strategies.

Case studies

Two cases are detailed in Supplemental File 2 (which includes Supplemental Table 2) and summarized below to illustrate how process mapping with FMEA can be operationalized in implementation research. Our first case explores referrals by primary care providers (PCPs) to chiropractic care for patients with low back pain. While chiropractic care is an evidence-based practice that is recommended by the American College of Physicians as first-line care for low back pain, relatively few PCPs refer their patients with low back pain to chiropractic care [79,80,81]. This study sought to identify barriers to accessing chiropractic care for patients with low back pain in low-income, racially diverse communities who often access chiropractic care at the lowest rates [82, 83]. Our second case involves implementation of a Family Navigation intervention for children with autism spectrum disorderThe investigation was embedded in a larger randomized controlled trial of Family Navigation, supporting an overall hybrid type I trial design [84]. Family Navigation is an evidence-based practice to improve access to care. Family Navigation consists of an individual with “lived experience” providing both logistical and interpersonal support to families during a time-limited period of care needs [85].

The principal investigators of the two cases followed similar guidance for design, so overlap was expected. However, we also highlight differences in the evidence-based practice, context of participating clinics, and operationalization of each of the four phases of process mapping with FMEA. For phase 1, formulating a plan, Case 1 involved four primary care clinics at two safety net hospitals, including two clinics with embedded chiropractic care and two without. Clinic PCPs, clinic staff, and chiropractors were invited to participate. For Case 2, the multidisciplinary research team working with pediatric clinics of the pragmatic clinical trial participated in the first phase. For phase 2, 1-on-1 interviews were used in both cases to draft process maps and stakeholders were given the opportunity to review and refine the map. Of note, these cases differed in their approach to the conduct of FMEA in the third phase, showcasing how this methodology can be applied to suit the context of different teams. For example, in the first case, FMEA workshops were conducted separately per clinic site and also engaged stakeholders asynchronously through pre-recorded video introductions, such that most participants completed the survey (rating >30 priority barriers) on their own time. In the second case, the FMEA workshop was conducted more collectively, including case examples to practice rating in real time and group consensus that led to participants rating 7 priority barriers. While FMEA results for Case 1 are being prepared for publication, the FMEA analysis results for Case 2 are reported in detail elsewhere [31]. For development of implementation strategies in phase 4, the FMEA results from Case 1 will inform a pilot study of an implementation strategy for increasing adoption of chiropractic care for low back pain in three primary care clinics (K23-AT010487). For Case 2, FMEA data were used to design a trial to empirically test new implementation strategy components [86]. The data have also informed a separate ongoing trial of family navigator implementation (R34MH120190).

Guidance on reporting

In Table 1 we present REPAIR checklist (which stands for REporting Process mapping and Analysis for Implementation Research). Informed by checklists from the EQUATOR Network [87], the REPAIR checklist is a 17-item checklist aimed to guide future reporting of studies that use process mapping with FMEA in implementation research.

Table 1 Reporting Process mapping and Analysis for Implementation Research (REPAIR)

Discussion

Process mapping with FMEA is underutilized in implementation research. This article proposes dividing the process mapping and FMEA methodology into four broad phases with 14 total steps. Application of these steps in recent or ongoing implementation research is illustrated using two case studies. While this article is not intended as a final guide to process mapping with FMEA, it is, to our knowledge, the first tailored to implementation research. We acknowledge that this methodology is commonly used by others (e.g., improvement scientists, engineers) and encourage interdisciplinary efforts to further advance and harmonize guidance for this methodology and its reporting standards.

Our first phase, formulating a plan, involves choosing a study question, a guiding framework, and selecting team members. Implementation scientists can partner with local improvement scientists who may already be familiar with process mapping and analysis. Taking advantage of the potential synergy of these disciplines—along with their unique skills and resources—is likely to advance this method and align efforts to achieve a common goal of improving the patient experience and outcomes [10, 88].

Our second phase, generating process maps, allows for visualization of process and distinct barriers to implementation of an evidence-based practice as they occur over time. Furthermore, this visualization fosters a shared understanding among different stakeholders regarding the process being targeted for implementation strategies.

Our third phase, using FMEA to analyze barriers from the maps, identifies the most important barriers for a specific context. This is especially relevant when it comes to issues of practicality and financial burden; it is crucial to allocate resources towards addressing the barriers that have been deemed high priority for implementation. Patient populations in underserved settings may be different from those in which evidence-based interventions were originally developed; thus, supporting needs-matched implementation can promote health equity. There may be situations in which analytic approaches other than FMEA are appropriate for prioritizing the identified barriers. For example, when there are too many identified barriers, in lieu of surveying stakeholders to condense the number of barriers, the research team can reduce barriers by using affinity diagramming [89, 90] to collapse similar barriers into one. Similarly, root cause analyses [67, 91] may help identify groups of barriers with the same cause. Additionally, if rating guides are difficult to develop and/or the stakeholders prefer to seek consensus on which barriers to prioritize, teams can instead hold consensus-building discussions [92, 93] regarding how the barriers compare in terms of the effort it would take to overcome them and expected impact of overcoming them. To inform this effort-versus-impact assessment [94, 95], discussions can use the Suppliers-Inputs-Process-Outputs-Customers (SIPOC) framework [96, 97] to consider who needs to supply how much input (i.e., effort) to address each barrier, and which outputs will best serve customers or stakeholders (i.e., impact). The barriers that have low expected effort coupled with high expected impact can then be prioritized.

In the fourth phase, implementation strategies are developed or refined to address the prioritized barriers. This stems from the use of thematic coding to finalize combined maps and FMEA to analyze priority barriers. Once RPNs have been calculated to reveal the highest priority barriers, process map data on barrier category and chronology allow investigators to describe where and/or when in the process to intervene. This is a timely contribution to implementation science, as the field is increasingly focused on understanding how best to match implementation strategies to specific contexts and barriers. Even in cases where an implementation strategy has already been selected for use, FMEA results can inform key modifications to the existing strategy to better meet the unique needs of an implementation setting. Miller and colleagues’ Framework for Reporting Adaptations and Modifications to Evidence-based Implementation Strategies (FRAME-IS) delineates various aspects of an implementation strategy to consider when making such modifications [77]. These include the strategy’s content, approach to evaluating its performance, associated training of implementers, and context (e.g., format, setting, personnel, or population). In considering potential modifications as indicated by the identified barriers, FRAME-IS can also provide a structure for documenting each modification’s nature, goal, and level (e.g., patient, provider, implementer, organizational, or sociopolitical level), such that the feasibility and intended impact can be methodically considered alongside complementary modifications.

Following our guide, we also introduced the REPAIR checklist for reporting standards of process mapping with FMEA for implementation research. Despite the use of process mapping with FMEA in multiple fields within and outside of healthcare, relevant reporting guidelines are not well established. While we illustrated use of the method in implementation research with two case studies, the utility of this methodology has yet to be evaluated in implementation research across a wide variety of clinical interventions and implementation contexts. Future work should convene experts from the quality improvement and implementation science fields to refine this methodology and its reporting, e.g,. through a multidisciplinary Delphi panel. While the REPAIR checklist should guide researchers using this method, further development and consensus is needed for definitive guidance for reporting findings in peer-reviewed journals.

While process mapping has its advantages, it may not always be feasible in terms of the time and training it requires. This method may become more feasible when research strategies are aligned with the goals of a healthcare system, when research funding can support training, or when investigators partner with quality improvement teams already familiar with process mapping and analysis. Feasibility of detecting or addressing barriers is also relevant to consider. If the number of identified barriers is high, investigators may choose to prioritize some barriers before FMEA, using methods described above. Furthermore, it is not yet known whether the use of process mapping with FMEA necessarily results in more effective implementation of an evidence-based practice compared to other methods of identifying barriers (e.g., interviews, focus groups surveys). However, its advantages of visualization, incorporation of time, and flexible tailoring to specific contexts and perspectives have long been embraced for quality improvement and are thus highly promising for implementation research. It is our hope that by providing foundational guidance on process mapping and FMEA, this paper will promote the field of implementation science to take up this methodology more broadly, such that it may be studied more empirically.

Conclusions

Process mapping with FMEA can elucidate barriers and facilitators to successful implementation of evidence-based clinical interventions. This paper provides initial guidance to making this approach more systematic. Future research should use a consensus-building approach, such as a Delphi panel, to further delineate the reporting standards for studies that use process mapping and analysis with FMEA.

Availability of data and materials

This is a paper describing a methodology. The data presented are preliminary and reflect only the initial phase of work by Drs. Roseen and Broder-Fingert. While the resulting datasets may eventually be made available, we are currently unable to do so.

Abbreviations

CFIR:

Consolidated Framework for Implementation Research

DC:

Doctor of Chiropractic

FMEA:

Failure mode and effects analysis

FRAME-IS:

Framework for Reporting Adaptations and Modifications to Evidence-based Implementation Strategies

PCP:

Primary care provider

PI:

Principal investigator

REDCap:

Research Electronic Data Capture

RPN:

Risk priority number

SIPOC:

Suppliers-Inputs-Process-Outputs-Customers

TDF:

Theoretical Domains Framework

References

  1. Nilsen P, Bernhardsson S. Context matters in implementation science: a scoping review of determinant frameworks that describe contextual determinants for implementation outcomes. BMC Health Serv Res. 2019;19(1):189.

    Article  PubMed  PubMed Central  Google Scholar 

  2. Bauer MS, Kirchner J. Implementation science: What is it and why should I care? Psychiatry Res. 2020;283:112376.

    Article  PubMed  Google Scholar 

  3. Wensing M, Grol R. Determinants of Implementation. In: Wensing M, ed. Improving Patient Care: The Implementation of Change in Health Care. third ed: Wiley; 2020.

  4. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011;38(1):4–23.

    Article  PubMed  Google Scholar 

  5. Antonacci G, Lennox L, Barlow J, Evans L, Reed J. Process mapping in healthcare: a systematic review. BMC Health Serv Res. 2021;21(1):342.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Thornton E, Brook OR, Mendiratta-Lala M, Hallett DT, Kruskal JB. Application of failure mode and effect analysis in a radiology department. Radiographics. 2011;31(1):281–93.

    Article  PubMed  Google Scholar 

  7. DeRosier J, Stalhandske E, Bagian JP, Nudell T. Using health care Failure Mode and Effect Analysis: the VA National Center for Patient Safety’s prospective risk analysis system. Jt Comm J Qual Improv. 2002;28(5):248–67, 209.

    PubMed  Google Scholar 

  8. Kononowech J, Landis-Lewis Z, Carpenter J, et al. Visual process maps to support implementation efforts: a case example. Implement Sci Commun. 2020;1(1):105.

    Article  PubMed  PubMed Central  Google Scholar 

  9. Leeman J, Rohweder C, Lee M, et al. Aligning implementation science with improvement practice: a call to action. Implement Sci Commun. 2021;2(1):99.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Koczwara B, Stover AM, Davies L, et al. Harnessing the Synergy Between Improvement Science and Implementation Science in Cancer: A Call to Action. J Oncol Pract. 2018;14(6):335–40.

    Article  PubMed  PubMed Central  Google Scholar 

  11. AHRQ. https://digital.ahrq.gov/health-it-tools-and-resources/evaluation-resources/workflow-assessment-health-it-toolkit/all-workflow-tools/flowchart. Accessed 10 May 2022.

  12. Chiozza ML, Ponzetti C. FMEA: a model for reducing medical errors. Clin Chim Acta. 2009;404(1):75–8.

    Article  CAS  PubMed  Google Scholar 

  13. IHI. Failure Modes and Effects Analysis (FMEA) tool. https://www.ihi.org/resources/tools/failure-modes-and-effects-analysis-fmea-tool. Accessed 12 Apr 2022.

  14. Dawson A. A Practical Guide to Performance Improvement: Beginning the Process. AORN J. 2019;109(3):318–24.

    Article  PubMed  Google Scholar 

  15. An introduction to FMEA. Using failure mode and effects analysis to meet JCAHO's proactive risk assessment requirement. Failure Modes and Effect Analysis. Health Devices. 2002;31(6):223–226.

  16. Joint Commission Resources. Joint Commission International. Failure Mode and Effects Analysis in Health Care: Proactive Risk Reduction. 2010.

  17. Ashley L, Armitage G, Neary M, Hollingsworth G. A practical guide to failure mode and effects analysis in health care: making the most of the team and its meetings. Jt Comm J Qual Patient Saf. 2010;36(8):351–8.

    PubMed  Google Scholar 

  18. Asgari Dastjerdi H, Khorasani E, Yarmohammadian MH, Ahmadzade MS. Evaluating the application of failure mode and effects analysis technique in hospital wards: a systematic review. J Inj Violence Res. 2017;9(1):51–60.

    PubMed  PubMed Central  Google Scholar 

  19. Bramstedt KA. Failure mode and effects analysis as an informed consent tool for investigational cardiothoracic devices. ASAIO J. 2002;48(3):293–5.

    Article  PubMed  Google Scholar 

  20. Coughlin K, Posencheg MA. Quality improvement methods - Part II. J Perinatol. 2019;39(7):1000–7.

    Article  PubMed  Google Scholar 

  21. Rath F. Tools for developing a quality management program: proactive tools (process mapping, value stream mapping, fault tree analysis, and failure mode and effects analysis). Int J Radiat Oncol Biol Phys. 2008;71(1 Suppl):S187–190.

    Article  PubMed  Google Scholar 

  22. Sheridan-Leos N, Schulmeister L, Hartranft S. Failure mode and effect analysis: a technique to prevent chemotherapy errors. Clin J Oncol Nurs. 2006;10(3):393–8.

    Article  PubMed  Google Scholar 

  23. Prabhakaran S, Khorzad R, Brown A, Nannicelli AP, Khare R, Holl JL. Academic-Community Hospital Comparison of Vulnerabilities in Door-to-Needle Process for Acute Ischemic Stroke. Circ Cardiovasc Qual Outcomes. 2015;8(6 Suppl 3):S148–154.

    PubMed  Google Scholar 

  24. Walsh KE, Mazor KM, Roblin D, et al. Multisite parent-centered risk assessment to reduce pediatric oral chemotherapy errors. J Oncol Pract. 2013;9(1):e1–7.

    Article  PubMed  Google Scholar 

  25. Schuller BW, Burns A, Ceilley EA, et al. Failure mode and effects analysis: A community practice perspective. J Appl Clin Med Phys. 2017;18(6):258–67.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Mattsson TO, Lipczak H, Pottegard A. Patient Involvement in Evaluation of Safety in Oral Antineoplastic Treatment: A Failure Mode and Effects Analysis in Patients and Health Care Professionals. Qual Manag Health Care. 2019;28(1):33–8.

    Article  PubMed  Google Scholar 

  27. Kricke GS, Carson MB, Lee YJ, et al. Leveraging electronic health record documentation for Failure Mode and Effects Analysis team identification. J Am Med Inform Assoc. 2017;24(2):288–94.

    Article  PubMed  Google Scholar 

  28. Teixeira FC, de Almeida CE, Saiful HM. Failure mode and effects analysis based risk profile assessment for stereotactic radiosurgery programs at three cancer centers in Brazil. Med Phys. 2016;43(1):171.

    Article  PubMed  Google Scholar 

  29. Sorrentino P. Use of Failure Mode and Effects Analysis to Improve Emergency Department Handoff Processes. Clin Nurse Spec. 2016;30(1):28–37.

    Article  PubMed  Google Scholar 

  30. Ibanez-Rosello B, Bautista JA, Bonaque J, et al. Failure modes and effects analysis of total skin electron irradiation technique. Clin Transl Oncol. 2018;20(3):330–65.

    Article  CAS  PubMed  Google Scholar 

  31. Broder-Fingert S, Qin S, Goupil J, et al. A mixed-methods process evaluation of Family Navigation implementation for autism spectrum disorder. Autism. 2019;23(5):1288–99.

    Article  PubMed  Google Scholar 

  32. Anthony D, Chetty VK, Kartha A, McKenna K, DePaoli MR, Jack B. Re-engineering the Hospital Discharge: An Example of a Multifaceted Process Evaluation. In: Henriksen K, Battles JB, Marks ES, Lewin DI, eds. Advances in Patient Safety: From Research to Implementation (Volume 2: Concepts and Methodology). Rockville (MD)2005.

  33. Babiker A, Amer YS, Osman ME, et al. Failure Mode and Effect Analysis (FMEA) may enhance implementation of clinical practice guidelines: An experience from the Middle East. J Eval Clin Pract. 2018;24(1):206–11.

    Article  PubMed  Google Scholar 

  34. Cantone MC, Ciocca M, Dionisi F, et al. Application of failure mode and effects analysis to treatment planning in scanned proton beam radiotherapy. Radiat Oncol. 2013;8:127.

    Article  PubMed  PubMed Central  Google Scholar 

  35. Chilakamarri P, Finn EB, Sather J, et al. Failure Mode and Effect Analysis: Engineering Safer Neurocritical Care Transitions. Neurocrit Care. 2021;35(1):232–40.

    Article  PubMed  PubMed Central  Google Scholar 

  36. Daniels LM, Barreto JN, Kuth JC, et al. Failure mode and effects analysis to reduce risk of anticoagulation levels above the target range during concurrent antimicrobial therapy. Am J Health Syst Pharm. 2015;72(14):1195–203.

    Article  PubMed  Google Scholar 

  37. Denny DS, Allen DK, Worthington N, Gupta D. The use of failure mode and effect analysis in a radiation oncology setting: the Cancer Treatment Centers of America experience. J Healthc Qual. 2014;36(1):18–28.

    Article  PubMed  Google Scholar 

  38. Ford EC, Gaudette R, Myers L, et al. Evaluation of safety in a radiation oncology setting using failure mode and effects analysis. Int J Radiat Oncol Biol Phys. 2009;74(3):852–8.

    Article  PubMed  PubMed Central  Google Scholar 

  39. Frewen H, Brown E, Jenkins M, O’Donovan A. Failure mode and effects analysis in a paperless radiotherapy department. J Med Imaging Radiat Oncol. 2018;62(5):707–15.

    Article  PubMed  Google Scholar 

  40. Gates EDH, Wallner K, Tiwana J, et al. Improved safety and quality in intravascular brachytherapy: A multi-institutional study using failure modes and effects analysis. Brachytherapy. 2023;22(6):779–89.

    Article  PubMed  Google Scholar 

  41. Gilmore MDF, Rowbottom CG. Evaluation of failure modes and effect analysis for routine risk assessment of lung radiotherapy at a UK center. J Appl Clin Med Phys. 2021;22(5):36–47.

    Article  PubMed  PubMed Central  Google Scholar 

  42. Gray T, Antolak A, Ahmed S, et al. Implementing failure mode and effect analysis to improve the safety of volumetric modulated arc therapy for total body irradiation. Med Phys. 2023;50(7):4092–104.

    Article  CAS  PubMed  Google Scholar 

  43. Haroun A, Al-Ruzzieh MA, Hussien N, et al. Using Failure Mode and Effects Analysis in Improving Nursing Blood Sampling at an International Specialized Cancer Center. Asian Pac J Cancer Prev. 2021;22(4):1247–54.

    Article  PubMed  PubMed Central  Google Scholar 

  44. Hosoya K, Mochinaga S, Emoto A, et al. Failure mode and effects analysis of medication adherence in patients with chronic myeloid leukemia. Int J Clin Oncol. 2015;20(6):1203–10.

    Article  CAS  PubMed  Google Scholar 

  45. Ibanez-Rosello B, Bautista-Ballesteros JA, Bonaque J, et al. Failure mode and effects analysis of skin electronic brachytherapy using Esteya((R)) unit. J Contemp Brachytherapy. 2016;8(6):518–24.

    Article  PubMed  PubMed Central  Google Scholar 

  46. Intra G, Alteri A, Corti L, et al. Application of failure mode and effect analysis in an assisted reproduction technology laboratory. Reprod Biomed Online. 2016;33(2):132–9.

    Article  PubMed  Google Scholar 

  47. Jones RT, Handsfield L, Read PW, et al. Safety and feasibility of STAT RAD: Improvement of a novel rapid tomotherapy-based radiation therapy workflow by failure mode and effects analysis. Pract Radiat Oncol. 2015;5(2):106–12.

    Article  PubMed  Google Scholar 

  48. Kim J, Miller B, Siddiqui MS, Movsas B, Glide-Hurst C. FMEA of MR-Only Treatment Planning in the Pelvis. Adv Radiat Oncol. 2019;4(1):168–76.

    Article  PubMed  Google Scholar 

  49. Kisling K, Johnson JL, Simonds H, et al. A risk assessment of automated treatment planning and recommendations for clinical deployment. Med Phys. 2019;46(6):2567–74.

    Article  PubMed  Google Scholar 

  50. Kunac DL, Reith DM. Identification of priorities for medication safety in neonatal intensive care. Drug Saf. 2005;28(3):251–61.

    Article  PubMed  Google Scholar 

  51. Manger R, Rahn D, Hoisak J, Dragojevic I. Improving the treatment planning and delivery process of Xoft electronic skin brachytherapy. Brachytherapy. 2018;17(4):702–8.

    Article  PubMed  Google Scholar 

  52. Nealon KA, Balter PA, Douglas RJ, et al. Using Failure Mode and Effects Analysis to Evaluate Risk in the Clinical Adoption of Automated Contouring and Treatment Planning Tools. Pract Radiat Oncol. 2022;12(4):e344–53.

    Article  PubMed  Google Scholar 

  53. Nishioka S, Okamoto H, Chiba T, et al. Identifying risk characteristics using failure mode and effect analysis for risk management in online magnetic resonance-guided adaptive radiation therapy. Phys Imaging Radiat Oncol. 2022;23:1–7.

    Article  PubMed  PubMed Central  Google Scholar 

  54. Noel CE, Santanam L, Parikh PJ, Mutic S. Process-based quality management for clinical implementation of adaptive radiotherapy. Med Phys. 2014;41(8): 081717.

    Article  PubMed  PubMed Central  Google Scholar 

  55. Rienzi L, Bariani F, Dalla Zorza M, et al. Comprehensive protocol of traceability during IVF: the result of a multicentre failure mode and effect analysis. Hum Reprod. 2017;32(8):1612–20.

    Article  CAS  PubMed  Google Scholar 

  56. Takemori M, Nakamura S, Sofue T, et al. Failure modes and effects analysis study for accelerator-based Boron Neutron Capture Therapy. Med Phys. 2023;50(1):424–39.

    Article  CAS  PubMed  Google Scholar 

  57. Xu AY, Bhatnagar J, Bednarz G, et al. Failure modes and effects analysis (FMEA) for Gamma Knife radiosurgery. J Appl Clin Med Phys. 2017;18(6):152–68.

    Article  PubMed  PubMed Central  Google Scholar 

  58. Xu Z, Lee S, Albani D, et al. Evaluating radiotherapy treatment delay using Failure Mode and Effects Analysis (FMEA). Radiother Oncol. 2019;137:102–9.

    Article  PubMed  Google Scholar 

  59. Yarmohammadian MH, Abadi TN, Tofighi S, Esfahani SS. Performance improvement through proactive risk assessment: Using failure modes and effects analysis. J Educ Health Promot. 2014;3:28.

    Article  PubMed  PubMed Central  Google Scholar 

  60. Yousefinezhadi T, Jannesar Nobari FA, Behzadi Goodari F, Arab M. A Case Study on Improving Intensive Care Unit (ICU) Services Reliability: By Using Process Failure Mode and Effects Analysis (PFMEA). Glob J Health Sci. 2016;8(9):52635.

    Article  PubMed  Google Scholar 

  61. Moullin JC, Dickson KS, Stadnick NA, et al. Ten recommendations for using implementation frameworks in research and practice. Implement Sci Commun. 2020;1:42.

    Article  PubMed  PubMed Central  Google Scholar 

  62. Birken SA, Rohweder CL, Powell BJ, et al. T-CaST: an implementation theory comparison and selection tool. Implement Sci. 2018;13(1):143.

    Article  PubMed  PubMed Central  Google Scholar 

  63. Atkins L, Francis J, Islam R, et al. A guide to using the Theoretical Domains Framework of behaviour change to investigate implementation problems. Implement Sci. 2017;12(1):77.

    Article  PubMed  PubMed Central  Google Scholar 

  64. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50.

    Article  PubMed  PubMed Central  Google Scholar 

  65. Damschroder LJ, Reardon CM, Widerquist MAO, Lowery J. The updated Consolidated Framework for Implementation Research based on user feedback. Implement Sci. 2022;17(1):75.

    Article  PubMed  PubMed Central  Google Scholar 

  66. Birken SA, Powell BJ, Presseau J, et al. Combined use of the Consolidated Framework for Implementation Research (CFIR) and the Theoretical Domains Framework (TDF): a systematic review. Implement Sci. 2017;12(1):2.

    Article  PubMed  PubMed Central  Google Scholar 

  67. Shahid R, Chaya M, Lutz I, Taylor B, Xiao L, Groot G. Exploration of a quality improvement process to standardised preoperative tests for a surgical procedure to reduce waste. BMJ Open Qual. 2021;10(3):e001570.

    Article  PubMed  PubMed Central  Google Scholar 

  68. Palinkas LA, Horwitz SM, Green CA, Wisdom JP, Duan N, Hoagwood K. Purposeful Sampling for Qualitative Data Collection and Analysis in Mixed Method Implementation Research. Adm Policy Ment Health. 2015;42(5):533–44.

    Article  PubMed  PubMed Central  Google Scholar 

  69. Saunders B, Sim J, Kingstone T, et al. Saturation in qualitative research: exploring its conceptualization and operationalization. Qual Quant. 2018;52(4):1893–907.

    Article  PubMed  Google Scholar 

  70. Guest G, Namey E, Chen M. A simple method to assess and report thematic saturation in qualitative research. PLoS ONE. 2020;15(5): e0232076.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  71. Gale RC, Wu J, Erhardt T, et al. Comparison of rapid vs in-depth qualitative analytic methods from a process evaluation of academic detailing in the Veterans Health Administration. Implement Sci. 2019;14(1):11.

    Article  PubMed  PubMed Central  Google Scholar 

  72. Taylor B, Henshall C, Kenyon S, Litchfield I, Greenfield S. Can rapid approaches to qualitative analysis deliver timely, valid findings to clinical leaders? A mixed methods study comparing rapid and thematic analysis. BMJ Open. 2018;8(10): e019993.

    Article  PubMed  PubMed Central  Google Scholar 

  73. Birt L, Scott S, Cavers D, Campbell C, Walter F. Member Checking: A Tool to Enhance Trustworthiness or Merely a Nod to Validation? Qual Health Res. 2016;26(13):1802–11.

    Article  PubMed  Google Scholar 

  74. Westland JC. Information loss and bias in likert survey responses. PLoS ONE. 2022;17(7): e0271949.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  75. Fernandez ME, Ten Hoor GA, van Lieshout S, et al. Implementation Mapping: Using Intervention Mapping to Develop Implementation Strategies. Front Public Health. 2019;7:158.

    Article  PubMed  PubMed Central  Google Scholar 

  76. CFIR Research Team Center for Clinical Management Research. Consolidated Framework for Implementation Science. 2024; https://cfirguide.org/choosing-strategies/. Accessed 5 Jul 2024.

  77. Miller CJ, Barnett ML, Baumann AA, Gutner CA, Wiltsey-Stirman S. The FRAME-IS: a framework for documenting modifications to implementation strategies in healthcare. Implement Sci. 2021;16(1):36.

    Article  PubMed  PubMed Central  Google Scholar 

  78. Bens I. Facilitating with Ease! In: ed, ed. Core Skills for Facilitators, Team Leaders and Members, Managers, Consultants, and Trainers.: John Wiley & Sons, Inc.; 2018.

  79. Qaseem A, Wilt TJ, McLean RM, Forciea MA. Clinical Guidelines Committee of the American College of P. Noninvasive Treatments for Acute, Subacute, and Chronic Low Back Pain: A Clinical Practice Guideline From the American College of Physicians. Ann Intern Med. 2017;166(7):514–30.

    Article  PubMed  Google Scholar 

  80. Traeger AC, Qaseem A, McAuley JH. Low Back Pain JAMA. 2021;326(3):286.

    PubMed  Google Scholar 

  81. Joyce CT, Roseen EJ, Smith CN, Patterson CG, McDonough CM, Hurstak E, Morone NE, Beneciuk J, Stevans JM, Delitto A, Saper RB. A Cluster Analysis of Initial Primary Care Orders for Patients with Acute Low Back Pain. J Am Board Fam Med. 2024;36(6):986–95.

  82. Roseen EJ, Patel KV, Ward R, de Grauw X, Atlas SJ, Bartels S, Keysor JJ, Bean JF. Trends in chiropractic care and physical rehabilitation use among adults with low back pain in the United States, 2002 to 2018. J Gen Intern Med. 2024;39(4):578–86.

  83. Roseen EJ, Joyce C, Winbush S, Pavco-Luttschwager N, Morone NE, Saper RB, Bartels S, Patel KV, Keysor JJ, Bean JF, Laird LD. Primary care barriers and facilitators to nonpharmacologic treatments for low back pain: A qualitative pilot study. PM R. 2024. View Related Profiles. https://doi.org/10.1002/pmrj.13183.

  84. Broder-Fingert S, Walls M, Augustyn M, et al. A hybrid type I randomized effectiveness-implementation trial of patient navigation to improve access to services for children with autism spectrum disorder. BMC Psychiatry. 2018;18(1):79.

    Article  PubMed  PubMed Central  Google Scholar 

  85. Broder-Fingert S, Stadnick NA, Hickey E, Goupil J, Diaz Lindhart Y, Feinberg E. Defining the core components of Family Navigation for autism spectrum disorder. Autism. 2020;24(2):526–30.

    Article  PubMed  Google Scholar 

  86. Broder-Fingert S, Kuhn J, Sheldrick RC, et al. Using the Multiphase Optimization Strategy (MOST) framework to test intervention delivery strategies: a study protocol. Trials. 2019;20(1):728.

    Article  PubMed  PubMed Central  Google Scholar 

  87. Equator Network. https://www.equator-network.org. Accessed 14 Feb 2024.

  88. Check DK, Zullig LL, Davis MM, et al. Improvement Science and Implementation Science in Cancer Care: Identifying Areas of Synergy and Opportunities for Further Integration. J Gen Intern Med. 2021;36(1):186–95.

    Article  PubMed  Google Scholar 

  89. Plourde CL, Varnado WT, Gleaton BJ, Das DG. Reducing Infusion Clinic Wait Times Using Quality Improvement. JCO Oncol Pract. 2020;16(8):e807–13.

    Article  PubMed  Google Scholar 

  90. Keurhorst M, Heinen M, Colom J, et al. Strategies in primary healthcare to implement early identification of risky alcohol consumption: why do they work or not? A qualitative evaluation of the ODHIN study. BMC Fam Pract. 2016;17:70.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  91. Nandwana SB, Walls G, Reich S. Learning From Experience: “Minimizing Patient Delays in Radiology: Optimizing On-Time Starts for CT Procedures.” Curr Probl Diagn Radiol. 2021;50(1):11–5.

    Article  PubMed  Google Scholar 

  92. Kim B, McCullough MB, Simmons MM, et al. A novel application of process mapping in a criminal justice setting to examine implementation of peer support for veterans leaving incarceration. Health Justice. 2019;7(1):3.

    Article  PubMed  PubMed Central  Google Scholar 

  93. LaMonica HM, Davenport TA, Ottavio A, et al. Optimising the integration of technology-enabled solutions to enhance primary mental health care: a service mapping study. BMC Health Serv Res. 2021;21(1):68.

    Article  PubMed  PubMed Central  Google Scholar 

  94. Ursu A, Greenberg G, McKee M. Continuous quality improvement methodology: a case study on multidisciplinary collaboration to improve chlamydia screening. Fam Med Community Health. 2019;7(2): e000085.

    Article  PubMed  PubMed Central  Google Scholar 

  95. Kim B, Wilson SM, Mosher TM, Breland JY. Systematic Decision-Making for Using Technological Strategies to Implement Evidence-Based Interventions: An Illustrated Case Study. Front Psychiatry. 2021;12: 640240.

    Article  PubMed  PubMed Central  Google Scholar 

  96. Cheung YY, Goodman EM, Osunkoya TO. No More Waits and Delays: Streamlining Workflow to Decrease Patient Time of Stay for Image-guided Musculoskeletal Procedures. Radiographics. 2016;36(3):856–71.

    Article  PubMed  Google Scholar 

  97. Baughn JM, Lechner HG, Herold DL, et al. Enhancing the patient and family experience during pediatric sleep studies. J Clin Sleep Med. 2020;16(7):1037–43.

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Funding

The work of Drs. Roseen and Broder-Fingert was supported by NIH career development awards (K23-AT010487 and K23-MH109673, respectively). The views expressed in this article are those of the authors and do not necessarily reflect the position or policy of NIH, the Department of Veterans Affairs or the United States government.

Author information

Authors and Affiliations

Authors

Contributions

All authors collaborated on the design. EJR and AN drafted the initial version of the manuscript sections. All authors (EJR, AN, BK, SB) provided critical revision of the manuscript for important intellectual content. All authors (EJR, AN, BK, SB) approved the final version of the manuscript.

Corresponding author

Correspondence to Eric J. Roseen.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Roseen, E.J., Natrakul, A., Kim, B. et al. Process mapping with failure mode and effects analysis to identify determinants of implementation in healthcare settings: a guide. Implement Sci Commun 5, 110 (2024). https://doi.org/10.1186/s43058-024-00642-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s43058-024-00642-4

Keywords