Skip to main content

A research agenda to advance the study of implementation mechanisms

Abstract

Background

Implementation science scholars have made significant progress identifying factors that enable or obstruct the implementation of evidence-based interventions, and testing strategies that may modify those factors. However, little research sheds light on how or why strategies work, in what contexts, and for whom. Studying implementation mechanisms—the processes responsible for change—is crucial for advancing the field of implementation science and enhancing its value in facilitating equitable policy and practice change. The Agency for Healthcare Research and Quality funded a conference series to achieve two aims: (1) develop a research agenda on implementation mechanisms, and (2) actively disseminate the research agenda to research, policy, and practice audiences. This article presents the resulting research agenda, including priorities and actions to encourage its execution.

Method

Building on prior concept mapping work, in a semi-structured, 3-day, in-person working meeting, 23 US-based researchers used a modified nominal group process to generate priorities and actions for addressing challenges to studying implementation mechanisms. During each of the three 120-min sessions, small groups responded to the prompt: “What actions need to be taken to move this research forward?” The groups brainstormed actions, which were then shared with the full group and discussed with the support of facilitators trained in structured group processes. Facilitators grouped critical and novel ideas into themes. Attendees voted on six themes they prioritized to discuss in a fourth, 120-min session, during which small groups operationalized prioritized actions. Subsequently, all ideas were collated, combined, and revised for clarity by a subset of the authorship team.

Results

From this multistep process, 150 actions emerged across 10 priority areas, which together constitute the research agenda. Actions included discrete activities, projects, or products, and ways to shift how research is conducted to strengthen the study of implementation mechanisms.

Conclusions

This research agenda elevates actions to guide the selection, design, and evaluation of implementation mechanisms. By delineating recommended actions to address the challenges of studying implementation mechanisms, this research agenda facilitates expanding the field of implementation science, beyond studying what works to how and why strategies work, in what contexts, for whom, and with which interventions.

Peer Review reports

Background

Some see implementation science as not just a pathway, but the pathway for advancing equity in healthcare access and outcomes, and equitable population health [1]. Although this research pathway can lead to equity, it is certainly not guaranteed, and in fact, like many fields, most implementation science theories, models, and frameworks did not center equity until recently [2]. This omission leaves implementation studies and strategies vulnerable to unintended consequences (or ripple effects) that might actually exacerbate disparities [3, 4]. The field of implementation science has made significant progress in this regard. Scholars like Woodward et al. [5] offer practical guidance for incorporating health equity domains into implementation determinant frameworks, and Gaias et al. [6] proposed a process to evaluate and adapt implementation strategies to promote equity. Walsh-Bailey is developing a resource to guide the integration of equity into strategy selection, design, and specification [7]. Moreover, numerous efforts collate factors that enable or obstruct the implementation of evidence-based interventions [8,9,10], and compile behavior change techniques and implementation strategies that may modify these factors [11,12,13,14,15,16]. Even with these advances, little research sheds light on how or why strategies work, in what contexts, and for whom [17,18,19,20,21]. Studying implementation mechanisms, or the processes through which strategies exert their effects on outcomes, can address this research gap to meaningfully advance the field of implementation science and enhance its value in facilitating equitable policy and practice change. Mechanistic implementation research can identify potential mediators or moderators that illuminate differential strategy impact based on factors such as gender, race/ethnicity, socioeconomic status, and center on understanding equitable approaches to implementation science and practice.

One of the principles of implementation science is that context matters, and by nature, each context is unique. The people, their interactions, their physical environment and resources, and their history and beliefs about the future, are among the subset of aspects that are diverse among clinics in the same organization, schools in the same district, and hospitals in the same health system. As implementation science evolves, complex and costly strategies are increasingly being deployed, making equity issues especially pronounced for those receiving care in under-resourced settings [20]. Evidence suggests that tailored implementation may be superior to standardized approaches [22, 23], but tailoring in the absence of understanding strategy mechanisms may compromise outcomes for some or undermine scaling positive outcomes. Establishing strategy mechanisms of action means that the essence of how a strategy works is known and empirically supported. Therefore, when tailoring, adapting, or modifying to fit different contexts, the essence of the strategy’s operation can be retained. When strategies are streamlined to fit contextual constraints or adapted to be a better fit, the mechanism ought to be activated if we are to expect the same outcome. Conversely, if strategies underperform or fail to work in certain settings, unpacking the causal pathway can lead to isolation of contextual factors that threaten mechanism activation or demand a new mechanism altogether. This is not to say that simply studying mechanisms will guarantee equitable outcomes, but in studying them, equitable implementation processes and outcomes are more likely.

To this end, in 2017, the Society for Implementation Research Collaboration (SIRC) conference theme centered implementation mechanisms to elevate dialogue and research about, “What Makes Implementation Work and Why?” [24]. SIRC is a not-for-profit society that convenes scholars, practitioners, policy makers, and others interested in advancing rigorous evaluation of implementation initiatives. SIRC’s call to action was motivated by the observation across trials that heterogeneity is the rule, not the exception, and weak main effects result. Thus, advancing the study of implementation mechanisms may offer benefits to research and practice communities. For example, identifying and evaluating mechanisms can help researchers learn from null studies [17] and optimize strategies for subsequent efforts or different objectives (e.g., equity, effectiveness, scalability) [25]. Articulating mechanisms can guide the practice community to identify the impact that strategies might have on their outcomes and inform their design or tailoring of strategies to the local context [26, 27]. Despite this call, only 7% of abstracts included at the subsequent (2019) SIRC conference [28] explicitly “featured the study of implementation mechanisms” [29].

In response to this need to advance the study of mechanisms, we convened an Agency for Healthcare Research and Quality-funded 3-year conference series titled, “Advancing mechanisms of implementation to accelerate sustainable evidence-based practice integration” [30]. The specific aims were to (1) develop a research agenda on implementation mechanisms, and (2) disseminate the research agenda to research, policy, and practice audiences. Similar to processes used for generating related research agendas (e.g., sustainability research [31]), concept mapping was employed in the first two years of the grant to elucidate challenges to advancing implementation mechanisms research [30, 32] and to organize these ideas into conceptually distinct clusters. Reported in more detail elsewhere [30, 32], concept mapping analyses yielded a 12-cluster solution that organized 105 challenge statements within five “super clusters” of mechanism research domains: (1) Accumulating Knowledge, (2) Conceptualization and Measurement, (3) Methods and Design, (4) Strategy, Mechanisms, Determinant, Outcome Linkages, and (5) Theory, Causality, and Context. See Table 1 for a complete list of identified challenges organized by cluster. These concept mapping results provided the basis for the research agenda. This paper describes how actions that could overcome those challenges were identified and presents the resulting research agenda.

Table 1 Concept mapping super-clusters, clusters, and statements

Method

Mechanisms Network of Expertise (MNoE)

The research agenda was developed by the Mechanisms Network of Expertise (MNoE). The MNoE is composed of over 40 invited implementation scientists who are diverse with respect to several dimensions (e.g., gender, race/ethnicity, stage of career, focus on priority populations, research settings), but who are predominantly United States (US)-based (4 scholars are from outside the US); See Additional File 1. Expertise ranged across various aspects of implementation mechanism research including strategy development, measurement, design, theory, and practice. We gathered collective wisdom and engaged in reciprocal learning with these experts through immersive, multi-day “Deep Dive” meetings.

Identifying research priorities via nominal group technique: MNoE data generation

A US-basedFootnote 1 subset of the MNoE (N = 23) met in person for a 3-day Deep Dive to address two goals: 1) expand upon the challenges derived from the previously completed concept mapping, and 2) generate ideas or actions (hereafter just referred to as actions) organized by priority areas, which constitute the research agenda, to advance the study of implementation mechanisms. To this end, attendees received handouts with the cluster solution from concept mapping and the list of statements associated with each cluster (Table 1). These two goals were pursued through four, 120-min sessions comprised of a 75-min small-group activity followed by a 45-min large-group activity. (Table 2) Group activities were structured using evidence-informed, semi-structured group problem solving activities—called “scripts”— derived from operations, consulting, and systems science methods [33, 34] (Table 3). Scripts include discussion prompts, guidelines about how time is spent (e.g., in small versus large groups), roles to be assumed by individuals (e.g., timekeeper), and session goals (e.g., brainstorming actions for a given cluster). A core planning team (n = 5) selected scripts from a repository and tailored them to Deep Dive objectives (e.g., identifying actions for addressing challenges to studying implementation mechanisms) across the sessions. Tailoring of scripts included adjusting the time allocated for each script, the examples used, and the wording of the prompts. The planning team assigned small group membership beforehand to ensure diverse groups regarding career stage and content or methodological expertise. The small group composition changed by session to stimulate creative conversation and cross-pollinate ideas by hearing new perspectives.

Table 2 Clusters reviewed during deep dive sessions
Table 3 Group model building “scripts”

A tailored Nominal Group Technique process was followed for the first three sessions. Instead of first brainstorming individually, as in the traditional Nominal Group Technique [35], small groups first generated action ideas before sharing, discussing, and voting on priority ideas with the large group. Attendees did the following in small groups before converging as a large group (Table 3): 1) Assign group roles, including scribe (to record discussion), reporter for large group, and timekeeper. Individuals could assume more than one role. 2) Brainstorm actions for inclusion in the research agenda and address the challenges from the five super-clusters (the planning team assigned which super-clusters were discussed during each of these sessions). Actions could include methods, tools, activities, meetings, research products, research foci, disciplines, or people/perspectives to be engaged. 3) Prioritize two actions for full group discussion: based on consensus, one idea favored by the group and one idea that was complex, underdeveloped, or surprising to work through were selected. Small groups were encouraged to spend approximately 60 min brainstorming and 15 min prioritizing actions. Each prioritized action was submitted on paper for sharing in the large group session. Groups were encouraged to write as many actions as they could generate. Scribes’ notes were later analyzed (see below). All actions generated, not just those prioritized for deeper discussion, were considered in developing the research agenda.

During the first three large-group discussions, each group’s reporter briefly described how their two prioritized actions would advance the study of implementation mechanisms. Each group had 5 min to share and take questions. Simultaneous with sharing out, facilitators collected the papers and grouped similar actions on a wall visible to all. After all small groups shared, the facilitator summarized the action themes. The large group collectively reflected on these and used the remaining time to further develop prioritized actions.

The fourth (final) session synthesized and expanded actions brought forth in the preceding sessions. Each attendee used five votes to indicate preferred actions (or group of actions) [36]. The highest-voted actions (n = 6) were prioritized for this session. Attendees self-selected into small groups based on which prioritized actions they wanted to discuss. During the final large-group session, each group shared how the actions had evolved or whether new actions emerged. One facilitator synthesized actions and asked clarifying questions, while another captured actions and priorities on large pieces of paper for the large group to see and discuss.

Data extraction and consolidation

To populate the research agenda, a subgroup (N = 6) of attendees extracted data from notes taken across the Deep Dive. Please refer to Table 4 for terms (and definitions) used to organize the research agenda. All unique actions were extracted from each session note that covered at least one super-cluster. Each session note was assigned a primary and a secondary coder. Coders met monthly as a group to refine the process and discuss emergent content. The primary coder extracted action data and refined the language to represent a succinct, coherent action based on: (1) the content of the notes, (2) the context of the larger discussion in the notes, (3) discussion with colleagues (during and/or after the Deep Dive), and (4) consideration of the broader literature. The secondary coder checked data accuracy, separated or grouped actions to ensure each reflected a singular activity, and refined the action verbiage. Coders were encouraged to interpret data to generate additional actions. Coders then worked across sessions to clarify and condense the list of actions, reduce redundancy, and organize actions into priority areas (“priorities”). Given the number of actions identified for each priority area, it became clear that organizing actions within priorities by goals could offer a useful, high-level summary. Coders reviewed all actions in a priority and articulated 2–4 goals that could be achievable by a subset of actions. Each action was then labeled with its corresponding goal. Lastly, the first author synthesized all actions and associated goals within each priority, solicited input from the full authorship team, and refined the data to yield the final research agenda.

Table 4 Terms and Definitions

Results

Table 5 presents the refined list of the MNoE-generated actions, organized by priorities and goals, into a research agenda to advance the study of implementation mechanisms. Although not required per our method, priorities reflected all five super-clusters from the concept mapping solution. In addition, priorities emerged specific to Engagement (of policy and practice communities, as well as funders) and Growing the Field in terms of capacity (number of knowledgeable researchers) and skills specific to studying mechanisms. The MNoE generated 150 unique actions across 10 priority areas (range: 11–19 actions per area). These actions included a mix of discrete activities, projects, or products, as well as ways to shift how research is conducted to center implementation mechanisms. Wherever possible, citations are included in the table to offer exemplars that represent the intention behind the possibility.

Table 5 Research agenda to advance the study of implementation mechanisms

Here, we briefly describe each priority and the types of associated actions. Table 5 presents additional details—including goals that each priority area might achieve. The first set of actions are directly aligned with the concept mapping solution super-clusters.

  • Accumulate Knowledge within and Across Disciplines includes 19 actions that feature specific systematic reviews and meta-analyses, for example, and research questions that would drive this type of evidence synthesis (e.g., determine whether mechanisms are universal, or if variation across contexts is observed).

  • Prioritize Mechanism Research and Incorporate Other Knowledge includes 11 actions that would bring together transdisciplinary teams across fields where mechanisms are likely a prominent area of research, such as psychology and epidemiology.

  • Overcome Design Challenges and Innovate Methods includes 18 actions where new methods are needed (e.g., modeling time in quantitative assessment to isolate specific mechanisms) and identifies underused methods offering specific value (e.g., comparative case studies to generate hypotheses about complex mechanistic pathways).

  • Improve Measurement includes 13 actions, such as pragmatic approaches for objective data collection and those that capture lived experiences—an essential measurement component to understand when disparities might be addressed or exacerbated through implementation research and practice.

  • Provide Guidance for Specifying Mechanisms includes 15 actions reflecting mostly tools/aids to improve researchers’ approach to examining mechanisms (e.g., a list of questions and criteria for articulating mechanisms).

  • Increase Focus on Theorizing includes 12 ways to capitalize on developing, incorporating, and refining theory into mechanistic research to better characterize mechanisms (e.g., make theory explicit in the strategy design phase).

The emergent actions related to Engagement and Growing the Field provide further priorities for action.

  • Engaging the Policy and Practice Community includes 12 actions or methods for understanding the perspective of these potential partners (e.g., cognitive walkthroughs, plain language, Implementation Mapping [64, 65]) and questions about when to include whom and how (e.g., compare “ground up” elucidation of mechanisms to the “top down” or theory-driven approach).

  • Engaging Funders and the Need for New Funding includes 17 actions to garner interest and expertise (e.g., mock study sections) and inspire novel use of new grant mechanisms (e.g., administrative supplements, trainee funding mechanisms).

  • Build Capacity includes 17 actions to offer clarification/guidance (e.g., how to understand conceptual/theoretical misalignment between strategies, mechanisms, and outcomes) and avenues to build the field’s capacity (e.g., postdoctoral training grants).

  • Emphasize Dissemination includes 17 actions like specific manuscript ideas, ways to engage journals to support mechanism-focused manuscripts, forums to host this dialogue, and other methods for generating broader interest beyond academia. Such methods are intended to foster iterative and collaborative advancements in mechanism research across interdisciplinary groups.

Discussion

This paper articulates opportunities to advance the study of implementation mechanisms in a research agenda organized by priorities for the field and specific actions to advance those priorities. Actions range from those that can be acted upon now by way of shifting the research paradigm (e.g., always articulate mechanisms when designing implementation strategies) to those that may need targeted funding and specialized knowledge/expertise (e.g., conduct sufficiently powered, multilevel tests of mechanisms with multidisciplinary input). What follows is a discussion of each priority area by highlighting actions (represented by A# corresponding to Table 5) or exemplars organized by goals (represented by G# in Table 5). These actions were articulated by the MNoE (a group of experts) as ways to address challenges identified in their prior concept mapping work.

Accumulating knowledge

With 100 + discrete implementation strategies and behavior change techniques from which to choose [12,13,14,15,16], balanced with evidence that rarely will a single strategy suffice in realizing sustained and robust change [68, 69], accumulating basic knowledge about how strategies work is crucial. Although the MNoE acknowledged that a starting place could be to curate a list of implementation mechanisms, they also emphasized that there is a risk in overreliance on static lists and frameworks at the expense of theorizing or broader critical thinking [70, 71] (A22.4), particularly where evidence for strategy functioning and causal processes is thin. To this end, the MNoE prioritized knowledge synthesis across completed studies (G1) and coordination of future studies (G2). Specifically, the MNoE prioritized accumulating knowledge to yield practical information such as: (i) which strategies are needed for specific types of interventions across most contexts (e.g., ‘practice & feedback’ needed for evidence-based psychotherapy implementation) (A1.8); (ii) which strategies hold promise in addressing certain barriers across diverse operationalizations [72, 73] (A1.12) (e.g., educational training to address knowledge deficits); (iii) whether strategy-mechanism pairings are universal, or if and how pathways vary across contexts (e.g., service system, level of actor, community, culture) or strategy operationalization (i.e., form versus function [74, 75]) (A1.7).

Not only are individual studies needed to test strategy pathways to yield this information (P1.5), which could be done in practical and efficient simulation studies (A2.5), but evidence syntheses are needed to curate this practical information (A1.1, 1.2, 1.3, 1.6, 1.8, 1.9, 1.10, 1.11, 1.12). These possible actions are ripe for those interested in secondary data analysis. Alternatively, meta-laboratories (meta-labs) [76] offer an approach to testing implementation strategies at scale with the possibility of pooling samples for mediation analyses (A2.3). Meta-labs can harness practical implementation efforts in health systems, for example, where different operationalizations of commonly deployed strategies can be examined using harmonized implementation process, service, and patient-level health outcomes contained in electronic medical records. Grimshaw and colleagues are pioneering the meta-lab by convening subject matter experts to accumulate evidence about audit and feedback [76, 77]. It is unclear whether existing grant funding mechanisms can accommodate the infrastructure necessary for multi-study, global coordination, and data sharing in such efforts (A19.5).

To accumulate knowledge efficiently, the MNoE recommended a mechanism-focused study repository for sharing information, evidence, and methods (A2.2). A repository could be used to share measures of mechanisms for cross-study testing and comparison; report impact/effect of strategies with how and why data; and provide diverse exemplar studies, especially those that engage community/practice partners. Web-based resources for implementation science are mounting (e.g., measure repositories [78, 79]), but to our knowledge, few living repositories or systematic reviews exist perhaps because they are a relatively novel methodology [80] expedited into action by the COVID-19 pandemic [81, 82].

Finally, the MNoE prioritized drawing on other disciplines (G3) and collaborating with experts from other disciplinary backgrounds (G4), such as scholars who study mechanisms using a multilevel perspective (A3.1). There are dozens of fields in which one entity helps another do something differently (A3.2) (e.g., governance, natural resources, education, health promotion) to integrate evidence-based interventions and strive for equity. The MNoE cautioned against our field ‘recreating the methodological wheel,’ and underscored the utility of multidisciplinary workgroups (A4.1) and workshops (A4.2). The MNoE prioritized actions to make implementation science more accessible (e.g., 1-page documents such as an SBAR: Situation, Background, Assessment, Recommendation [83] that conveys the importance of studying implementation mechanisms) to support bidirectional learning and springboard convenings. A recent commentary expressed concern that our field borrows superficially from others when interdisciplinarity or trans-disciplinarity is warranted [20]. Funders have recently made deep interdisciplinary collaboration a priority through opportunities such as the National Cancer Institute Implementation Science Centers [84] in which their Research Program Cores bring together numerous disciplines in a Methods Unit to test, refine, and disseminate new approaches [85] throughout 5-year awards [86].

Methods and design

The MNoE asserted the importance of overcoming design challenges (e.g., multiple multi-level mechanisms) and innovating methods (e.g., to address the time-varying nature of mechanism activation) specific to the study of mechanisms. They prioritized activities to guide selection and refinement of study designs (G5), enable measurement of pertinent and feasible data (G6), and leverage strengths of different research methods (G7) to enable establishing strategy mechanisms. For instance, much like the overview of designs that emerged from an NIH working session in 2014 [52], guidance is needed regarding when to use different designs and methods specifically for the purpose of establishing implementation mechanisms (A5.1). The MNoE suggested mechanism activation may offer an earlier signal along the causal pathway to indicate whether a strategy is working as hypothesized (A6.3). Designing trials for early signal testing demands methodological guidance regarding what constitutes reasonable levels of evidence (go/no-go indicators) (A6.4), how to time mechanism measurement or measure intermediate outcomes (A6.5), and how to pivot if the signal is not detected, particularly in a grant-funded study where adapting/changing the implementation strategy (i.e., independent variable) could be deemed a protocol deviation [58]. Fortunately, methods experts are beginning to apply adaptive trial designs that directly answer this call [87]. The MNoE also acknowledged the power of qualitative methods [88,89,90,91] to inform theory development and surface candidate mechanisms (A7.1) and to offer formative evidence for why a strategy did not work as intended (A7.2). The MNoE highlighted that qualitative methods provide richness, unique insights, and critical perspectives of those with lived experience [57, 89,90,91,92]. Engagement with diverse partners will yield more specific, contextualized, and experientially-informed hypotheses of how strategies are working (A7.3) that may be more acceptable and appropriate for a given context and innovation compared to researcher-derived hypotheses. For example, a secondary analysis of a large implementation trial of measurement-based care revealed no significant mediators from the quantitative data but identified important candidate mechanisms from qualitative analyses [93].

Conceptualization and measurement

In general, great strides have been made to enhance the quality, access, and utility of measurement in implementation science through systematic reviews, guidance documents, and web-based repositories [78, 79, 94]. The MNoE prioritized actions specific to studying mechanisms to develop grounded and generalizable measures (G8), recommend best practices regarding measurement (G9), and clarify ongoing measurement challenges (G10). The MNoE articulated the need to deploy measurement methods that allow for multiple, real-time assessments to detect changes that unfold over time (A8.2), as mechanisms are hypothesized to be activated at varying rates by population and context. The MNoE elevated the possible use of passive data collection approaches for continuous monitoring of mechanisms (A9.2), ecological momentary assessment (EMA), or lower-burden, near-continuous assessments to track changes in mechanisms and determinants (A9.3). As an example, EMA was used to identify predictors of noncompliance of event-based reporting of tobacco use [95]. Although this example is implementation-adjacent, it reveals how underused approaches like EMA can overcome measurement challenges critical to studying mechanisms such as timing (e.g., multiple, repeated measures) and self-report (e.g., bias, memory).

Strategy, mechanism, determinant, outcome linkages

The MNoE was initially organized to include a subset of scholars who focused on understanding the linkages between strategies, mechanisms, determinants, and outcomes [30]. Recognizing that strategies are too often disconnected from determinants [96, 97] and overpromising outcomes [69], the MNoE articulated the role of mechanisms in the causal pathway in terms of how a strategy exerts its effects on target outcomes by overcoming barriers [98]. The MNoE prioritized defining mechanisms as distinct from determinants and establishing reporting standards for mechanisms research (G11) to support deployment of cross-context and multilevel approaches (G12). The MNoE remarked on this as critical “foundational work” for scientific and practical progress to be made. For instance, the MNoE encouraged consideration of which strategies (from compilations such as Expert Recommendations for Implementing Change (ERIC) [13] and Effective Practice and Organization of Care (EPOC) [99]) have evidence of activating specific mechanisms to resolve particular barriers and achieve specific outcomes. Such foundational knowledge of discrete strategies would be instrumental in designing a practical implementation plan, but no synthesis or repository exists to our knowledge (A11.1), although a 2016 review does offer preliminary evidence on a subset of strategy-mediator pairings [21]. One activity to contribute this knowledge may be the “salvage strategy” [100, 101] in which journals or conferences feature implementation failures and invite exploration of mechanism activation or lack thereof [17] (A12.9). The MNoE also prioritized using theory to guide articulation of putative mechanisms (A12.3) and the examination of mechanisms across diverse contexts to explore how mechanisms might be activated differently or over a different timeframe across contexts, populations, or interventions (A12.4). Moreover, the MNoE acknowledged the potential to hyperfocus on intrapersonal mechanisms of behavior change, which has a mounting evidence base [16, 72, 102]. To complement this individually focused work, the MNoE explicitly prioritized exploring mechanisms at aggregate levels of analysis that are less studied (e.g., community or policy levels), but where structures should be targeted to improve (A12.6) equitable outcomes [32, 5052587587103, 104].

Theory, causality, and context

Because implementation science is a convergence of many disciplines, there are relevant classic theories (e.g., from social psychology, business, economics, education, anthropology) that articulate mechanisms [105]. Most utilized are frameworks, from which the theoretical underpinnings that depict relationships among constructs and enable prediction through propositions are absent, leaving a list of measurable factors organized by conceptual coherence, as in the case of the Theoretical Domains Framework [106] and the updated Consolidated Framework for Implementation Research [107]. Kislov et al. [108] wrote about the importance of theorizing as a process that could enable implementation scientists to bidirectionally inform and learn from empirical data to test and advance generalizable knowledge and theory working at the mid-range level to develop and refine grand theories. More recently, Meza and colleagues [109] attempted to make theorizing more accessible to researchers, and although they use theorizing about determinants as their use case, they name mechanisms as a critical component of causal chains that explain how an implementation initiative is successful. Toward this goal, the MNoE prioritized activities that would incorporate theory (G13) through examples and guidance (G14). Actions included differentiating causal theory from program theory (A13.1), modifying implementation science “grand” theories to better represent mechanisms (A13.2), and making the notion of timing more explicit in the theory of change (A13.6). Consistent with the above-mentioned calls to prioritize theory, the MNoE prioritized guidance to choose relevant theories for study planning (A14.4), to fully integrate theory in an implementation study of mechanisms (A14.3), and to clarify how theory is used to articulate mechanisms (A14.2).

Beyond the five priority clusters initially identified in the concept mapping of challenges stymying the field, two new priority clusters of actions emerged through MNoE discussions: Engagement and Growing the Field. These priorities reflect critical areas of work to advance the study of implementation mechanisms. The Engagement cluster represents actions that, if prioritized early, would amplify the impact of actions in other clusters. Growing the Field actions are foundational and/or underpin the work of the other clusters, which might not be possible otherwise.

Engagement

In terms of Engagement, the MNoE thought it critical to engage the policy and practice community, as well as funders of implementation science. The MNoE emphasized that the policy and practice communities are critical to establishing mechanisms, yet this area of science can feel obscure and pedantic to those communities. Funders were identified as a separate target for engagement because many of the prioritized actions do not fit neatly within traditional funding mechanisms.

The MNoE articulated priorities for engaging policy and practice partners in mechanism identification, validation, and testing (G15) and in using methods to obtain practice-based data and confirm theory (G16). The MNoE recommended plain-language mechanism definitions and de-jargonized questions for identifying mechanisms with community partners to help scientific teams learn from their perspectives (A15.1). Plain language was repeatedly emphasized because the term “mechanisms” itself may limit idea generation or perceptions of applicability as it tends to surface mechanical or biological underpinnings (A15.2, 15.3). The MNoE saw the policy and practice communities, broadly construed, as central to unearthing putative mechanisms and generated actions for facilitating their engagement, including motivating them to study mechanisms (A15.5), supporting them to collect and track data on mechanisms (A15.7), providing feedback (A15.8), and constructing causal pathways (A15.9). For instance, group model building presents a directed approach to engaging participants in articulating implementation mechanisms [110]. There are several more general frameworks, models, and approaches that can guide this kind of policy and practice community engagement, including community-based participatory research [111], community partnered participatory research [112], participatory action research [113], integrated knowledge translation [114], and user-centered design [67, 115].

The MNoE articulated goals for engaging funders including emphasizing the study of mechanisms as a priority (G17), growing mechanism expertise (G18), and considering new funding models to support mechanism-focused research (G19). The MNoE suggested that it might be important to clarify, or confirm, that mechanism-focused research can lead to more parsimonious and efficient implementation approaches and reproducibility (A17.5). To this end, the MNoE surfaced the possibility of using scientific administrative supplements for mechanism data testing (A17.1) and making the study of mechanisms an explicit priority in funding opportunities (A17.2). To ensure mechanism evaluation fits within grant budget limits, the MNoE suggested deprioritizing patient and clinical outcomes when the intervention’s efficacy and/or effectiveness is robust and adaptation is minimal (A17.3).

The MNoE highlighted the importance of ensuring that grant reviewers are familiar with implementation mechanisms and can critically review grant proposals on these topics. To grow the capacity of reviewers (and the extramural community more broadly), the MNoE proposed specialized training for reviewers or the reviewer pipeline (A18.1), including conference workshops (A18.2) and mock study sections that center applications proposing implementation mechanisms research (A18.3). The MNoE envisioned a guideline document that would support assessing a study proposal’s plan to evaluate implementation mechanisms and scaffold learning key elements for mechanisms testing for those writing grant applications (A18.4).

Finally, the MNoE articulated several ideas for funding opportunities or suggested elements to emphasize within planned/existing funding opportunities. These included funding a coordinating center to harmonize measures, create the infrastructure for data collection, and integrate findings across numerous studies examining implementation strategy mechanisms (A19.5). The MNoE also wondered about the possibility of mechanism evaluation occurring during a follow-up (e.g., renewal) grant funding period, leveraging the longitudinal nature of the evaluation and the need to engage multiple partners (A19.4). In addition to large cross-study or longer initiatives, the MNoE suggested small and nimble grant opportunities that allow for discrete strategy testing and the need to pivot if the strategy “signal” is not detected (A19.1).

Growing the field

Throughout the Deep Dive, the MNoE called for multi-pronged efforts to grow the field. The MNoE recommended resources for evaluating mechanisms that could scaffold scientists’ efforts (G20) as well as more robust training that would help scholars grow new skillsets in the study of implementation mechanisms (G21). The MNoE prioritized guidance and resources regarding topics such as: how to test a strategy causal pathway (A20.1), how to choose the most appropriate outcome for a given mechanism (A20.2), how to isolate a mechanism from other factors in a causal pathway (A20.3, 20.4), how to disentangle the intervention from implementation strategies (A20.5), when to adapt the intervention versus modify the implementation strategy (A20.6), and when to change the strategy for the context versus change the context using the strategy based on our understanding of mechanisms (A20.8). With respect to this last topic, many scholars see contextual targets that, if changed, boast greater societal benefit (e.g., consideration of social determinants of health; addressing structural racism) as being inappropriate targets for implementation scientists, unless the evidence-based intervention itself is directed at those higher levels. Yet, implementing within existing structures can exacerbate inequities. These are critical questions, answers to which would have serious practical implications if, indeed, empirical guidance could be curated. Moreover, these questions are faced by numerous research teams, making the investment in generating such guidance even more valuable. These are the types of empirical evidence and associated resources that might come from larger investments to support the study of mechanisms, such as center grant awards, from which the scientific field and practice community stand to benefit.

The MNoE also generated several actions that were characterized as training-like approaches to build capacity. These included efforts like brief, recorded, didactic sessions regarding definitional issues surrounding mechanisms (A21.1), as well as more process-oriented training on, for example, how to specify causal chains (A21.2) and how to regularly reflect on why an implementation strategy is or is not working throughout the course of a study (A21.3). A team at the IMPACT Center has begun to produce videos aligned with these actions with funding from the US National Institute of Mental Health (P50MH126219) [116]. Acknowledging that videos might not be sufficient, this team has also offered in-person workshop training followed by office hours and one year of expert consultation around causal pathway diagramming [117]. Multipronged training and consultation will be critical for capacity building in new areas like the study of implementation mechanisms. Somewhat innovative actions were also shared, including a workgroup to support a series of training grants focused on the study of mechanisms (A21.4) and a data summit in which underutilized data from grants could be made available for secondary analysis paired with postdoctoral researchers using a shared mentoring model. The sentiment was that the expertise required to advance the study of mechanisms is sparse and approaches that extend the reach to new teams and data sources would be critical.

Although several of the above suggestions function as dissemination, the MNoE articulated four specific dissemination-related goals: produce focused manuscripts (G22); partner with journals to generate new paper types (G23); establish forums for dialog (G24); and generate broad interest using strategies that reach community partners (G25). The MNoE articulated numerous manuscripts that would be helpful such as Mechanisms Made Too Simple, inspired by Curran’s article [118]. They also imagined new paper types, such as one that centered on “learning from failure with wisdom,” which would essentially unpack implementation failures with a mechanistic lens. An example of such a commentary was written by researchers (not members of the original research team) regarding a recently published null trial that appears fruitful [17], and yet another approach is to ensure that implementers have opportunities to share “salvage strategies” that make the most out of opportunities to retain rigor when unexpected events threaten to derail studies that could shed light on mechanisms [100, 101]. Finally, the MNoE underscored the importance of clarifying the “why” behind the study of mechanisms, particularly given the importance of learning from and supporting the policy and practice community. As they discussed dissemination, they surfaced a marketing problem in that not all would agree that the study of mechanisms could advance both science and practice, and some members believed this reductionist approach is misaligned with the very nature of implementation [20].

Limitations

Importantly, the MNoE may not be representative of those who could contribute and/or stand to benefit from this work. Although we made efforts to engage researchers from outside the United States (US; e.g., open attendance during a SIRC breakout; international representation in MNoE paper writing groups), the inputs and outputs of this research agenda largely reflect a US perspective. Indeed, parallel and complementary work from scholars in the United Kingdom (UK) includes an ontology of mechanisms of action in behavior change interventions that begins to address several aspects of the Research Agenda [119]. We hope readers with different perspectives will consider building from the US and UK work, for example, writing a commentary to further the dialogue and/or pursuing research that advances some of the priorities discussed above. Moreover, although some of the MNoE identify more as clinically or practically oriented researchers, the MNoE did not include policy and practice community members. Thus, it is likely that new actions across the priority clusters would have emerged if different groups were engaged in the process of generating this content. Also, the focus of this research agenda is on implementation strategy mechanisms, or the processes through which strategies exert their effects to achieve outcomes [30]. This focus overlooks contextual mechanisms, such as those surfaced through realist reviews [120]. This focus is consistent with prior work by our team [19], but can limit the field’s ability to explain how and why implementation occurs.

Conclusion

Implementation science needs to further expand from what works to how and why certain strategies work, for whom, when, and in which contexts [121]. This research agenda outlines a roadmap of concrete actions for advancing the study of mechanisms. To carry out this research agenda, concerted and strategic effort is needed. There are numerous training forums that grow implementation research capacity [122]. We hope some will highlight the priorities articulated herein, bring together transdisciplinary experts with mechanism-specific expertise, and contribute to the study of implementation mechanisms.

Availability of data and materials

N/A.

Notes

  1. MNoE members from other countries were invited, but unable to attend due to COVID restrictions.

Abbreviations

SIRC:

Society for Implementation Research Collaboration

MNoE:

Mechanisms Network of Expertise

References

  1. Shelton RC, Chambers DA, Glasgow RE. An extension of RE-AIM to enhance sustainability: Addressing dynamic context and promoting health equity over time. Front Public Health. 2020;8:1–8.

    Article  Google Scholar 

  2. Gustafson P, Abdul Aziz Y, Lambert M, Bartholomew K, Rankin N, Fusheini A, et al. A scoping review of equity-focused implementation theories, models and frameworks in healthcare and their application in addressing ethnicity-related health inequities. Implement Sci. 2023;18:51.

    Article  PubMed  PubMed Central  Google Scholar 

  3. Pullmann MD, Dorsey S, Duong MT, Lyon AR, Muse I, Corbin CM, et al. Expect the unexpected: A qualitative study of the ripple effects of children’s mental health services implementation efforts. Implementation Research and Practice. 2022;3:26334895221120796.

    Article  PubMed  PubMed Central  Google Scholar 

  4. Dadich A, Vaughan P, Boydell K. The unintended negative conse- quences of knowledge translation in healthcare: A systematic scoping review. Health Sociol Rev. 2023;32:75–93.

  5. Woodward EN, Singh RS, Ndebele-Ngwenya P, Melgar Castillo A, Dickson KS, Kirchner JE. A more practical guide to incorporating health equity domains in implementation determinant frameworks. Implementation Science Communications. 2021;2:61.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Gaias LM, Arnold KT, Liu FF, Pullmann MD, Duong MT, Lyon AR. Adapting strategies to promote implementation reach and equity (ASPIRE) in school mental health services. Psychol Sch. 2022;59:2471–85.

    Article  Google Scholar 

  7. Building health equity into implementation strategies and mechanisms. Available from: https://reporter.nih.gov/search/0xJrdiFN8EeuIS8eEAKZCA/project-details/10597777. Cited 2023 Aug 9

  8. Flottorp SA, Oxman AD, Krause J, Musila NR, Wensing M, Godycki-Cwirko M, et al. A checklist for identifying determinants of practice: A systematic review and synthesis of frameworks and taxonomies of factors that prevent or enable improvements in healthcare professional practice. Implement Sci. 2013;8:1–11.

    Article  Google Scholar 

  9. Squires JE, Graham ID, Santos WJ, Hutchinson AM, Backman C, Bergström A, et al. The Implementation in Context (ICON) framework: a meta-framework of context domains, attributes and features in healthcare. Health Res Pol Syst. 2023;21:81.

    Article  Google Scholar 

  10. Birken SA, Powell BJ, Presseau J, Kirk MA, Lorencatto F, Gould NJ, et al. Combined use of the Consolidated Framework for Implementation Research (CFIR) and the Theoretical Domains Framework (TDF): A systematic review. Implement Sci. 2017;12:2.

    Article  PubMed  PubMed Central  Google Scholar 

  11. Powell BJ, McMillen JC, Proctor EK, Carpenter CR, Griffey RT, Bunger AC, et al. A compilation of strategies for implementing clinical innovations in health and mental health. Med Care Res Rev. 2012;69:123–57.

    Article  PubMed  Google Scholar 

  12. Michie S, Richardson M, Johnston M, Abraham C, Francis J, Hardeman W, et al. The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: Building an international consensus for the reporting of behavior change interventions. Ann Behav Med. 2013;46:81–95.

    Article  PubMed  Google Scholar 

  13. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: Results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10:1–14.

    Article  Google Scholar 

  14. Kok G, Gottlieb NH, Peters GY, Mullen PD, Parcel GS, Ruiter RAC, et al. A taxonomy of behaviour change methods: An Intervention Mapping approach. Health Psychol Rev. 2016;10:297–312.

    Article  PubMed  Google Scholar 

  15. McHugh S, Presseau J, Luecking CT, Powell BJ. Examining the complementarity between the ERIC compilation of implementation strategies and the behaviour change technique taxonomy: a qualitative analysis. Implement Sci. 2022;17:56.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Marques M, Wright A, Johnston M, West R, Hastings J, Zhang L, et al. The Behaviour Change Technique Ontology: Transforming the Behaviour Change Technique Taxonomy v1. PsyArXiv; 2023. Available from: https://psyarxiv.com/vxypn/. Cited 2023 Aug 8

  17. Geng EH, Baumann AA, Powell BJ. Mechanism mapping to advance research on implementation strategies. PLoS Med. 2022;19:e1003918.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Lewis CC, Klasnja P, Powell BJ, Lyon AR, Tuzzio L, Jones S, et al. From classification to causality: advancing understanding of mechanisms of change in implementation science. Front Public Health. 2018;6:1–6.

    Article  Google Scholar 

  19. Lewis CC, Boyd MR, Walsh-Bailey C, Lyon AR, Beidas R, Mittman B, et al. A systematic review of empirical studies examining mechanisms of implementation in health. Implement Sci. 2020;15:1–25.

    Article  Google Scholar 

  20. Beidas RS, Dorsey S, Lewis CC, Lyon AR, Powell BJ, Purtle J, et al. Promises and pitfalls in implementation science from the perspective of US-based researchers: learning from a pre-mortem. Implement Sci. 2022;17:55.

    Article  PubMed  PubMed Central  Google Scholar 

  21. Williams NJ. Multilevel mechanisms of implementation strategies in mental health: Integrating theory, research, and practice. Admin Pol Mental Health Mental Health Serv Res. 2016;43:783–98.

    Article  Google Scholar 

  22. Baker R, Comosso-Stefinovic J, Gillies C, Shaw EJ, Cheater F, Flottorp S, et al. Tailored interventions to address determinants of practice. Cochrane Database Syst Rev. 2015;4:1–118.

    Google Scholar 

  23. Lewis CC, Marti CN, Scott K, Walker MR, Boyd M, Puspitasari A, et al. Standardized versus tailored implementation of measurement-based care for depression in community mental health clinics. PS. 2022;73:1094–101.

    Article  Google Scholar 

  24. Lewis CC, Stanick C, Lyon A, Darnell D, Locke J, Puspitasari A, et al. Proceedings of the Fourth Biennial Conference of the Society for Implementation Research Collaboration (SIRC) 2017: implementation mechanisms: what makes implementation work and why? part 1. Implement Sci. 2018;13(Suppl 2):1–5.

    Google Scholar 

  25. Guastaferro K, Collins LM. Optimization methods and implementation science: an opportunity for behavioral and biobehavioral interventions. Implement Res Pract. 2021;2:26334895211054364.

    PubMed  PubMed Central  Google Scholar 

  26. McHugh SM, Riordan F, Curran GM, Lewis CC, Wolfenden L, Presseau J, et al. Conceptual tensions and practical trade-offs in tailoring implementation interventions. Frontiers in Health Services. 2022;2. Available from: https://www.frontiersin.org/articles/10.3389/frhs.2022.974095. Cited 2022 Nov 19

  27. Riordan F, Kerins C, Pallin N, Albers B, Clack L, Morrissey E, et al. Characterising processes and outcomes of tailoring implementation strategies in healthcare: a protocol for a scoping review. HRB Open Research; 2022. Available from: https://hrbopenresearch.org/articles/5-17. Cited 2022 Nov 19

  28. Landes SJ, Kerns SEU, Pilar MR, Walsh-Bailey C, Yu SH, Byeon YV, et al. Proceedings of the Fifth Biennial Conference of the Society for Implementation Research Collaboration (SIRC) 2019: where the rubber meets the road: the intersection of research, policy, and practice - part 1. Implement Sci. 2020;15(Suppl 3):1–5.

    Google Scholar 

  29. Vejnoska SF, Mettert K, Lewis CC. Mechanisms of implementation: An appraisal of causal pathways presented at the 5th biennial Society for Implementation Research Collaboration (SIRC) conference. Implement Res Pract. 2022;3:26334895221086270.

    PubMed  PubMed Central  Google Scholar 

  30. Lewis CC, Powell BJ, Brewer SK, Nguyen AM, Schriger SH, Vejnoska SF, et al. Advancing mechanisms of implementation to accelerate sustainable evidence-based practice integration: protocol for generating a research agenda. BMJ Open. 2021;11:e053474.

    Article  PubMed  PubMed Central  Google Scholar 

  31. Proctor EK, Luke D, Calhoun A, McMillen JC, McCrary S, Padek M. Sustainability of evidence-based healthcare: research agenda, methodological advances, and infrastructure support. Implement Sci. 2015;10:1–13.

    Article  Google Scholar 

  32. Powell BJ. Challenges related to the study of implementation mechanisms: A concept mapping approach. San Diego, CA: Conference presentation at the Society for Implementation Research Collaboration Biennial Conference; 2022.

    Google Scholar 

  33. Vennix JA. Group model building. System. Dynamics. 1996;2:123–32.

    Google Scholar 

  34. Hovmand PS, Rouwette EAJA, Andersen DF, Richardson GP, and Kraus A. Scriptapedia 4.0.6. Cambridge: The 31st International Conference of the System Dynamics Society; 2013. Available from: https://proceedings.systemdynamics.org/2013/proceed/papers/P1405.pdf. Cited 2024 Sept 6.

  35. Cantrill JA, Sibbald B, Buetow S. Delphi and nominal group techniques in health services research. Int J Pharm Pract. 1996;4:67–74.

    Article  Google Scholar 

  36. American Society for Quality. What is multivoting? 2023. Available from: https://asq.org/quality-resources/multivoting. Cited 2023 Aug 8

  37. Ertmer PA, Glazewski KD. Developing a research agenda: contributing new knowledge via intent and focus. J Comput High Educ. 2014;26:54–68.

    Article  Google Scholar 

  38. Kane M, Trochim WMK. Concept mapping for planning and evaluation. Thousand Oaks, CA: Sage; 2007.

    Book  Google Scholar 

  39. Albers B, Metz A, Burke K, Bührmann L, Bartley L, Driessen P, et al. The mechanisms of implementation support - findings from a systematic integrative review. Res Soc Work Pract. 2022;32:259–80.

    Article  Google Scholar 

  40. Miech EJ, Rattray NA, Flanagan ME, Damschroder L, Schmid AA, Damush TM. Inside help: An integrative review of champions in healthcare-related implementation. SAGE Open Med. 2018;6:2050312118773261.

    Article  PubMed  PubMed Central  Google Scholar 

  41. Zamboni K, Baker U, Tyagi M, Schellenberg J, Hill Z, Hanson C. How and under what circumstances do quality improvement collaboratives lead to better outcomes? A systematic review. Implement Sci. 2020;15:27.

    Article  PubMed  PubMed Central  Google Scholar 

  42. Kilbourne AM, Geng E, Eshun-Wilson I, Sweeney S, Shelley D, Cohen DJ, et al. How does facilitation in healthcare work? Using mechanism mapping to illuminate the black box of a meta-implementation strategy. Implement Sci Commun. 2023;4:53.

    Article  PubMed  PubMed Central  Google Scholar 

  43. Ivers NM, Grimshaw JM, Jamtvedt G, Flottorp S, O’Brien MA, French SD, et al. Growing literature, stagnant science? Systematic review, meta-regression and cumulative analysis of audit and feedback interventions in health care. J Gen Intern Med. 2014;29:1534–41.

    Article  PubMed  PubMed Central  Google Scholar 

  44. Akiba CF, Powell BJ, Pence BW, Nguyen MXB, Golin C, Go V. The case for prioritizing implementation strategy fidelity measurement: benefits and challenges. Transl Behav Med. 2022;12:335–42.

    Article  PubMed  Google Scholar 

  45. Akiba CF, Powell BJ, Pence BW, Muessig K, Golin CE, Go V. “We start where we are”: a qualitative study of barriers and pragmatic solutions to the assessment and reporting of implementation strategy fidelity. Implement Sci Commun. 2022;3:117.

    Article  PubMed  PubMed Central  Google Scholar 

  46. Akiba CF, Go VF, Powell BJ, Muessig K, Golin C, Dussault JM, et al. Champion and audit and feedback strategy fidelity and their relationship to depression intervention fidelity: A mixed method study. SSM - Mental Health. 2023;3:100194.

    Article  PubMed  PubMed Central  Google Scholar 

  47. Brookman-Frazee L, Stahmer AC. Effectiveness of a multi-level implementation strategy for ASD interventions: study protocol for two linked cluster randomized trials. Implement Sci. 2018;13:66.

    Article  PubMed  PubMed Central  Google Scholar 

  48. Brookman-Frazee L, Chlebowski C, Suhrheinrich J, Finn N, Dickson KS, Aarons GA, et al. Characterizing shared and unique implementation influences in two community services systems for autism: applying the EPIS framework to two large-scale autism intervention community effectiveness trials. Adm Policy Ment Health. 2020;47:176–87.

    Article  PubMed  Google Scholar 

  49. Rothman AJ, Sheeran P. What is slowing us down? Six challenges to accelerating advances in health behavior change. Ann Behav Med. 2020;54:948–59.

    Article  PubMed  Google Scholar 

  50. Luke DA, Powell BJ, Paniagua-Avila A. Bridges and mechanisms: Integrating systems science thinking into implementation research. Ann Rev Public Health. 2024;45:7.

    Article  Google Scholar 

  51. Swedberg R. Can you visualize theory? On the use of visual thinking in theory pictures, theorizing diagrams, and visual sketches. Sociol Theory. 2016;34:250–75.

    Article  Google Scholar 

  52. Brown CH, Curran G, Palinkas LA, Aarons GA, Wells KB, Jones L, et al. An overview of research and evaluation designs for dissemination and implementation. Annu Rev Public Health. 2017;38:1–22.

    Article  PubMed  PubMed Central  Google Scholar 

  53. Mercer SL, DeVinney BJ, Fine LJ, Green LW, Dougherty D. Study designs for effectiveness and translation research: Identifying trade-offs. Am J Prev Med. 2007;33:139–54.

    Article  PubMed  Google Scholar 

  54. Mazzucca S, Tabak RG, Pilar M, Ramsey AT, Baumann AA, Kryzer E, et al. Variation in research designs used to test the effectiveness of dissemination and implementation strategies: A review. Front Public Health. 2018;6:1–10.

    Article  Google Scholar 

  55. Hwang S, Birken SA, Melvin CL, Rohweder CL, Smith JD. Designs and methods for implementation research: advancing the mission of the CTSA program. J Clin Transl Sci. 2020;4:159–67.

    Article  PubMed  PubMed Central  Google Scholar 

  56. Institute of Medicine. Assessing the Use of Agent-Based Models for Tobacco Regulation. Washington, DC: The National Academies Press; 2015. Available from: https://doi.org/10.17226/19018

  57. Bonell C, Warren E, Melendez-Torres G. Methodological reflections on using qualitative research to explore the causal mechanisms of complex health interventions. Evaluation. 2022;28:166–81.

    Article  Google Scholar 

  58. Frank HE, Kemp J, Benito KG, Freeman JB. Precision implementation: an approach to mechanism testing in implementation research. Adm Policy Ment Health. 2022;49:1084–94.

    Article  PubMed  Google Scholar 

  59. Glaser J, Laudel G. The Discovery of Causal Mechanisms: Extractive Qualitative Content Analysis as a Tool for Process Tracing. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research. 2019;20. Available from: https://www.qualitative-research.net/index.php/fqs/article/view/3386. Cited 2023 Aug 9

  60. Stone AA, Schneider S, Smyth JM. Evaluation of pressing issues in ecological momentary assessment. Annu Rev Clin Psychol. 2023;19:107–31.

    Article  PubMed  Google Scholar 

  61. Kerwer M, Chasiotis A, Stricker J, Günther A, Rosman T. Straight From the scientist’s mouth—plain language summaries promote laypeople’s comprehension and knowledge acquisition when reading about individual research findings in psychology. Collabra: Psychology. 2021;7:18898.

    Article  Google Scholar 

  62. Jones L, Wells K. Strategies for academic and clinician engagement in community-participatory partnered research. JAMA. 2007;297:407–10.

    Article  CAS  PubMed  Google Scholar 

  63. London RA, Glass RD, Chang E, Sabati S, Nojan S. “We Are About Life Changing Research”: Community Partner Perspectives on Community-Engaged Research Collaborations. Journal of Higher Education Outreach and Engagement. 2022;26. Available from: https://openjournals.libs.uga.edu/jheoe/article/view/2512. Cited 2023 Aug 9

  64. Fernandez ME, ten Hoor GA, van Lieshout S, Rodriguez SA, Beidas RS, Parcel G, et al. Implementation mapping: Using intervention mapping to develop implementation strategies. Front Public Health. 2019;7:1–15.

    Article  Google Scholar 

  65. Fernandez ME, Powell BJ, Ten Hoor GA. Editorial: Implementation Mapping for selecting, adapting and developing implementation strategies. Front Public Health. 2023;11:1–4.

    Article  Google Scholar 

  66. Lyon AR, Coifman J, Cook H, McRee E, Liu FF, Ludwig K, et al. The Cognitive Walkthrough for Implementation Strategies (CWIS): a pragmatic method for assessing implementation strategy usability. Implement Sci Commun. 2021;2:78.

    Article  PubMed  PubMed Central  Google Scholar 

  67. Graham AK, Wildes JE, Reddy M, Munson SA, Taylor CB, Mohr DC. User-centered design for technology-enabled services for eating disorders. Int J Eat Disord. 2019;52:1095–107.

    Article  PubMed  PubMed Central  Google Scholar 

  68. Oxman AD, Thomson MA, Davis DA, Haynes B. No magic bullets: A systematic review of 102 trials of interventions to improve professional practice. Can Med Assoc J. 1995;153:1424–31.

    Google Scholar 

  69. Shojania KG, Grimshaw JM. Still no magic bullets: pursuing more rigorous research in quality improvement. Am J Med. 2004;116:778–80.

    Article  PubMed  Google Scholar 

  70. Walsh Bailey C, Wiltsey Stirman S, Helfrich CD, Moullin J, Nilsen P, Oladunni O, et al. The hazards of overreliance on theories, models, and frameworks and how the study of mechanisms can offer a solution. San Diego, CA: Society for Implementation Research Collaboration Conference; 2022.

    Google Scholar 

  71. Connors EH, Martin JK, Aarons GA, Barwick M, Bunger AC, Bustos TE, et al. Proceedings of the Sixth Conference of the Sixth Conference of the Society for Implementation Research Collaboration (SIRC) 2022: from implementation foundations to new frontiers. Implement Res Pract. 2023;4:S1–185.

    Google Scholar 

  72. Johnston M, Carey RN, Connell Bohlen LE, Johnston DW, Rothman AJ, de Bruin M, et al. Development of an online tool for linking behavior change techniques and mechanisms of action based on triangulation of findings from literature synthesis and expert consensus. Transl Behav Med. 2021;11:1049–65.

    Article  PubMed  Google Scholar 

  73. Waltz TJ, Powell BJ, Fernández ME, Abadie B, Damschroder LJ. Choosing implementation strategies to address contextual barriers: Diversity in recommendations and future directions. Implement Sci. 2019;14:1–15.

    Article  Google Scholar 

  74. Perez Jolles M, Lengnick-Hall R, Mittman BS. Core functions and forms of complex health interventions: A patient-centered medical home illustration. J Gen Intern Med. 2019;34:1032–8.

    Article  PubMed  PubMed Central  Google Scholar 

  75. Lengnick-Hall R, Stadnick NA, Dickson KS, Moullin JC, Aarons GA. Forms and functions of bridging factors: specifying the dynamic links between outer and inner contexts during implementation and sustainment. Implement Sci. 2021;16:34.

    Article  PubMed  PubMed Central  Google Scholar 

  76. Grimshaw JM, Ivers N, Linklater S, Foy R, Francis JJ, Gude WT, et al. Reinvigorating stagnant science: implementation laboratories and a meta-laboratory to efficiently advance the science of audit and feedback. BMJ Qual Saf. 2019;28:416–23.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  77. The Audit & Feedback MetaLab. Available from: https://www.ohri.ca/auditfeedback/. Cited 2023 Aug 9

  78. Implementation Outcome Repository. Available from: https://implementationoutcomerepository.org. Cited 2023 Aug 9

  79. Systematic Reviews of Methods to Measure Implementation Constructs. Available from: https://journals.sagepub.com/topic/collections-irp/irp-1-systematic_reviews_of_methods_to_measure_implementation_constructs/irp. Cited 2023 Aug 9

  80. Living systematic reviews | Cochrane Community. Available from: https://community.cochrane.org/review-development/resources/living-systematic-reviews. Cited 2023 Aug 9

  81. Maguire BJ, Guérin PJ. A living systematic review protocol for COVID-19 clinical trial registrations. Wellcome Open Res. 2020;5:60.

    Article  PubMed  PubMed Central  Google Scholar 

  82. Negrini S, Ceravolo MG, Côté P, Arienti C. A systematic review that is ``rapid’’ and ``living’’: A specific answer to the COVID-19 pandemic. J Clin Epidemiol. 2021;138:194–8.

    Article  PubMed  PubMed Central  Google Scholar 

  83. Compton J, Copeland K, Flanders S, Cassity C, Spetman M, Xiao Y, et al. Implementing SBAR across a large multihospital health system. Joint Comm J Qual Pat Safety. 2012;38:261–8.

    Google Scholar 

  84. Oh A, Vinson CA, Chambers DA. Future directions for implementation science at the National Cancer Institute: Implementation Science Centers in Cancer Control. Transl Behav Med. 2021;11:669–75.

    Article  PubMed  Google Scholar 

  85. Lewis CC, Hannon PA, Klasnja P, Baldwin L-M, Hawkes R, Blackmer J, et al. Optimizing Implementation in Cancer Control (OPTICC): protocol for an implementation science center. Implement Sci Commun. 2021;2:44.

    Article  PubMed  PubMed Central  Google Scholar 

  86. Oh AY, Emmons KM, Brownson RC, Glasgow RE, Foley KL, Lewis CC, et al. Speeding implementation in cancer: The National Cancer Institute’s Implementation Science Centers in Cancer Control. J Natl Cancer Inst. 2023;115:131–8.

    Article  PubMed  Google Scholar 

  87. Kilbourne AM, Almirall D, Eisenberg D, Waxmonsky J, Goodrich DE, Fortney JC, et al. Protocol: Adaptive Implementation of Effective Programs Trial (ADEPT): cluster randomized SMART trial comparing a standard versus enhanced implementation strategy to improve outcomes of a mood disorders program. Implement Sci. 2014;9:1–14.

    Article  Google Scholar 

  88. Ramanadhan S, Revette AC, Lee RM, Aveling EL. Pragmatic approaches to analyzing qualitative data for implementation science: an introduction. Implement Sci Commun. 2021;2:1–10.

    Article  Google Scholar 

  89. National Cancer Institute. Qualitative methods in implementation science. Bethesda, Maryland; 2019. Available from: https://cancercontrol.cancer.gov/sites/default/files/2020-09/nci-dccps-implementationscience-whitepaper.pdf

  90. Gertner AK, Franklin J, Roth I, Cruden GH, Haley AD, Finley EP, et al. A scoping review of the use of ethnographic approaches in implementation research and recommendations for reporting. Implement Res Pract. 2021;2:1–13.

    Google Scholar 

  91. Hamilton AB, Finley EP. Qualitative methods in implementation research: An introduction. Psychiatry Res. 2019;280:112516.

    Article  PubMed  PubMed Central  Google Scholar 

  92. Lemire S, Kwako A, Nielsen SB, Christie CA, Donaldson SI, Leeuw FL. What is this thing called a mechanism? Findings from a review of realist evaluations. N Dir Eval. 2020;167:73–86.

    Article  Google Scholar 

  93. Lewis CC, Boyd MR, Marti CN, Albright K. Mediators of measurement-based care implementation in community mental health settings: results from a mixed-methods evaluation. Implement Sci. 2022;17:71.

    Article  PubMed  PubMed Central  Google Scholar 

  94. Rabin BA, Lewis CC, Norton WE, Neta G, Chambers D, Tobin JN, et al. Measurement resources for dissemination and implementation research in health. Implement Sci. 2016;11:1–9.

    Google Scholar 

  95. Kendall AD, Robinson CSH, Diviak KR, Hedeker D, Mermelstein RJ. Introducing a real-time method for identifying the predictors of noncompliance with event-based reporting of tobacco use in ecological momentary assessment. Ann Behav Med. 2023;57:399–408.

    Article  PubMed  Google Scholar 

  96. Bosch M, van der Weijden T, Wensing M, Grol R. Tailoring quality improvement interventions to identified barriers: A multiple case analysis. J Eval Clin Pract. 2007;13:161–8.

    Article  PubMed  Google Scholar 

  97. Wensing M, Grol R. Knowledge translation in health: How implementation science could contribute more. BMC Med. 2019;17:1–6.

    Article  Google Scholar 

  98. Lewis CC, Klasnja P, Lyon AR, Powell BJ, Lengnick-Hall R, Buchanan G, et al. The mechanics of implementation strategies and measures: advancing the study of implementation mechanisms. Implement Sci Commun. 2022;3:114.

    Article  PubMed  PubMed Central  Google Scholar 

  99. EPOC Taxonomy. Available from: https://epoc.cochrane.org/epoc-taxonomy. Cited 2023 Aug 9

  100. Hoagwood KE, Chaffin M, Chamberlain P, Bickman L, Mittman B. Implementation salvage strategies: maximizing methodological flexibility in children's mental health research. In 4th Annual NIH conference on the Science of Dissemination and Implementation: Policy and Practice; 2011.

  101. Dunbar J, Hernan A, Janus E, Davis-Lameloise N, Asproloupos D, O’Reilly S, et al. Implementation salvage experiences from the Melbourne diabetes prevention study. BMC Public Health. 2012;12:1–9.

    Article  Google Scholar 

  102. Carey RN, Connell LE, Johnston M, Rothman AJ, de Bruin M, Kelly MP, et al. Behavior change techniques and their mechanisms of action: a synthesis of links described in published intervention literature. Ann Behav Med. 2019;53:693–707.

    PubMed  Google Scholar 

  103. Crable EL, Lengnick-Hall R, Stadnick NA, Moullin JC, Aarons GA. Where is “policy” in dissemination and implementation science? Recommendations to advance theories, models, and frameworks: EPIS as a case example. Implement Sci. 2022;17:80.

    Article  PubMed  PubMed Central  Google Scholar 

  104. Purtle J, Moucheraud C, Yang LH, Shelley D. Four very basic ways to think about policy in implementation science. Implement Sci Commun. 2023;4:111.

    Article  PubMed  PubMed Central  Google Scholar 

  105. Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. 2015;10:1–13.

    Article  Google Scholar 

  106. Cane J, O’Connor D, Michie S. Validation of the theoretical domains framework for use in behaviour change and implementation research. Implement Sci. 2012;7:1–17.

    Article  Google Scholar 

  107. Damschroder LJ, Reardon CM, Widerquist MAO, Lowery J. The updated consolidated framework for implementation research based on user feedback. Implement Sci. 2022;17:75.

    Article  PubMed  PubMed Central  Google Scholar 

  108. Kislov R, Pope C, Martin GP, Wilson PM. Harnessing the power of theorising in implementation science. Implement Sci. 2019;14:1–8.

    Article  Google Scholar 

  109. Meza RD, Moreland JC, Pullmann MD, Klasnja P, Lewis CC, Weiner BJ. Theorizing is for everybody: Advancing the process of theorizing in implementation science. Frontiers in Health Services. 2023;3. Available from: https://www.frontiersin.org/articles/10.3389/frhs.2023.1134931. Cited 2023 Mar 19

  110. Northridge ME, Metcalf SS. Enhancing implementation science by applying best principles of systems science. Health Res Pol Syst. 2016;14:1–8.

    Google Scholar 

  111. Wallerstein N, Duran B. Community-based participatory research contributions to intervention research: The intersection of science and practice to improve health equity. Am J Public Health. 2010;100:S40–6.

    Article  PubMed  PubMed Central  Google Scholar 

  112. Wells K, Jones L. “Research” in community-partnered. Participatory Res JAMA. 2009;302:320–1.

    CAS  Google Scholar 

  113. Baum F, MacDougall C, Smith D. Participatory action research. J Epidemiol Community Health. 2006;60:854.

    Article  PubMed  PubMed Central  Google Scholar 

  114. Nguyen T, Graham ID, Mrklas KJ, Bowen S, Cargo M, Estabrooks CA, et al. How does integrated knowledge translation (IKT) compare to other collaborative research approaches to generating and translating knowledge? Learning from experts in the field. Health Res Pol Syst. 2020;18:35.

    Article  CAS  Google Scholar 

  115. Lyon AR, Bruns EJ. User-centered redesign of evidence-based psychosocial interventions to enhance implementation: Hospitable soil or better seeds? JAMA Psychiat. 2019;76:3–4.

    Article  Google Scholar 

  116. Kaiser Permanente Washington Health Research Institute. Causal Pathway Diagrams. 2022. Available from: https://vimeo.com/740549106

  117. Klasnja P, Meza RD, Pullmann MP, Mettert KD, Hawkes R, Palazzo L, et al. 1354 Getting cozy with causality: A causal pathway diagramming method to 1355 enhance implementation precision. Implement Res Pract. 2024; 5:1–14.

  118. Curran GM. Implementation science made too simple: a teaching tool. Implement Sci Commun. 2020;1:1–3.

    Article  Google Scholar 

  119. Schenk PM, Wright AJ, West R, Hastings J, Lorencatto F, Moore C, et al. An ontology of mechanisms of action in behaviour change interventions. Wellcome Open Res. 2024;8:337.

    Article  PubMed  PubMed Central  Google Scholar 

  120. Salter KL, Kothari A. Using realist evaluation to open the black box of knowledge translation: a state-of-the-art review. Implement Sci. 2014;9:115.

    Article  PubMed  PubMed Central  Google Scholar 

  121. Hamilton AB, Mittman BS. Implementation science in health care. In: Brownson RC, Colditz GA, Proctor EK, editors. Dissemination and implementation research in health: Translating science to practice. 2nd ed. New York: Oxford University Press; 2018. p. 385–400.

    Google Scholar 

  122. Viglione C, Stadnick NA, Birenbaum B, Fang O, Cakici JA, Aarons GA, et al. A systematic review of dissemination and implementation science capacity building programs around the globe. Implement Sci Commun. 2023;4:34.

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Acknowledgements

The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

In addition to federal funding, we are grateful to have received funding support from the Society for Implementation Research Collaboration, which helped make our Deep Dive meetings possible. Moreover, we would like to thank attendees at SIRC’s 2019 conference who provided input into the challenges associated with studying implementation mechanisms. Finally, we wish to thank each member of the MNoE, some of whom contributed to the development of the R13 grant proposal (Gregory Aarons, Rinad Beidas, Aaron Lyon, Brian Mittman, Byron Powell, Bryan Weiner, Nate Williams), and the rest of whom participated in one or more Deep Dive meetings, between event virtual sessions, or in the paper writing groups; see Additional File 1. Gracelyn Cruden began this work at her prior institution, Oregon Social Learning Center, and completed it at her current institution.

Funding

This research was supported by funding from the Agency for Healthcare Research and Quality (R13HS025632), National Institute of Mental Health (P50MH126219, R01MH111950, R01MH111981, K01MH128769, R25MH080916, K01MH113806, K01MH128761), and National Cancer Institute (P50CA244432 and R01CA262325, P50CA244690).

Author information

Authors and Affiliations

Authors

Consortia

Contributions

CCL, HEF, BK, AS, GC, and BJP contributed to the conceptualization of the manuscript, engaged in the coding, and participated in data interpretation. CCL drafted the introduction, results, and discussion. HEF and GC drafted the method section. ARL and BA reviewed preliminary results and contributed to revisions to the results table. BJP and CCL worked the manuscript through several cycles of review by all coauthors. All authors (CCL, HEF, BK, AS, GC, BJP GAA, RSB, BSM, BJW, NJW, MF, SM, MP, LS, AW, CWB, SWS) reviewed, edited, and approved the final content of the manuscript.

Corresponding author

Correspondence to Cara C. Lewis.

Ethics declarations

Ethics approval and consent to participate

This study was reviewed and approved by Kaiser Permanente Washington Health Research Institute’s IRB and was deemed Not Human Subjects Research.

Consent for publication

N/A.

Competing interests

Drs. Lewis and Weiner receive royalties from Springer Publishing. Dr. Beidas is principal at Implementation Science & Practice, LLC. She receives royalties from Oxford University Press, consulting fees from United Behavioral Health and OptumLabs, and serves on the advisory boards for Optum Behavioral Health, AIM Youth Mental Health Foundation, and the Klingenstein Third Generation Foundation outside of the submitted work. Dr. Aarons is a Co-Editor-in-Chief, Dr. Beidas is an Associate Editor, and Drs. Powell and Weiner are on the Editorial Board of Implementation Science, none of whom will play a role in the editorial process of this manuscript.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lewis, C.C., Frank, H.E., Cruden, G. et al. A research agenda to advance the study of implementation mechanisms. Implement Sci Commun 5, 98 (2024). https://doi.org/10.1186/s43058-024-00633-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s43058-024-00633-5

Keywords