Skip to main content

Organizing the dissemination and implementation field: who are we, what are we doing, and how should we do it?


Two decades into its tenure as a field, dissemination and implementation (D&I) scientists have begun a process of self-reflection, illuminating a missed opportunity to bridge the gap between research and practice—one of the field’s foundational objectives. In this paper, we, the authors, assert the research-to-practice gap has persisted, in part due to an inadequate characterization of roles, functions, and processes within D&I. We aim to address this issue, and the rising tension between D&I researchers and practitioners, by proposing a community-centered path forward that is grounded in equity.

We identify key players within the field and characterize their unique roles using the translational science spectrum, a model originally developed in the biomedical sciences to help streamline the research-to-practice process, as a guide. We argue that the full translational science spectrum, from basic science research, or “T0,” to translation to community, or “T4,” readily applies within D&I and that in using this framework to clarify roles, functions, and processes within the field, we can facilitate greater collaboration and respect across the entire D&I research-to-practice continuum. We also highlight distinct opportunities (e.g., changes to D&I scientific conference structures) to increase regular communication and engagement between individuals whose work sits at different points along the D&I translational science spectrum that can accelerate our efforts to close the research-to-practice gap and achieve the field’s foundational objectives.

Peer Review reports


Though still in its infancy, the field of dissemination and implementation science (D&I) [1, 2] is facing challenges related to the growing gap between the science and practice of implementation [1, 3, 4]. D&I is the scientific study of translating research findings and evidence-based interventions into everyday practice; in the current state of the D&I literature, this often means that a practice developed by one group of actors is being implemented into the everyday practice of others [5]. A premortem by Beidas and colleagues [4] highlighted several factors stagnating the field, including closure of the evidence-to-practice gap [6,7,8,9], insufficient impact, and inability to align timelines and priorities with partners [1]. This commentary aims to establish further clarity regarding who “we” are as a field, what we are doing, and how we can collectively work to achieve shared goals of improved population health in D&I. This refers to the collective “we” of those engaged in D&I work.

In clarifying key components of D&I, important lessons can be drawn from more established fields. For example, when reflecting on disciplines such as mathematics and physics, one notes the emergence of two broad areas of scholarship—theoretical and applied—within these fields. These scholarship areas fill distinct, but important roles within their fields. Here, the authors posit that D&I science could be similarly broken down into theoretical and applied scholarships. In this paper, we, the authors, elaborate on the functions of these differential scholarships, and the functions of professionals working in the large and ever-growing field of implementation practice.

While many have noted D&I aims “to promote the adoption and integration of evidence-based practices, interventions, and policies into routine health care and public health settings to improve the impact on population health,” [10] specificity in how to achieve this outcome has been elusive. In this article, we propose that the field must first define the actors and audiences across the implementation spectrum and how each group connects with others. Subsequently, the field can strengthen the infrastructures that facilitate these connections. In this article, we aim to address the rising tension between implementation scientists, implementation support practitioners, delivery systems [11], and communities by proposing a path forward that is community-oriented and grounded in equity, thereby upholding every actor’s place at the D&I table. We draw on principles well-established in the field of translational science to better align D&I towards both improved ideas and real-world impact. We note that our mental model as authors is that success for D&I would be defined as impact at the community or population levels. We recognize this is not the mental model held by all people working in D&I, but believe even for those whose focus is not on population impact, we can collectively work together to achieve these outcomes and impact practice [12].

Who are we?

To date, much of the discussion around the direction of D&I has been researcher-centric [13]. To promote greater equity within the discipline (i.e., to reduce disparities in whose voices are heard within the field of D&I), we would like to expand the existing discourse to include the entire spectrum of professionals who work in implementation, including communities, delivery systems, implementation support practitioners, intermediaries, non-implementation science researchers (e.g., interventionists), and applied and theoretical D&I researchers. Including the entire implementation workforce in a description of the field provides opportunities to see where practitioners have not been empowered to exert influence and to change these inequities. While D&I professionals are likely to fill more than one role at a time or during their careers and may hold perspectives that are therefore representative of a number of these D&I actors, we would like to re-center the current conversation within D&I around implementation support practitioners and delivery systems specifically to uphold our commitment to those most directly affected by D&I efforts.

Communities and individuals impacted by the change

Communities and the individuals who comprise them play a critical role in the success or failure of efforts to implement evidence-based or informed programs and practices (EBPs) within a particular setting [14,15,16,17]. Aligned with this principle, there has been a shifting focus from using community-based to community-led research methods across academic disciplines [18, 19]. Funding agencies have also begun to recognize the need for greater community involvement in research, with current directives to engage community partners acrossthe research spectrum [20]. As suggested by others, strengthening relationships between communities and individuals working at all levels of implementation should remain a priority in closing the evidence-to-practice gap and upholding equity in D;I; indeed, it is essential [21].

Practitioners—implementation support practitioners and delivery systems

Implementation has been happening for the entirety of human history. While several scientific fields (e.g., political science, medicine) began formally investigating processes of D&I in the mid-to-late twentieth century—thereby laying the foundation for current research in this area— the distinct field of D&I only emerged in the past few decades, prompted by repeatedly observed barriers to the successful implementation of EBPs [5, 22].

“Implementation practitioners” are professionals comprised of two distinct groups: implementation support practitioners [23, 24] (e.g., administrators, policy-makers) are involved in planning, engagement, co-creation, strategy selection, capacity building, monitoring, and evaluation; delivery systems (e.g., front-line managers at organizations implementing an EBP) are responsible for implementing the actual practices with professionals, organizations, and the public [11]. Identifying professionals engaged in implementation practice can be difficult as there is inconsistency and terminology; for example, there are over 30 job titles associated with implementation support practitioner roles (see Fig. 1). “Delivery systems” are often unaware of the D&I field or their role as end-users. Implementation researchers appropriately identifying and connecting with delivery systems and implementation support practitioners is key to closing the evidence-to-practice gap and improving impact [4].

Fig. 1
figure 1

Professional job titles of individuals working directly in implementation or implementation support as identified through the Center for Implementation (In preparation for an event about the roles of implementation support practitioners, an open call was sent out to members of an online community of professionals supporting implementation. People were asked for their current or previous job titles that included an implementation component.)


Globally, there are several intermediary organizations serving to translate findings from D&I to support the implementation of EBPs by delivery systems and implementation support practitioners (e.g., the Collaborative for Implementation Practice; Center for Evidence and Implementation in Australia; Impact Center at the University of North Carolina; Center for Effective Services in Ireland; the Nigerian Implementation Science Alliance). These organizations employ implementation support practitioners and bridge the implementation research-to-practice divide by providing training in implementation-related skills and creating tools to support the selection of appropriate implementation strategies. For example, one intermediary has a mini-course providing an introduction to implementation that has enrolled over 10,000 individuals. Millions of research, government, and philanthropic dollars are being invested in these organizations [25,26,27,28]. As implementation researchers and intermediaries, the authors regularly hear from organizations, communities, and individuals that they struggle to access supports in implementation science to address their needs in implementing evidence The demand for this type of work often outpaces the supply, and researchers and funders alike state a clear need for additional resources linking implementation science and practice [29,30,31,32,33].


To better clarify the full spectrum of implementation researchers, researchers whose work is primarily centered on the advancement of implementation ideas (e.g., theory, methods, or framework (TMF) development) are referred to as theoretical implementation scientists and those whose work is primarily centered on the direct use of implementation concepts as a method to achieve better clinical or programmatic outcomes as applied implementation scientists. Scientists may work on both theoretical and applied projects but tend to focus their programs of research in one or the other and may even identify as one or the other.

Non-D&I researchers are also becoming increasingly interested in D&I, as evidenced by the growing number of D&I training institutes globally (e.g., HIV, Infectious Disease and Global Health Implementation Research Institute (HIGH IRI); University College Cork Implementation Science Training Institute; University of Nairobi Implementation Science Fellowship; Training Institute for Dissemination and Implementation Research in Health (TIDIRH)) [34]. Non-D&I researchers are individuals from distinct substantive areas (e.g., HIV, cancer prevention) who are interested in applying D&I to their work but have limited training in this area. These researchers often aim to draw from the TMFs and evidence from D&I to design, implement, and scale EBPs. They may benefit from increased collaboration with individuals who have worked more squarely in D&I.

What are we doing?

We, the paper’s authors, entered the field of D&I with the goal of bridging the research-to-practice gap to better improve the lives of people in our areas of scholarship (HIV, mental health). Yet, we have found that our substantively distinct bodies of applied D&I research have unfolded in such a way that we are all currently involved in a range of theoretical implementation research. This journey has not been without difficulty—the further we moved from our applied work and what grounded our science, the less impact we felt we were having. While we found theoretical research important, we felt as though our roles and functions within D&I were less clear. This lack of clarity in our professional self-concept ultimately helped us identify that D&I is not monolithic. Through conversation, we found that articulating the spectrum of theoretical to applied D&I helped us regain the clarity we needed to continue advancing our science. We believe these realizations could also be beneficial to other D&I professionals.

Leveraging translational science to find clarity

There is extensive literature on moving research findings into practice [35], but the translation of D&I knowledge into practice has received much less attention [1]. Moreover, there is insufficient understanding of which actors are involved at which stages along this spectrum, how each stage contributes to the field, and how these stages, and actors at each of these stages, can connect and achieve shared goals. In Fig. 2, the authors draw on the translational spectrum to address these limitations. The traditional translational spectrum aims to streamline the “bench to bedside” approach and defines the continuum of basic science (stage T0) to public health science (stage T4) [36]. D&I science has long been placed in the T3–T4 segments of the traditional translational spectrum [36]. However, we argue that the full translational spectrum, from T0 through T4, is applicable to D&I. This distinction is often at the core of the tension observed within the field and where our personal struggles in our shifting identities and relationship with D&I research emerged.

Fig. 2
figure 2

The translational spectrum applied to implementation science

In the traditional translational spectrum, T0, “pre-clinical research,” includes bench science and aims to define mechanisms, targets, and strategies for intervention on a general level. In D&I, theoretical implementation scientistswork on the development of TMFs, and elicitation, description, and modeling of mechanisms. Many of the foundational papers that guide implementation research to date stem from work at this stage [37,38,39,40,41,42,43]. T1, “translation to humans,” includes Stage 1 clinical trials and proof of concept science and aims to develop new methods of diagnosis, treatment, and prevention in highly controlled settings. In D&I, theoretical and applied implementation researchers focus on translating theoretical constructs (i.e., TMFs) to actual people and developing methods to test these constructs. Examples of this type of research include measurement of implementation domains such as context (e.g., the Organizational Readiness for Change measure) [44] and implementation outcomes (e.g., the NoMAD measure from Normalization Process Theory) [45]. T2, “translation to patients,” includes Stages 2–3 clinical trials and aims to develop clinical applications and evidence-based guidelines for a given disease. In D&I, applied implementation researchersfocus on identifying implementation constructs relevant to a specific situation, intervention, context, or population where the researchers aim to understand how best to implement. Traditional randomized controlled trial designs are often used in this stage. Individuals working at this stage may test bundled strategies, interrogate the “active ingredients” in strategies [46], or test strategies in varied contexts.

An interesting phenomenon occurs in the T3–4 range. Acknowledging the contributions of researchers and practitioners, we see a split whereby researchers continue to serve as the primary actors in one branch of the translational spectrum, while practitioners become the primary actors in another branch of the spectrum. T3, “translation to practice,” includes comparative effectiveness trials and clinical outcome studies and aims to evaluate real-world effectiveness. In D&I, implementation support practitioners come into a principal role. Individuals working in this capacity use the results of T0–2 to plan implementation projects, sometimes in the form of quality improvement-type projects. In parallel, T3 applied implementation researchers are primarily monitoring or evaluating implementation projects’ real-world effectiveness; this could involve research using pragmatic or naturalistic methods whereby researchers partner with healthcare delivery systems or organizations to better understand real-world implementation or effectiveness outcomes. T4 involves population-level outcomes research and monitoring improvements in morbidity and mortality to impact policy or system change. In D&I, implementation support practitioners and delivery systems scale EBPs up and out. Implementation researchers working at stage T4 define the implementation workforce, develop surveillance systems, and evaluate the effects of evidence-informed implementation on project successes. Intermediariesare prime partners in this work. Additional work is needed to establish clear evidence about what is and is not working on a broad scale and in what contexts [42, 47].

Defining the translational spectrum for D&I facilitates the process of identifying a “home base” for individuals involved in D&I science, thereby improving self-concept clarity and making clear how individuals can foray into upstream and downstream segments to better link their research with that of others. In keeping with findings from workplace self-concept clarity literature [48, 49], when we claim our places in the spectrum, we can improve our effectiveness and avoid burnout [50]. Specifically, we can improve our capacity to clearly generate research questions, identify colleagues, and expand the impact of our work.

How should we do it?

As has been noted by others [21, 51, 52], there is a significant disconnect between individuals working in distinct roles within the field of D&I, particularly between those operating at the two ends of the D&I translational spectrum. By interacting more often and intentionally across the entirety of the D&I process, we as a field could develop significant synergy and produce actionable solutions more quickly to achieve shared goals.

Asking and answering the right question

Fundamental respect for the work of actors at every level of the implementation spectrum, fostered by regular communication, is essential in resolving our identity crises, achieving our shared goals, and upholding equity within the field [21]. One fundamental way for theoretical implementation scientists to demonstrate respect for implementation practitioners is to ask research questions that implementation practitioners want answered [52]. Implementation practitioners have critical theoretical questions that arise while implementing programs and policies in their specific contexts. For example, implementation practitioners regularly assess organizational readiness for change before altering or implementing a new program or policy (as recommended in the implementation science literature). Yet when the assessments suggest that sites are not ready to implement the intended change, there is little guidance from implementation science about how to best address this issue. A common suggestion is to prioritize “ready” sites [53]. This approach is likely to perpetuate existing inequities or disparities, as “ready” sites are often the sites that are least in need of additional resources and supports, and leaves “non-ready” sites with no plan for reaching a sufficient level of readiness. What strategies can increase readiness? Another example involves the need for a more concrete understanding of the effects of adaptation. While the field might agree adaptation is often important to the scale-up and scale-out of EBPs, many adaptation tools [54, 55] are designed for researchers as opposed to practitioners looking for guidance in understanding if the adaptations they propose will influence the effectiveness of the original EBP. How can D&I measures be made more accessible for implementation practitioners? These are just two examples of many.

Working with existing implementation efforts

Evaluating existing processes and successes of implementation practitioners can also galvanize efforts, improve impact of D&I, and uphold equity in D&I. Delivery systems are continually implementing “the thing” and have been for years. Connecting with existing implementation efforts and studying the effectiveness of implementation strategies being actively used by delivery systems is critical to supporting the ongoing work of these individuals [2, 21, 56]. In many ways, this can shortcut science more quickly to a clearer understanding of what works when and for whom, and improve the likelihood of establishing sustainable practices and policies that are feasible, acceptable, and appropriate [23, 24]. This approach is also consistent with the principles of community-based participatory research, including respect for lived experience and tailoring interventions to the needs of the community [57, 58].

Fostering increased communication

Increased communication among actors across the D&I translational spectrum is critical, as previously noted [3, 52, 59]. To again draw from the successes of other fields, the International AIDS Society is a group of over 13,000 members worldwide that “unite(s) scientists, policymakers and activists to galvanize the scientific response, build global solidarity and enhance human dignity for all people living with and affected by HIV” [60]. The International AIDS Society hosts two conferences that rotate annually with a shifting focus between research and practice. Using this model, which has been repeatedly shown to be highly impactful, individuals working at all stages of the HIV implementation science spectrum can engage in, learn from, and contribute to dialogue with others with distinct perspectives and roles in the discipline, thereby improving equity concerning whose voices are centered and uplifted in global agenda-setting efforts. As such, the field of D&I could benefit from an organization akin to the International AIDS Society and agenda-setting practices and conference structures employed by this Society [61,62,63].

Developing tools to directly support real-world D&I

Tools that facilitate the translation of D&I into practice are also critical to achieving shared goals [1]. Again, the field of D&I can look to adjacent fields to learn how they have successfully scaled. For example, the Institute for Healthcare Improvement (IHI), whose mission is to improve health and healthcare worldwide, has scaled the use of quality improvement methods. Over 30 years, they have worked in 42 countries and have had over 7 million online course enrollments [64]. Part of IHI’s model has been to develop practical and easy-to-use improvement tools. A critique of implementation science is that existing frameworks are complicated and difficult to use [3, 4]. If the field of D&I learned from the success of IHI and developed tools that help professionals operationalize implementation science in practice, it would support the broader use of D&I to improve outcomes.

Aligning funding mechanisms and priorities

Funding agencies should increase requirements and supports for community inclusion and implementation throughout the research process. Researchers currently prioritize funding agency policies and expectations, which may not allow enough time for building sustainable community relationships and co-creation of work. A shift in funding agencies’ research calls and approach to awarding research dollars is necessary to build capacity for long-term academic-community partnerships [65,66,67]. Implementation science-related funding calls from the National Institutes of Health, UK Research and Innovation, the Global Alliance for Chronic Diseases, the South African Medical Research Council, and other funding agencies could more intentionally include requirements for this type of work.


Key actions are needed for the field of D&I to self-actualize: (1) Uphold everyone’s place at the implementation table while centering the wants and needs of those most directly affected by implementation efforts; (2) Clarify where on the translational spectrum work is being done by whom and where the gaps in both sufficient volume of work and translation of that work lie; and (3) Facilitate regular communication across the spectrum, from theoretical implementation scientists to implementation practitioners and vice versa. Ideally, this work should be done with researchers and practitioners around the globe. If these three tasks are accomplished, we as a field will be able to reverse the tides and bridge the implementation research-to-practice gap, instead of letting it continue to grow.

Availability of data and materials

Not applicable.



Dissemination and implementation


Human immunodeficiency virus


Acquired immunodeficiency syndrome


  1. Westerlund A, Sundberg L, Nilsen P. Implementation of implementation science knowledge: the research-practice gap paradox. Worldviews Evidence-Based Nurs. 2019;16(5):332–4.

    Article  Google Scholar 

  2. Metz A, Jensen T, Farley A, Boaz A. Is implementation research out of step with implementation practice? Pathways to effective implementation support over the last decade. Implement Res Pract. 2022;3:263348952211055.

    Article  Google Scholar 

  3. Rapport F, Smith J, Hutchinson K, et al. Too much theory and not enough practice? The challenge of implementation science application in healthcare practice. J Eval Clin Pract. 2022;28(6):991–1002.

    Article  PubMed  Google Scholar 

  4. Beidas RS, Dorsey S, Lewis CC, et al. Promises and pitfalls in implementation science from the perspective of US-based researchers: learning from a pre-mortem. Implement Sci. 2022;17(1):1–15.

    Article  Google Scholar 

  5. Eccles MP, Mittman BS. Welcome to implementation science. Implement Sci. 2006;1(1):1–3.

    Article  PubMed Central  Google Scholar 

  6. Taylor SP, Kowalkowski MA, Beidas RS. Where is the implementation science? An opportunity to apply principles during the COVID-19 pandemic. Clin Infect Dis. 2020:6–8. .

  7. Lyon AR, Comtois KA, Kerns SEU, Landes SJ, Lewis CC. Closing the science–practice gap in implementation before it widens. In: Albers B, Shlonsky A, Mildon R, eds. Implementation Science 3.0. Springer International Publishing; 2020:295–313. .

  8. Ploeg J, Davies B, Edwards N, Gifford W, Miller PE. Factors influencing best-practice guideline implementation: Lessons learned from administrators, nursing staff, and project leaders. Worldviews Evidence-Based Nurs. 2007;4(4):210–9.

    Article  Google Scholar 

  9. Bernhardt JM, Mays D, Kreuter MW. Dissemination 2.0: closing the gap between knowledge and practice with new media and marketing. J Health Commun. 2011;16(SUPPL. 1):32–44.

    Article  PubMed  Google Scholar 

  10. National Cancer Institute. Implementation Science. 2020. Accessed 12 Sept 2023. .

  11. Wandersman A, Duffy J, Flaspohler P, et al. Bridging the gap between prevention research and practice: The interactive systems framework for dissemination and implementation. Am J Community Psychol. 2008;41(3–4):171–81.

    Article  PubMed  Google Scholar 

  12. Boulton R, Sandall J, Sevdalis N. The Cultural Politics of ‘Implementation Science.’ J Med Humanit. 2020;41(3):379–94.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Jensen TM, Metz AJ, Farley AB, Disbennett ME. Developing a practice-driven research agenda in implementation science : Perspectives from experienced implementation support practitioners. Published online. 2023.

    Article  Google Scholar 

  14. Iwelunmor J, Blackstone S, Veira D, et al. Toward the sustainability of health interventions implemented in sub-Saharan Africa: a systematic review and conceptual framework. Implement Sci. 2016;11(1). .

  15. Baptiste S, Manouan A, Garcia P, Etya’ale H, Swan T, Jallow W. Community-led monitoring: when community data drives implementation strategies. Curr HIV/AIDS Rep. 2020;17(5):415–21.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Anderson KA, Dabelko-Schoeny H, Koeuth S, Marx K, Gitlin LN, Gaugler JE. The use of community advisory boards in pragmatic clinical trials: The case of the adult day services plus project. Home Health Care Serv Q. 2021;40(1):16–26.

    Article  PubMed  Google Scholar 

  17. Ramanadhan S, Davis M, Donaldson ST, Miller E, Minkler M. Participatory Approaches in Dissemination and Implementation Science. In: Brownson RC, Colditz GA, Proctor EK, editors. Dissemination and Implementation Research in Health: Translating Science to Practice. New York: Oxford University Press; 2023. p. 212.

    Chapter  Google Scholar 

  18. Ricalde MCA, Annoni J, Bonney R, et al. Understanding the Impact of Equitable Collaborations between Science Institutions and Community-Based Organizations: Improving Science through Community-Led Research. Bioscience. 2022;72(6):585–600.

    Article  Google Scholar 

  19. Fernandez ME, Ten Hoor GA, van Lieshout S, et al. Implementation mapping: using intervention mapping to develop implementation strategies. Front Public Heal. 2019;7(JUN):1–15.

    Article  Google Scholar 

  20. National Institutes of Health. All of Us: Local Community and/or Participant Advisory Boards (C/PABs). 2023. Accessed 13 Sept 2023. .

  21. Brownson RC, Kumanyika SK, Kreuter MW, Haire-Joshu D. Implementation science should give higher priority to health equity. Implement Sci. 2021;16(1):28.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Morris ZS, Wooding S, Grant J. The answer is 17 years, what is the question: understanding time lags in translational research. J R Soc Med. 2011;104(12):510–20.

    Article  PubMed  PubMed Central  Google Scholar 

  23. Bührmann L, Driessen P, Metz A, et al. Knowledge and attitudes of implementation support practitioners—findings from a systematic integrative review. PLoS One. 2022;17(5 May):1–25.

    Article  CAS  Google Scholar 

  24. Albers B, Metz A, Burke K. Implementation support practitioners- A proposal for consolidating a diverse evidence base. BMC Health Serv Res. 2020;20(1):1–10.

    Article  Google Scholar 

  25. PCORI dissemination and implementation funding initiatives. Accessed 29 Feb 2024. .

  26. United States Agency for International Development. USAID’s Implementation Science Investment. Accessed 29 Feb 2024. .

  27. National Heart, Lung and BI. Implementation Science Branch. Accessed 29 Feb 2024. .

  28. Zurynski Y, Smith CL, Knaggs G, Meulenbroeks I. Funding research translation: how we got here and what to do next. Aust N Z J Public Health. 2021;45(5):420–3.

    Article  PubMed  Google Scholar 

  29. Holmes B, Hamilton AB. Three opportunities to boost implementation science at a critical time of need. Heal Published online. 2021.

    Article  Google Scholar 

  30. Planning team for the Pathways to Prevention (P2P) Workshop on Achieving Health Equity in Preventive Services and the Office for Disease Prevention portfolio analysis team. We Need More Implementation Science To Improve Health Equity in Clinical Preventive Services. Director’s Messages. Published October 14, 2022. Accessed 29 Feb 2024.

  31. Implementation Science Takes Off at Brown. 2023. Accessed 29 Feb 2024. .

  32. Davis R, D’Lima D. Building capacity in dissemination and implementation science: a systematic review of the academic literature on teaching and training initiatives. Implement Sci. 2020;15(1):97.

    Article  PubMed  PubMed Central  Google Scholar 

  33. Boyce CA, Barfield W, Curry J, et al. Building the next generation of implementation science careers to advance health equity. 2019;29:77–82. .

  34. Osanjo GO, Oyugi JO, Kibwage IO, et al. Building capacity in implementation science research training at the University of Nairobi. Implement Sci. 2016;11(1):1–9.

    Article  Google Scholar 

  35. Straus SE, Ma JT, Graham I. Defining knowledge translation. Review. 2009;181:165–8.

    Article  Google Scholar 

  36. National Center for Advancing Translational Science. Transforming Translational Science. Accessed 24 Mar 2020. 2017;Fall.

  37. Proctor E, Silmere H, Raghavan R, et al. Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Heal Ment Heal Serv Res. 2011;38(2):65–76.

    Article  Google Scholar 

  38. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implement Sci. 2009;4(1):1–15.

    Article  Google Scholar 

  39. Aarons GA, Hurlburt M, Horwitz SMC. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Heal Ment Heal Serv Res. 2011;38(1):4–23.

    Article  Google Scholar 

  40. Tabak RG, Khoong EC, Chambers DA, Brownson RC. Bridging research and practice: Models for dissemination and implementation research. Am J Prev Med. 2012;43(3):337–50.

    Article  PubMed  PubMed Central  Google Scholar 

  41. Walsh-Bailey C, Tsai E, Tabak RG, et al. A scoping review of de-implementation frameworks and models. Implement Sci. 2021;16(1):1–18.

    Article  Google Scholar 

  42. Nilsen P, Bernhardsson S. Context matters in implementation science: A scoping review of determinant frameworks that describe contextual determinants for implementation outcomes. BMC Health Serv Res. 2019;19(1):1–21.

    Article  Google Scholar 

  43. Powell BJ, Waltz TJ, Chinman MJ, et al. A refined compilation of implementation strategies: Results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10(1):1–14.

    Article  Google Scholar 

  44. Shea CM, Jacobs SR, Esserman DA, Bruce K, Weiner BJ. Organizational readiness for implementing change: A psychometric assessment of a new measure. Implement Sci. 2014;9(1):1–15.

    Article  Google Scholar 

  45. Finch TL, Girling M, May CR, et al. Improving the normalization of complex interventions : part 2 - validation of the NoMAD instrument for assessing implementation work based on normalization process theory ( NPT ). BMC Med Res Methodol. 2018;18(135):1–13.

    Google Scholar 

  46. Desveaux L, Nguyen MD, Ivers NM, et al. Snakes and ladders: A qualitative study understanding the active ingredients of social interaction around the use of audit and feedback. Transl Behav Med. 2023;13(5):316–26.

    Article  PubMed  PubMed Central  Google Scholar 

  47. Edwards N, Barker PM. The importance of context in implementation research. J Acquir Immune Defic Syndr. 2014;67:S157–62.

    Article  PubMed  Google Scholar 

  48. Gray CE, Mcintyre KP, Mattingly BA, Jr GWL. Interpersonal Relationships and the Self-Concept. Springer International Publishing; 2020. .

  49. Wu P, Liu T, Li Q, Yu X, Liu Z, Tian S. Maintaining the working state of firefighters by utilizing self-concept clarity as a resource. BMC Public Health. 2024;24(1):1–11.

    Article  Google Scholar 

  50. Balundė A, Paradnikė K. Resources linked to work engagement: the role of high performance work practices, employees’ mindfulness, and self-concept clarity. Soc Inq into Well-Being. 2016;2(2):55–62.

    Article  Google Scholar 

  51. Harvey G, Rycroft-Malone J, Seers K, et al. Connecting the science and practice of implementation – applying the lens of context to inform study design in implementation research. Front Heal Serv. 2023;3(July):1–15.

    Article  Google Scholar 

  52. Tabak RG, Padek MM, Kerner JF, et al. Dissemination and Implementation Science Training Needs: Insights From Practitioners and Researchers. Am J Prev Med. 2017;52(3):S322–9.

    Article  PubMed  PubMed Central  Google Scholar 

  53. Atkins BR, Allred S, Hart D. Philanthropy’s Rural Blind Spot. Stanford Soc Innov Rev. Published online 2021. .

  54. Stirman SW, Baumann AA, Miller CJ. The FRAME: An expanded framework for reporting adaptations and modifications to evidence-based interventions. Implement Sci. 2019;14(1):1–10.

    Article  Google Scholar 

  55. Miller CJ, Barnett ML, Baumann AA, Gutner CA, Wiltsey-Stirman S. The FRAME-IS: a framework for documenting modifications to implementation strategies in healthcare. Implement Sci. 2021;16(1):36.

    Article  PubMed  PubMed Central  Google Scholar 

  56. Boaz A. Lost in co-production: To enable true collaboration we need to nurture different academic identities . LSE; 2021. p. 1–4.

  57. Seifer S. Walking the Talk: Achieving the Promise of Authentic Partnerships. Partnersh Perspect. 2007;IV(I):1–12.

    Google Scholar 

  58. Ramanadhan S, Davis MM, Armstrong R, et al. Participatory implementation science to increase the impact of evidence-based cancer prevention and control. Cancer Causes Control. 2018;29(1):363–9.

    Article  PubMed  PubMed Central  Google Scholar 

  59. Shelton RC, Brownson RC. Enhancing Impact: A Call to Action for Equitable Implementation Science. Prev Sci Published online. 2023.

    Article  Google Scholar 

  60. International AIDS Society. Home Page. Accessed 2 Oct 2023. .

  61. Cahn P, McClure C. Beyond the first 25 years: The International AIDS Society and its role in the global response to AIDS. Retrovirology. 2006;3(1):2004–6.

    Article  Google Scholar 

  62. Kort R. 5th International AIDS Society Conference on HIV Pathogenesis, treatment and prevention: summary of key research and implications for policy and practice - operations research. J Int AIDS Soc. 2010;13(SUPPL. 1):1–6.

    Article  Google Scholar 

  63. Gayle H, Wainberg MA. Impact of the 16th International Conference on AIDS: can these conferences lead to policy change? Retrovirology. 2007;4:2–3.

    Article  Google Scholar 

  64. IHI Marks 30 Years of Quality Improvement in Health Care Worldwide. BusinessWire. Published 28 Oct 2021.

  65. Elwood WN, Corrigan JG, Morris KA. NIH-Funded CBPR: self-reported community partner and investigator perspectives. J Community Health. 2019;44(4):740–8.

    Article  PubMed  PubMed Central  Google Scholar 

  66. Teufel-Shone NI, Schwartz AL, Hardy LJ, et al. Supporting new community-based participatory research partnerships. Int J Environ Res Public Health. 2019;16(1):1–12.

    Article  Google Scholar 

  67. Minkler M, Blackwell AG, Thompson M, Tamir HB. Community-based participatory research: implications for public health funding. Am J Public Health. 2003;93(8):1210–3.

    Article  PubMed  PubMed Central  Google Scholar 

Download references


We would like to thank Drs. Cory Bradley and Donny Gerke for their contributions in early conceptualization of this paper and to colleagues who took the time to review and provide feedback prior to submission.


GB was supported by the National Institute of Mental Health grant T32MH019960 at Washington University (PI: Leopoldo J. Cabassa) during a portion of manuscript development. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institute of Mental Health.

Author information

Authors and Affiliations



GB, LF, and JM equally participated in the conception, drafting, and revising of the manuscript, and they have approved the manuscript as submitted. GB, LF, and JM agree to be personally accountable for their own contributions and to ensure that questions related to the accuracy or integrity of any part of the work, even ones in which the author was not personally involved, are appropriately investigated, resolved, and the resolution documented in the literature.

Corresponding author

Correspondence to Gretchen J. R. Buchanan.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

Author JM is the Director of The Center for Implementation and previously led the implementation team at the Knowledge Translation Program, St. Michael’s Hospital. Several examples are drawn from direct experience in these roles. LF and GB declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Buchanan, G.J.R., Filiatreau, L.M. & Moore, J.E. Organizing the dissemination and implementation field: who are we, what are we doing, and how should we do it?. Implement Sci Commun 5, 38 (2024).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: