Skip to main content
  • Systematic review
  • Open access
  • Published:

A systematic review of dissemination and implementation science capacity building programs around the globe

Abstract

Background

Research centers and programs focused on dissemination and implementation science (DIS) training, mentorship, and capacity building have proliferated in recent years. There has yet to be a comprehensive inventory of DIS capacity building program (CBP) cataloging information about activities, infrastructure, and priorities as well as opportunities for shared resources, collaboration, and growth. The purpose of this systematic review is to provide the first inventory of DIS CBPs and describe their key features and offerings.

Methods

We defined DIS CBPs as organizations or groups with an explicit focus on building practical knowledge and skills to conduct DIS for health promotion. CBPs were included if they had at least one capacity building activity other than educational coursework or training alone. A multi-method strategy was used to identify DIS CBPs. Data about the characteristics of DIS CBPs were abstracted from each program’s website. In addition, a survey instrument was developed and fielded to gather in-depth information about the structure, activities, and resources of each CBP.

Results

In total, 165 DIS CBPs met our inclusion criteria and were included in the final CBP inventory. Of these, 68% are affiliated with a United States (US) institution and 32% are internationally based. There was one CBP identified in a low- and middle-income country (LMIC). Of the US-affiliated CBPs, 55% are embedded within a Clinical and Translational Science Award program. Eighty-seven CBPs (53%) responded to a follow-up survey. Of those who completed a survey, the majority used multiple DIS capacity building activities with the most popular being Training and Education (n=69, 79%) followed by Mentorship (n=58, 67%), provision of DIS Resources and Tools (n=57, 66%), Consultation (n=58, 67%), Professional Networking (n=54, 62%), Technical Assistance (n=46, 52%), and Grant Development Support (n=45, 52%).

Conclusions

To our knowledge, this is the first study to catalog DIS programs and synthesize learnings into a set of priorities and sustainment strategies to support DIS capacity building efforts. There is a need for formal certification, accessible options for learners in LMICs, opportunities for practitioners, and opportunities for mid/later stage researchers. Similarly, harmonized measures of reporting and evaluation would facilitate targeted cross-program comparison and collaboration.

Peer Review reports

Background

Interest in dissemination and implementation science (DIS) has grown exponentially over the last 15 years in the USA and internationally [1,2,3]. This growth can be attributed, in part, to increased investment from funding agencies. Of the 27 Institutes and Centers at the National Institutes of Health (NIH) in the USA, 18 participate in the Dissemination and Implementation Research in Health Program Announcement (PAR-18-017). The NIH Fogarty International Center has championed multiple “implementation science alliances” for specific initiatives such as preventing mother-to-child HIV transmission and adolescent HIV prevention and treatment [4, 5]. Other funders such as the Patient Centered Outcomes Research Institute and William T. Grant Foundation have specific RFAs requesting proposals that apply DIS methods [6, 7]. Clinical and Translational Science Award (CTSA) programs funded by the NIH’s National Center for Advancing Translational Sciences are now required to develop specialized programs to promote DIS as a prerequisite for funding (PAR-18-940). CTSAs require multidisciplinary capacity building services to guide researchers to effectively implement and disseminate evidence-based solutions, and for hubs to work together to tailor these solutions for different environments [8].

Capacity building is “a general term for a process of individual and institutional development which leads to higher levels of skills and greater ability to perform useful research” [2, 9]. DIS capacity building requires a multi-pronged approach where various types of activities of training, mentoring, technical assistance, pilot funding, learning collaboratives, tools, and other resources work synergistically together to support the development of skills across diverse DIS competencies (e.g., applying frameworks for multi-level evaluation, stakeholder analysis). Capacity building also entails the development of a group or collaborative of experts who regularly interact and engage in collaborations, peer mentoring, mentoring emerging and junior researchers, and idea generation, and collaborating on grant applications and publications. Examples of activities include targeted consultation, technical assistance for research teams, and the provision of educational materials and operational toolkits to guide researchers in systematically applying DIS. DIS capacity building programs across diverse institutions have been serving this role around the globe but may use different terms such as knowledge translation, knowledge exchange, and quality improvement to encompass DIS efforts.

Interest in and demand for DIS capacity building has also gained traction outside of the USA. Examples of international funding agencies for DIS research has expanded across global agencies like the World Health Organization, National Institute for Health and Care Research (UK), and United States Agency for International Development [10,11,12]. These DIS programs and opportunities also support networking and gathering for individuals passionate about DIS in global settings and provide a diverse set of activities and resources to support DIS capacity building.

To meet the interest and concomitant investment in DIS, several programs focused on DIS training, mentorship, and capacity building have been developed [2, 13]. Even with a multitude of DIS offerings, the demand for DIS training and mentoring far exceeds available opportunities. For instance, the NIH-funded Training Institute for Dissemination and Implementation Research in Cancer (TIDIRC) program received 266 applicants for 35 slots in its first year and the NIH, NIDA, and VA-funded Implementation Research Institute routinely receives five applications for every one fellowship slot [14]. As such, finding ways to accelerate the pace of DIS capacity building and training has been recognized as an international priority, with experts and organizations (e.g., the NIH) concluding that we need greater access to capacity building for all levels of DIS researchers and practitioners [15].

Despite the rapid growth in DIS interest, demand, investment, and opportunity, to our knowledge there has yet to be a systematic review of DIS programs cataloging information about activities, infrastructure, and priorities [2]. Davis & D’Lima reviewed academic literature to catalogue teaching and training initiatives in DIS ranging from short training courses to intensive institutes. Our work expands on their review by systematically searching the internet for DIS capacity building programs inclusive of but not limited to training institutes. Also, because of the diversity of DIS research and practice areas (e.g., cancer, infectious disease, behavioral and school health) and the often-siloed nature of distinct academic disciplines, there is a need to examine areas of overlap and redundancy across DIS programs as well as explore opportunities for shared resources, collaboration, and growth [16]. We acknowledge and build on the work of Darnell and colleagues (2017) that reviewed and characterized existing DIS resource initiatives and identified multiple and overlapping “non-interactive” DIS resources (e.g., resource libraries, archived talks) [16]. In this systematic review, we identify and describe existing DIS infrastructure to build a comprehensive inventory of capacity building programs and activities. We highlight areas for opportunity, collaboration, and growth to boost discrete DIS capacity building efforts across programs and allow for greater synergy and collaboration.

Methods

We used a multi-method strategy to identify DIS capacity building programs (from here on referred to as DIS CBPs). The phases of our search, review, survey, data collection, and synthesis process are summarized below and in Fig. 1. The search followed the Preferred Reporting Items of Systematic Reviews and Meta-analyses (PRISMA) guidelines (see Additional File with a PRISMA checklist). The search commenced in August 2020 and the review finished in January 2022. The study was reviewed for approval by the UC San Diego Health Sciences Institutional Review Board (T28074).

Fig. 1
figure 1

Systematic review phases

Phase 1—DIS capacity building program search strategy

To identify all DIS CBPs, we used Google searches for “dissemination and implementation” and nine synonyms that are common outside of the USA (e.g., diffusion, knowledge transfer, improvement science) with and without the word “program” along with 13 synonyms (e.g., institute, center, collaborative) [17]. See Table 1 for a full list of search terms with the corresponding results. The first 50 search results were extracted from each unique search. Due to known shortcomings that result in an inability to reproduce Google search results for systematic reviews, we used Apify, an online web-scraping tool, that allowed us to automatically extract the search result title, website, and description thus limiting the number of searches run [18, 19]. We also crosschecked the DIS Program Search Strategy by confirming inclusion of a starter list of DIS CBPs that was created by the coauthors of this paper based on their expert knowledge.

Table 1 Dissemination and implementation science capacity building program

Phase 2—DIS program website screening using inclusion criteria

A three-tiered screening process was used to identify the initial list of DIS CBPs. First, trained research assistants screened resultant webpages from Google (n=7000) to identify DIS programs meeting eligibility criteria. CBPs were defined as an entity (e.g., organization, program or center) with at least one capacity building activity (e.g., consultation, technical assistance, networking events, journal club meetings) with an explicit focus or goal of building practical knowledge and skills to conduct DIS for public or population health work. CBPs affiliated with academic organizations including Clinical and Translational Science Award Program, governmental and funding entities, the United States Veterans Health Administration, or not for profits or collaboratives focusing on education, business, or technology with a stated emphasis on public or population health were also included.

Programs were included if (a) the focus was on DIS or one of the search terms listed in Table 1 (e.g., knowledge translation); (b) the organization offered at least one capacity building activity other than coursework, training, or a static educational material (e.g., an implementation science resource guide) as our intention was to identify CBPs that provide more comprehensive, multi-component capacity building activities rather than focusing solely on training or education; and (c) the program’s focus was in service of public or population health. All programs that were provisionally included were discussed to verify they were classified appropriately according to these criteria.

Second, as part of quality assessment, the study coordinator independently reviewed all programs in depth to confirm eligibility and screened for duplicate programs. The review team (who included a study coordinator, research assistants, a doctoral student, and faculty researchers) met biweekly to refine inclusion and exclusion criteria and come to consensus about specific programs. Lastly, two faculty researcher team members independently reviewed 50% of the included programs to confirm program inclusion. Each faculty, researcher team member then checked each other’s decisions on their respective 50%, with a select number tagged for full team discussion.

Phase 3—Targeted searches for additional DIS programs

To validate the initial list of programs and ensure an exhaustive list was compiled, we consulted 6 experts in the DIS field to identify missing programs. The DIS experts consulted were nationally and internationally recognized DIS researchers who were connected to our research team or identified via snowball sampling. Experts represented multiple perspectives including academic researchers with international reputation in DIS, government funders, and DIS experts with experience leading DIS training and capacity building. In addition to DIS expert nomination of missing programs, searches were conducted via NIH Reporter. This consisted of targeted searches on 2020 funding awards for DIS programs funded by NIH with known DIS components including CTSA and multi-project research applications (e.g., U01, P50s). Results were screened for duplicates and eligibility.

Phase 4—Website abstraction for all programs and data cleaning

The full website for each CBP was reviewed for capacity building information and activities. Information was abstracted by research assistants using an iteratively developed abstraction form. Abstracted information was reviewed by BR and NAS after decisions about program inclusion were made following the robust reliability assessments described in Quality Assurance section below. Fields for abstraction included locational information, primary website, DIS concentration, and DIS capacity building activities (see Table 2). All identified programs were categorized with uniform descriptors to characterize their DIS focus and capacity building activities. Table 3 describes the agreed upon definitions for the capacity building activities. In addition, we contacted 5 academic institutions each with multiple DIS programs to clarify whether to list their programs separately or to include them as a single institutional center. All indicated to include each program individually.

Table 2 Results from systematic review of DIS programs
Table 3 Dissemination and implementation science capacity building activities

Phase 5—DIS capacity building survey distribution and findings

To gather additional information about each CBP, we emailed a Capacity Building Survey that requested detailed information about various aspects of the program’s capacity building efforts to the primary contact identified for each program, with up to three reminder emails. The survey was structured following the domains of the Washington University Network of Dissemination and Implementation Research model [1]. Model domains include inputs (e.g., funding model, human resources), activities (e.g., training, mentorship), outputs (e.g., grant outcomes, academic outcomes), and long-term public health outcomes guided by the Translational Science Benefits Model [20]. The following elements were also included in the survey: year when the CBP was established; the CBP’s primary contact; member characteristics, community partners, financial resources, types of activities, measurement, evaluation, and D&I competencies or frameworks for program evaluation (see Additional file 2 for the full survey). Outputs also included whether CBPs specialized in a DIS product or resource, with respondents able to elaborate in a text response. Results of the survey were analyzed by summarizing responses, displaying frequencies, and synthesizing themes. The number of respondents varied across survey items, so denominators reflect who responded to the respective item.

Quality assurance assessment

We developed a rigorous process of multiple checks in which programs included and excluded were reviewed several times by independent team members. The process involved a first review by trained research assistants who established an agreement rate across other reviewers of at least 90% on a subsample of programs before initiating independent review. A sample of 100 search results were used to establish inter-rater agreement with >90% agreement serving as the threshold for concordance among reviewers. Second, the study coordinator reviewed all included sites for accuracy of inclusion and exclusion and also screened for duplicates. Next, there was a tertiary review by two senior team members in which each member independently reviewed 50% and then crosschecked each other’s decisions with a select number flagged for discussion. The refined list was circulated to a purposive sample of DIS experts for review and determination of missing sites. Lastly, programs were checked for accuracy and validity through a website check and abstraction of key data from websites.

Results

DIS capacity building program search results (Phases 1–3)

The first 50 search results were extracted from each unique search resulting in 140 searches and 7000 search results. After removing duplicates (n= 174) and those not meeting the eligibility criteria (n= 6130), 696 CBPs were retained (see Fig. 2). The secondary review for inclusion and duplicates resulted narrowed the sample to 186 CBPs. The tertiary review by faculty researcher team members resulted in an additional exclusion of 69 CBPs. The refined list of 117 programs was circulated to eight DIS experts who nominated 36 additional programs. Individual program searches of funding mechanisms yielded 162 additional programs. In total, through expert nomination and funding searches, an additional 203 programs were identified. Of these 203 CBPs, 151 did not meet criteria and 4 were duplicate sites. This phase resulted in 48 additional DIS CBPs for a total of 165 CBPs.

Fig. 2
figure 2

PRISMA diagram for DIS program systematic review

DIS program characteristics (Phase 4)

One hundred sixty-five DIS CBPs are included in the final list displayed in Table 2. One hundred twelve (68%) are in the USA and 53 (32%) are internationally based. Program characteristics were abstracted from program websites and from responses of those CBPs who completed the follow-up survey (Table 2). One hundred thirty-one CBPs had a concentration in Dissemination and Implementation Science (DIS) (79%), 46 had a concentration in Quality Improvement (28%), 27 in Knowledge Translation (16%), 5 in Policy (3%), and 4 in Community Engagement (2%). Table 2 is organized alphabetically by host institution, with CBPs that responded to the survey highlighted at the top. Activities identified from website review included the following: Conferences/Workshops, Consultation, Data Analysis, Database, Fellowships, Framework/Tool Development, Funding, Guideline Development, Internships, Mentorship, Research, Seminars/Webinars, Training/Courses, Training Materials, Video Channel, and Work Placement.

Capacity building survey results (Phase 5)

Ninety-two (56%) of the 165 CBPs invited responded to the survey invitation and 5 of 92 indicated that they were no longer active. Thus, 87 (53%) CBPs completed the survey. Of these, 62 (80%) were based in the USA and 56 (71%) were affiliated with an academic institution (see Table 4). Survey results indicated that 37 (43%) CBPs do not serve a specific population. Of those that specify a population of focus, 31 (36%) reported serving Adults, 29 (33%) reported Clinical, 27 (31%) Urban, 27 (31%) Women, 27 (31%) Older adults, and 23 (26%) General community. See Table 4 for a full summary of the Capacity Building Survey responses, and Additional file 1 displays qualitative responses to the question about program-specific DIS products or resources.

Table 4 DIS capacity building program (CBP) survey results

DIS program inputs (Phase 5)

Most CBPs, 34 (39%), reported having 0 to 5 faculty with formal positions. Similarly, most CBPs, 42 (48%), reported having 0–5 staff with formal positions. A few CBPs reported having high numbers of faculty or staff; 5 (6%) CBPs had 21 or more faculty positions; 3 (3%) of CBPs had 50 or more; and 1 (1%) CBP had more than 50 staff positions. Regarding membership, 21 (34%) reported that their program had more than 200 members, whereas 22 (36%) reported 0–50 members. In terms of CBP funding, 33 (45%) described their funding sources as long term/ongoing, while 21 CBPs (29%) had both short term (project-specific or start-up funds) and long term. Specific sources of funding were research/program grants, 45 (52%), CTSA funding, 25 (29%), and internal institutional funds, 35 (40%).

DIS program activities (Phase 5)

Sixty-nine (79%) provided Training and Education, 58 (67%) provided Mentorship, 57 (66%) offered Resources and Tools, and 58 (67%) provided Consultation. CBPs provided many types of DIS Training, including Webinars/Seminars 63 (91%), Coursework 43 (62%), and Invited Guest Speakers 41 (59%). Twenty-eight (49%) CBPs do not charge fees for DIS Consultations and 23 (40%) reported that consultation fees varied depending on the situation. Virtual/in-person Networking Events were the most common (35 (65%)) type of DIS Professional Networking opportunity offered. Of the CBPs that offered DIS grant development support, 28 (62%) held Virtual/in-person Training. Most DIS Technical Assistance was offered in the form of Virtual/in-person (35 (76%)) or recorded video tutorials/training (23 (50%)). Forty-nine (86%) CBPs offered educational materials as the most common type of DIS Resource and Tool.

DIS program outputs and outcomes (Phase 5)

Outputs included whether CBPs specialized in a DIS product or resource. Of the 47 (63%) CBPs that affirmed the development of a DIS product, responses ranged across courses (10 (21%)) like the Healthcare Delivery Science Course, DIS models (14 (30%)) like the Iowa Model for Evidence-based practice, and DIS frameworks (10 (21%)) such as StrategEase. In terms of short-term outcomes and evaluation, 44 (60%) CBPs reported using DIS competencies to guide activities. Forty-eight CBPs (55%) evaluated their impact using Productivity measures, while Member Engagement (37%), Member Satisfaction (36%), and Training Effectiveness (35%) were also popular evaluation metrics. Thirty-five CBP (49%) respondents measure productivity once a year and 16 (22%) measure every 6 months.

Long-term outcomes include using Translational Science Benefits (TSB) Indicators or categories for evaluation [20, 21]. Twenty-seven percent of respondents (n=20) reported that they use the TSB indicators for evaluation. Of these, 13 (65%) reported using Community & Public Health (e.g., health care delivery, accessibility, life expectancy), 11 (55%) Economic (e.g., cost effectiveness, cost savings, societal cost of illness), 11 (55%) Policy and Legislative (e.g., committee participation, policies, expert testimony), and 9 (45%) Clinical and Medical indicators (e.g., drugs or diagnostic guidelines).

Discussion

This systematic review identified 165 national and international DIS CBPs with most having more than two relevant DIS capacity building activities. As a result of this work, an interactive, searchable, online resource with an inventory of the programs will be made available for the DIS community to facilitate on-going capacity building, multidisciplinary engagement, and minimization of duplication of efforts across DIS CBPs. We also describe the key features of DIS CBPs as reported through the Capacity Building Survey. This study found that CBPs in the review described diverse funding models from several sources. Most CBPs have funding from research and program grants, as well as internal institutional grants and CTSA funding. The majority highlighted a robust infrastructure with faculty having formal program supervisory or operational roles. Three longstanding CBPs, Quality Enhancement Research Initiative (QUERI) National Program, National Cancer Institute Implementation Science Centers in Cancer Control (ISC3), Institute for Healthcare Improvement reported having more than 100 faculty with roles to support their DIS CBPs underscoring the immense human resources needed to build, deliver, and sustain large-scale DIS CBPs as compared to most respondents who reported fewer than 5 faculty. These CBPs are certainly outliers but nonetheless highlight that infrastructural funding is indispensable to the operations and delivery of CBPs.

There is overlap across capacity building activities and initiatives. Nearly all offer educational webinars or seminars in DIS consistent with exponential increases globally in virtual educational offerings, particularly since the COVID-19 pandemic. One conservative estimate suggests between 2020 and 2021 the number of webinars (not DIS-specific) grew by 162% and attendees increased 251% compared with before the pandemic [22]. Given that online seminars are often open to international and public audiences, there is a need to coordinate and streamline virtual programming so that there are fewer, more focused and targeted skill-building sessions to conserve resources. Anecdotally from our own center at UC San Diego, members highlight feeling overwhelmed and inundated with online seminars and unable to identify which might be most important, relevant, or applicable to their work.

Mentorship and consultation were frequently used capacity building strategies underscoring the potential critical importance of individualized and tailored guidance when applying DIS approaches. Despite most CBPs offering consultation, these approaches may be more difficult to grow at scale and unable to accommodate growing interest among trainees, staff, faculty, and D&I practitioners. Compared with the other capacity building strategies reported by 70–94% of CBPs, relatively fewer reported offering technical assistance (65%) as one of their CBP activities. Understanding why technical assistance is offered by fewer CBPs might be an important question for future study considering that technical assistance often includes providing hands-on support to community partners and other stakeholders, an area critical for operationalizing DIS [23].

Few CBPs reported offering Doctoral or PhD programs with a focus on DIS. As DIS evolves and expands, there is arguably a greater need for formalized coursework and programmatic offerings within higher-education degree programs as compared with offering traditional single seminar, workshops, and virtual offerings. DIS users may also want more formal training activities (e.g., those that offer a certification of completion) so they can be considered credentialed implementation scientists and include this in their professional curriculum vitae.

Furthermore, few programs endorsed offering DIS training within clinical professional degree programs, illustrating the gap in the field of DIS approaches in clinical practice. Clinical researchers and practitioners tend to have interest in DIS and use DIS approaches in their research, and would likely benefit from the integration of DIS coursework into their degree programs [24, 25]. There are numerous resources available online for training and education in DIS, but many of these existing opportunities available may be best suited for early career investigators or trainees who are still forming their careers and research interests. There is a need to support mid to later stage investigators who may not have time to independently study DIS and integrate DIS into their research, requiring more individualized and tailored guidance. These findings align with the Davis & D’Lima [2] review of training institutes; there appear to be fewer opportunities for later-stage investigators, practitioners, policy makers, and community partners despite a growing number of degree or certificate programs [26] such as University of Washington’s PhD in global health metrics and implementation science [27] or University of California San Francisco Implementation Science Certificate program [28].

This review also revealed a dearth of programs across LMICs. Only one CBP, the Nigerian Implementation Science Alliance, was identified in sub-Saharan Africa. We did not locate programs in South America, Central America, or Eastern Europe. It is also noteworthy from our review that DIS CBPs necessitate immense resources, time, and personnel to plan, market, and deliver training, networking, consultation, and other DIS opportunities for researchers and practitioners. Institutions based in LMICs may have fewer disposable resources to devote to DIS capacity building. Moreover, with the range and diversity of DIS offerings, it is increasingly difficult to know whether CBPs are effective in producing measurable impact and ultimately improving population health. There is a need for development of shared outcomes and success metrics across DIS CBPs which will not only advance programs’ ability to advocate for funding but will facilitate broad evidence synthesis to identify programs with high degree of output and impact. Davis & D’Lima also find that standardized metrics are needed in the reporting of key elements of DIS training content and structure as well as the evaluation of the CBPs [2].

Strengths of this systematic review include the multi-phase, multi-method, rigorous approach for the identification, screening, and abstraction of CBPs using triangulation of sources across web-based approaches and external, expert reviewers. One method would not have been suitable to capture all CBPs as some programs have dated or incomplete websites. Additionally, we deliberately included terms to capture international DIS programs (e.g., knowledge translation, knowledge exchange). Our processes to check validity and active operations of the programs through thorough website review and individual emails to program contacts strengthened our results.

Although we received responses from just over half of identified programs, the response rate was still higher than the average response rate for online surveys (i.e., 44%) [29]. We were also able to cross-check characteristics abstracted from the website with program self-report data from the survey. Limitations include the possible exclusion of search terms that may be more common in non-English speaking countries or other countries outside of North America and Europe. Given our supplementary search of NIH funding mechanisms, it is also possible that we more easily identified US-based programs. However, we did make substantial efforts to include non-US resources including outreach to DIS experts outside of North America and, when possible, the translation of websites found through the Google searches. We also focused most exclusively on DIS programs; therefore, this list is not an exhaustive reflection of improvement science or quality improvement programs. Moreover, we conducted the searches and reviews in English and are thereby limited to programs with readily translatable webpages. Programs that were recommended by our expert review panel also had a web presence. Programs within larger organizations such as hospitals and research institutes may have DIS capacity building programs not publicly available on websites. We may not have been able to identify programs without a web presence.

Conclusions

We identified 165 CBPs from 13 countries. This systematic review is the first to comprehensively catalog CBPs around the world and describe their key features and offerings. Our team’s primary next step is to build a searchable online platform featuring each CBP’s descriptive profile to allow DIS researchers and practitioners to collaborate and continually add and update program information. Despite the quantity of DIS programs, several opportunities remain to further enhance and streamline DIS capacity building efforts. Key priorities include sustainment strategies such as advocating for extramural and internal funding to support program infrastructure and capacity building operations rather than relying on traditional research study grants to indirectly support operational activities. There is also a need for formal certification, low-cost, accessible options for learners in LMICs, opportunities for practitioners/non-researchers (e.g., DIS and medical degrees), and opportunities for mid/later stage researchers.

Availability of data and materials

The datasets used during the current study are available from the corresponding author upon request. There are also plans to make the data freely available online in a searchable database.

Abbreviations

CBP:

Capacity building program

DIS:

Dissemination and implementation science

NIH:

National Institutes of Health

LMIC:

Low- and middle-income country

TIDIRC:

Training Institute for Dissemination and Implementation Research in Cancer

References

  1. Brownson RC, Proctor EK, Luke DA, Baumann AA, Staub M, Brown MT, et al. Building capacity for dissemination and implementation research: one university’s experience. Implement Sci. 2017;12(1):104.

    Article  PubMed  PubMed Central  Google Scholar 

  2. Davis R, D’Lima D. Building capacity in dissemination and implementation science: a systematic review of the academic literature on teaching and training initiatives. Implement Sci. 2020;15(1):97.

    Article  PubMed  PubMed Central  Google Scholar 

  3. Shelton RC, Lee M, Brotzman LE, Wolfenden L, Nathan N, Wainberg ML. What is dissemination and implementation science?: an introduction and opportunities to advance behavioral medicine and public health globally. Int J Behav Med. 2020;27(1):3–20.

    Article  PubMed  Google Scholar 

  4. Aarons GA, Sommerfeld DH, Chi BH, Ezeanolue EE, Sturke R, Guay L, et al. Concept mapping of PMTCT implementation challenges and solutions across 6 sub-Saharan African countries in the NIH-PEPFAR PMTCT Implementation Science Alliance. JAIDS J Acquir Immune Defic Syndr. 2016;72(2):S202–6.

    Article  PubMed  Google Scholar 

  5. Aarons GA, Reeder K, Sam-Agudu NA, Vorkoper S, Sturke R. Implementation determinants and mechanisms for the prevention and treatment of adolescent HIV in sub-Saharan Africa: concept mapping of the NIH Fogarty International Center Adolescent HIV Implementation Science Alliance (AHISA) initiative. Implement Sci Commun. 2021;2(1):53.

    Article  PubMed  PubMed Central  Google Scholar 

  6. PCORI. Who & What We Fund. [cited 2022 Jun 21]. Available from: https://www.pcori.org/funding-opportunities/what-who-we-fund

  7. William T. Grant Foundation. Research grants on improving the use of research evidence. [cited 2022 Jun 21]. Available from: https://wtgrantfoundation.org/grants/research-grants-improving-use-research-evidence

  8. Begg MD, Crumley G, Fair AM, Martina CA, McCormack WT, Merchant C, et al. Approaches to preparing young scholars for careers in interdisciplinary team science. J Invest Med. 2014;62(1):14–25.

    Article  Google Scholar 

  9. Rabin BA, Viglione C, Brownson RC. A glossary for dissemination and implementation research. In: Dissemination and implementation research in health: translating science to practice. 3rd ed. New York: Oxford University Press; 2023.

  10. World Health Organization. Research for Implementation. [cited 2022 Jun 21]. Available from: https://tdr.who.int/our-work/research-for-implementation

  11. United States Agency for International Development. USAID’s implementation science investment [Internet]. [cited 2022 Jun 21]. Available from: https://www.usaid.gov/news-information/fact-sheets/usaids-implementation-science-investment#:~:text=Definition%20of%20Implementation%20Science&text=Implementation%20science%20takes%20innovative%20approaches,practices%20for%20real%20world%20application.

  12. National Institute for Health and Care Research. Funding opportunities. [cited 2022 Jun 21]. Available from: https://www.nihr.ac.uk/researchers/funding-opportunities/

  13. Proctor EK, Chambers DA. Training in dissemination and implementation research: a field-wide perspective. Transl Behav Med. 2017;7(3):624–35.

    Article  PubMed  Google Scholar 

  14. Meissner HI, Glasgow RE, Vinson CA, Chambers D, Brownson RC, Green LW, et al. The U.S. training institute for dissemination and implementation research in health. Implement Sci. 2013;8(1):12.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Osanjo GO, Oyugi JO, Kibwage IO, Mwanda WO, Ngugi EN, Otieno FC, et al. Building capacity in implementation science research training at the University of Nairobi. Implement Sci. 2015;11(1):30.

    Article  Google Scholar 

  16. Darnell D, Dorsey CN, Melvin A, Chi J, Lyon AR, Lewis CC. A content analysis of dissemination and implementation science resource initiatives: what types of resources do they offer to advance the field? Implement Sci. 2017;12(1):137.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Colquhoun H, Leeman J, Michie S, Lokker C, Bragge P, Hempel S, et al. Towards a common terminology: a simplified framework of interventions to promote and integrate evidence into health practices, systems, and policies. Implement Sci. 2014;9(1):781.

    Article  Google Scholar 

  18. Apify.com. (Web Scraping Tool). Accessed Aug 2020.

  19. Piasecki J, Waligora M, Dranseika V. Google Search as an additional source in systematic reviews. Sci Eng Ethics. 2017; [cited 2022 Jun 21]; Available from: http://link.springer.com/10.1007/s11948-017-0010-4.

  20. Luke DA, Sarli CC, Suiter AM, Carothers BJ, Combs TB, Allen JL, et al. The translational science benefits model: a new framework for assessing the health and societal benefits of clinical and translational sciences: translational science benefits model. Clin Transl Sci. 2018;11(1):77–84.

    Article  PubMed  Google Scholar 

  21. WUSTL. Translational Science Benefits Model. [cited 2022 Jun 21]. Available from: https://translationalsciencebenefits.wustl.edu/about-the-model-2/

  22. BusinessWire. Global report shows use of webinars triples, driving digital-first engagement across industries. 2021; Available from: https://www.businesswire.com/news/home/20210602005035/en/Global-Report-Shows-Use-of-Webinars-Triples-Driving-Digital-First-Engagement-Across-Industries#:~:text=The%20number%20of%20webinars%20grew,number%20of%20webinars%20in%202021.

    Google Scholar 

  23. Mangosing D, Kumalo-Sakutukwa G, Bourdeau B, Rebchook G, Lightfoot M, Myers JJ. Supporting community partners in reducing HIV-related health disparities: technical assistance across a spectrum of intensity. Inq J Health Care Organ Provis Financ. 2022;59:004695802210814.

    Google Scholar 

  24. Rudd BN, Davis M, Beidas RS. Integrating implementation science in clinical research to maximize public health impact: a call for the reporting and alignment of implementation strategy use with implementation outcomes in clinical research. Implement Sci. 2020;15(1):103.

    Article  PubMed  PubMed Central  Google Scholar 

  25. Stevens ER, Shelley D, Boden-Albala B. Unrecognized implementation science engagement among health researchers in the USA: a national survey. Implement Sci Commun. 2020;1:39.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Huebschmann AG, Johnston S, Davis R, Kwan B, Geng E, Haire-Joshu D, et al. Promoting transparency, rigor, and sustainment in implementation science capacity building programs: a multi-method scoping review. Press Implement Res Pract. 2022;3.

  27. Department of Global Health, University of Washington. PhD in Global Health Metrics and Implementation Science. [cited 2022 Jun 21]. Available from: https://globalhealth.washington.edu/academic-programs/phd-global-health-metrics-and-implementation-science

  28. Department of Epidemiology & Biostatistics. FAQs for Implementation Science Certificate. [cited 2022 Jun 21]. Available from: https://epibiostat.ucsf.edu/faqs-implementation-science-certificate

  29. Wu MJ, Zhao K, Fils-Aime F. Response rates of online surveys in published research: a meta-analysis. Comput Hum Behav Rep. 2022;7:100206.

    Article  Google Scholar 

Download references

Acknowledgements

The authors wish to thank the following DISC Research Interns who supported website screening and review: Nicholas Lee, Jillian Abasta, Kasey Yu, and Melanie Aguilar. The authors also thank the expert reviewers who supported the identification of missing programs.

Funding

This project received financial support from the University of California San Diego Altman Clinical and Translational Research Institute Dissemination and Implementation Science Center (DISC) and infrastructural support from the Altman Clinical Translational Research Institute (UL1 TR001442).

Author information

Authors and Affiliations

Authors

Contributions

CV, BR, NAS, and JAC conceived of the study design. CV, OF, JAC, and BB participated in the systematic review. CV, NAS, BR, and JAC developed the program survey. BR and NAS developed the review protocol and structured the manuscript. JAC extracted the initial set of results from Google. BR and NAS reviewed the final results. CV, OF, and BB participated in summary and analysis. CV drafted the manuscript and BB, JAC, OF, NAS, GAA, LBF, and BR wrote and revised manuscript sections. All authors reviewed and revised the manuscript and approved the final version.

Corresponding author

Correspondence to Clare Viglione.

Ethics declarations

Ethics approval and consent to participate

The study was reviewed for approval by the UC San Diego Health Sciences Institutional Review Board (T28074).

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

Survey Responses About DIS Product/Resource Expertise1.

Additional file 2. 

D&I Science Capacity Building Survey.

Additional file 3.

PRISMA 2020 Checklist.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Viglione, C., Stadnick, N.A., Birenbaum, B. et al. A systematic review of dissemination and implementation science capacity building programs around the globe. Implement Sci Commun 4, 34 (2023). https://doi.org/10.1186/s43058-023-00405-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s43058-023-00405-7

Keywords