- Short report
- Open Access
Unrecognized implementation science engagement among health researchers in the USA: a national survey
Implementation Science Communications volume 1, Article number: 39 (2020)
- The Correction to this article has been published in Implementation Science Communications 2020 1:65
Implementation science (IS) has the potential to serve an important role in encouraging the successful uptake of evidence-based interventions. The current state of IS awareness and engagement among health researchers, however, is relatively unknown.
To determine IS awareness and engagement among health researchers, we performed an online survey of health researchers in the USA in 2018. Basic science researchers were excluded from the sample. Engagement in and awareness of IS were measured with multiple questionnaire items that both directly and indirectly ask about IS methods used. Unrecognized IS engagement was defined as participating in research using IS elements and not indicating IS as a research method used. We performed simple logistic regressions and tested multivariable logistic regression models of researcher characteristics as predictors of IS engagement.
Of the 1767 health researchers who completed the survey, 68% stated they would be able to describe IS. Only 12.7% of the population self-identified as using IS methods. Of the researchers not self-identifying as using IS methods, 86.4% reported using the IS elements “at least some of the time.” Nearly half (47.9%) reported using process/implementation evaluation, 89.2% use IS measures, 27.3% use IS frameworks, and 75.6% investigate or examine ways to integrate interventions into routine health settings. IS awareness significantly reduced the likelihood of all measures of unrecognized IS engagement (aOR 0.13, 95% CI 0.07 to 0.27, p < 0.001).
Overall, awareness of IS is high among health researchers, yet there is also a high prevalence of unrecognized IS engagement. Efforts are needed to further disseminate what constitutes IS research and increase IS awareness among health researchers.
Over the past 15 years, as a field, implementation science (IS) has made great strides to raise awareness of IS as well as establish methods and frameworks that provide for rigorous and meaningful implementation research [1, 2]. Defined as “the scientific study of methods to promote the systematic uptake of research findings and other evidence-based practices into routine practice and, hence, to improve the quality and effectiveness of health services” , the appropriate and rigorous use of IS can promote the dissemination and increase the effectiveness of interventions in real-world settings [4, 5].
As efforts have emerged to advance IS, resources have been developed seeking to increase awareness of the importance of the field, as well as develop an understanding of the qualities that make an implementation study rigorous and “good” [1, 2, 6]. The scope of IS is broad, and can be challenging for uninitiated investigators to define, as it encompasses a range of methods both unique to IS as well as derived from other disciplines . Therefore, the broad scope of IS can make it difficult to identify and differentiate from other types of research.
More researchers are beginning to incorporate implementation concepts into their research. Indeed, many non-IS research funding opportunities now expect the incorporation of components of implementation into funding proposals . However, using implementation elements without an awareness that they are part of an established set of methods may jeopardize the rigor of the implementation research performed. Further, while not all researchers are expected to become IS experts, a lack of awareness of IS methods may impede collaboration between researchers during implementation-focused research . As a result, ensuring researchers who engage in implementation research are aware of IS methods included in their research is pivotal to impactful implementation research.
As the field of IS advances, the engagement and collaboration of health researchers across disciplines will serve an important role in the successful implementation of evidence-based interventions. Current levels of IS engagement and the use of implementation methods among health researchers are not clear. To address this knowledge gap, we performed a survey of health researchers to measure awareness of and engagement in IS research.
The survey was distributed from January to March 2018 by e-mail to health researchers who received federal funding (including all NIH institutes, as well as the Centers for Disease Control and Prevention [CDC], Agency for Healthcare Research and Quality [AHRQ], Health Resources and Services Administration [HRSA], Administration for Children and Family [ACF], and U.S. Department of Veterans Affairs [VA], but not including Patient-Centered Outcomes Research Institute [PCORI]) in the past 5 years. Basic science researchers and non-research grant recipients (e.g., small business grants or conference grants) were excluded from the sample. The sampling frame consisted of participant e-mails obtained from NIH RePORTER . A simple random sample of researchers received the survey. The New York University institutional review board approved the study.
Data were collected via an online questionnaire examining participant demographics, current research practices, and perceptions of IS. Survey item development was guided by expert opinion and behavioral models [10,11,12]. The survey questions were pilot tested with a sample of health researchers from a variety of fields. The survey collected quantitative data including responses on a Likert scale and categorical responses. The survey was distributed by email via Qualtrics, and all responses were anonymous. The relevant survey measures can be found in the appendix.
Defining engagement in implementation science
Engagement in IS was measured with multiple questionnaire items both directly and indirectly asking about the use of IS methods. Measures of IS engagement included using an IS framework; performing a process/implementation evaluation; research integrating an intervention into routine settings; incorporating measures of acceptability, adoption, appropriateness, feasibility, fidelity, implementation cost, penetration, and sustainability into existing research design. A researcher was considered to have performed IS elements if they indicated that they performed one or more IS elements at least “sometimes,” they report performing or collaborating on an implementation study in the past five years, and/or they self-identified as using “implementation science” methods. This approach for defining engagement was chosen because researchers may not be familiar with IS terminology and methods even if they are using elements of IS in their research. A cut-off of “sometimes” was selected to capture researchers using IS elements even if IS does not represent the majority of the research they engage in. Participants were asked to report the methods they use in their research from a list of research methods, researchers could select multiple methods. Unrecognized IS engagement was thus defined as participating in research using an IS element and not indicating IS as a research method used. Similarly, IS awareness was assessed by asking whether they would be able to describe IS to a colleague. All measures were self-report. See supplementary materials for the text of survey questions pertaining to IS engagement.
Surveys of less than 85% complete were excluded from analyses. All surveys were examined for inconsistencies and invalid responses were treated as missing values, resulting in slightly different denominators for analyses. We performed descriptive data analysis and multivariable logistic regressions, controlling for the participant demographics, to compare the characteristics of health researchers who use IS to those who do not report its use and assess which researcher characteristics are associated with unrecognized IS engagement. Results are reported as adjusted odds ratios (aOR) with 95% confidence intervals (95% CI). All analyses were performed in Stata (version 14, College Station, TX).
The survey was distributed to 7259 health researchers. Nearly 30% (2051) of participants started the survey and 1767 participants completed at least 85% of the survey for an overall response rate of 24.3%. The population of non-completers differed significantly from those who completed the survey. Compared to survey completers, non-completers were more likely to only have a master’s degree (6.1% vs. 3.5%), less likely to self-identify as using IS methods (3.8% vs. 12.7%), less likely to report RCTs (22.3% vs. 43.0%), cohort studies (20.3% vs. 29.4%), and epidemiology (12.6% vs. 23.8%) as a method used.
The characteristics of respondents who completed the survey are presented in Table 1. Respondent demographics were generally representative of the NIH funded population . Participants were geographically diverse within the USA institution types included academic (87.4%), public (19.1%), non-profit (14.0%), and private (3.4%). As their highest degree received, 69.4% had a PhD alone, 12.3% had an MD and master’s degree, 9.9% had an MD alone, and 3.6% and 3.5% had an MD-PhD or master’s degree alone, respectively. The most common reported research methods were RCTs (43.0%), cohort studies (29.4%), and epidemiology (23.8%), qualitative research (17.9%), and statistics (17.5%). IS method use was reported by 12.7% of participants.
Implementation science awareness and engagement
Although only 12.7% of the population self-identified as using IS methods, 93.8% reported at least sometimes using elements of IS (Table 2). Of the researchers not identifying as using IS methods, 86.4% at least sometimes use elements of IS in their research. Nearly half (47.9%) of researchers reported using process/implementation evaluations, 89.2% reported using IS measures, 27.3% reported using IS frameworks, and 75.6% reported developing or testing ways to integrate interventions into routine health settings. More than two-fifths (43.7%) of respondents reported they performed or collaborated on a study examining the translation of an intervention into routine settings in the past five years. Nearly two-thirds (63.9%) of researchers not self-identifying as using IS methods stated they would be able to describe IS to a colleague.
Characteristics associated with unrecognized IS engagement
Researcher characteristics associated with unrecognized IS engagement are presented in Table 3. IS awareness significantly reduced the likelihood of all measures of unrecognized IS engagement (aOR 0.13, 95% CI 0.07 to 0.27, p < 0.001). IS awareness decreased unrecognized process/implementation evaluations (aOR 0.14, 95% CI 0.06 to 0.33, p < 0.001), use of IS measures (aOR 0.14, 95% CI 0.07 to 0.29, p < 0.001), use of IS frameworks (aOR 0.23, 95% CI 0.08 to 0.68, p < 0.001), and research integrating an intervention into routine settings (aOR 0.16, 95% CI 0.08 to 0.33, p < 0.001).
Compared to academic institutions, research at a public institution was consistently associated with a decreased likelihood of unrecognized IS engagement (aOR 0.54, 95% CI 0.36 to 0.81, p < 0.01) including process/implementation evaluations (aOR 0.62, 95% 0.40 to 0.97, p = 0.034), use of IS measures (aOR 0.54, 95% CI 0.36 to 0.81, p < 0.01), and research integrating an intervention into routine settings (aOR 0.54, 95% CI 0.36 to 0.82, p < 0.01).
This study demonstrated the majority of health researchers are aware of IS, with more than two-thirds of the population stating they would be able to describe IS to a colleague; however, comprehensive understanding of IS may not be universal. Despite the high level of self-reported awareness of IS, there may be a general misunderstanding of the scope of IS. An overwhelming majority of health researchers reported at least sometimes using elements of IS; however, when asked directly the type of methods used, only one-tenth of researchers self-identified as using IS. It is not expected that all researchers would or should identify as IS researchers; however, the gap between those identifying as IS researchers and those reporting IS use is larger than would be ideal. The disparity indicates there may be many researchers engaging in IS without being aware their methods would fit under the umbrella of IS research, consider the IS methods used as belonging to another field of research, or do not consistently use a sufficient number of IS methods to consider their work IS. This use of IS elements without identifying them as methods in the field of IS may jeopardize the rigor of the implementation research.
As a field, IS not only seeks to bring attention to the need for real-world relevance in research , but, through its frameworks and methods, IS seeks to improve the rigor and transparency of the methods used to examine implementation [1, 14,15,16,17,18]. Many implementation studies in published literature still have weak study designs and lack the rigor necessary to successfully answer important implementation research questions [19, 20]. The potential for the perpetuation of poor practices in implementation research is particularly important as many non-IS health researchers are now expected to incorporate components of implementation into their research . A lack of sufficient awareness of IS methods and training among health researchers could explain some of the shortcomings seen in implementation research. Increasing awareness of IS methods among non-IS researchers who engage in implementation research may lead to more impactful implementation research.
Over the past two decades, considerable progress has been made conceptualizing what constitutes IS  and many resources to define and explain IS have been developed [2, 19]. Our study results, however, confirm previous observations that considerable confusion persists about the terminology and scope of IS [18, 21, 22]. The discordance between researchers using elements of IS and those acknowledging the use of IS methods may be partly explained by a confusion regarding what separates IS from other research methodologies. The scope of IS is broad and incorporates many methods and measures familiar to researchers in a variety of other disciplines . Therefore, some health researchers may have been exposed to and using elements of IS as part of research in other fields (e.g., quality improvement).
As many IS resources have been made available only recently, the observed low levels of self-identification as using IS methods may be a result of a lag between IS resource development and dissemination to health researchers. Due to the disconnect between IS element use and the acknowledgement of IS engagement, further efforts are likely needed to disseminate IS to researchers across disciplines. To support these efforts, additional research is needed to determine whether health researchers are aware of and utilizing the currently available IS resources, as well as whether available IS resources provide adequate and sufficiently clear information to be useful for potential IS researchers.
The high prevalence of IS element use reported is at odds with the presentation of these elements in the published literature  where publishing even basic IS outcomes are sparse [24,25,26,27,28]. The discordance between using IS methods and what is published in literature may in part be a result of the lack of consistency in IS terminology used. Implementation studies are conducted across a broad range of disciplines and topical areas, and the terminology used to describe similar constructs often varies significantly (e.g., “fidelity” is also reported as “delivered as intended,” “adherence,” “integrity,” and “quality of program delivery”) [23, 29]. Therefore, measuring the use of IS in the literature may underreport the use of these measures. The absence of IS elements in the published literature may also be due to a lack of incentive to publish IS measures, which are often viewed as secondary outcomes for many researchers and publishers alike . Increasing researcher awareness of IS, its methods, and terminology may serve to unify implementation research and increase its impact.
The results of this study support calls for the improvement of researcher training in IS [31,32,33,34]. While there are numerous IS resources available , it has been acknowledged there is a need for innovative solutions for disseminating such knowledge to researchers . Effective training in IS is essential for the success of IS research [31, 32], and the dissemination of IS knowledge may reduce unrecognized IS engagement and consequently improve the effectiveness and impact of implementation research.
Our study had several limitations. First, the generalizability of our study may be limited due to selection bias from the sampling frame used. NIH RePORTER is limited to researchers who have had a successful grant submission. Therefore, the survey data may not be generalizable to researchers using other non-public sources of funding, more junior researchers, or those who have been unsuccessful in getting funding. Similarly, NIH RePORTER predominantly contains USA researchers and therefore the study results may not be generalizable to researchers outside of the USA. Second, this study was likely impacted by response bias due to the nature of the survey topic. The survey invitation purposefully did not include terms associated with IS and as a result, approximately one-quarter of researchers who started the survey did not complete it, with a number of researchers expressing (through personal correspondence with the author) frustration and disinterest in completing the survey because it was not relevant to them or their research. Therefore, it is likely greater survey completion was present in researchers who were already aware of and engaging in IS. Similarly, the overall response rate was relatively low and therefore the estimates reported may not be representative of the sampling frame as a whole. However, overall, the distribution and variety of reported methods used indicate that the group that completed the survey still represents a diverse group of health researchers that are likely to be generally representative of the target population . Finally, while the pilot tested, the survey measures of IS engagement have not been validated. Our results are also based on self-report of elements of IS and not actual practice or understanding of IS, which is likely to lead to an overestimation of the number of researchers engaging in implementation research. Additionally, the survey did not measure the quality of research being performed by those with unrecognized IS and more research is needed to assess actual IS practices in this population.
Overall, awareness of IS is high among health researchers, yet there is also a high prevalence of unrecognized IS engagement. Efforts need to be made to further disseminate what constitutes IS research and increase IS awareness among health researchers.
Availability of data and materials
The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.
Adjusted odds ratio
95% confidence interval
Brownson RC, Colditz GA, Proctor EK. Dissemination and implementation research in health: Translating science to practice. New York: Oxford University Press; 2017.
Darnell D, Dorsey CN, Melvin A, Chi J, Lyon AR, Lewis CC. A content analysis of dissemination and implementation science resource initiatives: what types of resources do they offer to advance the field? Implement Sci. 2017;12(1):137.
Eccles MP, Mittman BS. Welcome to implementation science. Implementation Sci. 2006;1(1):1.
Glasgow RE, Emmons KM. How can we increase translation of research into practice? Types of evidence needed. Annu Rev Public Health. 2007;28:413–33.
Glasgow RE, Lichtenstein E, Marcus AC. Why don’t we see more translation of health promotion research to practice? Rethinking the efficacy-to-effectiveness transition. Am J Public Health. 2003;93(8):1261–7.
Geng EH, Peiris D, Kruk ME. Implementation science: Relevance in the real world without sacrificing rigor. PLoS Med. 2017;14(4):e1002288.
National Institutes of Health. Search Funding Opportunities and Notices 2018 [Available from: https://grants.nih.gov/searchguide/Search_Guide_Results.cfm?noticestoo=0&rfastoo=0.
Edmondson AC, Roloff KS. Overcoming barriers to collaboration: psychological safety and learning in diverse teams. Team effectiveness in complex organizations. New York: Routledge; 2008. p. 217-42.
US Department of Health & Human Services. NIH RePORTER 2018 [Available from: https://projectreporter.nih.gov/reporter.cfm.
Prochaska JO, DiClemente CC. The transtheoretical approach. In: Handbook of psychotherapy integration, vol. 2; 2005. p. 147–71.
Edwards RW, Jumper-Thurman P, Plested BA, Oetting ER, Swanson L. Community readiness: Research to practice. J Community Psychol. 2000;28(3):291–307.
Rogers EM. Diffusion of Innovations, 5th Edition. New York: Free Press; 2003.
Lauer M. Trends in Diversity within the NIH-funded Workforce: Extramural Nexus; 2018 [Available from: https://nexus.od.nih.gov/all/2018/08/07/trends-in-diversity-within-the-nih-funded-workforce/.
Green LW, Glasgow RE. Evaluating the relevance, generalization, and applicability of research: issues in external validation and translation methodology. Eval Health Prof. 2006;29(1):126–53.
Hoffmann TC, Glasziou PP, Boutron I, Milne R, Perera R, Moher D, et al. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ. 2014;348:g1687.
Sturke R, Harmston C, Simonds RJ, Mofenson LM, Siberry GK, Watts DH, et al. A multi-disciplinary approach to implementation science: the NIH-PEPFAR PMTCT implementation science alliance. J Acquir Immune Defic Syndr. 2014;67(Suppl 2):S163–7.
Pinnock H, Barwick M, Carpenter CR, Eldridge S, Grandes G, Griffiths CJ, et al. Standards for Reporting Implementation Studies (StaRI) Statement. BMJ. 2017;356:i6795.
Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implementation Sci. 2013;8(1):139.
Evensen AE, Sanson-Fisher R, D'Este C, Fitzgerald M. Trends in publications regarding evidence-practice gaps: A literature review. Implementation Sci. 2010;5:11.
Johnson LG, Armstrong A, Joyce CM, Teitelman AM, Buttenheim AM. Implementation strategies to improve cervical cancer prevention in sub-Saharan Africa: a systematic review. Implementation Sci. 2018;13:28.
Remme JHF, Adam T, Becerra-Posada F, D'Arcangues C, Devlin M, Gardner C, et al. Defining Research to Improve Health Systems. PLoS Med. 2010;7(11):e1001000.
Ciliska D, Robinson P, Horsley T, Ellis P, Brouwers M, Gauld M, et al. Diffusion and dissemination of evidence-based dietary strategies for the prevention of cancer. Current oncology (Toronto, Ont). 2006;13(4):130–40.
Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. 2011;38(2):65–76.
Gresham FM, Gansle KA, Noell GH, Cohen S. Treatment integrity of school-based behavioral intervention studies: 1980–1990. School Psychol Rev. 1993;22:254–72.
Peterson L, Homer AL, Wonderlich SA. The integrity of independent variables in behavior analysis. J Appl Behav Anal. 1982;15(4):477–92.
Gresham FM, MacMillan DL, Beebe-Frankenberger ME, Bocian KM. Treatment integrity in learning disabilities intervention research: Do We Really Know How Treatments Are Implemented? Learn Disabil Res Pract. 2000;15(4):198–205.
Wheeler JJ, Baggett BA, Fox J, Blevins L. Treatment integrity: a review of intervention studies conducted with children with autism. Focus on Autism and Other Developmental Disabilities. 2006;21(1):45-54.
McIntyre LL, Gresham FM, DiGennaro FD, Reed DD. Treatment integrity of school-based interventions with children in the journal of applied behavior analysis 1991-2005. J Appl Behav Anal. 2007;40(4):659–72.
McKibbon KA, Lokker C, Wilczynski NL, Ciliska D, Dobbins M, Davis DA, et al. A cross-sectional study of the number and frequency of terms used to refer to knowledge translation in a body of health literature in 2006: a Tower of Babel? Implementation Sci. 2010;5(1):16.
Brunner JW, Sankaré IC, Kahn KL. Interdisciplinary priorities for dissemination, implementation, and improvement science: frameworks, mechanics, and measures. Clin Transl Sci. 2015;8(6):820–3.
Tabak RG, Padek MM, Kerner JF, Stange KC, Proctor EK, Dobbins MJ, et al. Dissemination and implementation science training needs: insights from practitioners and researchers. Am J Prev Med. 2017;52(3s3):S322–s9.
Brownson RC, Samet JM, Chavez GF, Davies MM, Galea S, Hiatt RA, et al. Charting a future for epidemiologic training. Ann Epidemiol. 2015;25(6):458–65.
Straus SE, Sales A, Wensing M, Michie S, Kent B, Foy R. Education and training for implementation science: our interest in manuscripts describing education and training materials. Implementation Sci. 2015;10(1):136.
Brazier JE, Roberts J. The estimation of a preference-based measure of health from the SF-12. Med Care. 2004;42(9):851–9.
This study was funded by the New York University Dean’s Doctoral Research Scholarship.
Ethics approval and consent to participate
The New York University institutional review board approved the study.
Consent for publication
The authors declare that they have no competing interests.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Stevens, E.R., Shelley, D. & Boden-Albala, B. Unrecognized implementation science engagement among health researchers in the USA: a national survey. Implement Sci Commun 1, 39 (2020). https://doi.org/10.1186/s43058-020-00027-3
- Implementation science
- Health research