Skip to main content

Metrics to evaluate implementation scientists in the USA: what matters most?

Abstract

Background

Implementation science has grown rapidly as a discipline over the past two decades. An examination of how publication patterns and other scholarly activities of implementation scientists are weighted in the tenure and promotion process is needed given the unique and applied focus of the field.

Methods

We surveyed implementation scientists (mostly from the USA) to understand their perspectives on the following matters: (1) factors weighted in tenure and promotion for implementation scientists, (2) how important these factors are for success as an implementation scientist, (3) how impact is defined for implementation scientists, (4) top journals in implementation science, and (5) how these journals are perceived with regard to their prestige. We calculated univariate descriptive statistics for all quantitative data, and we used Wilcoxon signed-rank tests to compare the participants’ ratings of various factors. We analyzed open-ended qualitative responses using content analysis.

Results

One hundred thirty-two implementation scientists completed the survey (response rate = 28.9%). Four factors were rated as more important for tenure and promotion decisions: number of publications, quality of publication outlets, success in obtaining external funding, and record of excellence in teaching. Six factors were rated as more important for overall success as an implementation scientist: presentations at professional meetings, involvement in professional service, impact of the implementation scientist’s scholarship on the local community and/or state, impact of the implementation scientist’s scholarship on the research community, the number and quality of the implementation scientist’s community partnerships, and the implementation scientist’s ability to disseminate their work to non-research audiences. Participants most frequently defined and described impact as changing practice and/or policy. This expert cohort identified Implementation Science as the top journal in the field.

Conclusions

Overall, there was a significant mismatch between the factors experts identified as being important to academic success (e.g., tenure and promotion) and the factors needed to be a successful implementation scientist. Findings have important implications for capacity building, although they are largely reflective of the promotion and tenure process in the USA.

Peer Review reports

Background

As the field of implementation science grows and coalesces, there is a concomitant growing cadre of implementation scientists in academia. Understanding how implementation scientists are evaluated in the tenure and promotion process is important for the long-term viability of the field.

In the USA, decisions about tenure and promotion are typically made based upon the internal and external evaluation of faculty members [1]. In research-focused institutions, faculty typically are judged on the number and size of funded grants and the number and placement of publications [2, 3]. Despite the known challenges with common metrics (e.g., journal impact factors, h-index) [4,5,6,7], these are frequently used as guideposts [8, 9]. These traditional metrics may be even more salient when a discipline is less known to reviewers, such as implementation science.

In addition to needing to meet traditional metrics of academia, implementation scientists must also attend to additional activities aligned with tenets of the field, including the use of participatory design [10] and community-academic partnerships [11], the ability to disseminate work to non-research audiences [12], and changes to practice and/or policy [13]. Needing to align with two sets of metrics—one to meet tenure and promotion and one to achieve success in the field of implementation science—may create challenges for implementation scientists. Other fields (e.g., health services researchers, health equity scholars) have encountered similar challenges, including the perception that community-engaged scholarship is not valued in the tenure and promotion review process [14,15,16,17].

To address these matters and to provide guidance to the field and tenure and promotion committees, we surveyed implementation science experts to understand their perspectives on how publication patterns and other scholarly activities of implementation scientists are weighted in the tenure and promotion process. We also explored whether these factors are weighted differently for tenure and promotion versus overall success as an implementation scientist. It is important to note that the authors work in the USA and designed a survey that is mostly reflective of the tenure and promotion process in the USA.

Methods

Participants

We purposively recruited survey respondents from an international group of implementation science experts. Our list of experts was compiled from (1) individuals listed as Implementation Science editors, associate editors, and editorial board members; (2) the AcademyHealth National Institutes of Health (NIH) Annual Conference on the Science of Dissemination and Implementation in Health Committee and Scientific Advisory Board; (3) the NIH Implementation Research Institute core faculty, expert faculty, and fellows; (4) the NIH Mentored Training for Dissemination and Implementation Research in Cancer faculty and fellows; (5) Knowledge Translation Canada experts; (6) the NIH Dissemination and Implementation Research in Health (DIRH) Review Committee; (7) the NIH Training Institute for Dissemination and Implementation Research in Health faculty mentors; (8) the Society for Implementation Research Collaboration (SIRC) Network of Expertise Established Investigators; and (9) the principal investigators of NIH DIRH funded R01s (as of January 2020). The initial recruitment email was sent to 457 potential participants.

Procedure and measures

The University of Pennsylvania’s Institutional Review Board approved the study procedures. Potential participants received an email from the senior author (RB) inviting them to participate in a brief (i.e., 15–30 min) online survey through REDCap (see Additional file 1 for the full survey). Questions were adapted from previous surveys on faculty evaluation [18]. Specifically, we queried about (1) factors weighted in tenure and promotion for implementation scientists (10 items rated on a 1–3 scale, with higher scores indicating greater influence), (2) how important these factors are for success as an implementation scientist (10 items rated on a 1–3 scale, with higher scores indicating greater importance), (3) how impact is defined for implementation scientists (2 open-ended questions), (4) top journals in implementation science (open-ended question), and (5) how the prestige of these journals is perceived (on a 0–9 scale, with higher scores indicating greater prestige). We also examined the impact factors of the journals with the highest frequencies of implementation science papers. Data collection occurred from April 15, 2020, to May 15, 2020. Individuals received up to three reminder emails, sent weekly after the initial invitation. All participants provided informed consent electronically.

The methods informing the survey section on top journals in implementation science and perceived prestige of these journals were based on a similar study in health services research by Brooks, Walker, and Szorady [19], which involved program chairs rating the level of achievement of faculty who published in specific journals in health care administration. We adapted their survey prompt, replacing “health care administration” with “implementation science.” Participants rated the perceived prestige of 24 journals obtained via bibliometric methods (see Additional file 2 for methods used to generate the list of journals). For all journals reported below, the study team pulled the impact factors from journal websites as of November 1, 2021.

Data analyses

Quantitative data were analyzed with IBM SPSS Statistics version 28. First, we calculated univariate descriptive and frequency statistics. Next, we compared how participants weighted each of the 10 factors (see Additional file 1) for tenure and promotion versus overall success as an implementation scientist using Wilcoxon signed-rank tests (ordinal, item-level data). Finally, open-ended survey responses were managed in Excel and analyzed by two reviewers independently (BM and MP, or BM and RB) using conventional content analysis involving five steps: reading the data in its entirety, developing codes to reflect the data, coding the data, reviewing the data and codes a second time, and establishing consensus between the coders through discussion [20].

Results

Participant characteristics

A total of 132 implementation science experts completed the survey (28.9% response rate). See Table 1 for participant characteristics.

Table 1 Participant characteristics (n = 132)

Factors weighted in tenure and promotion decisions

As summarized in Table 2, participants rated the same list of 10 factors for two separate questions to compare the degree of influence for tenure and promotion decisions versus the degree of importance to being a successful implementation scientist. Each of these factors showed significantly different ratings between the two areas. Four factors were rated as more important for tenure and promotion decisions, compared to being a successful implementation scientist: number of publications, quality of publication outlets, success in obtaining external funding, and record of excellence in teaching. Six factors were rated as more important for the overall success as an implementation scientist, compared to tenure and promotion decisions: presentations at professional meetings, involvement in professional service, impact of the implementation scientist’s scholarship on the local community and/or state, impact of the implementation scientist’s scholarship on the research community, the number and quality of the implementation scientist’s community partnerships, and the implementation scientist’s ability to disseminate their work to non-research audiences. Most notably, 65.9% of participants described community partnerships as majorly important to being a successful implementation scientist versus only 12.9% reporting that community partnerships are majorly influential on tenure and promotion decisions.

Table 2 Perceived degree of influence/importance of various factors on tenure and promotion decisions for implementation scientists versus the overall success of implementation scientists

Seventy-five participants shared additional factors perceived as important for evaluating implementation scientists for tenure and promotion. Figure 1 displays the final codes from the content analysis of these open-ended responses. The most frequently described factor was mentoring or training the next generation of implementation scientists. As one participant noted, “Given the state of the field, it is important to have the ability to build capacity in the field through mentorship.” Other factors included collaboration (e.g., ability to conduct team science across disciplines), leadership (e.g., leadership in professional or practice organizations that disseminate evidence), quality of research (e.g., methodological rigor of work), national or international impact (e.g., impact on national policy), expertise (e.g., methodological strength in a specific area), and citation metrics (e.g., h-index).

Fig. 1
figure 1

Additional factors reported as important for evaluating implementation scientists on their performance (n = 75)

Defining and describing impact

Content analysis of 106 open-ended responses about how best to define impact revealed eight codes (Fig. 2). The same eight codes, plus one additional code, emerged from 118 open-ended responses about a situation when the participant’s work had an impact (Fig. 3). Table 3 displays the definition and an example response for each code. Changing practice and/or policy was the most frequently coded response, reported by the majority of participants for both questions. Of note, six participants expressed uncertainty about their work having an impact, and six participants noted that determining whether work has an impact is difficult because it takes a long period of time. As one participant shared, “You do not know at the time; you may feel your work could have potential, but it takes time to see any impact - this is generally over years.”

Fig. 2
figure 2

Coded definitions of impact of an implementation scientist’s work (n = 106)

Fig. 3
figure 3

Coded descriptions of participants’ own work having an impact (n = 118)

Table 3 Code definitions and examples from content analysis of impact questions

Journal endorsements and ratings

When asked to report the top three journals that publish implementation science papers, almost all participants (97.8%) named Implementation Science. The next most frequently named journal was Administration and Policy in Mental Health and Mental Health Services Research (20.5%). The journals that were named by ≥ 10 participants as one of the top three are displayed in Table 4 with their impact factors.

Table 4 Top journals that publish implementation science (selected by ≥ 10 participants as one of the top three) with impact factors

The participants’ perceived achievement ratings of faculty who published an implementation science paper in each of the journals are displayed in Table 5. Implementation Science received the highest achievement rating, which was significantly higher than the second highest rating for the Journal of General Internal Medicine, t(131) = 7.831, p < .001.

Table 5 Achievement ratings of faculty members who published an implementation science paper in selected journals (0 = lowest achievement, 9 = highest achievement), with impact factors

Discussion

We surveyed primarily US-based implementation scientists to understand how various factors are weighted within the tenure and promotion process for implementation scientists. Our results indicate that traditional academic metrics such as quantity and quality of scholarly publications and external funding are perceived as more influential for tenure and promotion decisions, compared to their importance for being a successful implementation scientist. Although these metrics were still rated as very important for success as an implementation scientist, additional factors were also rated highly, such as community partnerships, impact, and dissemination to non-scientific audiences. These findings suggest that implementation scientists may experience tension in attempting high-quality implementation research, which takes time and effort to accomplish, while also trying to achieve promotion and tenure. This tension has been noted in other fields [14,15,16,17]. If academic promotion is meant to reflect success in a field, then standards for promotion need to incorporate these additional metrics [6, 25]. Fortunately, community-engaged scholarship is emerging as a more influential factor in tenure and promotion decisions at some institutions [26,27,28]. There are also resources available for faculty seeking promotion or tenure based on community-engaged scholarship and for review committee members evaluating community-engaged scholars [29].

In addition to the factors described above, implementation science is fundamentally centered on impact or implementation success. However, the field lacks a commonly used definition for this outcome. Kilbourne et al. [30] define implementation success as “achieving behavioral or clinical improvement in a population when interventions were implemented in multiple settings and scaled up and sustained after the original research on the intervention ended” (p.S783). Similar to our findings, the authors note that impact or success may not be visible for years after the initial implementation study. In addition, work that advances the conceptual and methodological foundation of the field takes time. Overall, determining more proximal metrics of impact and developing a methodology to evaluate implementation success may be worthwhile for implementation scientists in academia.

There are several tools that implementation scientists and evaluating institutions (e.g., universities, funders) can use to systematically assess and report impact. One example is the Translational Science Benefits Model (TSBM) [31], which includes 30 specific and observable indicators of clinical, community, economic, and policy benefits. Another example is the International School on Research Impact Assessment (ISRIA), which is intended to assist organizations in conducting effective research impact assessments for any scientific domain [32]. Structured CV templates that include research translation activities could also address existing inconsistencies in reporting impact [33].

Respondents provided the most frequent and highest endorsement ratings for the journal Implementation Science, which is the flagship journal of the field. Our participant sampling strategy targeted editors, editorial board members, and authors of articles in this journal, which may have influenced our results. However, similar findings have been reported elsewhere, with Implementation Science leading other rankings of journals for publishing implementation research [34, 35]. A small number of highly regarded journals in the field could limit publication opportunities for implementation scientists. In positive news, there has been a large increase recently in new journals focused on implementation research (e.g., Implementation Research and Practice, Implementation Science Communications, Global Implementation Research and Applications) as well as numerous special issues on implementation science published in discipline-specific journals. This trend likely points to a changing landscape for implementation research with improved visibility and impact.

This study has limitations. First, this study largely reflects academic practice in the USA, and our findings likely do not apply to many other countries with different tenure and promotion processes. Second, our survey relied only on expert input from people who identify as implementation scientists and whose work has earned recognition in the implementation science field. While this ensured our sample had a high familiarity with implementation research, it is possible that rankings of promotion criteria importance would differ in a broader sample, which could include many researchers whose work aligns closely with implementation science, but who use different terminology to describe their work. Third, 47% of respondents were full professors, meaning they have successfully navigated the academic promotion process, and their survey responses may not generalize to implementation scientists with different experiences related to promotion. Fourth, we did not collect detailed information about the participants' work setting, so we do not know if our sample is skewed toward a particular focus (e.g., behavioral health). Survey respondents likely work at institutions with varying criteria and standards for tenure and promotion. Fifth, less than half of the participants reported prior experience serving on a tenure and promotion committee for an implementation scientist. However, the pattern of results remained largely unchanged when excluding participants without this prior experience from analyses (Additional file 3). Sixth, questions in the survey were largely theoretical and asked respondents to reflect broadly on factors of importance; future work might expand on this using candidate vignettes (e.g., sample CVs and scholarly statistics), which may provide more objective assessments of how different candidates are evaluated. Seventh, while our response rate was consistent with prior studies employing similar methodology [33, 34] as well as other online surveys [36], it was overall relatively low; our response rate may have been further hampered by timing, during the start of the COVID-19 pandemic. Eighth, our sample was predominantly White. Ninth, we did not ask respondents about their expertise in other fields that may experience similar challenges (e.g., health equity, community-based participatory research methods). Finally, Implementation Science Communications and Implementation Research and Practice were endorsed as highly influential, but do not yet have impact factors.

Conclusions

This study suggests that implementation scientists often experience a tension between what they must achieve for tenure and promotion and what they must achieve to be impactful and successful as implementation scientists. Our findings highlight the need for implementation scientists to adopt a more structured and systematic method for reporting impact and research translation activities more broadly; in turn, academic institutions and funders are called to recognize and credit scholarly activities that impact practice or policy.

Availability of data and materials

The dataset analyzed during the current study is available from the corresponding author on reasonable request.

Abbreviations

CV:

Curriculum vitae

DIRH:

Dissemination and Implementation Research in Health

ISRIA:

International School on Research Impact Assessment

MT-DIRC:

Mentored Training for Dissemination and Implementation Research in Cancer

NIH:

National Institutes of Health

SIRC:

Society for Implementation Research Collaboration

TIDIRH:

Training Institute for Dissemination and Implementation Research in Health

TSBM:

Translational Science Benefits Model

References

  1. Moher D, Naudet F, Cristea IA, Miedema F, Ioannidis JPA, Goodman SN. Assessing scientists for hiring, promotion, and tenure. PLoS Biol. 2018;16:e2004089. https://doi.org/10.1371/journal.pbio.2004089.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  2. Hammarfelt B. Recognition and reward in the academy: valuing publication oeuvres in biomedicine, economics and history. Aslib J Inf Manag. 2017;69:607–23. https://doi.org/10.1108/AJIM-01-2017-0006.

    Article  Google Scholar 

  3. Rice DB, Raffoul H, Ioannidis JPA, Moher D. Academic criteria for promotion and tenure in biomedical sciences faculties: cross sectional analysis of international sample of universities. BMJ. 2020;369:m2081. https://doi.org/10.1136/bmj.m2081.

    Article  PubMed  PubMed Central  Google Scholar 

  4. Holden G, Rosenberg G, Barker K. Bibliometrics. Soc Work Health Care. 2005;41:67–92. https://doi.org/10.1300/J010v41n03_03.

    Article  PubMed  Google Scholar 

  5. Koltun V, Hafner D. The h-index is no longer an effective correlate of scientific reputation. PLoS One. 2021;16:e0253397. https://doi.org/10.1371/journal.pone.0253397.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  6. Cassil A. Disrupting the status quo: redesigning academic incentives to prioritize social impact in health services research: AcademyHealth; 2021. https://academyhealth.org/publications/2021-09/reimagining-academic-incentives-and-rewards-health-services-research. Accessed 6 Dec 2021.

  7. San Francisco declaration on research assessment. DORA n.d. https://sfdora.org/read/. Accessed 6 Dec 2021.

  8. Schimanski LA, Alperin JP. The evaluation of scholarship in academic promotion and tenure processes: past, present, and future. F1000Res. 2018;7:1605. https://doi.org/10.12688/f1000research.16493.1.

    Article  PubMed  PubMed Central  Google Scholar 

  9. McKiernan EC, Schimanski LA, Muñoz Nieves C, Matthias L, Niles MT, Alperin JP. Use of the journal impact factor in academic review, promotion, and tenure evaluations. ELife. n.d.;8:e47338. https://doi.org/10.7554/eLife.47338.

  10. Ramanadhan S, Davis MM, Armstrong R, Baquero B, Ko LK, Leng JC, et al. Participatory implementation science to increase the impact of evidence-based cancer prevention and control. Cancer Causes Control. 2018;29:363–9. https://doi.org/10.1007/s10552-018-1008-1.

    Article  PubMed  PubMed Central  Google Scholar 

  11. Pellecchia M, Mandell DS, Nuske HJ, Azad G, Wolk CB, Maddox BB, et al. Community–academic partnerships in implementation research. J Community Psychol. 2018;46:941–52. https://doi.org/10.1002/jcop.21981.

    Article  PubMed  Google Scholar 

  12. Bubela T, Nisbet MC, Borchelt R, Brunger F, Critchley C, Einsiedel E, et al. Science communication reconsidered. Nat Biotechnol. 2009;27:514–8. https://doi.org/10.1038/nbt0609-514.

    Article  CAS  PubMed  Google Scholar 

  13. Bauer MS, Damschroder L, Hagedorn H, Smith J, Kilbourne AM. An introduction to implementation science for the non-specialist. BMC Psychol. 2015;3:32. https://doi.org/10.1186/s40359-015-0089-9.

    Article  PubMed  PubMed Central  Google Scholar 

  14. Calleson DC, Jordan C, Seifer SD. Community-engaged scholarship: is faculty work in communities a true academic enterprise? Acad Med J Assoc Am Med Coll. 2005;80:317–21. https://doi.org/10.1097/00001888-200504000-00002.

    Article  Google Scholar 

  15. Michener L, Cook J, Ahmed SM, Yonas MA, Coyne-Beasley T, Aguilar-Gaxiola S. Aligning the goals of community-engaged research: why and how academic health centers can successfully engage with communities to improve health. Acad Med. 2012;87:285–91. https://doi.org/10.1097/ACM.0b013e3182441680.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Nokes KM, Nelson DA, McDonald MA, et al. Faculty perceptions of how community-engaged research is valued in tenure, promotion, and retention decisions. Clin Transl Sci. 2013;6:259–66. https://doi.org/10.1111/cts.12077.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Marrero DG, Hardwick EJ, Staten LK, et al. Promotion and tenure for community-engaged research: an examination of promotion and tenure support for community-engaged research at three universities collaborating through a Clinical and Translational Science Award. Clin Transl Sci. 2013;6:204–8. https://doi.org/10.1111/cts.12061.

    Article  PubMed  PubMed Central  Google Scholar 

  18. O’Meara KA. Encouraging multiple forms of scholarship in faculty reward systems: have academic cultures really changed? 2006. p. 77–95. https://doi.org/10.1002/ir.173.

    Book  Google Scholar 

  19. Brooks CH, Walker LR, Szorady R. Rating journals in health care administration: the perceptions of program chairpersons. Med Care. 1991;29:755–65 https://www.jstor.org/stable/3766103.

    Article  CAS  Google Scholar 

  20. Hsieh H-F, Shannon SE. Three approaches to qualitative content analysis. Qual Health Res. 2005;15:1277–88. https://doi.org/10.1177/1049732305276687.

    Article  PubMed  Google Scholar 

  21. Davis R, D’Lima D. Building capacity in dissemination and implementation science: a systematic review of the academic literature on teaching and training initiatives. Implement Sci. 2020;15:97. https://doi.org/10.1186/s13012-020-01051-6.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Levine R, Russ-Eft D, Burling A, Stephens J, Downey J. Evaluating health services research capacity building programs: implications for health services and human resource development. Eval Program Plann. 2013;37:1–11. https://doi.org/10.1016/j.evalprogplan.2012.12.002.

    Article  PubMed  Google Scholar 

  23. Kislov R, Waterman H, Harvey G, Boaden R. Rethinking capacity building for knowledge mobilisation: developing multilevel capabilities in healthcare organisations. Implement Sci. 2014;9:166. https://doi.org/10.1186/s13012-014-0166-0.

    Article  PubMed  PubMed Central  Google Scholar 

  24. Glasgow RE, Estabrooks PE. Pragmatic applications of RE-AIM for health care initiatives in community and clinical settings. Prev Chronic Dis. 2018;15:E02. https://doi.org/10.5888/pcd15.170271.

    Article  PubMed  PubMed Central  Google Scholar 

  25. Lal S, Urquhart R, Cornelissen E, Newman K, Van Eerd D, Powell BJ, et al. Trainees’ self-reported challenges in knowledge translation, research and practice. Worldviews Evid Based Nurs. 2015;12:348–54. https://doi.org/10.1111/wvn.12118.

    Article  PubMed  Google Scholar 

  26. Able H, Blanchard LW, Corbie-Smith GM, Friedman BG, Muller EL, Rhodes TE. 2016 provost’s task force on engaged scholarship in promotion and tenure; 2016.

    Google Scholar 

  27. Margolis DJ, Bellini L, Bowles AW. Annual COAP update; 2021.

    Google Scholar 

  28. 2021-2022 promotion guidelines: to associate and full professor. Psychiatry Univ Tor n.d. https://www.psychiatry.utoronto.ca/2021-2022-promotion-guidelines-associate-and-full-professor. Accessed 6 Dec 2021.

  29. Jordan CM, Joosten YA, Leugers RC, Shields SL. The community-engaged scholarship review, promotion, and tenure package: a guide for faculty and committee members. Metrop Univ. 2009;20:66–86.

    Google Scholar 

  30. Kilbourne AM, Glasgow RE, Chambers DA. What can implementation science do for you? Key success stories from the field. J Gen Intern Med. 2020;35:783–7. https://doi.org/10.1007/s11606-020-06174-6.

    Article  PubMed  PubMed Central  Google Scholar 

  31. Translational science benefits model. Washington University in St. Louis; n.d. https://translationalsciencebenefits.wustl.edu/. Accessed 6 Dec 2021.

  32. Adam P, Ovseiko PV, Grant J, Graham KEA, Boukhris OF, Dowd A-M, et al. ISRIA statement: ten-point guidelines for an effective process of research impact assessment. Health Res Policy Syst. 2018;16:8. https://doi.org/10.1186/s12961-018-0281-5.

    Article  PubMed  PubMed Central  Google Scholar 

  33. Boland L, Brosseau L, Caspar S, Graham ID, Hutchinson AM, Kothari A, et al. Reporting health research translation and impact in the curriculum vitae: a survey. Implement Sci Commun. 2020;1:20. https://doi.org/10.1186/s43058-020-00021-9.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  34. Norton WE, Lungeanu A, Chambers DA, Contractor N. Mapping the growing discipline of dissemination and implementation science in health. Scientometrics. 2017;112:1367–90. https://doi.org/10.1007/s11192-017-2455-2.

    Article  PubMed  PubMed Central  Google Scholar 

  35. Mielke J, Brunkert T, Zullig LL, Bosworth HB, Deschodt M, Simon M, et al. Relevant journals for identifying implementation science articles: results of an international implementation science expert survey. Front Public Health. 2021;9:458. https://doi.org/10.3389/fpubh.2021.639192.

    Article  Google Scholar 

  36. Sammut R, Griscti O, Norman IJ. Strategies to improve response rates to web surveys: a literature review. Int J Nurs Stud. 2021;123:104058. https://doi.org/10.1016/j.ijnurstu.2021.104058.

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

We are grateful for Drs. Ross Brownson, David Chambers, and Kimberly Hoagwood, who reviewed the initial survey draft and provided feedback.

Funding

BP was supported by the National Institute of Mental Health through K01MH113806.

Author information

Authors and Affiliations

Authors

Contributions

RB and DA conceptualized the study. BM, CBW, RS, BP, KO, MP, EBH, and RB contributed to the study design. BM created the survey draft. CBW, RS, BP, KO, MP, EBH, and RB reviewed the survey draft and provided feedback. BM, CBW, RS, KO, MP, EBH, YVB, and RB supported a team of undergraduate coders in reviewing which journals publish implementation science. BM, MP, and RB coded the open-ended survey responses. BM led the data analyses, interpretation, and writing of the manuscript. YVB, CBW, RS, KO, MP, EBH, and RB assisted with writing the manuscript. All authors reviewed and provided feedback for this manuscript. The final version of this manuscript was approved by all authors.

Corresponding author

Correspondence to Brenna B. Maddox.

Ethics declarations

Ethics approval and consent to participate

The University of Pennsylvania Institutional Review Board (IRB) approved this study on January 27, 2020 (Protocol Number 834876). Participants provided informed consent electronically.

Consent for publication

Not applicable.

Competing interests

Dr. Beidas is the principal at Implementation Science & Practice, LLC. She receives royalties from Oxford University Press and consulting fees from United Behavioral Health and OptumLabs and serves on the advisory boards for Optum Behavioral Health, AIM Youth Mental Health Foundation, and the Klingenstein Third Generation Foundation outside of the submitted work. All other authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

Online Survey.

Additional file 2.

Details about Journals Listed in Survey.

Additional file 3.

Supplementary Analyses with the Subset of Participants with Experience Participating on Tenure and Promotion Committees.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Maddox, B.B., Phan, M.L., Byeon, Y.V. et al. Metrics to evaluate implementation scientists in the USA: what matters most?. Implement Sci Commun 3, 75 (2022). https://doi.org/10.1186/s43058-022-00323-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s43058-022-00323-0

Keywords