Skip to main content

Pragmatic considerations and approaches for measuring staff time as an implementation cost in health systems and clinics: key issues and applied examples

Abstract

Background

As the field of implementation science wrestles with the need for system decision-makers to anticipate the budget impact of implementing new programs, there has been a push to report implementation costs more transparently. For this purpose, the method of time-driven activity-based costing (TDABC) has been heralded as a pragmatic advance. However, a recent TDABC review found that conventional methods for estimating staff time remain resource-intensive and called for simpler alternatives. Our objective was to conceptually compare conventional and emerging TDABC approaches to measuring staff time.

Methods

Our environmental scan of TDABC methods identified several categories of approaches for staff time estimation; across these categories, staff time was converted to cost as a pro-rated fraction of salary/benefits. Conventional approaches used a process map to identify each step of program delivery and estimated the staff time used at each step in one of 3 ways: (a) uniform estimates of time needed for commonly occurring tasks (self-report), (b) retrospective “time diary” (self-report), or (c) periodic direct observation. In contrast, novel semi-automated electronic health record (EHR) approaches “nudge” staff to self-report time for specific process map step(s)—serving as a contemporaneous time diary. Also, novel EHR-based automated approaches include timestamps to track specific steps in a process map. We compared the utility of these TDABC approach categories according to the 5 R’s model that measures domains of interest to system decision-makers: relevance, rapidity, rigor, resources, and replicability, and include two illustrative case examples.

Results

The 3 conventional TDABC staff time estimation methods are highly relevant to settings but have limited rapidity, variable rigor, are rather resource-intensive, and have varying replicability. In contrast to conventional TDABC methods, the semi-automated and automated EHR-based approaches have high rapidity, similar rigor, similar replicability, and are less resource-intensive, but have varying relevance to settings.

Conclusions

This synthesis and evaluation of conventional and emerging methods for staff time estimation by TDABC provides the field of implementation science with options beyond the current approaches. The field remains pressed to innovatively and pragmatically measure costs of program delivery that rate favorably across all of the 5 R’s domains.

Peer Review reports

Background

The field of implementation science (IS) has made great progress in identifying critical approaches to translate evidence-based programs (EBP) into practice [1, 2]. Despite this progress to guide the implementation of an EBP into a given health setting, persistent dissemination challenges include: (1) inconsistent “scaling up” to varied settings within a health system, (2) “scaling out” across different health systems remains rare, and (3) sustainment of these changes is difficult. When system-level decision makers lack information on the cost of implementing and sustaining EBPs, it deters dissemination and sustainment [3,4,5]. Some IS frameworks, including the Veterans Administration QUality Enhancement Research Initiative (VA QUERI) roadmap [6], seek to guide scaling up EBPs by considering different types of implementation costs within the following project phases: (1) pre-implementation, (2) implementation, and (3) sustainment [6]. During the pre-implementation and implementation phases, key cost considerations are as follows: (1) “capacity” for delivering the EBP, including the cost of staff time for both EBP delivery and the implementation strategy of staff training, and (2) comparing the costs of alternate implementation strategies. In the sustainment phase, the focus shifts to estimate the staff time needed to continue to deliver the EBP and implementation strategies, as well as other ongoing system costs such as program materials [5].

Recent reviews of cost assessment approaches for IS and improvement science have specified the need to track the staff time required for both EBP delivery and for implementation strategies used [5, 7,8,9]. Drilling down into the staff time costs for both of EBP delivery and implementation strategies is important because (1) staff time is a major source of costs for EBP delivery; (2) staff time is a costly element of certain implementation strategies, such as technical assistance and training; and (3) assessment of other types of costs, such as program materials, are more straightforward to track. The method of time-driven activity-based costing (TDABC) has been heralded as a relatively pragmatic approach to estimate the staff time required for these different tasks; accordingly, the use of TDABC in IS research has accelerated recently [3,4,5].

As developed by Kaplan et al. [3], TDABC methods specify costs across several steps of EBP implementation. A central aspect of TDABC is to create a process map that allocates the time for each staff actor to complete each process map step, inclusive of both EBP delivery and implementation strategies used [5]. However, a recent review of TDABC by Keel et al. [4] concluded that current approaches for staff time estimation in each step of a TDABC process map remain resource-intensive and called for the development of simpler and more rapid approaches with less resource burden [4]. Accordingly, the field would benefit from more pragmatic staff time estimation approaches, with balanced attention to rigorous and reliable data collection methods [10].

Thus, there is a need to contrast the conventional methods of staff time estimation with some novel and emerging electronic health record (EHR)-based methods that could address some of the current challenges. The purpose of this brief methodology report is to compare distinct categories of conventional and emerging TDABC approaches to staff time estimation according to the 5 R’s model [11] for pragmatism that measures domains of interest to researchers and health system decision-makers: relevance, rapidity, rigor, resources, and replicability. In contrast to recent reviews and commentaries that only considered conventional approaches to staff time estimation by TDABC [4, 5, 7,8,9,10], this paper also considers innovative automated and semi-automated EHR-based approaches, and compares these different approaches on each of the 5 R’s domains. We also provide two illustrative case study examples that delineate why different staff time estimation approaches may be selected. This environmental scan of emerging pragmatic methods for staff time estimation provides the field of IS with options beyond the current standards of observation or asynchronous reporting, and presses the field to identify additional non-intrusive, real-time approaches to assessing implementation costs.

Methods

We conducted an environmental scan, including a literature search, for articles measuring the cost of staff time to implement healthcare-related EBPs. We searched PubMed using the following search terms: (“implementation cost” or “time-driven activity-based cost” or “micro-cost”) and (“health*” or “clinic*”). The literature search was limited to articles in English over the past 5 years. Articles’ references were hand-searched for additional articles. We also queried an online community of EHR users (Epic UserWeb) and colleagues with experience in EHR approaches to time capture: a clinical informatics nurse research scientist and two physician informaticists.

Our intent was not to conduct a systematic review, but to use this environmental scan to identify existing categories of staff time estimation approaches, and to compare the relative pragmatism of these approaches using the 5 R’s model perspective [11] (Table 1). While not exclusive to IS, the 5 R’s was selected because it was developed to increase the pragmatism of health research and is an accepted model of pragmatic health research domains [11,12,13]. The 5 R’s framework emphasis on relevance, rigor, and replicability are complementary with the approach that Cidav et al. took to track TDABC according to the Proctor et al. framework [12] by specifying who/what/when/how often/for how long an individual delivers an implementation strategy, but also adds an explicit emphasis on rapid/low resource burden approaches [5].

Table 1 Application of the 5 R’s to evaluate cost assessment approaches

Approaches to staff time estimation were evaluated from the perspective of system-level decision makers. Decision makers did not participate in the review process, but we considered their perspective on how a new EBP would impact their budget. Using a content analysis approach, two authors (KT and AH) reviewed the distinct approaches to staff time estimation in each article and placed them in categories named for common time capture terms [14]. We evaluated these categories of approaches from the 5 R’s model perspective [11] (Table 1), providing more favorable ratings if they (1) rated high in relevance to stakeholders, rapidity, and recursiveness, rigor, and replicability and (2) required few resources.

Results

From our environmental scan, we identified several categories of approaches to estimate the staff time spent implementing EBPs as a part of TDABC [4, 5, 7, 15]. These categories of staff time estimation approaches are applicable to EBP program delivery by managers, supervisors, and staff, as well as implementation strategies (e.g., training to deliver the EBP, and other time spent preparing for the program); time spent evaluating the program; and indirect time costs of the program on patients and care givers [5, 7, 15].

With the caveat that the approaches used to capture staff time were not always clearly described in our literature search, and a given study sometimes used more than one category of staff time estimation approach in concert [4], the most common conventional approaches reported were self-report using a time-reporting template or “time diary” [16,17,18] and uniform self-report estimates of time spent on certain activities [5, 7, 19,20,21]. Some studies also reported a category of direct observation [22,23,24]. Using our pre-specified search terms, we found one study reporting use of an automated EHR-related approach [23]. Our broad environmental scan also identified other articles using semi-automated or automated EHR-based approaches for staff time estimation, including recommendations for their use and reporting [25, 26]. Our summary of these TDABC categories of approaches are summarized in Table 2 from a 5 R’s model perspective.

Table 2 Comparison of current categories of TDABC approaches to staff time estimation

Self-report/observation categories

We identified numerous articles using conventional self-report or observation approaches to estimate staff time [5, 7,8,9, 28]. As described above, these began with a process map to identify each step of EBP delivery and then estimated the staff time required at each step of the process map using one of these approaches: (a) uniform estimate of time needed for a commonly occurring task, (b) retrospective self-report in “time diary”, or (c) periodic direct observation. However, these approaches are somewhat resource-intensive, especially observation. Further, using these approaches, costs may not be feasible to capture during the sustainment phase when there are no grant funds to support observations and/or compilation of self-report data.

Automated/semi-automated EHR-based approaches

For programs in settings that have EHRs, recent approaches have emerged to automate the data collection partly or fully. Semi-automated approaches may include hard stops built into a specific EHR note type that “nudge” a user to input their time—this essentially embeds a contemporaneous time diary into the note. Incorporating a contemporaneous time diary into the clinic note allows staff to review their charting to guide their time estimate reported and may lessen recall bias by completing the time diary in “real-time.” In contrast, fully automated approaches require no action by the staff. Seven categories of fully automated EHR-based approaches to staff time estimation are possible. These 7 categories are not mutually exclusive and include time spent: (1) documenting care provided—including the time spent within specific types of encounters, such as time spent documenting an anticoagulation visit encounter in the second case example below, (2) time spent placing or refilling prescriptions, (3) managing the EHR inbox including patient messages, (4) managing orders as part of the team, (5) providing direct patient care, (6) work during scheduled work hours, and (7) work outside of scheduled work hours [25].

With automated approaches, data are collected in real time (e.g., by the EHR), which avoids recall bias and reduces the resource burden. However, depending on the EHR vendor or other software used to track time, there are limitations in what activities can be tracked, the accuracy of the tracking estimates, and the timeliness of retrieving the data. For example, when clinicians are multitasking and leave EHR windows open, time estimates may be inflated. As it relates to relevance, if the encounter type that is tracked is not specific to the EBP and it also captures other tasks, it may not be fully relevant and the rigor of measurement is decreased. Regarding resources, some automated EHR-based approaches require assistance from the EHR vendor and/or local analysts/informaticists at the outset to determine what data to collect and how to access them. In addition, although collected in real time, the data may not be accessible in real time—data access may also require help from the EHR vendor or a local analyst/informaticist. After the initial set up, there are some benefits to EHR approaches, notably that when programs reach a sustainment phase [3], an automated method that was set up in the EHR can continue to provide reports of the staff time needed for a certain type of clinical encounter.

Case examples

To further illustrate the tradeoffs of these different approaches, we provide an overview of the approaches employed in two real-world research case examples [26]. In the first case, the study used both self-report and semi-automated approaches for capturing time spent, whereas in the second case, the authors used both automated approaches and direct observation to develop a complete workflow process map and to validate the automated timing calculations. In Table 3, we provide the rationale for the approach selected in each case example, and at least one alternative approach that could have been used.

Table 3 Rationale for use of specific TDABC approaches in the case example pilot trials

The first case example is from a pilot type 2 hybrid implementation/effectiveness trial studying the delivery of an evidence-based physical activity coaching intervention in a primary care clinic [26]. Staff time costs included the following: (1) an implementation strategy of training existing staff to serve as coaches; (2) time spent delivering the 6 intervention telephone calls to each patient, and (3) time for the implementation strategy of coaches providing technical assistance to patients to share their physical activity data (FitBit©). Approaches to capture time varied across the different elements of the program (Table 3). For time spent training, a conventional self-report time diary was used per the staff employer’s preference, in order to allocate the time spent to the research grant for this one-time session. To optimally capture the time spent in each counseling session, a semi-automated EHR-based approach was used to avoid recall bias: a brief, required contemporaneous time diary was embedded within the behavioral coaching note template in the EHR (Epic Systems). This embedded time collection template can easily be replicated in Epic Systems and other commonly used EHRs by creating a “required field” for time that must be documented before closing the note. In contrast to an alert that fires and interrupts workflow, this approach only nudges staff if the template was left incomplete when signing the encounter. During the pre-implementation phase, the coaches noted this approach fit their workflow and was minimally burdensome.

The second example is a program evaluation of the staff costs of delivering care at an anticoagulation clinic for various phenotypes of patients—those who needed minimal adjustment to their treatment regimen and those who needed frequent adjustments [23]. As they sought to compare variable costs across patients in the existing anticoagulation clinic where baseline training had already occurred, the authors did not assess staff training costs. Instead, they used direct observation to detail a process map of each step in the workflow for a patient to engage with the anticoagulation clinic staff. This included multiple steps for in-person visits, from the time of check-in until the time of check-out, and the time spent by nurses and pharmacists between in-person clinic visits. Using a proprietary internal database, the authors captured automated data for the time spent by each member of the clinical team in each step of the process map workflow. They also used a subset of direct observation assessments to validate these automated measurements of staff time. Using TDABC, they calculated the costs of the staff time in each step of the workflow, and then differentiated the costs for patients who were well-controlled and not well-controlled. Although this internal database was proprietary to their system, other EHRs, including the Epic Systems EHR [25], also have this capacity to track staff time.

Discussion

This brief methodologic commentary compares several approaches to capturing the portion of implementation costs related to staff time—an important element of implementation according to the VA QUERI Roadmap [6] and other IS process models. In particular, approaches to capturing staff time are critical to transparently report to system decision-makers the time required to implement and sustain a program. Overall, the comparison of these approaches in Table 2 may be considered as a balance of data optimization (i.e., rigorous/reliable) and efficiency in terms of a rapid return of relevant findings with low-resource requirements. In terms of the rigor/reliability, the observational approaches are most accurate, followed by the automated and semi-automated EHR-based approaches, and then the retrospective time diary approaches that are particularly prone to recall biases. In terms of efficiency, the semi-automated/automated EHR approaches stand out for their rapidity and for the limited resources needed after their initial set-up, followed by self-report. Observational approaches are the slowest and most time-consuming.

It is interesting to further consider the relative merits of these approaches from the VA QUERI Roadmap perspective which dictates that estimates of staff time are most critical to assess in the Sustainment phase. Conventional self-report time diary and observational approaches are typically too burdensome for use in the Sustainment phase; however, the conventional self-report uniform estimate approaches could be pragmatic in this phase, as well as the semi-automated or automated EHR-based approaches. In contrast, during the pre-implementation planning phase of an EBP, estimation may be the only possible approach available if decision makers need data on the time required for alternate implementation strategies before these tasks have been pilot-tested. In sum, advances are needed in terms of highly rapid, rigorous, and low-resource time capture approaches, and the semi-automated and automated approaches described here provide innovative steps forward towards that goal.

Strengths of this report include its summary of key emerging EHR-based semi-automatic and automatic approaches to capturing time and the concrete case study examples (Table 3). Further, the 5 R’s model provided a systematic basis on which to evaluate the pragmatism of different approaches. In addition, reporting staff time as a cost is consistent with the recommendations from the Consolidated Health Economic Evaluation Reporting Standards (CHEERS) [29] to “describe the methods for valuing each resource in terms of its unit cost.” However, depending on the EHR approach used, the automated approach may be challenged to separately report the distinct resource costs for the intervention and the implementation strategy, as others have recommended [5]. Although beyond the scope of this brief review, those applying these different approaches to staff time estimation should keep in mind the CHEERS recommendations to specify which staff are included (e.g., clinical staff, contracted coaches) and from what perspective (e.g., clinical health system staff, research staff) [29].

Limitations include that our focused environmental scan on conventional self-report/observation approaches to staff time estimation and EHR-based semi-automatic and automatic approaches did not include all potential approaches relevant for IS, such as automated assessments by radiofrequency identification (RFID) tags or readers. In addition, our comparisons according to the 5 R’s model are necessarily subjective. A future systematic review would expand and add rigor to this environmental scan. Automated EHR-based approaches have been used internally by health systems more often than in IS research; thus, there are some key limitations in terms of sparse prior reporting of details and validation of these approaches [25]. However, some of the described EHR approaches have been validated against direct observation and demonstrated that > 80% of the time the estimates are within 3 min of each other [27]. When used for research, it is reasonable to initially vet the accuracy of automated EHR-based approaches as compared to observation [25], as was done in case example 2—this is particularly important for complex processes that are prone to interruptions.

Conclusions

We summarized the strengths and limitations of different conventional and EHR-based semi-automated and automated approaches to measuring staff time as a cost for IS studies, with an emphasis on the 5 R’s model as an index of factors that are important to stakeholders. This is critical to allow decision-makers to consider the feasibility of implementing and sustaining programs, based on the estimates of staff time required. Going forward, the field should continue to identify additional methods of estimating staff time (and other implementation costs) that are rigorous and replicable, and also relevant, rapid, and low-resource enough to be measured in a EBP sustainment phase.

Availability of data and materials

Not applicable.

Abbreviations

CHEERS:

Consolidated Health Economic Evaluation Reporting Standards

EHR:

Electronic health record

EBP:

Evidence-based programs

IS:

Implementation science

TDABC:

Time-driven activity-based costing

VA:

Veteran’s Affairs

5 R’s model:

Relevance, rapidity, rigor, resources, and replicability model

References

  1. Kilbourne AM, Elwy AR, Sales AE, Atkins D. Accelerating research impact in a learning health care system: VA’s quality enhancement research initiative in the choice act era. Med Care. 2017;55(7 Suppl 1):S4–12.

    Article  PubMed  Google Scholar 

  2. Aarons GA, Sklar M, Mustanski B, Benbow N, Brown CH. “Scaling-out” evidence-based interventions to new populations or new health care delivery systems. Implement Sci. 2017;12(1):111.

    Article  PubMed  PubMed Central  Google Scholar 

  3. Kaplan RS, Witkowski M, Abbott M, Guzman AB, Higgins LD, Meara JG, et al. Using time-driven activity-based costing to identify value improvement opportunities in healthcare. J Health Manag. 2014;59(6):399–412.

    Google Scholar 

  4. Keel G, Savage C, Rafiq M, Mazzocato P. Time-driven activity-based costing in health care: a systematic review of the literature. Health Policy. 2017;121(7):755–63.

    Article  PubMed  Google Scholar 

  5. Cidav Z, Mandell D, Pyne J, Beidas R, Curran G, Marcus S. A pragmatic method for costing implementation strategies using time-driven activity-based costing. Implement Sci. 2020;15(1):28.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Kilbourne AM, Goodrich DE, Miake-Lye I, Braganza MZ, Bowersox NW. Quality enhancement research initiative implementation roadmap: toward sustainability of evidence-based practices in a learning health system. Med Care. 2019;57(Suppl 3):S286–93.

    Article  PubMed  PubMed Central  Google Scholar 

  7. Roberts SLE, Healey A, Sevdalis N. Use of health economic evaluation in the implementation and improvement science fields-a systematic literature review. Implement Sci. 2019;14(1):72.

    Article  PubMed  PubMed Central  Google Scholar 

  8. Smith JD, Hasan M. Quantitative approaches for the evaluation of implementation research studies. Psychiatry Res. 2020;283:112521.

    Article  PubMed  Google Scholar 

  9. Eisman AB, Kilbourne AM, Dopp AR, Saldana L, Eisenberg D. Economic evaluation in implementation science: making the business case for implementation strategies. Psychiatry Res. 2020;283:112433.

    Article  PubMed  Google Scholar 

  10. Wagner TH, Yoon J, Jacobs JC, So A, Kilbourne AM, Yu W, et al. Estimating costs of an implementation intervention. Med Decis Making. 2020;40(8):959–67.

    Article  PubMed  Google Scholar 

  11. Peek CJ, Glasgow RE, Stange KC, Klesges LM, Purcell EP, Kessler RS. The 5 R’s: an emerging bold standard for conducting relevant research in a changing world. Ann Fam Med. 2014;12(5):447–55.

    CAS  Article  PubMed  PubMed Central  Google Scholar 

  12. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8:139.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Glasgow RE. What does it mean to be pragmatic? Pragmatic methods, measures, and models to facilitate research translation. Health Educ Behav. 2013;40(3):257–65.

    Article  PubMed  Google Scholar 

  14. Hsieh HF, Shannon SE. Three approaches to qualitative content analysis. Qual Health Res. 2005;15(9):1277–88.

    Article  PubMed  Google Scholar 

  15. Ritzwoller DP, Glasgow RE, Sukhanova AY, Bennett GG, Warner ET, Greaney ML, et al. Economic analyses of the Be Fit Be Well program: a weight loss program for community health centers. J Gen Intern Med. 2013;28(12):1581–8.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Hoeft TJ, Wilcox H, Hinton L, Unützer J. Costs of implementing and sustaining enhanced collaborative care programs involving community partners. Implement Sci. 2019;14(1):37.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Kim KE, Tangka FKL, Jayaprakash M, Randal FT, Lam H, Freedman D, et al. Effectiveness and cost of implementing evidence-based interventions to increase colorectal cancer screening among an underserved population in Chicago. Health Promot Pract. 2020;21(6):884–90.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Jordan N, Graham AK, Berkel C, Smith JD. Costs of preparing to implement a family-based intervention to prevent pediatric obesity in primary care: a budget impact analysis. Prev Sci. 2019;20(5):655–64.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Nguyen HN, Sammer MB, Bales B, Cano MC, Trout AT, Dillman JR, et al. Time-driven activity-based cost comparison of three imaging pathways for suspected midgut volvulus in children. J Am Coll Radiol. 2020;17(12):1563–70.

    Article  PubMed  Google Scholar 

  20. Collins CI, Hasan TF, Mooney LH, Talbot JL, Fouraker AL, Nelson KF, et al. Subarachnoid hemorrhage “fast track”: a health economics and health care redesign approach for early selected hospital discharge. Mayo Clin Proc Innov Qual Outcomes. 2020;4(3):238–48.

    Article  PubMed  PubMed Central  Google Scholar 

  21. Boyce-Fappiano D, Ning MS, Thaker NG, Pezzi TA, Gjyshi O, Mesko S, et al. Time-driven, activity-based cost analysis of radiation treatment options for spinal metastases. JCO Oncol Pract. 2020;16(13):e271–9.

    Article  PubMed  Google Scholar 

  22. Simeon K, Sharma M, Dorward J, Naidoo J, Dlamini N, Moodley P, et al. Comparative cost analysis of point-of-care versus laboratory-based testing to initiate and monitor HIV treatment in South Africa. PLoS One. 2019;14(10):e0223669.

    CAS  Article  PubMed  PubMed Central  Google Scholar 

  23. Bobade RA, Helmers RA, Jaeger TM, Odell LJ, Haas DA, Kaplan RS. Time-driven activity-based cost analysis for outpatient anticoagulation therapy: direct costs in a primary care setting with optimal performance. J Med Econ. 2019;22(5):471–7.

    Article  PubMed  Google Scholar 

  24. Laviana AA, Ilg AM, Veruttipong D, Tan HJ, Burke MA, Niedzwiecki DR, et al. Utilizing time-driven activity-based costing to understand the short- and long-term costs of treating localized, low-risk prostate cancer. Cancer. 2016;122(3):447–55.

    Article  PubMed  Google Scholar 

  25. Sinsky CA, Rule A, Cohen G, Arndt BG, Shanafelt TD, Sharp CD, et al. Metrics for assessing physician activity using electronic health record log data. J Am Med Inform Assoc. 2020;27(4):639–43.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Huebschmann AG, Glasgow RE, Leavitt IM, Chapman K, Rice JD, Lockhart S, et al. Integrating a physical activity coaching intervention into diabetes care: a mixed methods evaluation of a pilot pragmatic trial. Transl Behav Med. 2022; in press.

  27. Hribar MR, Read-Brown S, Goldstein IH, Reznick LG, Lombardi L, Parikh M, et al. Secondary use of electronic health record data for clinical workflow analysis. J Am Med Inform Assoc. 2018;25(1):40–6.

    Article  PubMed  Google Scholar 

  28. Lopetegui M, Yen PY, Lai A, Jeffries J, Embi P, Payne P. Time motion studies in healthcare: what are we talking about? J Biomed Inform. 2014;49:292–9.

    Article  PubMed  Google Scholar 

  29. Husereau D, Drummond M, Petrou S, Carswell C, Moher D, Greenberg D, et al. Consolidated Health Economic Evaluation Reporting Standards (CHEERS)--explanation and elaboration: a report of the ISPOR Health Economic Evaluation Publication Guidelines Good Reporting Practices Task Force. Value Health. 2013;16(2):231–50.

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

None.

Funding

This work was supported by the National Cancer Institute grant (P50CA244688) award to Dr. Glasgow, and Drs. Huebschmann and Gritz are also partly supported by this grant. Dr. Trinkley is also supported by the National Heart Lung and Blood Institute K12 Training grant (K12HL137862). The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

Author information

Affiliations

Authors

Contributions

AH and RG conceptualized the idea for this brief commentary report. KT and MG provided important insights into the approach. KT conducted the literature search and queried colleagues with experience using electronic health record approaches to assess staff time. AH and KT collaboratively drafted the manuscript. All authors contributed to the interpretation of the findings and read and approved the final manuscript.

Corresponding author

Correspondence to Amy G. Huebschmann.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Huebschmann, A.G., Trinkley, K.E., Gritz, M. et al. Pragmatic considerations and approaches for measuring staff time as an implementation cost in health systems and clinics: key issues and applied examples. Implement Sci Commun 3, 44 (2022). https://doi.org/10.1186/s43058-022-00292-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s43058-022-00292-4

Keywords

  • Costing
  • Costs and cost analysis
  • Implementation
  • Time-driven activity-based costing
  • Program delivery
  • Health workforce; Staff time estimation