Skip to main content

Challenges and recommendations for collecting and quantifying implementation costs in practice: a qualitative interview study

Abstract

Background

The cost of implementation is typically not accounted for in published economic evaluations, which determine the relative value for money of health innovations and are important for allocating scarce resources. Despite key papers outlining relevant implementation costs, they continue to be under reported in the literature and often not considered in practice. This study sought to understand and outline current practices for capturing the costs associated with implementation efforts, with examples from the digital health setting.

Methods

A qualitative study of semi-structured interviews with purposefully sampled experts in implementation science, health economics and/or digital health was conducted. The interview guide was informed by a literature review and was pilot tested. Interviews were digitally recorded and transcribed. A hybrid inductive/deductive framework analysis was conducted using thematic analysis to elicit key concepts related to the research question.

Results

Interviews were conducted with sixteen participants with specialist expertise in implementation science (n = 8), health economics (n = 6), and/or digital health (n = 8). Five participants were experienced in more than one field. Four key themes were elicited from the data: difficulty identifying and collecting implementation cost data; variation in approaches for collecting implementation cost data; the value of implementation costs; and collaboration enables implementation costing. Broadly, while interviewees recognised implementation costs as important, only some costs were considered in practice likely due to the perceived ill-defined boundaries and inconsistencies in terminology. A variety of methods were used to collect and estimate implementation costs; the most frequent approach was staff time tracking. Multidisciplinary collaboration facilitated this process, but the burden of collecting the necessary data was also highlighted.

Conclusions

In current practice, standardised methods are not commonly used for data collection or estimation of implementation costs. Improved data collection through standardised practices may support greater transparency and confidence in implementation cost estimates. Although participants had industry exposure, most were also academic researchers and findings may not be representative of non-academic industry settings.

Peer Review reports

Introduction

It is essential to estimate the cost of healthcare models, services, or interventions to support the appropriate allocation of healthcare-related resources [1]. Economic evaluations are employed to conduct this work but often exclude the costs associated with implementation strategies [2, 3]. Implementation strategies require additional resourcing to facilitate the adoption of healthcare interventions. Under resourcing can lead to failed implementation, while over resourcing can lead to inefficient use of scarce healthcare resources [4]. One challenge when applying economic evaluations to implementation strategies relates to obtaining accurate implementation cost estimates [5, 6], in an efficient way [7]. Despite key papers outlining relevant costs and sources for cost data [8,9,10], implementation costs continue to be under reported in the literature [11] and often not considered in practice [12]. Tools to assist the collection of appropriate implementation cost data may improve reporting, and in turn the impact of economic evaluations in implementation science [4, 5]. However, to date, research suggests that it’s been difficult to balance the ability for a tool to be both comprehensive and practical, while also minimising the need for prior implementation science knowledge [13, 14].

Implementation costing is a transdisciplinary issue requiring expertise from fields including implementation science and health economics. Collaboration between fields may help to overcome previously documented difficulties related to tracking implementation costs, variation in resource needs across implementation stages, and reluctance of sharing financial information, which has contributed to the lack of economic evaluations in implementation science studies [15]. Implementation scientists and health economists share motivation to collaborate to improve methodological rigor and real-world impact, however, collaboration is currently underutilised [15, 16].

Digital health is one field where there are increasing attempts to measure the costs and cost-effectiveness of their digital interventions [17]. Digital health innovations include tele-health, electronic reminder systems, artificial intelligence (AI) and machine learning, and may provide better value care by reducing human error, improving clinical outcomes, facilitating care coordination, improving practice efficiency, or tracking data over time [18]. Improvements in efficiency, time or effort may also translate into cost savings [19, 20]. However, the costs associated with implementation strategies for digital health innovations have been under reported in the literature, and often excluded from economic evaluations [11, 21, 22]. For example, a 2023 review identified only nine studies to report the costs associated with the implementation of clinical decision support systems, most of which did not contain enough information to discern data collection practices [23]. Specific challenges to costing implementation in the digital health setting include inconsistencies in the concept of implementation costs and a lack of methodological guidance suited to the digital health context [23].

The aim of this study was to understand and outline current practices for capturing the costs associated with implementation efforts, with examples from the digital health setting. It is intended that the findings will contribute to ongoing research in this field to establish effective and efficient data collection practices for estimating costs associated with implementation efforts.

Methods

Study design

We conducted a qualitative study using semi-structured interviews to document how the implementation of digital health innovations has been costed in hospital settings. The study sought to understand processes, experiences, opinions, and feelings attributed to this phenomenon. To achieve this we adopted a qualitative exploratory and descriptive approach using a hybrid inductive/deductive framework analysis [24]. This approach is consistent with that recommended for healthcare research and multidisciplinary teams [25]. The Consolidated Criteria for Reporting Qualitative Research Checklist (COREQ) was used for transparent and complete reporting of methods (see Additional file 1) [26]. Ethical approval was obtained from Metro South Human Research Ethics Committee (HREC/2022/QMS/81677).

Study participants and recruitment

A purposive sampling approach was used to recruit stakeholders from the academic, government, clinical or health service sectors who had experience working in the fields of implementation science, health economics and/or digital health. Potential participants were identified through existing collaborative research networks, publicly available hospital and university staff directories, and key academic publications in the fields of implementation science and health economics. We examined publicly available biographies of potential participants, including academic staff biographies, to confirm experience in relevant fields. Researcher TD emailed an invitation to participate and study information sheet. Interviews were arranged for those who expressed interest in participating and provided consent. While recruitment mostly occurred at the local and national level, international participants were also eligible for inclusion. An estimated sample size of 5 to 25 participants was preliminarily established to provide depth in data collection, with the final sample size being determined based on both pragmatic considerations and thematic saturation of data [27]. Thematic saturation of data was present when no new themes were noted after four successive interviews (the stopping criterion) [28]. A total of sixteen participants were interviewed.

Data collection

Semi-structured interviews were conducted from May to November 2022, using an interview guide (Additional file 2). The interview guide was informed by a literature search and internally tested through discussion and mock interviewing (TD, BA, and HC) [29]. It was then piloted in interviews with a sample group of implementation scientists and health economists with experience in digital health (n = 4), representative of the study population. The interview guide was further refined after the pilot. Final topics included how participants defined implementation costs; how they differ from intervention costs; why and how implementation costs are recorded; and the importance of doing so. The pilot data was used in the final analysis.

Depending on the participant’s preference and proximity to the research team, individual interviews were either conducted in-person or virtually, using the Zoom videoconferencing platform. One researcher (TD) conducted all interviews which lasted between 30 and 45 min and were audio recorded with consent. On occasion a second researcher (MF) was present in the interviews. Both TD and MF were female PhD students, with a research focus on implementation science, health economics and digital health. They received training in semi-structured interviewing and guidance from BA, an implementation scientist who has extensive experience in qualitative research. The interviewer/s were not known to participants prior to this study, however the wider research team (HC, BA, SM) was known to some participants. Participants did not have the opportunity to review, comment on or change their answers after the interview had taken place. No repeat interviews were conducted. Research notes and reflective memos were recorded by TD before and after each interview, and throughout data collection and analysis.

Data collection and data analysis were iterative. Initially six interviews were conducted, analysed and themes identified. Then a further six interviews were conducted and analysed. This iterative process continued until the research team agreed that both data saturation and inductive thematic saturation had been achieved [30]. The point of data saturation was defined when four successive interviews were conducted and no new themes emerged, this was the stopping criterion [28].

Data analysis

We followed the procedure described by Gale and colleagues for using framework analysis in multidisciplinary health research teams [25]. Table 1 outlines how we performed Gale’s seven stages of framework analysis in the context of this study [24]. In addition, the costs and resources associated with implementation strategies described by participants were mapped to the Expert Recommendations for Implementing Change (ERIC) framework [31].

Table 1 Framework data analysis

The research team that analysed the data included an implementation scientist (BA) and a health economist (HC) which reflected the participant population and allowed discipline nuances to be captured. Regular meetings were held during data analysis to discuss codes, subthemes, and themes to ensure coherency and consistency with the data to maximise rigour. Research notes were not subjected to thematic analysis but assisted in understanding and developing codes, subthemes, and themes [32]. Data analysis was conducted using NVivo software (release 1.6.1).

Results

Participant characteristics

Sixty-two professionals with experience in implementation science (IS), health economics (HE) or digital health (DH) were invited to participate in the study. Interviews were conducted with sixteen consenting participants: five implementation scientists, two health economists, four digital health specialists and five with experience across more than one of these fields (Fig. 1). Participants worked across a range of healthcare disciplines, clinical areas and settings including nursing, surgery, maternal health, nutrition and dietetics, pharmacy, heart disease, lung cancer, clinical excellence, information systems, and digital health including telehealth and AI. Most participants were worked in academia (n = 14) and were located in the same geographical region as the research team (n = 9) (Table 2).

Fig. 1
figure 1

Participants’ experience in implementation science, health economics and digital health

Table 2 Participant characteristics

Themes

Four major themes, each containing three or four subthemes, were derived from the data (as seen in Fig. 2) and are explained in more detail below and summarised in Table 3. Three deductive themes of difficulty identifying and collecting implementation cost data, influences on approaches for collecting implementation cost data, and the value of implementation costs were developed as a priori from the interview guide based on the research question. One theme of collaboration enables implementation costing occurred inductively from the data.

Fig. 2
figure 2

Thematic analysis coding tree

Table 3 Themes, subthemes and quotes. (NVivo File V6.2)

Difficulty identifying and collecting implementation cost data

Across the fields of implementation science, health economics and digital health, terminology differed and caused confusion, specifically when identifying appropriate implementation cost data. In digital health, “they [digital health solutions] typically get kind of deployed, as in I want to put this system in and then I want to use it right? So that that business of putting it in and using it is what we describe as implementation. …deployed/implementation, we use them interchangeably” [DH]. In implementation science, the process of implementation was considered broader and included considerations of context at the patient, provider, system and/or policy levels. A common language across fields was lacking, yet was perceived to be important when costing implementation.

The boundaries of implementation were difficult to delineate for participants which added to the difficulty of identifying appropriate implementation costs (Table 3: quote 1.2.(a)). For example, some participants were unsure when to start costing implementation (e.g., whether to include project planning activities) and when to end costing implementation (e.g., whether to include evaluation activities) (Table 3: quote 1.2.(b)-(d)). Although the bounds of implementation were unclear to the participants, certain activities and associated costs (both implementation and non-implementation costs) were often discussed in phases. The phases were not linear but were discussed in a logical order, from pre to post, while acknowledging the cyclical nature of implementation projects. To better understand the presence and types of costs across the implementation continuum we arranged these activities and associated costs into three phases defined as pre-implementation, peri-implementation or post-implementation, as seen in Fig. 3.

Fig. 3
figure 3

Implementation phases and associated activities

There was a wide variation of how participants conceptualized implementation costs contributing to difficulties identifying implementation cost data. Some participants believed, “intervention costs would include how the intervention was implemented” [HE] while others felt it was mainly implementation strategies, “for us, the implementation cost was largely the facilitation” [IS]. Other ways implementation costs were described included short-term costs, ‘extra’ costs, specific personnel, and pieces of work (Table 3: quote 1.3.(a)-(g)).

Participants found some implementation costs and intervention costs easier to identify than others (Table 3: quote 1.4.(a)/(b)). Implementation strategies were the most common implementation cost mentioned by participants. We mapped the mentioned implementation strategies to the ERIC framework [31]. The most frequently mentioned ERIC clusters of implementation strategies were ‘use evaluative and iterative strategies’, followed by ‘develop stakeholder interrelationships’. Implementation costs which fell outside the scope of this framework included the cost of a project manager and costs associated with workflow alterations. A project manager’s role ranged from conducting some implementation strategies to completing the administration duties to progress the project. As this role appeared to facilitate implementation, participants considered it to be an implementation cost. Participants mentioned that it was important to understand current clinical workflows and how they may be impacted from the introduction of the intervention, “or you’re never going to get your clinicians to do anything[HE]. Consequently, the need for workflow alterations may be considered within the scope of implementation processes and costed accordingly.

Physical space was mentioned by some health economists as a potential implementation cost. Space would not be included in their costing analysis if it was not a “big item” [IS/HE/DH] or if “slack is built into the system” [HE] allowing meetings to be conducted without considerations for opportunity costs or renting the space (which would have been costed). Only one participant specifically mentioned opportunity costs for implementation. The health economist explained challenges for analysis when, “trying to capture opportunity costs. And so the implicit question that we're always struggling to figure out is what would be the opportunity costs of someone not doing this or doing this” [HE]. Other participants incidentally mentioned potential opportunity costs (Table 3: quote 1.4.(c)).

Participants identified costs relating to the intervention itself as non-implementation costs. In the case of digital health interventions, this included costs relating to cybersecurity and the digital backbone including hardware, software, ongoing management of a data base, and ongoing technical management.

Areas where participants found it difficult to attribute costs to implementation included existing resources because either the cost was not incurred or the additional labour to cost implementation had “limited benefit unless there's some bigger picture” [HE/DH]. Costing labour associated with implementation was challenging when staff had to differentiate between their implementation and clinical (or regular) duties (Table 3: quote 1.4.(d)). Intangible costs including soft skills, personal reflection time, existing relationships, level of authority, and mental load were highlighted as contributing to implementation but challenging to cost (Table 3: quote 1.4.(e)).

Influences on approaches for collecting implementation cost data

The results demonstrated that implementation cost estimates from data collected in practice could be influenced by several important factors. These included the wide variation in approaches to cost implementation (Table 3: quote 2.1.(a)), the burdensome nature of collection, and the availability of implementation data collection tools.

Implementation cost estimates were frequently obtained via staff time tracking methods where staff time was documented against specified activities, and salaries were applied to calculate the cost associated with each activity (Table 3: quote 2.1.(b)). Although labour intensive, this method was not seen as complex and could provide contextual insight specific to the site. However, some participants mentioned that it was at times difficult to delineate time spent in roles and responsibilities relating to the implementation activity and usual job duties. Within this approach there was a variation in practices. Detailed approaches captured all activities and personnel involved in the implementation, while other simplified approaches estimated wages only for the key personnel involved. Some project managers developed an activity template and had staff complete it prospectively with their own time allocations. Others estimated staff time and did not request staff to complete it themselves.

Other approaches to estimating implementation costs included estimating implementation costs from expert opinion (Table 3: quote 2.1.(c)), usually through experience from similar projects. Some participants expressed that the amount of available funding determined the amount of implementation costs (Table 3: quote 2.1.(d)). Economic evaluations were mentioned, although implementation costs were not frequently included in these types of analyses.

Participants discussed ways in which they retrieved informative data to value implementation cost data. Publicly available information including pay rates and awards could be used as a resource. However instead of using this resource, most participants contacted relevant teams within the organisation, for example the finance team, to obtain salaries of the personnel involved. Navigating large organisations to obtain this information was at times difficult. Most participants combined contacting relevant teams within the organisation with primary collection of information.

The collection of appropriate data to estimate implementation costs was seen as a burdensome task. This was particularly true for collectors that were not part of the implementation project team (for example clinical staff using the intervention) and when tracking staff time, as it required personnel involvement in collecting the data themselves (Table 3: quote 2.2.(a)). Suggested strategies to encourage data collection included: utilising incentives, involving collectors in the design of data collection, and building data collection into other required tasks (Table 3: quote 2.2.(b)). Achieving high accuracy and precision through frequent and comprehensive data collection was also seen as burdensome (Table 3: quote 2.3.(a)).

No standardised implementation cost data collection tools were mentioned by participants, however several attributes were considered important by participants for successful collection of implementation cost data. Participants expressed the want for practical, pragmatic, and simple tools for local implementers (Table 3: quote 2.4.(a)). A checklist-like format was suggested, as well as aligning the input with data which is already collected for another purpose (Table 3: quote 2.4.(b)). Staff time tracking was aided when implementation activities were clearly defined in advance which was commonly achieved through a purpose-built template. Participants expressed the importance of having a few clear categories for collecting the required information (Table 3: quote 2.4.(c)). Other considerations included flexibility of tools and capture formats to suit local teams (Table 3: quote 2.4.(d)), as well as ease of integration with statistical analysis software. Recording the required data digitally was favoured by participants using programs that were available and familiar, including MS Excel, RedCap and Qualtrics (Table 3: quote 2.4.(e)/(f)).

The value of implementation costs

Capturing implementation costs was perceived to be important for demonstrating the value of the intervention, particularly to decision makers tasked with continuing or scaling the intervention. Implementation cost estimates were used to show that the intervention was either cost saving or was justified by other benefits including improved patient experience, patience safety, and clinical outcomes (Table 3: quote 3.1.(a)). It was also suggested that implementation cost estimates can also be used to inform future scalability of the intervention (Table 3: quote 3.1.(b)). Including implementation cost estimates in grant proposals or business cases can also assist in informed financial decision making, including when to proceed with pilot implementation projects (Table 3: quote 3.1.(c)).

For some participants, the lack of research and knowledge of implementation costs within implementation science contributed to their decision to cost implementation (Table 3: quote 3.2.(a)/(b)). These participants purposefully gathered implementation cost data to address this under-researched area of implementation science. Implementation costs could be used in determining the value of implementation strategies, along with the effectiveness of implementation. While it was accepted that implementation strategies are necessary for successful implementation, costing was still important to demonstrate that funds are being used appropriately. Some funding bodies required ongoing reporting of cost spending in which implementation cost estimates were included. For others, there was no requirement to cost implementation, even though they believed it was important (Table 3: quote 3.2.(c)). Even when not a requirement, some participants would still report implementation cost estimates as part of their project management practice. Some included implementation cost estimates in disseminated reports or publication to assist others who may want to emulate the same project in their institutions.

The study design, outcomes and audience of the evaluation can impact the value of implementation costs. Implementation projects were often underpowered, with limited data available for meaningful analysis, outside of descriptive analysis. In addition, if the primary objective was not achieved, further analysis (including implementation costing) was not typically performed (Table 3: quote 3.3.(a)). Costing implementation was a challenge when funding was not available for a long enough period for rigorous evaluation or to remunerate for absent expertise including health economics (Table 3: quote 3.3.(b)). The value of evaluating implementation costs differed between audiences (Table 3: quote 3.3.(c)). It was considered important to implementation scientists and health economists. However, those with experience in implementing digital health initiatives in health services did not also share this perception (Table 3: quote 3.3.(d)).

Collaboration enables implementation costing

Collaboration across disciplines facilitated the overall implementation process as well as costing of implementation, even if there was a lack of expertise in implementation science within the collaboration. Some participants had not been aware of implementation science prior to starting implementation projects but had been using analogous approaches in the past (Table 3: quote 4.1.(a)). Most participants mentioned that multidisciplinary collaborations provided a rich range of perspectives which supported the implementation project. Collaboration from the beginning of a project, particularly during study design, was most beneficial and often sought out (Table 3: quote 4.2.(a)). Participants expressed that the implementation costing was aided when the required information was planned in advance with advice from health economists and those collecting the implementation cost data, typically the implementers (Table 3: quote 4.2.(b)). At times, this served a dual purpose for defining roles and responsibilities for implementers (Table 3: quote 4.3.(a)).

Discussion

Sixteen experts in implementation science, health economics and/or digital health shared their experience of costing implementation, with examples in digital health projects. Interviewees recognised implementation costs as important and could easily identify some implementation costs (including implementation strategies, project managers and workflow alterations) and separate out non-implementation costs relating to the intervention itself. Other costs were difficult to delineate and capture in practice which was likely attributable to inconsistencies in terminology and the perceived ill-defined boundaries of implementation phases. In practice, reasons why implementation activities may not be costed include a failure to identify them, a perception they did not require reporting, or they were not considered important, all perpetuating the reported lack of awareness regarding costing implementation.

Our findings are consistent with a recent review on economic evaluations of implementation science outcomes in low- and middle-income countries which found large heterogeneity across 23 papers in how implementation resource use was conceptualised and costed [33]. Implementation strategies can be used in all phases of an implementation project [34] which likely contributes to inconsistencies in terminology and the perceived ill-defined boundaries of implementation phases that impacts capturing implementation costs in practice. Process mapping has been recommended to cost implementation and could circumvent the aforementioned issues [35, 36]. A similar approach is embedded in the cost of implementing new strategies (COINS) tool [13]. COINS maps costs to the Stages of Implementation of Completion (SIC) framework [37]. However, the level of detail required in COINS may not be suitable in some projects [14]. Simpler approaches involve summarising the project into phases and outlining activities into each phase, but are yet to be incorporated into data collection tools [10]. We found outlining phases and associated activities (Fig. 3) improved clarity when identifying potential implementation costs in our analysis. Current evidence suggests implementation costing appears to be facilitated through activity identification and categorisation to a level of detail that is required by the project.

Micro or activity-based costing approaches have been suggested for costing implementation and was reflected in our findings as the most common approach [11, 13]. These approaches have the potential to generate precise estimates [2, 38] and then may facilitate sensitivity analyses to be conducted for translating costs to other contexts by incorporating contextual differences in resource use and resource unit costs [39]. However as noted in the literature and by our participants, micro or activity-based costing can be labour intensive and burdensome [7, 8]. Data collection approaches using an onsite database (or electronic health records-base) is likely to alleviate the burden [35]. This approach may not be viable for all projects because it requires pre-existing infrastructure [35]. In our study, participants suggested that utilising data already collected for another purpose could reduce the burden. However, it is important to note that reducing the burden of data collection may lead to trade-offs in both accuracy and precision [40]. The term ‘accuracy’ refers to how close data are to their true value, while ‘precision’ is concerned with the granularity of data, which may be useful in presenting disaggregated findings or subgroup analyses. Both precision and accuracy trade-offs need to be balanced with an acceptable level of research burden on a case by case basis [39].

Micro-costing methods for collecting implementation cost data include direct observation, time-diaries/ activity logs, targeted questionnaires, key informant interviews, and onsite database (or electronic health records-base) approaches [35]. A review of micro-costing data collection tools used for health interventions outlined that the tools were developed specifically for their respective study, although some of the standardised comprehensive templates could be (and have been) generalised for public use [39]. Standardised tools promote transparency and confidence in cost estimates [39]. The use of standardised tools to cost implementation has previously been recommended [5, 7]. Participants in the current study highlighted the need for practical implementation costing tools. A standardised tool which is pragmatic, flexible and simple to collect and estimate implementation costs may improve the quality of implementation cost estimates and subsequent evaluations.

Participants described multidisciplinary collaboration as a facilitator for costing implementation. The importance of collaboration from an early stage was mentioned in our study and is consistent with previous work which recommends multi-disciplinary input during the research design phase from fields including implementation scientists and health economists [15]. Despite this knowledge, collaboration is currently lacking between implementation scientists and health economists [15]. In our study, costing implementation was hindered by the use of discipline specific terminology. For example, opportunity costs were only specifically mentioned by a health economist while other participants dismissed potential opportunity costs, unknowingly. Differences in language used between the disciplines may contribute to difficulties in collaborating and efforts should be made to develop a common understanding.

Recommendations

We propose four recommendations, based on the findings of this study, to support the effective and efficient collection and estimation of implementation costs:

  1. 1.

    A set of discrete cost categories should be developed prior to the collection of implementation cost data. Ideally the categories should reflect the project/study’s implementation effort to allow for meaningful analysis from the data collected. For example, categories could reflect the implementation strategies used, in which case an implementation terminology framework like ERIC could be applied [31]. Categories could otherwise reflect resource type; for example, labour, equipment, space, supplies, and travel [9]. Other categories for cost data specific to implementation science have been suggested elsewhere [8].

  2. 2.

    Efforts should be made to reduce the burden of implementation cost data collection. This may be achieved through use of existing databases or leveraging data that has been collected for other purposes. However, researchers should be mindful of the accuracy and precision trade-offs that may result from these efforts.

  3. 3.

    Assumptions made during the implementation costing process should be reported clearly and transparently. The use of standardised checklists may assist with this. Currently there is a checklist for the conduct and reporting of cost analysis of implementation strategies which combines elements of Proctor’s report framework of D&I strategies, the CHEERS checklist for economic evaluations and Chapel and Wang’s review on cost data collection tools [9]. Another checklist is under development for conducting, reporting, and appraisal of micro-costing studies in healthcare which may also be applicable to implementation costing studies [41]. Further research to determine best practice reporting for implementation cost estimates is required.

  4. 4.

    Increased collaboration across disciplines, particularly between implementation scientists and health economists, is likely to promote a common understanding and facilitate implementation costing.

Limitations

This study is limited by the data collected in the sixteen interviews conducted. In qualitative research data quality does not always improve with quantity and the sample size in this study reflects others in semi-structured interview studies which range from 5 to 25 [27]. Furthermore, data saturation and inductive thematic saturation were achieved after the third round of iterative data collection and suggested comprehensiveness of the sample. Collecting data past the point of saturation can be useful for providing rich quotes and greater levels of researcher awareness of the issues under investigation [27]. This study did not collect data past saturation.

The majority of participants were academics and therefore the results may reflect how implementation costs are estimated and data collected in academia more than frontline healthcare operations. Although, half of the academic participants were currently working in hospital systems. Exploring operational implementation outside of academic research may also provide additional insights or considerations when costing implementation. Embedding relevant experience into the research team may be useful to ensure that nuances from non-academic language are not overlooked. Due to the location of most of the participants, the results of this study will likely be most applicable to the Australian context. Stronger global representation of implementation costing practices may provide additional insights, particularly because there is no implementation costing standard in the literature.

The framework analysis method is typically used to compare between groups, however this aspect was a challenge because several participants had expertise in more than one field and/or worked across sectors. The multi- disciplinary and sectorial perspectives were valuable in this study and further highlighted the importance of collaboration. Although rigid comparisons could not be made in this study, the framework analysis technique remained useful for identifying specific perspectives.

Conclusion

Challenges were identified in the implementation costing process, mostly relating to identifying and collecting implementation cost data. In current practice, standardised methods are not commonly used for data collection or estimation of implementation costs. Staff time tracking was the most frequently used method to cost implementation but there was variation in this approach. There is a need for pragmatic tools to facilitate the collection of implementation cost data in practice. Improved data collection practices would promote transparency and confidence in implementation cost estimates and may lead to greater reporting of implementation of costs in the literature.

Availability of data and materials

The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.

Abbreviations

AI:

Artificial intelligence

COINS:

Cost of implementing new strategies

COREQ:

Consolidated Criteria for Reporting Qualitative Research Checklist

IT:

Information technology

ERIC:

Expert Recommendations for Implementing Change framework

References

  1. Drummond MF, Sculpher MJ, Torrance GW, Stoddart GL. Methods for the economic evaluation of healthcare programs. 3rd ed. USA: Oxford University Press; 2005.

    Book  Google Scholar 

  2. Hoomans T, Severens JL. Economic evaluation of implementation strategies in health care. Implement Sci. 2014;9:168.

    Article  PubMed  PubMed Central  Google Scholar 

  3. O’Leary MC, Hassmiller Lich K, Frerichs L, Leeman J, Reuland DS, Wheeler SB. Extending analytic methods for economic evaluation in implementation science. Implement Sci. 2022;17(1):27.

    Article  PubMed  PubMed Central  Google Scholar 

  4. Eisman AB, Kilbourne AM, Dopp AR, Saldana L, Eisenberg D. Economic evaluation in implementation science: Making the business case for implementation strategies. Psychiatry Res. 2020;283:112433.

    Article  PubMed  Google Scholar 

  5. Dopp AR, Mundey P, Beasley LO, Silovsky JF, Eisenberg D. Mixed-method approaches to strengthen economic evaluations in implementation research. Implement Sci. 2019;14(1):2.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Eisman AB, Quanbeck A, Bounthavong M, Panattoni L, Glasgow RE. Implementation science issues in understanding, collecting, and using cost estimates: a multi-stakeholder perspective. Implement Sci. 2021;16(1):75.

    Article  PubMed  PubMed Central  Google Scholar 

  7. Wagner TH. Rethinking How We Measure Costs in Implementation Research. J Gen Intern Med. 2020;35(Suppl 2):870–4.

    Article  PubMed  PubMed Central  Google Scholar 

  8. Gold HT, McDermott C, Hoomans T, Wagner TH. Cost data in implementation science: categories and approaches to costing. Implement Sci. 2022;17(1):11.

    Article  PubMed  PubMed Central  Google Scholar 

  9. Michaud TL, Pereira E, Porter G, Golden C, Hill J, Kim J, et al. Scoping review of costs of implementation strategies in community, public health and healthcare settings. BMJ Open. 2022;12(6):e060785.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Ritzwoller DP, Sukhanova A, Gaglio B, Glasgow RE. Costing behavioral interventions: a practical guide to enhance translation. Ann Behav Med. 2009;37(2):218–27.

    Article  PubMed  Google Scholar 

  11. Cidav Z, Mandell D, Pyne J, Beidas R, Curran G, Marcus S. A pragmatic method for costing implementation strategies using time-driven activity-based costing. Implement Sci. 2020;15(1):28.

    Article  PubMed  PubMed Central  Google Scholar 

  12. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. 2011;38(2):65–76.

    Article  PubMed  Google Scholar 

  13. Saldana L, Chamberlain P, Bradford WD, Campbell M, Landsverk J. The Cost of Implementing New Strategies (COINS): A Method for Mapping Implementation Resources Using the Stages of Implementation Completion. Child Youth Serv Rev. 2014;39:177–82.

    Article  PubMed  Google Scholar 

  14. Hoeft TJ, Wilcox H, Hinton L, Unutzer J. Costs of implementing and sustaining enhanced collaborative care programs involving community partners. Implement Sci. 2019;14(1):37.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Barnett ML, Dopp AR, Klein C, Ettner SL, Powell BJ, Saldana L. Collaborating with health economists to advance implementation science: a qualitative study. Implement Sci Commun. 2020;1:82.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Roberts SLE, Healey A, Sevdalis N. Use of health economic evaluation in the implementation and improvement science fields—a systematic literature review. Implement Sci. 2019;14(1):72.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Gentili A, Failla G, Melnyk A, Puleo V, Tanna GLD, Ricciardi W, et al. The cost-effectiveness of digital health interventions: A systematic review of the literature. Front Public Health. 2022;10:787135.

    Article  PubMed  PubMed Central  Google Scholar 

  18. World Health Organization (WHO). WHO Guideline: Recommendations on Digital Interventions for Health System Strengthening. Geneva: WHO; 2019.

    Google Scholar 

  19. Alotaibi YK, Federico F. The impact of health information technology on patient safety. Saudi Med J. 2017;38(12):1173–80.

    Article  PubMed  PubMed Central  Google Scholar 

  20. Jandoo T. WHO guidance for digital health: What it means for researchers. Digit Health. 2020;6:2055207619898984.

    Article  PubMed  PubMed Central  Google Scholar 

  21. Bowser DM, Henry BF, McCollister KE. Cost analysis in implementation studies of evidence-based practices for mental health and substance use disorders: a systematic review. Implement Sci. 2021;16(1):26.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Waltz TJ, Powell BJ, Chinman MJ, Smith JL, Matthieu MM, Proctor EK, et al. Expert Recommendations for Implementing Change (ERIC): protocol for a mixed methods study. Implement Sci. 2014;9:39.

    Article  PubMed  PubMed Central  Google Scholar 

  23. Donovan T, Abell B, Fernando M, McPhail SM, Carter HE. Implementation costs of hospital-based computerised decision support systems: a systematic review. Implement Sci. 2023;18(1):7.

    Article  PubMed  PubMed Central  Google Scholar 

  24. Ritchie J, Spencer Li. Qualitative data analysis for applied policy research. Bryman A, Burgess RG, editors. London and New York: Routledge London; 1994.

  25. Gale NK, Heath G, Cameron E, Rashid S, Redwood S. Using the framework method for the analysis of qualitative data in multi-disciplinary health research. BMC Med Res Methodol. 2013;13:117.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Tong A, Sainsbury P, Craig J. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. Int J Qual Health Care. 2007;19(6):349–57.

    Article  PubMed  Google Scholar 

  27. Townsend PK. Saturation And Run Off: How Many Interviews Are Required In Qualitative Research? Human Resource Management. 2013:17.

  28. Francis JJ, Johnston M, Robertson C, Glidewell L, Entwistle V, Eccles MP, et al. What is an adequate sample size? Operationalising data saturation for theory-based interview studies. Psychol Health. 2010;25(10):1229–45.

    Article  PubMed  Google Scholar 

  29. Kallio H, Pietila AM, Johnson M, Kangasniemi M. Systematic methodological review: developing a framework for a qualitative semi-structured interview guide. J Adv Nurs. 2016;72(12):2954–65.

    Article  PubMed  Google Scholar 

  30. Saunders B, Sim J, Kingstone T, Baker S, Waterfield J, Bartlam B, et al. Saturation in qualitative research: exploring its conceptualization and operationalization. Qual Quant. 2018;52(4):1893–907.

    Article  PubMed  Google Scholar 

  31. Waltz TJ, Powell BJ, Matthieu MM, Damschroder LJ, Chinman MJ, Smith JL, et al. Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: results from the Expert Recommendations for Implementing Change (ERIC) study. Implement Sci. 2015;10(1):109.

    Article  PubMed  Google Scholar 

  32. Braun V, Clarke V. One size fits all? What counts as quality practice in (reflexive) thematic analysis? Qual Res Psychol. 2021;18(3):328–52.

    Article  Google Scholar 

  33. Malhotra A, Thompson RR, Kagoya F, Masiye F, Mbewe P, Mosepele M, et al. Economic evaluation of implementation science outcomes in low- and middle-income countries: a scoping review. Implement Sci. 2022;17(1):76.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Nathan N, Powell BJ, Shelton RC, Laur CV, Wolfenden L, Hailemariam M, et al. Do the Expert Recommendations for Implementing Change (ERIC) strategies adequately address sustainment? Front Health Serv. 2022;2:905909.

    Article  PubMed  PubMed Central  Google Scholar 

  35. Cronin JR, Gritz M, Eisman AB, Panattoni L, Ritzwoller DP, Wagner N, et al. A Costing Guidebook for Implementation Scientists: The Colorando Implementation Science Center for Cancer Control; 2023.

  36. Antonacci G, Lennox L, Barlow J, Evans L, Reed J. Process mapping in healthcare: a systematic review. BMC Health Serv Res. 2021;21(1):342.

    Article  PubMed  PubMed Central  Google Scholar 

  37. Saldana L. The stages of implementation completion for evidence-based practice: protocol for a mixed methods study. Implement Sci. 2014;9(1):43.

    Article  PubMed  PubMed Central  Google Scholar 

  38. Drummond M, Sculpher M, Claxton K, Stoddart G, Torrance G. Methods for Economic Evaluation of Health Care Programmes. USA: Oxford University Press; 2015.

    Google Scholar 

  39. Chapel JM, Wang G. Understanding cost data collection tools to improve economic evaluations of health interventions. Stroke Vasc Neurol. 2019;4(4):214–22.

    Article  PubMed  PubMed Central  Google Scholar 

  40. Wagner TH, Yoon J, Jacobs JC, So A, Kilbourne AM, Yu W, et al. Estimating Costs of an Implementation Intervention. Med Decis Making. 2020;40(8):959–67.

    Article  PubMed  Google Scholar 

  41. Ruger JP, Reiff M. A checklist for the conduct, reporting, and appraisal of Microcosting studies in health care: protocol development. JMIR Res Protoc. 2016;5(4):e195.

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Acknowledgements

The authors gratefully thank Manasha Fernando for support during the conduction of interviews. The authors gratefully thank the Queensland University of Technology and Australian Government Research Training Program Scholarship for supporting this work.

Funding

No specific funding was obtained to conduct this research.

Author information

Authors and Affiliations

Authors

Contributions

TD, BA, HC and SM conceived the study. TD recruited participants, conducted the interviews and drafted the manuscript. TD, BA and HC analysed the data. TD, BA, HC and SM revised and edited the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Thomasina Donovan.

Ethics declarations

Ethics approval and consent to participate

Ethical approval was obtained from Metro South Human Research Ethics Committee (HREC/2022/QMS/81677). Oral informed consent was obtained from all participants at the beginning of each interview and recorded. Each participant also verbally consented to the audio recording of the interview. The interviewees were interviewed on a voluntary basis.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Donovan, T., Carter, H.E., McPhail, S.M. et al. Challenges and recommendations for collecting and quantifying implementation costs in practice: a qualitative interview study. Implement Sci Commun 5, 114 (2024). https://doi.org/10.1186/s43058-024-00648-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s43058-024-00648-y

Keywords