Letter to the Editor
Implementation Science Communications volume 3, Article number: 14 (2022)
To the Editor:
Economic evaluations of implementation strategies remain few and far between despite the importance of economic considerations in disseminating implementation strategies. The piece “Moving Beyond Aim 3” addresses this gap by presenting a model of transdisciplinary team science for implementation scientists and health economists that can increase economic evaluations. As the authors state, health economists are often approached to conduct “aim 3” of an implementation study on the costs of implementation; however, “aim 3” is rarely considered a high priority as reflected by the study resources devoted to that aim. The authors argue that transdisciplinary science moves beyond traditional collaborations across disciplines that combine methods and theories from different disciplines in a hybrid approach.
In this transdisciplinary team science model, the development phase occurs when researchers from different disciplines agree to collaborate and develop a shared understanding of the problem and of the goals of the group. In this phase, the authors highlight the challenges in collaborating with health economists due to the priority of health economists publishing in economic journals and their lack of priority around collaboration. However, this description of health economists focuses on those trained in and/or employed by departments of economics whereas health economists with expertise in evaluating applied health care interventions and who have an orientation toward collaboration across disciplines are often trained in and/or employed by multidisciplinary departments in schools of public health, public policy, medicine, and pharmacy. As such, implementation scientists may have more success seeking out health economists who do not have a single-discipline focus. As the authors suggest, it is important to identify health economists with interests in implementation science, and they highlight Centers for Translational Science Award Programs as places to find health economists.
The authors also describe a phase in their team science model, the conceptualization of research questions and designs by the study team in which the integration of multiple disciplines occurs. The authors further discuss participation of health economists in evaluating implementation costs and other economic outcomes such as quality of life. What could be added to the description of the conceptualization phase, however, is the often-overlooked expertise of health economists in quantitative methods to answer the central aims of whether an implementation strategy is effective. While some implementation studies employ a randomized design to estimate the unbiased effect of implementation strategies versus usual care on outcomes, many more use nonrandomized designs and retrospective analysis of observational data. Many health economists have expertise in econometric methods to estimate the causal impact of interventions or programs. Methods such as differences-in-differences, instrumental variables, and propensity matching involve econometric or statistical techniques to limit bias in non-randomized studies comparing outcomes between intervention and control groups since these groups can have underlying differences that lead to misattributing differences in outcomes to the intervention. For example, one implementation study evaluated the impact of a quality improvement program for maternal and newborn care on newborn outcomes for earlier versus later-adopting hospitals using a differences-in-differences design to compare outcomes before and after the program was implemented in the two groups of hospitals since hospitals adopting the program are more similar to each other than to hospitals who never adopted the program . Another implementation study compared use of internal and external facilitation to implement a nurse-led intervention on transitions in care and used propensity-matched facilities to identify a comparison group of facilities who did not implement the intervention to compare patient outcomes between intervention and comparison facilities . In these and many more examples, the study designs and statistical methods allowed for estimating the impact of implementation interventions on outcomes by reducing the bias created when hospitals, providers, or patient groups receiving interventions have many differences with those not receiving interventions. Therefore, health economists need not be constrained to “aim 3” when their expertise could be integrated into helping answer key implementation science questions.
Overall, the authors present reasonable approaches towards fostering better collaborations between health economists and implementation scientists in multidisciplinary teams. Key questions around costs and resource use of implementation strategies and their downstream impacts on health system costs and other economic outcomes continue to be important to informing the diffusion of real-world implementation strategies. Whether adoption of this model by multidisciplinary teams leads to greater inclusion of economists and economic evaluations in implementation science remains to be seen. Limited resources for conducting research will always lead to prioritizing research questions, and consensus must still be obtained on the inclusion of economic outcomes relative to other study outcomes. Efforts to increase economic evaluation can be bolstered by health care organizations, policymakers, and journals asking for more information on economic implications of implementation strategies. As the evidence base of implementation science continues to grow, building capacity for economic evaluations is critical.
Jean Yoon, PhD, MHS
Availability of data and materials
Zamboni K, Singh S, Tyagi M, et al. Effect of collaborative quality improvement on stillbirths, neonatal mortality and newborn care practices in hospitals of Telangana and Andhra Pradesh, India: evidence from a quasi-experimental mixed-methods study. Implement Sci. 2021;16:4. https://doi.org/10.1186/s13012-020-01058-z.
Leonard C, Lawrence E, McCreight M, et al. Implementation and dissemination of a transition of care program for rural veterans: a controlled before and after study. Implement Sci. 2017;12:123. https://doi.org/10.1186/s13012-017-0653-1.
This work was supported by the VA Quality Enhancement Research Initiative 20-009.
Ethics approval and consent to participate
Consent for publication
The authors declare that they have no competing interests.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Yoon, J. Letter to the Editor. Implement Sci Commun 3, 14 (2022). https://doi.org/10.1186/s43058-022-00266-6