Building implementation capacity in health care and welfare through team training—study protocol of a longitudinal mixed-methods evaluation of the building implementation capacity intervention
Implementation Science Communications volume 2, Article number: 129 (2021)
To ensure the provision of high-quality safety and cost-effective health and welfare services, managers and professionals are required to introduce and ensure the routine use of clinical guidelines and other evidence-based interventions. Despite this, they often lack training and support in implementation. This project aims to investigate how a team training intervention, with the goal to build implementation capacity, influences participants’ implementation knowledge and skills, as well as how it influences implementation activities and implementation capacity within participating health and welfare organizations. Furthermore, the aim is to investigate how the organizations’ contexts influence the intervention outcomes.
The building implementation capacity (BIC) intervention builds on the behavior change wheel, which considers implementation as a matter of behavior change. The intervention will be provided to teams of managers and professionals working in health and welfare organizations and seeking support to implement a guideline- or evidence-based intervention. The intervention consists of a series of interactive workshops that provides the participating teams with the knowledge and skills to apply a systematic implementation model. A longitudinal mixed-methods evaluation, including interviews, surveys, and document analysis, will be applied over 24 months. The normalization process theory measure will be used to assess how the intervention influences implementation activities in practice and implementation capacity in the teams and the wider organizations.
This project has an ambition to add to the knowledge concerning how to promote the uptake of research findings into health care by building implementation capacity through team training in implementation. The project’s uniqueness is that it is designed to move beyond individual-level outcomes and evaluate implementation activities and implementation capacity in participating organizations. Further, the intervention will be evaluated over 24 months to investigate long-term outcomes of implementation training.
Contribution to the literature
Implementation of evidence-based interventions in health and welfare organizations require managers and professionals to change behavior. Despite this, training in implementation and the methods that best support behavior change in the workforce are scarce.
This project will provide insights into how team training may support implementation activities in practice, as well as the building of implementation capacity.
The study will measure processes and outcomes of the training concerning implementation of specific evidence-based interventions and implementation capacity.
The study will evaluate implementation capacity 24 months after the training and, thereby, contribute to understanding of implementation training’s long-term outcomes.
To ensure high-quality, safety, and cost-effectiveness of services delivered by health and welfare organizations, managers and professionals are continuously required to implement changes (i.e., continuously achieving sustainable change in routine practice), such as introducing and ensuring the routine use of clinical guidelines and other evidence-based interventions (EBIs). Despite this, development of skills in implementation is usually not a part of managers’ and professionals’ professional education nor the trainings in which they participate as part of their continuous development [1, 2]. As a consequence, a need to build implementation capacity has been emphasized as an important part of increasing the uptake of research findings into health and welfare organizations . Most implementation trainings have targeted researchers, doctoral- and master-level students [3,4,5,6,7,8,9], or individual professionals through university courses, webinars , or a combination of workshops and webinars . However, there are still limited efforts to build evidence-based knowledge and skills in implementation among managers and professionals. Consequently, a need for capacity-building interventions aimed at professionals and managers has been identified [10, 11].
Leadership is considered a crucial factor in implementation [12,13,14], and managers have a key role in fostering a supportive implementation climate . Line managers (i.e., the managers who work closest to the professionals responsible for providing direct services  are usually responsible for implementation in practice, as well as overseeing implementation . However, implementation is a team effort [18, 19]. Thus, it has been suggested that training in implementation should be provided to teams of professionals alongside their managers, rather than targeting individuals .
The benefit of any training will ultimately depend on whether acquired knowledge, attitudes, and skills are transferred to job-related activities . The literature on transfer of training has identified three large categories of factors that influence whether what is learned from training is transferred into behaviors. These include trainee characteristics (i.e., cognitive ability, self-efficacy, motivation, and perceived utility of training), training design (i.e., behavioral modelling, error management, and realistic training environments), and work environment (i.e., transfer climate, support, opportunity to perform, and follow-up) . These factors are important to consider when designing and delivering training interventions. However, the providers of training interventions usually have little influence on the work environment where the learned skills are supposed to be transferred. This is problematic because even training programs that are designed and delivered effectively will fail to produce positive outcomes if the work environment does not encourage the use of targeted behaviors . Support from managers and peers has been identified as one of the most important work environment factors to promote transfer of training . Thus, team-training interventions provide an opportunity not only to improve the skills of the participating individuals but also to impact the environment where the implementation takes place.
Evaluations of training initiatives have often focused on individual-level outcomes, such as satisfaction with the training and improvements in knowledge and skills . Although a few exceptions exist [e.g., ], long-term evaluations are scarce, and there is a lack of studies that have evaluated the impact on organizational outcomes. There is a need for longitudinal evaluations of training initiatives to increase the understanding of the extent to which training in implementation leads to sustainable outcomes .
The building implementation capacity (BIC) intervention is a team training intervention with the goal of improving work teams’ implementation knowledge and skills . The intervention builds upon the idea that implementation concerns behavior change and proposes that implementation is more likely to be successful if tailored to the specific context in which change should happen [24, 25], as opposed to having generic implementation plans.
Aware that implementation is a reoccurring challenge, the BIC intervention strives to develop the team’s capacity to tailor implementation. Specifically, the intervention sets out to develop a set of dynamic capabilities (i.e., capabilities that relate to the organizations ability to manage change), which enables an organization to integrate clinical guidelines and other EBIs continuously and systematically, with the purpose of improving services and patient outcomes [26, 27]. The consequence is that participating teams should have the ability to concretize what the implementation implies in terms of mapping what behaviors need to change among which individuals and to develop an implementation plan with fit-for-purpose implementation strategies that align with the targeted individuals’ needs. One key difference between the BIC intervention and many other implementation-training efforts is that the participating organizations seek support in implementing something (the implementation case) relevant to them, and the BIC intervention’s focus is to build capacity for implementation. Thus, participating teams work with different implementation cases rather than focusing on a common implementation case.
An evaluation of the intervention’s first version  found positive outcomes in terms of participants perceiving the intervention as useful and relevant, and it increased participants’ knowledge about implementation. The evaluation provided further insights on how parts of the training were transferred to implementation activities in practice, as well as an understanding of how the BIC intervention could be enhanced. This input has now been used to improve the BIC intervention. Although the previous evaluation showed some impact on individuals’ self-reported knowledge and skills in implementation, it is still unknown if the intervention can impact the organizational capacities, what type of additional support organizations requires to integrate the use of the implementation model in their organizations, and what extent the implementation model is used in future implementations.
Aim and research questions
This project aims to investigate how the BIC intervention influences participants’ implementation knowledge and skills, as well as how it influences implementation activities and implementation capacity within participating health and welfare organizations. Furthermore, the aim is to investigate how the organizations’ contexts influence the intervention outcomes.
The following research questions (RQ) will be examined.
To what extent does the BIC intervention increase participating teams’ implementation knowledge and skills?
What type of support is requested and provided to participating teams in addition to the BIC intervention?
How is the acquired knowledge from the BIC intervention transferred into implementation activities in practice?
How does the BIC intervention influence the organizational implementation capacity?
How do the participating organizations’ contexts affect their ability to achieve the intended implementation and apply the knowledge and skills gained through the BIC intervention in future implementations?
The normalization process theory (NPT) will guide the evaluation of the BIC intervention. The NPT provides understanding of how practices become normalized and how work routines are created [28, 29]. In the current project, NPT will be used to understand the normalization of the clinical guidelines and EBIs on which the teams are working in their implementation cases, as well as the normalization of the BIC implementation model’s use as a tool to support future implementation. NPT is concerned with explaining what work people do, or need to do, with regard to implementing new practices, which is conceptualized in a set of four core constructs: coherence, which concerns the sense-making work that people do individually and collectively to operationalize new practices; cognitive participation, which mirrors the relational work that people do to build and sustain a community of practice; collective action, which is the operational work that people perform to enact a set of practices; and reflexive monitoring, which includes the appraisal work people conduct to assess and understand the ways that a new set of practices affects them and others. Due to its explanatory nature, NPT can be useful to assess the prerequisites for implementation, as well as to evaluate the progress of implementation. The application of NPT in this study will be operationalized using the normalization process theory measure (NoMAD) [30, 31].
The intervention evaluation will be conducted with a parallel convergent mixed-methods longitudinal study design  over 24 months. The evaluation’s focus will be on assessing changes in knowledge, skills, and normalization of the specific implementation on which the organizations are working, as well as normalization of the BIC implementation model use in future implementations (i.e., implementation capacity). A process evaluation will be conducted to investigate how contextual factors influence intervention outcomes. The evaluation will be divided into one basic and one profound part. The basic evaluation will include all participating teams, and the profound evaluation will include a subset of teams for a more in-depth investigation of how the training influences activities and implementation capacity in the organizations. The Template for Intervention Description and Replication (TIDieR) guideline  will be used to describe the intervention (Additional file 1), and the Consolidated Standards of Reporting Trials (CONSORT) extension for randomized pilot and feasibility trials  will be adapted according to existing recommendations  and used to report the intervention’s evaluation. The Consolidated Criteria for Reporting Qualitative Research (COREQ) checklist  will be used for reporting the qualitative data .
Setting, participants, and recruitment
The project will be conducted in Region Stockholm and municipalities within Stockholm County. Region Stockholm is Sweden’s largest health care provider region, serving a population of more than two million. The region is responsible for the health care provided to its citizens, including primary care, acute hospital care, and psychiatric care. Stockholm County consists of 26 municipalities, which are responsible for social services. A regional research and development unit will provide the intervention.
The BIC intervention is offered to teams of professionals and managers. A team consists of a manager and two to five colleagues. Participating organizations are recruited through information and marketing of the intervention via different channels reaching organizations in Region Stockholm. Participation is voluntary, and teams choose to register for the intervention themselves. As the intervention is offered twice a year, participants are continuously included in the intervention. The study population will include organizations participating in the BIC intervention and providing informed consent to participate in the research project. We estimate that approximately 40 teams, with 120–200 participants in total, will take part in the intervention and provide data for RQs 1 and 2. A purposefully chosen subsample of these organizations will be included in a profound evaluation. The organizations will be chosen to achieve a maximum variation sample in terms of organization type and size, as well as a variation in implementation cases (i.e., the clinical guidelines or EBIs being implemented). In these organizations, staff not directly participating in the intervention will also be invited to participate in the study to assess implementation of the methods/guidelines on which the teams are working during the BIC intervention, as well as the implementation capacity in the organizations. We estimate that approximately 10 organizations will be included in the profound evaluation and contribute by responding to all stipulated RQs. As such, study participants include (1) participants attending the BIC intervention and (2) eligible staff at the participating organization (i.e., participants attending the BIC intervention together with their colleagues).
The BIC intervention
The intervention development started in 2013 and included a review of the literature on training initiatives in implementation science. This search provided information about implementation approaches with scientific support, including behavioral approaches (e.g., the behavior change wheel [BCW] ), strategies that were tailored to contextual barriers , and models for the implementation stages . The literature was also searched for scientifically supported training designs. This search provided information that was used for developing the intervention’s pedagogical base, including the theory of experiential learning , the transfer of training research , and team learning [41, 42]. In a second step, interviews were conducted with local health-care stakeholders to investigate training needs and desired training outcomes, as well as contextual factors that influence opportunities to participate in training.
An intervention prototype was developed based on the information provided by the literature search and the interviews. This was discussed and revised in a workshop with national experts (i.e., researchers, consultants, and practitioners) in implementation, change management, and health and social care. A pilot test of the intervention in 24 teams provided information that was used to make improvements to the intervention design, mainly clarifications and simplification of the intervention content. The first version of the BIC intervention was evaluated in 2016–2017 ; and based on the results, further changes and refinements were made. These included clarifications of the content and materials used and the addition of practical training components to improve learning and to facilitate transfer of the acquired knowledge to implementation activities in practice.
The revised BIC intervention consists of a series of four workshops (three hours each) over a period of approximately 3 months in which the participants will acquire evidence-based knowledge of implementation through short interactive lectures on implementation research, which are interspersed with the participants’ work with their own implementation case. The workshops will be delivered face-to-face in a large room where participants sit together with their own team (approximately 10 teams in total) to facilitate discussion and work with the implementation case. Between workshops, participants are expected to anchor the work they do within the BIC intervention at their workplace and collect comments on the work from colleagues. Participants will receive continuous feedback on their planning and implementation work from both workshop leaders and other participants during the workshops and between workshops. Approximately 3 months after the last workshop teams will be invited for an additional refill workshop where all steps in the implementation model are repeated. During this workshop, the teams will also have an opportunity to receive feedback and support on their current implementation phase and potential issues that have occurred. For an overview of the delivery and content in the workshops, see Additional file 2.
The intervention will be delivered by a group of professionals working in a research and development unit in Region Stockholm. The unit specializes in providing support concerning implementation and evaluation of EBIs to health and welfare organizations in the region. All workshop leaders are trained in implementation practice, as well as in the specific BIC-intervention model.
The BIC intervention is based on an implementation model adapted from BCW  and determinants of practice . The BCW approaches implementation as a matter of behavior change and provides a system for designing behavior change interventions . When seen as a matter of behavior change, implementation normally indicates a new behavior should occur and, most often, an old behavior should cease. Therefore, the first step of the BCW is to understand the problem that the intervention aims to solve and, then, to choose and specify target behaviors of the intervention. Thereafter, a crucial part of the BCW is tailoring the intervention by analyzing what needs to change to enable the new behavior(s). At the core of the BCW lies the COM-B model, which is used to analyze what needs to change. The COM-B emphasizes that people need capability (C), opportunity (O), and motivation (M) to perform a behavior (B). In the BIC intervention, COM-B is complemented by Flottorp et al.’s  checklist for identifying determinants of practice. Based on an analysis of what needs to change, among individuals or in the environment, suitable implementation strategies (behavior change techniques) are finally identified, and their delivery planned [37, 44].
The BIC implementation model (see Fig. 1 for a graphic overview of the implementation model) includes six steps in which participating teams should (1) describe what they wish to accomplish with the implementation; (2) identify and specify target behavior(s); (3) for each behavior, analyze what is needed for behavior change to occur (using the COM-B model and determinants of practice); (4) choose implementation strategies (e.g., education and reminders) based on the analysis in step 3; (5) apply implementation strategies; and (6) monitor occurrence of the target behavior (i.e., implementation fidelity). The arrows in the middle of the model highlight that monitoring target behavior will likely result in a need to go back in the process and refine the implementation plan. The implementation model’s steps and the activities to support the teams to learn these steps compose the core components of the intervention. Learning activities have been carefully designed to enable teams to achieve learning outcomes and, thus, also constitute core components of the BIC intervention. The use of implementation cases relevant to the teams and the continuous feedback that is provided by workshop leaders throughout the intervention allows tailoring of the intervention to teams’ specific needs.
All participants will receive a workbook in which the different steps are outlined. The workbook serves as a supporting document that didactically directs teams through the implementation model. The teams will also be provided with an implementation plan template that briefly outlines the steps in the implementation model. All material (in Swedish) is available upon request to the first author.
Fidelity to the BIC intervention will be assessed by evaluating if the delivered intervention adheres to the content, frequency, duration, and coverage as described in the planned intervention . The workshop leaders will keep notes about modifications of the planned intervention regarding changes in content, frequency, or duration of the training components as outlined in Additional file 2. Coverage will be assessed by taking notes on attendance for all participants in each workshop.
To examine the development of sustainable implementation capacity, the participating organizations will be followed for 2 years (20 teams) and 1 year (20 teams), respectively, because teams will be included in the intervention at different times starting from 2022 to 2023. As there are many teams participating in the BIC intervention, we will undertake two levels of data collection.
Basic evaluation will cover all organizations participating in the BIC intervention. The purpose of the basic evaluation is to evaluate the extent to which participants increase their implementation knowledge and skills and understand their needs for further support in implementation (i.e., RQ 1 and 2).
Profound evaluation will include 10 purposively selected organizations. These selected organizations will serve as case studies and provide a better understanding of how the BIC intervention works in different organizations, for example, in health care versus social service, in large versus small organizations, and at organizations working on a more strategic level versus those working more practice-oriented. This profound evaluation will include, in addition to the data collection occurring within the basic evaluation, a more in-depth evaluation of the extent to which learnings from the intervention are transferred to behaviors in practice, the organizations’ implementation capacities after attending the intervention, and the ways in which the organizational context influences these outcomes (i.e., RQ 3–5).
Data collection tools and methods
The evaluation will be based on a combination of qualitative and quantitative methods, including surveys, individual interviews, focus group discussions, and document analyses. Below is a description of the different types of data that will be collected to answer each RQ. An overview of the data collection is shown in Table 1.
Implementation knowledge and skills (RQ1) will be assessed at baseline and directly after the intervention using a survey  administered to all intervention participants (basic evaluation). In addition, knowledge, and skills will be evaluated at the team level to assess the extent to which participating teams are able to apply the BIC implementation model. The participating teams in the intervention are required to create plans for their implementation case and for fictive implementations. Their plans will provide information about how they will execute implementation of their implementation case. The fictitious cases will be used to assess participants’ knowledge in implementation and their ability to use the implementation model. This evaluation will provide an additional and more objective account of their learning, compared to the self-reported knowledge assessed through surveys.
Requested implementation support (RQ2)
Documentation outlining requested support outside the workshops will be collected through a structured logbook kept by the workshop leaders. These logbooks will provide information on extent and type of support requested by BIC intervention participants. To understand the need for additional support outside the BIC intervention workshops, individual interviews will also be conducted with workshop leaders.
Use of acquired knowledge for implementation activities in practice (RQ 3)
To assess how the acquired knowledge is transferred to implementation activities in practice, organization staff (i.e., intervention participants and their colleagues) included in the profound evaluation will be invited to respond to the Swedish version of the normalization process theory measure (S-NoMAD) . S-NoMAD is designed for adaptation to the EBI being implemented. Thus, respondents will answer questions concerning the normalization of the specific implementation case that the teams work on during the BIC intervention. The measure will be conducted at three points: directly after the intervention and at 6- and 12-month follow-ups.
Implementation capacity (RQ4)
To assess how the BIC intervention influences organizational implementation capacity, organizational staff included in the profound evaluation will retake the S-NoMAD  now adapted so that the questions refer to the normalization of using the BIC implementation model (rather than the normalization of a specific implementation case). Measures will be conducted at 6, 12, and 24 months as follow-up.
To understand further how the implementation model taught in the BIC intervention has been used in the participating organizations—in relation to the implementation case that the teams have been working on during the intervention and other implementation efforts—individual interviews and focus group discussions will be conducted with intervention participants and eligible staff working at all participating organizations in the profound evaluation. The interviews will be conducted at two time points: 6 months and 18 months after completion.
Understanding organizational context (RQ5)
To explore how the participating organizations’ context affect their ability to achieve the intended implementation and apply the knowledge and skills gained through the BIC intervention in future implementations, individual interviews, and focus groups discussions will be conducted at 6 and 18 months follow-up. Informants will include participants and their colleagues from all the organizations included in the profound evaluation. The focus will be to understand under which circumstances the BIC intervention builds implementation capacity through an investigation of contextual factors.
Qualitative data will be analyzed using qualitative content analysis  in the software NVivo. The quantitative analysis will include descriptive statistics, chi-square tests, and when appropriate, Fisher’s exact tests. Multilevel modeling will be used to assess changes over time because individuals are nested within teams. Data will be analyzed using R statistical software.
Although many empirical studies and theoretical frameworks highlight the importance of managers and staff for successful implementation, there is limited knowledge of how they can be supported and trained in conducting implementation practice. This project has the ambition to add to the knowledge concerning how to promote the uptake of research findings into health care by building implementation capacity through implementation training .
The project makes five main contributions to the research on training to build implementation capacity. First, manager and staff engagement and activities are crucial in implementation efforts. Therefore, the BIC intervention applies a team training approach. This is novel to implementation training, which has traditionally been delivered to individuals, for example, through university courses [3,4,5,6,7,8,9]. This project will provide insights into how team training may support implementation activities in practice, as well as the building of implementation capacity. Second, we aim to move beyond individual-level outcomes and evaluate how the BIC intervention influences implementation capacity and activities within the participating organizations. This will provide information on whether acquired knowledge and skills are transferred to implementation activities in practice.
Third, the project provides a scientific evaluation of a specific implementation case in the participating organizations. For this purpose, we will adopt a new, validated, and translated measurement: S-NoMAD . The use of S-NoMAD will enable this evaluation, despite the participating teams having different implementation cases. Thus, the project will use an instrument for assessing impact of training initiatives across implementation cases and settings, which can provide useful information for future research and evaluations of training initiatives.
Fourth, evaluations of training interventions are often limited to short-term outcomes . Consequently, the extent to which training interventions may lead to more sustainable outcomes, such as maintained implementation capacity, is not known. In addition, a 2-year follow-up focused on quantifying implementation capacity is, to our knowledge, uncommon. Therefore, we will use S-NoMAD to evaluate the sustained use of the implementation model, upon which the BIC intervention builds, to investigate the normalization of its use in the implementation of clinical guidelines and EBIs beyond the cases during the intervention. Measuring implementation capacity at different points in time will facilitate an understanding of the institutionalization of implementation capacity, here operationalized as the normalization of using the BIC implementation model. Last, one major problem in implementation is transferability between different contexts. The BIC intervention targets a wide range of different organizations, which, in combination with the investigation of influencing contextual factors, will contribute to understanding under which circumstances the BIC intervention can lead to implementation activities in practice and to increased organizational implementation capacity.
Availability of data and materials
Transcripts from individual interviews and focus group discussions are not publicly available. However, data are available from the authors upon reasonable request, after ensuring that the integrity and confidentiality of respondents can be maintained. Quantitative data sets are available from the corresponding author on reasonable request.
Behavior change wheel
Building implementation capacity
Capability, opportunity, motivation-behavior
Normalization process theory
Swedish version of the normalization process theory measure
Mosson R, Hasson H, Wallin L, von Thiele Schwarz U. Exploring the role of line managers in implementing evidence-based practice in social services and older people care. Br J Soc Work. 2017;47(2):542–60.
Leeman J, Calancie L, Kegler MC, Escoffery CT, Herrmann AK, Thatcher E, et al. Developing theory to guide building practitioners’ capacity to implement evidence-based interventions. Health Educ Behav. 2017;44(1):59–69.
Straus SE, Sales A, Wensing M, Michie S, Kent B, Foy R. Education and training for implementation science: our interest in manuscripts describing education and training materials. Implement Sci. 2015;10:136.
Meissner HI, Glasgow RE, Vinson CA, Chambers D, Brownson RC, Green LW, et al. The U.S. training institute for dissemination and implementation research in health. Implement Sci. 2013;8:12.
Chambers DA, Proctor EK, Brownson RC, Straus SE. Mapping training needs for dissemination and implementation research: lessons from a synthesis of existing D&I research training programs. Transl Behav Med. 2017;7(3):593–601.
Tabak RG, Padek MM, Kerner JF, Stange KC, Proctor EK, Dobbins MJ, et al. Dissemination and implementation science training needs: insights from practitioners and researchers. Am J Prev Med. 2017;52(3 Suppl 3):S322–S9.
Carlfjord S, Roback K, Nilsen P. Five years’ experience of an annual course on implementation science: an evaluation among course participants. Implement Sci. 2017;12:101.
Proctor EK, Landsverk J, Baumann AA, Mittman BS, Aarons GA, Brownson RC, et al. The implementation research institute: training mental health implementation researchers in the United States. Implement Sci. 2013;8:105.
Moore JE, Rashid S, Park JS, Khan S, Straus SE. Longitudinal evaluation of a course to build core competencies in implementation practice. Implement Sci. 2018;13:106.
Chambers DA, Pintello D, Juliano-Bult D. Capacity-building and training opportunities for implementation science in mental health. Psychiatry Res. 2020;283(112511).
Davis R, D’Lima D. Building capacity in dissemination and implementation science: a systematic review of the academic literature on teaching and training initiatives. Implement Sci. 2020;15:97.
Gifford W, Davies B, Edwards N, Griffin P, Lybanon V. Managerial leadership for nurses' use of research evidence: an integrative review of the literature. Worldviews Evid Based Nurs. 2007;4(3):126–45.
Øvretveit J. Improvement leaders: what do they and should they do? A summary of a review of research. BMJ Qual Saf. 2010;19(6):490–2.
Reichenpfader U, Carlfjord S, Nilsen P. Leadership in evidence-based practice: a systematic review. Leadersh Health Serv. 2015;28(4):298–316.
Ehrhart MG, Aarons GA, Farahnak LR. Assessing the organizational context for EBP implementation: the development and validity testing of the Implementation Climate Scale (ICS). Implement Sci. 2014;9:157.
Mosson R. Leading implementation in health and social care: the line manager in the spotlight. Stockholm: Karolinska Institutet; 2018.
Birken SA, Lee S-YD, Weiner BJ. Uncovering middle managers' role in healthcare innovation implementation. Implement Sci. 2012;7:28.
Weiner BJ. A theory of organizational readiness for change. Implement Sci. 2009;4:67.
Weiner BJ, Amick H, Lee S-YD. Conceptualization and measurement of organizational readiness for change: a review of the literature in health services research and other fields. Med Care Res Rev. 2008;65(4):379–436.
Aguinis H, Kraiger K. Benefits of training and development for individuals and teams, organizations, and society. Annu Rev Psychol. 2009;60:451–74.
Grossman R, Salas E. The transfer of training: what really matters. Int J Train Dev. 2011;15(2):103–20.
Park JS, Moore JE, Sayal R, Holmes BJ, Scarrow G, Graham ID, et al. Evaluation of the “Foundations in Knowledge Translation” training initiative: preparing end users to practice KT. Implement Sci. 2018;13:63.
Mosson R, Augustsson H, Bäck A, Åhström M, von Thiele Schwarz U, Richter A, et al. Building implementation capacity (BIC): a longitudinal mixed methods evaluation of a team intervention. BMC Health Serv Res. 2019;19(287).
Baker R, Camosso-Stefinovic J, Gillies C, Shaw EJ, Cheater F, Flottorp S, et al. Tailored interventions to address determinants of practice. Cochrane Database Syst Rev. 2015;4:CD005470.
Wensing M. The Tailored Implementation in Chronic Diseases (TICD) project: introduction and main findings. Implement Sci. 2017;12:5.
Evans JM, Brown A, Baker GR. Organizational knowledge and capabilities in healthcare: deconstructing and integrating diverse perspectives. SAGE Open Med. 2017;5:1–10.
Kislov R, Wilson P, Boaden R. The ‘dark side’of knowledge brokering. J Health Serv Res Policy. 2017;22(2):107–12.
May CR, Cummings A, Girling M, Bracher M, Mair FS, May CM, et al. Using Normalization Process Theory in feasibility studies and process evaluations of complex healthcare interventions: a systematic review. Implement Sci. 2018;13:80.
May CR, Mair F, Finch T, MacFarlane A, Dowrick C, Treweek S, et al. Development of a theory of implementation and integration: normalization process theory. Implement Sci. 2009;4:29.
Rapley T, Girling M, Mair FS, Murray E, Treweek S, McColl E, et al. Improving the normalization of complex interventions: part 1-development of the NoMAD instrument for assessing implementation work based on normalization process theory (NPT). BMC Med Res Methodol. 2018;18:133.
Finch TL, Girling M, May CR, Mair FS, Murray E, Treweek S, et al. Improving the normalization of complex interventions: part 2-validation of the NoMAD instrument for assessing implementation work based on normalization process theory (NPT). BMC Med Res Methodol. 2018;18(135).
Creswell JW, Plano Clark VL. Designing and conducting mixed methods research. 2nd ed. Los Angeles, Calif.; London: SAGE; 2011.
Hoffmann TC, Glasziou PP, Boutron I, Milne R, Perera R, Moher D, et al. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. Bmj. 2014;348(g1687).
Eldridge SM, Chan CL, Campbell MJ, Bond CM, Hopewell S, Thabane L, et al. CONSORT 2010 statement: extension to randomised pilot and feasibility trials. Bmj. 2016;355(i5239).
Lancaster GA, Thabane L. Guidelines for reporting non-randomised pilot and feasibility studies. Pilot Feasibility Stud. 2019;5:114.
Tong A, Sainsbury P, Craig J. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. Int J Qual Health Care. 2007;19(6):349–57.
Mitchie S, Atkins L, West R. The behaviour change wheel: a guide to designing interventions. London: Silverback Publishing; 2014.
Grimshaw JM, Eccles MP, Lavis JN, Hill SJ, Squires JE. Knowledge translation of research findings. Implement Sci. 2012;7:50.
Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011;38:4–23.
Kolb DA. Experiential learning: experience as the source of learning and development. Englewood Cliffs: Prentice Hall; 1984.
Gifford W, Davies B, Tourangeau A, Lefebre N. Developing team leadership to facilitate guideline utilization: planning and evaluating a 3-month intervention strategy. J Nurs Manag. 2011;19:121–32.
Greenhalgh T. How to implement evidence-based healthcare. Oxford: Wiley Blackwell; 2018.
Flottorp SA, Oxman AD, Krause J, Musila NR, Wensing M, Godycki-Cwirko M, et al. A checklist for identifying determinants of practice: a systematic review and synthesis of frameworks and taxonomies of factors that prevent or enable improvements in healthcare professional practice. Implement Sci. 2013;8:35.
Michie S, van Stralen MM, West R. The behaviour change wheel: a new method for characterising and designing behaviour change interventions. Implement Sci. 2011;6:42.
Carroll C, Patterson M, Wood S, Booth A, Rick J, Balain S. A conceptual framework for implementation fidelity. Implement Sci. 2007;2:40.
Augustsson H, Richter A, Hasson H, von Thiele Schwarz U. The need for dual openness to change: a longitudinal study evaluating the impact of employees’ openness to organizational change content and process on intervention outcomes. J Appl Behav Sci. 2017;53(3):349–68.
Elo S, Kyngas H. The qualitative content analysis process. J Adv Nurs. 2008;62(1):107–15.
Elf M, Nordmark S, Lyhagen J, Lindberg I, Finch T, Åberg AC. The Swedish version of the Normalization Process Theory Measure S-NoMAD: translation, adaptation, and pilot testing. Implement Sci. 2018;13:146.
Open access funding provided by Karolinska Institute. This project is funded by the Swedish Research Council for Health, Working Life and Welfare (FORTE: grant number 2020-01184).
Ethics approval and consent to participate
This study has been reviewed by the Swedish Ethical Review Authority and was found not to need any ethical approval (Ref no: 2020-03601). Nevertheless, all participants will be treated in accordance with the ethical guidelines. Written informed consent will be obtained from all study participants. In the case of refusal, these individuals will not be included in the data set used for analyses.
Consent for publication
The authors declare that they have no competing interests.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Augustsson, H., Costea, VA., Eriksson, L. et al. Building implementation capacity in health care and welfare through team training—study protocol of a longitudinal mixed-methods evaluation of the building implementation capacity intervention. Implement Sci Commun 2, 129 (2021). https://doi.org/10.1186/s43058-021-00233-7
- Implementation science
- Behavior change
- Implementation capacity
- Knowledge translation
- Process evaluation
- Tailored implementation