Skip to main content

Developing a co-production strategy to facilitate the adoption and implementation of evidence-based colorectal cancer screening interventions for rural health systems: a pilot study

Abstract

Background

Evidence-based colorectal cancer screening (CRCS) interventions have not been broadly adopted in rural primary care settings. Co-production of implementation strategies through a bundled approach may be promising in closing this gap by helping rural healthcare practitioners select and implement the best fitting CRCS interventions to the local context. This paper describes the process and outcomes of co-development and delivery of the bundled implementation strategy to improve adoption and implementation of CRCS interventions with two rural clinics.

Methods

We used a bundle of implementation strategies with a core focus on academic-clinical partnership development (strategy 1) and Plan-Do-Study-Act cycles (strategy 2) to identify clinical partner interests/preferences on delivery methods and content needed to facilitate intervention identification and implementation that improves CRCS. We also developed an implementation blueprint for each clinic (strategy 3) through an online blueprinting process based on adapted “Putting Public Health Evidence in Action” (PPHEA) training curriculum. Clinic physicians and staff (n = 7) were asked to evaluate the bundled approach based on overall reactions and perceptions of innovation characteristics using 5-point Likert scale. After completing the bundled approach, we collected implementation outcomes and limited intervention effectiveness of the CRCS evidence-based interventions (EBIs) developed through the process.

Results

Our co-production strategy yielded a prototype online blueprinting process consisting of 8 distance-learning PPHEA modules that guide selection and implementation of EBIs tailored to CRCS. Modules were delivered to clinic participants with minor adaptations, using PDSA cycle to improve quality of module contents and formats. Overall, participants in both clinics reported positive reactions toward the bundled approach. Both clinics reported improvements in how they perceived the characteristics of the innovation (the bundled approach) to tailor selected CRCS EBIs. As a result of the bundled strategies, each clinic selected and adopted specific EBI(s) with the varying degrees of implementation and CRCS outcomes.

Conclusions

The bundle of implementation strategies used were feasible and acceptable in rural primary care practices to facilitate the use of EBIs to improve CRCS.

Peer Review reports

Background

Colorectal cancer (CRC) is the fourth most common cancer in the USA [1]. In 2022, approximately 150,000 individuals are estimated to be diagnosed with CRC, and over 50,000 individuals will die from CRC in the U.S. CRC has the second highest cost of cancer in the USA. According to the projection of 2010–2020, total annual medical cost for CRC was $14.1 billion and average Medicare health care spending for patients with newly diagnosed CRC ranges from $40,000 to $80,000 depending on the stage [2, 3]. CRC is among the few preventable cancers through screening. The United States Preventive Services Task Force (USPSTF) rated CRC screening the highest-ranking (“A” grade) for a preventive care screening [4]. Anyone aged between 45 and 75 without symptoms or family history can prevent or detect cancer in earlier stages by performing visual examinations (e.g., colonoscopy) or taking high-sensitive stool tests (e.g., fecal immunochemical test). Yet, CRC screening (CRCS) rate in the USA is not optimal. According to 2020 Behavioral Risk Factor Surveillance System data, 69.7% of Americans aged 50–75 met the USPSTF recommendation [5], yet below than the 74.4% goal by Healthy People 2030 [6] and 80% goal by the National Colorectal Cancer Roundtable [7].

Rural communities experience much lower CRCS rates than urban counterparts. In 2013, two studies found rural-urban disparities among CRCS. Studies reported that rural patients were less likely to receive screening recommendation by physicians, and less likely to have completed screening or the stool-based test compared to their urban counterparts [8,9,10,11]. For example, a recent survey study conducted in rural Nebraska found that rural primary care patients were less likely to have CRCS compared to their urban counterparts (74.4% vs. 88.1%, p < 0.001) [10]. This rural-urban gap is exacerbated when combined with racial and ethnic factors (e.g., rural with higher proportion of Hispanic populations having lower CRCS rates) and is even larger in states with lower screening rates compared to those with higher screening rates [12].

A plethora of evidence-based intervention (EBI) strategies and programs to promote CRCS are available. The Community Guide recommends more than 15 intervention strategies to increase community demand (e.g., one-on-one education or client reminders), community access (e.g., reducing structural barriers by providing navigation services), or increase provider delivery (e.g., provider assessment and feedback) [13,14,15,16]. The National Cancer Institute also introduced 22 research-tested intervention programs (RTIPs) for promoting CRCS, and among those, eight targeted rural and low-income populations [17]. More recently, direct mailing strategies using stool-based CRCS tests were shown to be highly effective, combined with other strategies (e.g., education, navigation, or client reminders) [15, 18,19,20,21].

Despite these strong evidence and a wide range of selections, adopting and implementing CRCS EBIs is still challenging for rural health practitioners [22]. Some of the major challenges for rural systems include locating the most up to date evidence [8], determining the resources and system changes needed for implementation, and then selecting and initiating the EBIs that fit best to practice organizations’ context [23, 24]. This is not an inconsequential issue because uncertainty about “fit” can potentially lead to poor outcomes in the implementation and sustainability of EBIs. Uncertainty can also put heavy demands on rural health systems that are chronically overburdened and under-resourced, further increasing concerns about applying resources to strategies that ultimately do not fit within the local system.

To address these challenges in rural practices, the study introduced the co-production of implementation strategies to facilitate CRCS EBI uptake in the rural health systems. Co-production (also called, co-design) is “a collaboration between researchers and end users (rural health systems) from the onset, in question framing, research design and delivery, and influencing strategy, with implementation and broader dissemination strategies part of its design from gestations.” [25]. Co-production is often operationalized as a participatory approach that includes an ongoing, engaged clinical-academic partnership to facilitate the movement of EBIs from research to practice [26,27,28].

Participatory approaches also tend to bundle implementation strategies to help facilitate the translation of EBI to practice. Using Expert Recommendations for Implementing Change (ERIC) taxonomy, we bundled three implementation strategies: (1) developing an academic-clinical partnership, (2) creating a formal implementation blueprint through an online blueprinting process, and (3) using plan-do-study-act (PDSA) cycles to identify clinical partner interests and preferences on delivery methods, content refinement, and system change processes that improve CRCS EBI adoption implementation and sustainability [29]. In this study, we described the process of our co-production (participatory) approach including the development and delivery of the bundle of implementation strategies and tested the feasibility and acceptability of this bundled approach with the two rural clinics.

Methods

Study setting and participants

The study was conducted with two rural primary care clinics as a part of an accountable care organization (ACO) located in a rural county with a population of 34,914 (2019 estimates by US Census) with a Rural-Urban Commuting Area (RUCA) Codes of “4 = micropolitan area core with primary flow within an urban cluster of 10,000 to 49,999” or “5 = micropolitan high commuting with primary flow 30% or more to a large urban cluster” [30]. The participating ACO was consisted of a 116-bed regional referral center and six primary care clinics. All six clinics and a regional referral center are located in the same county. Through initial meetings, we identified that CRCS is one of the priority areas of the ACO since they have participated in a value-based payment program for commercially insured patients since 2018.

Among the six clinics, the two clinics showed interests in participating in the study. Clinic A provides essential primary care services to the community through six providers (three physicians, an Advance Practice Registered Nurse [APRN], and two Physician Assistants [PAs]) and additional 17 staff members. Clinic B is the largest primary care practice in the ACO network and provides services through 11 providers (six physicians, two APRNs, a PA, and two residents) with the support of 22 non-medical staff. Both clinics have participated in the co-production approach from July 2019 to March 2022 (see Additional file 1 for project milestone).

Three to five representatives from each clinic participated in the study. Following guidance from the literature on systems-based approaches [26], we recommended each clinic to have a team composed of at least one “decision-maker” (e.g., lead physician or manager) and one “doer” who carries out implementation plans (e.g., clinical care coordinators or frontline staff) and “supporter” who provide additional support to the team (e.g., data specialists, or other administrative staff). The ACO leadership team was also invited to, and engaged in, providing feedback on the process.

Study design

This study applied the principles of participatory action research that uses a “reflection, data collection, and action that aims to improve health and reduce inequalities through involving the people who, in turn, take actions to improve their own health” [31]. In our study, we involved key partners (rural ACO clinic providers) in identifying problems (CRCS), reflecting on past and current approaches to promote CRCS, and taking action with a new approach (e.g., co-produced strategies).

Intervention (bundled implementation strategies)

Our co-production approach used a bundle of the three implementation strategies (Fig. 1) to facilitate rural clinic partners to locate, select, and implement CRCS EBIs in their practices.

Fig. 1
figure 1

A bundled implementation strategy to facilitate evidence-based interventions to promote colorectal cancer screening

Academic-practice partnership

Following the approach by Estabrooks and his team [26], we integrated an academic-practice partnership through ongoing interactions with our clinical partners (ACO) on the process of problem prioritization, strategy selection, adaptation, trials, evaluation, and decision-making. The role of academic members in the partnership is to increase resources by engaging as “knowledge brokers” that can summarize existing EBIs, provide support for health systems to prioritize across available EBIs, and gather and report on system processes and outcomes that can inform adaptation and sustainability. The role of clinical members is to bring staff, organizational knowledge, experience, and culture together by engaging as “system experts.” The partnership enables locating and selecting the EBIs that best align with practice needs and capacities, and determining system changes necessary for implementation.

Developing a blueprint for EBI implementation

A blueprint is a formal implementation plan or protocol that includes the (1) aim/purpose of the implementation, (2) scope of the change, (3) timeframe and milestones, and (4) appropriate performance/process measures [29]. Once developed, the blueprint can be used and updated to guide the implementation effort over time. To facilitate the development of the blueprint, we adapted “Putting Public Health Evidence in Action (PPHEA)” curriculum developed by the Cancer Prevention and Control Research Network [32]. PPHEA curriculum provides eight publicly available, ready-to-use training sessions and tools to guide each step required to adapt, implement, and evaluate EBIs to promote various public health programs [33,34,35]. While these applications of the PPHEA show promise, initial applications used a relatively intensive face-to-face process that is unlikely to fit in busy, rural primary care practices. To further increase the usability of this approach, we developed an online blueprinting process that uses distance-learning modules to deliver the PPHEA training. The online module development team consisted of an academic team including an implementation scientist (PE), health services researcher (JK), distance-learning instructional designer (AM), and research assistant (AA) as well as rural health system experts from the two clinics.

Plan-Do-Study-Act (PDSA) cycle

We used the PDSA cycle as a key approach within our bundled implementation strategy [36]. In the planning stage, we conducted two qualitative focus groups (n = 8) to assess clinic representatives’ opinions regarding the most suitable methods to receive information on EBI characteristics and blueprints for implementation, systems-change strategies to facilitate implementation, and tools to identify intervention strategies that are both efficient and effective (Plan). Each focus group session took about an hour and conducted in person at each clinic’s conference room. Based on information from the planning session, we delivered a prototype distance-learning module of PPHEA training to clinic participants (Do). Upon completing each session (n = 8 sessions), participants provided feedback using surveys to assess perceptions of the bundled approach and potential adaptations for the next session (Study). Based on the feedback, modifications were made for the next modules (Act). After delivering all eight sessions, academic facilitators continued to hold monthly meetings with clinic participants to facilitate the implementation of the EBIs selected from the training (Additional File 1).

Evaluation plan and measures

Participants’ reactions and perceptions

We adapted the post-training evaluation measures in the PPHEA training guide [34]. We administered an online survey asking participants’ reactions after completing each module regarding [1] overall satisfaction, (2) knowledge enhancement, (3) relevance to the job, (4) time investment, and (5) credibility of information. All items were rated on a 5-point Likert Scale of 1 = strongly disagree to 5 = strongly agree. After completing the first and last module, we evaluated participants’ perceptions of the bundled approach using Rogers’ Innovation Characteristics Measures [37] that included Relative Advantage (7 items), Compatibility (5 items), Simplicity (6 items), Trialability (3 items), and Observability (4 items). We adapted the questionnaires from the three existing studies [38,39,40]. See Additional File 2 for detailed survey instruments. Participants received a $10 gift card for completing each survey and an additional $20 gift card for completing all eight surveys.

Adoption, implementation, and outcomes of EBIs

Adoption was defined as “Yes” if any of the activities or plans of the selected EBIs were initiated at the clinic based on the monthly facilitation meeting notes recorded by a research team staff. Implementation was measured as the proportion of the activities or plans completed as compared to all the activities or plans developed in the formal implementation blueprint document. For example, if there are five activities or plans developed in the blueprint and three has completed, we recorded 60% implementation rates. CRCS outcome data were limitedly available at both clinics. Clinic A provided overall CRCS rates between fiscal year 2020 and fiscal year 2021 based on an annual performance report developed for commercially insured patients (about 50% of the entire clinic population). Clinic B provided number and proportions of CRCS eligible patients who completed the screening during the flu vaccination season (August to February of 2020 and 2021).

Analysis

We used descriptive statistics, including Means, Standard Deviations, Frequencies, and Percentages, to analyze quantitative data. Qualitative data (initial focus group) were analyzed by inductive and deductive development and organization of thematic codes. The research team (JK and AA) developed a coding structure, which includes key conceptual domains and participant perspectives. Minor modifications were made iteratively until the model was saturated. Facilitation notes were carefully reviewed and summarized. Data were analyzed using SAS version 9.3 and NVivo qualitative analysis software (QSR NVivo 11).

Results

Development of the distance-learning modules (PPHEA training) tailored to CRCS

Through the initial focus group, we identified clinic participants’ interests and preferences on the content tailored to CRCS and delivery methods using synchronous (real-time video conferencing) and asynchronous (pre-recorded lecture video) distance learning technologies. Based on these initial preferences, we converted the original eight PPHEA sessions to pre-recorded, online video sessions followed by online discussion forums or live-streaming conference videos/calls facilitated by the academic team. Following the PPHEA training facilitator’s guide, we included all core contents in each training session and customized contexts/supplemental materials (e.g., handouts or tools) specific to CRCS EBIs. This resulted in the integration of 6 EBI strategies recommended by the CommunityGuide (small media, client reminder, one-on-one education, provider feedback and assessment, and provider reminder, reducing structural barriers) and three packaged programs introduced by Research Tested Intervention Programs (Flu-FIT/FOBT, Community Cancer Screening Program, and FIT & Colonoscopy Outreach) as well as the recent evidence of mailed stool-based approaches and multi-component strategies [18,19,20,21]. Original and modified PPHEA module contents are illustrated in the Additional File 3. We used a free online learning management system (LMS) called “Moodle” developed and maintained by the University of Nebraska Medical Center to upload module contents and communicate with learners via online discussion forums. For real-time video conferencing, we used “Zoom” or “Webex.”

Delivery of distance-learning modules

The modules were delivered to the clinic participants on a monthly interval from October 2019 to August 2020, except the 2 months that were affected by the COVID-19 outbreak. The team composition grew relatively organically within each clinic and differed for the two clinics. Clinic A’s team consists of three primary care providers (two physicians and an APRN) and a nurse clinical manager from the ACO administrative team. Clinic B’s team included a physician, a clinical data coordinator, a nurse care coordinator, a referral/schedule coordinator, and a care manager. The two clinical teams also showed different learning styles. Clinic A used a “group learning” approach (viewed online lectures together at a reserved conference room followed by live streaming discussion). In contrast, clinic B used a “hybrid learning” approach (individuals viewed online lectures separately to cover material before a group meeting and video conference facilitated by the academic partners). After implementing the first module (defining EBI), clinic A provided constructive feedback regarding the video lecture presentation quality and content (e.g., too monotonous; less dynamic). Clinic A also requested to skip the session on community assessment and move directly to the module session that included the CRCS EBI examples. After receiving clinic’s feedback, we improved the video presentation quality and skipped the module 2 (community assessment). As a result, clinic A completed seven sessions, while clinic B completed all eight sessions as planned (see Table 1).

Table 1 Original and adapted plan for the module delivery

Participants’ reactions

Despite some negative feedback from clinic A for the first session, participants in both clinics reported overall high mean scores (most scores 4 points or higher on a scale of 1 = “strongly disagree” to 5 = “strongly agree”) on the five items of reactions: overall satisfaction with the session, knowledge enhancement on CRCS interventions, relevance to job, worth the time invested, and credibility of information (see Table 2).

Table 2 Participants’ reaction to the implementation strategy

Participants’ perception of innovation (i.e., bundled implementation strategy) characteristics

Both clinics reported improvements in their perceptions of the bundled implementation strategy after completing all the distance-learning sessions, although differences vary by characteristic domain and clinic. In clinic A, the largest improvements were shown in Relative Advantage (Diff = 1.43) followed by Trialability (Diff = 1.34). In clinic B, the largest improvement was seen in their perception of Compatibility of the bundled approach (Diff = 0.72) (see Table 3).

Table 3 Participants’ perceptions toward innovation characteristics (before and after)

Adoption, implementation, and outcomes of the selected EBIs

After completing all the modules, both clinics developed a specific plan to implement CRCS EBIs (Table 4). Clinic A chose a combination of small media and client reminder intervention using mailed postcards informing patients regarding CRCS followed by telephone reminders. Clinic B developed an idea to adapt Flu-FIT/Flu-FOBT program, which uses an injection nurse to recommend CRCS for patients who visit the clinic for receiving the flu vaccine. Both clinics developed a formal implementation blueprint for the selected EBIs that included specific goals and activities, person responsible, resources, progress, and indicators of completion.

Table 4 Adoption, implementation, and outcomes of the selected EBIs

Upon the completion of the online blueprinting process through the distance-learning PPHEA modules, the academic team continued to meet with each clinic team monthly to facilitate the adoption and implementation of the selected EBI (October 2020 through March 2022). As shown in the Table 4, both clinics adopted the EBIs, with varying degrees of implementation. Due to the surge of patient care needs during the COVID-19 pandemic and staff turnover, clinic A delayed the implementation about 6 months, and implemented only 58% of the intervention activities planned. Between July 2021 and January 2022, 34 postcards (about 35% of the total number of mailings initially planned) were sent and 24 follow-up calls were completed. This resulted in one colonoscopy referral and one FIT-DNA ordered. According to the gap report, between the FY20 and FY21, there was an increase of CRCS from 71 to 77%; however, it is not clear whether this increase was solely accounted for the EBIs that were implemented. Clinic B was able to achieve 100% implementation for the Flu-CRC program. During the pilot implementation trials in the year 1 (August 2020–February 2021), 977 patients visited the clinic B for a flu shot. Of these, 163 were due for CRCS were recommended by injection nurses to schedule or order screening tests. This resulted in 29 patients completed CRCS within the six months. In the year 2 (August 2021–February 2022), 1175 patients came for a flu shot, and 214 were due for CRCS. Of these, 38 completed CRCS within the 6 months.

Discussion

The selection and implementation of CRCS EBIs in rural primary care clinics are critical given the geographic disparities in cancer screening and outcomes. Equally important is understanding how these clinics perceive co-production of implementation strategies intended to facilitate the uptake of CRCS EBIs. In this study, we began our approach by building an academic-practice partnership in the process of problem prioritization (CRCS), strategy selection, adaptation, and implementation. We developed implementation blueprints by providing training and education using the adapted PPHEA modules specifically designed for, and delivered to rural primary care practitioners. We also used a rapid improvement cycle (PDSA) to make iterative changes to the implementation strategy based on clinic staff feedback. Our approach was perceived positively to clinic participants and resulted in an adoption of EBIs in each clinic with varying levels of implementation. Our project provides preliminary information about the potential of the bundled implementation strategy as feasible and acceptable in rural primary care practices.

While this is preliminary and pilot from two rural clinics, our findings align with research that suggests the likely success of implementation strategies that bundle activities such as academic-practice partnerships, implementation blueprint, and quality improvement strategies with regular feedback with iterative adaptations [41, 42]. Adams and colleagues (2018) conducted a survey study of key informants at federally qualified health centers (FQHCs) in eight states to examine which EBIs to promote CRCS were used, and which implementation strategies were employed. They found that FQHCs used multiple implementation strategies (an average of 10, range 2–19) as “bundles” depending on different implementation stages. A few examples of such strategies include identifying barriers to implementing evidence-based approaches, consistently monitoring the implementation process and modifying as appropriate, distributing CRC guidelines and EBI materials to providers, and developing a formal implementation blueprint. Particularly, the Adams et al. work highlighted the potential gains of training health system staff on process of developing and executing implementation strategies (e.g., developing a formal implementation blueprint or conducting group educational meetings for providers) consistent with the previous studies underscoring the need for more support and guidance for EBI implementation [43, 44]. Our findings suggest a feasible and acceptable way to work with these health systems, especially in the rural areas, to provide guidance and resources for selecting and implementing EBIs to promote CRCS.

It is worth noting that our co-production approach was well received by our rural primary care practitioners given that most participants reporting positive satisfaction and relative advantage of the bundled implementation strategy approach. Specifically, participants from both clinics reported significant improvements in relative advantage and compatibility, which indicates a relative benefit of our bundled approach compared to other implementation strategies and a better fit with the clinics’ needs and capacity. While previous studies focus on the volume of system-level implementation strategies (e.g., more strategies correlate with higher screening rates) [41, 42], our findings add to the previous studies by highlighting that the number of strategies may be less important than having system strategies that align well with the local context. However, the degree to which this is related to successful implementation of EBIs and CRCS outcomes will need to be tested in a larger trial.

Interestingly, the two clinic teams showed varying levels of implementation. Besides the struggle that the clinic A faced due to the loss of a lead physician in the midst of pandemic, clinic A did not engage any support staff or “doers” in the module training process, which could be the reason for delayed and partial implementation. This goes back to the literature highlighting the importance of involving interdisciplinary staff (practice manager, frontline office delivery staff, data person) in the implementation team composition depending on the scope and complexity of the project [26]. Daly et al.’s study also underscored the importance of engaging office support staff encouraging CRCS as one of the key system-level strategies used in community health centers serving medically vulnerable patients. Additionally, some of our rural primary care participants wanted less education on community assessment and definitions and quicker access to modules that provided example CRCS EBIs. An alternative approach may be to combine modules (e.g., module 1: defining EBIs and module 2: community assessment) or to reduce the modules that are not the core components (e.g., module 8: communication). Future studies may consider the engagement of program delivery staff more fully in the development of blueprint processes [42] and examine the pace and content of the module delivery based on local needs and context while minimizing the changes in the core component of the training.

While only a pilot, the process supported both clinics in identifying and initiating implementation of evidence-based approaches to increase CRC screening. Both clinics selected EBIs that had been included as examples, underscoring the need to do preliminary work to ensure that a range of examples are provided to fit differential resources across rural clinics. Interestingly, our rural clinic partners prefer EBIs that include colonoscopy as a major test option and stool-based tests as alternatives, rather than selecting stool-based test as a single main intervention (e.g., direct mailing of FIT). We found the similar preferences in another rural-focused study reporting that rural clinics are more likely to prefer the use of colonoscopy alone or prefer to use both colonoscopy and stool tests [10, 45]. This may be due to the rural practitioners’ uncertainties around the effectiveness of the stool-based tests, or concerns about out-of-pocket cost for “diagnostic” colonoscopy when performed following a positive stool-based test, which is more expensive than screening colonoscopy. Future programs may need to include up-to-date scientific evidence on effectiveness of the stool-based tests as well as more accurate and transparent cost implications for each screening test modalities.

Of course, as a pilot project, there are several limitations. These include a small sample of clinics and staff from each clinic, and both clinics also came from the same region, which limits generalizability. We did not include patient/community representatives in our co-production process, which may lack their perspectives on the best fitting CRCS EBIs for the community’s needs. Finally, our quantitative data is limited to descriptive considerations rather than inferential statistics due to the small sample size. Nevertheless, our qualitative data indicated that this participatory approach fits well with the clinical resources, time, and interest. Future work will include testing the newly developed modules in a broader range of rural clinics to determine the utility of this co-production of the bundled implementation strategies in supporting EBI selection, adoption, implementation, and sustainability to promote CRCS.

Conclusions

Little is known about the co-production of implementation strategies for evidence-based CRCS in rural primary care clinics. We used a bundle of strategies (developing an academic-clinical partnership, forming an implementation blueprint, implementing quality improvement strategies to provide regular feedback and iterative adaptations) to help rural clinics identify the best fitting EBI to their practice context. We developed eight distance-learning modules to build an online blueprint for CRCS EBI selection and implementation combined with monthly live-streaming conferences to allow for CRCS tailoring. After completing all the modules, participants in two rural clinics reported positive reactions toward the bundled approach. Both clinics reported improvements in how they perceived the characteristics of the bundled approach to tailor selected CRCS EBIs. Through the process, both clinics developed and adopted the EBIs with varying degrees of implementation and modest increase of CRCS outcomes. Our preliminary data showed that our bundle of implementation strategies were feasible and acceptable in rural primary care practices with modifications for the local context to facilitate the use of evidence-based approaches to improve CRCS.

Availability of data and materials

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

Abbreviations

ACO:

Accountable care organization

APRN:

Advance Practice Registered Nurse

CRC:

Colorectal cancer

CRCS:

Colorectal cancer screening

EBI:

Evidence-based intervention

PA :

Physician assistant

PCP:

Primary care provider

PDSA:

Plan-Do-Study-Act

PPHEA:

Putting Public Health Evidence in Action

USPSTF:

United States Preventive Services Task Force

References

  1. American Cancer Society. Cancer facts and figures. American Cancer Society; 2022.

  2. Mariotto AB, Robin Yabroff K, Shao Y, Feuer EJ, Brown ML. Projections of the cost of cancer care in the United States: 2010–2020. J National Cancer Instit. 2011;103(2):117–28.

    Article  Google Scholar 

  3. Chen CT, Li L, Brooks G, Hassett M, Schrag D. Medicare spending for breast, prostate, lung, and colorectal cancer patients in the year of diagnosis and year of death. Health Serv Res. 2018;53(4):2118.

    Article  Google Scholar 

  4. Whitlock EP, Lin JS, Liles E, Beil TL, Fu R. Screening for colorectal cancer: a targeted, updated systematic review for the US Preventive Services Task Force. Annals Internal Med. 2008;149(9):638–58.

    Article  Google Scholar 

  5. American Cancer Society Colorectal Cancer Facts and Figures 2020-2022. Atlanta, GA: American Cancer Society; 2022.

  6. Healthy People 2030: Overview and Objectives - Cancer: U.S. Department of Health and Human Services. https://health.gov/healthypeople/objectives-and-data/browse-objectives/cancer. Accessed 20 August 2022.

  7. Wender RC, Doroshenk M, Brooks D, Hotz J, Smith RA. Creating and implementing a national public health campaign: the American Cancer Society's and National Colorectal Cancer Roundtable's 80% by 2018 Initiative. Am J Gastroenterol. 2018;113(12):1739.

    Article  Google Scholar 

  8. Davis TC, Rademaker A, Bailey SC, Platt D, Esparza J, Wolf MS, et al. Contrasts in rural and urban barriers to colorectal cancer screening. Am J Health Behav. 2013;37(3):289–98.

    Article  Google Scholar 

  9. Hughes AG, Watanabe-Galloway S, Schnell P, Soliman AS. Rural–urban differences in colorectal cancer screening barriers in Nebraska. J Commun Health. 2015;40(6):1065–74.

    Article  Google Scholar 

  10. Watanabe-Galloway S, Kim J, LaCrete F, Samson K, Foster J, Farazi PA, et al. Cross-sectional survey study of primary care clinics on evidence-based colorectal cancer screening intervention use. J Rural Health. 2021.

  11. Cole S, Smith A, Wilson C, Turnbull D, Esterman A, Young G. An advance notification letter increases participation in colorectal cancer screening. J Med Screening. 2007;14(2):73–5.

    Article  CAS  Google Scholar 

  12. Carmichael H, Cowan M, McIntyre R, Velopulos C. Disparities in colorectal cancer mortality for rural populations in the United States: Does screening matter? Am J Surg. 2020;219(6):988–92.

    Article  Google Scholar 

  13. Brouwers MC, De Vito C, Bahirathan L, Carol A, Carroll JC, Cotterchio M, et al. Effective interventions to facilitate the uptake of breast, cervical and colorectal cancer screening: an implementation guideline. Implement Sci. 2011;6(1):1–8.

    Article  Google Scholar 

  14. Sabatino SA, Lawrence B, Elder R, Mercer SL, Wilson KM, DeVinney B, et al. Effectiveness of interventions to increase screening for breast, cervical, and colorectal cancers: nine updated systematic reviews for the guide to community preventive services. Am J Prevent Med. 2012;43(1):97–118.

    Article  Google Scholar 

  15. Davis MM, Freeman M, Shannon J, Coronado GD, Stange KC, Guise J-M, et al. A systematic review of clinic and community intervention to increase fecal testing for colorectal cancer in rural and low-income populations in the United States–how, what and when? BMC cancer. 2018;18(1):1–16.

    Article  Google Scholar 

  16. Community Preventive Services Task Force. Updated recommendations for client-and provider-oriented interventions to increase breast, cervical, and colorectal cancer screening. Am J Prevent Med. 2012;43(1):92–6.

    Article  Google Scholar 

  17. Sherman EJ, Primack BA. What works to prevent adolescent smoking? A systematic review of the National Cancer Institute's research-tested intervention programs. J School Health. 2009;79(9):391–9.

    Article  Google Scholar 

  18. Goodwin BC, Ireland MJ, March S, Myers L, Crawford-Williams F, Chambers SK, et al. Strategies for increasing participation in mail-out colorectal cancer screening programs: a systematic review and meta-analysis. Syst Rev. 2019;8(1):1–11.

    Article  Google Scholar 

  19. Jager M, Demb J, Asghar A, Selby K, Mello EM, Heskett KM, et al. Mailed outreach is superior to usual care alone for colorectal cancer screening in the USA: a systematic review and meta-analysis. Digest Dis Sci. 2019;64(9):2489–96.

    Article  Google Scholar 

  20. Jean-Jacques M, Kaleba EO, Gatta JL, Gracia G, Ryan ER, Choucair BN. Program to improve colorectal cancer screening in a low-income, racially diverse population: a randomized controlled trial. Annals Family Med. 2012;10(5):412–7.

    Article  Google Scholar 

  21. Roy S, Dickey S, Wang H-L, Washington A, Polo R, Gwede CK, et al. Systematic review of interventions to increase stool blood colorectal cancer screening in African Americans. J Comm Health. 2021;46(1):232–44.

    Article  Google Scholar 

  22. Vanderpool RC, Gainor SJ, Conn ME, Spencer C, Allen AR, Kennedy S. Adapting and implementing evidence-based cancer education interventions in rural Appalachia: real world experiences and challenges. Rural Remote Health. 2011;11(4):1807.

    Google Scholar 

  23. Powell BJ, Proctor EK, Glisson CA, Kohl PL, Raghavan R, Brownson RC, et al. A mixed methods multiple case study of implementation as usual in children’s social service organizations: study protocol. Implement Sci. 2013;8(1):1–12.

    Article  Google Scholar 

  24. Baker R, Camosso-Stefinovic J, Gillies C, Shaw EJ, Cheater F, Flottorp S, et al. Tailored interventions to address determinants of practice. Cochrane Database Syst Rev. 2015;4.

  25. Goodyear-Smith F, Jackson C, Greenhalgh T. Co-design and implementation research: challenges and solutions for ethics committees. BMC Medical Ethics. 2015;16(1):1–5.

    Article  Google Scholar 

  26. Estabrooks PA, Harden SM, Almeida FA, Hill JL, Johnson SB, Porter GC, et al. Using integrated research-practice partnerships to move evidence-based principles into practice. Exercise Sport Sci Rev. 2019;47(3):176.

    Article  Google Scholar 

  27. Wolfenden L, Yoong SL, Williams CM, Grimshaw J, Durrheim DN, Gillham K, et al. Embedding researchers in health service organizations improves research translation and health service performance: the Australian Hunter New England Population Health example. J Clin Epidemiol. 2017;85:3–11.

    Article  Google Scholar 

  28. Ramanadhan S, Davis MM, Armstrong R, Baquero B, Ko LK, Leng JC, et al. Participatory implementation science to increase the impact of evidence-based cancer prevention and control. Cancer Causes & Control. 2018;29(3):363–9.

    Article  Google Scholar 

  29. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10(1):1–14.

    Article  Google Scholar 

  30. Rural-Urban Commuting Area Codes. https://www.ers.usda.gov/data-products/rural-urban-commuting-area-codes.aspx. Accessed 13 June 2022.

  31. Baum F, MacDougall C, Smith D. Participatory action research. J Epidemiol Comm Health. 2006;60(10):854.

    Article  Google Scholar 

  32. Cancer Prevention and Control Research Network. Putting public health evidence in action training workshop. Prevention Research Center Program's Training Workshop Facilitator's Guide; 2014.

  33. Mainor AG, Decosimo K, Escoffrey C, Farris P, Shannon J, Winters-Stone K, et al. Scaling up and tailoring the “Putting Public Health in Action” training curriculum. Health Promotion Pract. 2018;19(5):664–72.

    Article  Google Scholar 

  34. Boyle L, Homer M. Using what works: Adapting evidence-based programs to fit your needs: National Cancer Institute; 2006.

    Google Scholar 

  35. Leeman J, Blitstein JL, Goetz J, Moore A, Tessman N, Wiecha JL. Developing a tool to assess the capacity of out-of-school time program providers to implement policy, systems, and environmental change; 2016.

    Book  Google Scholar 

  36. Taylor MJ, McNicholas C, Nicolay C, Darzi A, Bell D, Reed JE. Systematic review of the application of the plan–do–study–act method to improve quality in healthcare. BMJ Quality Safety. 2014;23(4):290–8.

    Article  Google Scholar 

  37. Rogers EM, Singhal A, Quinlan MM. Diffusion of innovations. An integrated approach to communication theory and research: Routledge; 2014. p. 432–48.

    Google Scholar 

  38. Martins CB, Steil AV, Todesco JL. Factors influencing the adoption of the Internet as a teaching tool at foreign language schools. Comput Educ. 2004;42(4):353–74.

    Article  Google Scholar 

  39. Atkinson NL. Developing a questionnaire to measure perceived attributes of eHealth innovations. Am J Health Behav. 2007;31(6):612–21.

    Article  Google Scholar 

  40. Moore GC, Benbasat I. Development of an instrument to measure the perceptions of adopting an information technology innovation. Inform Syst Res. 1991;2(3):192–222.

    Article  Google Scholar 

  41. Adams SA, Rohweder CL, Leeman J, Friedman DB, Gizlice Z, Vanderpool RC, et al. Use of evidence-based interventions and implementation strategies to increase colorectal cancer screening in Federally Qualified Health Centers. J Comm Health. 2018;43(6):1044–52.

    Article  Google Scholar 

  42. Daly JM, Levy BT, Moss CA, Bay CP. System strategies for colorectal cancer screening at federally qualified health centers. Am J Public Health. 2015;105(1):212–9.

    Article  Google Scholar 

  43. Escoffery C, Hannon P, Maxwell AE, Vu T, Leeman J, Dwyer A, et al. Assessment of training and technical assistance needs of Colorectal Cancer Control Program Grantees in the US. BMC public health. 2015;15(1):1–8.

    Article  Google Scholar 

  44. Hannon PA, Maxwell AE, Escoffery C, Vu T, Kohn M, Leeman J, et al. Colorectal Cancer Control Program grantees’ use of evidence-based interventions. Am J Prevent Med. 2013;45(5):644–8.

    Article  Google Scholar 

  45. Kim J, Wang H, Young L, Michaud TL, Siahpush M, Farazi PA, et al. An examination of multilevel factors influencing colorectal cancer screening in primary care accountable care organization settings: a mixed-methods study. J Public Health Manag Pract. 2019;25(6):562–70.

    Article  Google Scholar 

Download references

Acknowledgements

We acknowledge our rural practice partners who selflessly and tirelessly work for improvement of health and well-being for rural residents.

Authors’ information (optional)

Jungyoon Kim received her PhD in Health Policy and Administration at the Pennsylvania State University. Currently, Dr. Kim is an Assistant Professor in the Department of Health Services Research and Administration at the University of Nebraska Medical Center, College of Public Health. Her research interests include organizational and system level change in healthcare and public health settings, particularly regarding the adoption and implementation of new policies or evidence-based practices.

Funding

The study was funded by the Great Plains IDeA CTR pilot grant through the University of Nebraska Medical Center (UNMC), supported by National Institute of General Medical Sciences (1 U54 GM115458). The study was also funded by College of Public Health at the University of Nebraska Medical Center. The funding source had no role in the design, conduct, or reporting of the study or in the decision to submit the article for publication.

Author information

Authors and Affiliations

Authors

Contributions

JK conceptualized the study, delivered implementation strategies, analyzed and interpreted the data, and was a major contributor in writing the manuscript. PE provided mentorship of the study, delivered implementation strategies, and critically reviewed and revised the manuscript. AA coordinated delivery sessions, developed materials, collected, and analyzed data. AM contributed to the development of distance learning modules of implementation strategies. KA analyzed and interpreted survey data regarding participants’ reactions and perceptions toward implementation strategies. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Jungyoon Kim.

Ethics declarations

Ethics approval and consent to participate

The study was approved by University of Nebraska Medical Center Institutional Review Board (IRB # 227-19-EP)

Consent for publication

The University of Nebraska Medical Center Institutional Review Board waived the consent form for the study.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1..

Project Milestone

Additional file 2..

Survey Instruments

Additional file 3..

Original and Modified PPHEA Modules with video links

Additional file 4..

TIDIeR checklist

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kim, J., Estabrooks, P., Aggarwal, A. et al. Developing a co-production strategy to facilitate the adoption and implementation of evidence-based colorectal cancer screening interventions for rural health systems: a pilot study. Implement Sci Commun 3, 131 (2022). https://doi.org/10.1186/s43058-022-00375-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s43058-022-00375-2

Keywords