Skip to main content

Acceptability, appropriateness, and feasibility of Rural School Support Strategies for behavioral interventions: a mixed methods evaluation over two years of a hybrid type 3 implementation-effectiveness trial

Abstract

Background

Positive Behavioral Interventions and Supports (PBIS) is a framework for implementing evidence-based interventions for preventing behavioral issues and improving climate in schools. The implementation of school-wide PBIS with fidelity is complex, requiring leadership commitment, teaming, and coordination of systems for tracking behaviors and consequences. Putting these components in place while ensuring alignment with the values and needs of the school community can be difficult for schools with fewer resources, such as rural schools. Implementation supports are needed, including strategies such as technical assistance, but it is unclear whether lower-cost modalities such as virtual support are acceptable, appropriate, and feasible and whether perceptions vary throughout the implementation process.

Methods

A type 3 hybrid implementation-effectiveness trial is taking place in 40 Idaho schools, testing a bundle of implementation supports selected to meet the needs of schools in rural areas. Supports include technical assistance from an implementation support practitioner (ISP), didactic trainings, virtual learning sessions, and an online resource portal. Surveys and interviews in the first 2 years of implementation (fall 2019 to spring 2021) explored outcomes of acceptability, appropriateness, and feasibility regarding the implementation supports among more than 150 school stakeholders.

Results

Evaluations showed high acceptability and appropriateness of the PBIS concepts and training. The 20 schools receiving additional implementation support rated the technical assistance and support from the project’s ISPs as the most acceptable and appropriate resource. Reasons for acceptability were the relationship built with the ISP, the ISP’s expertise, and being a “neutral party.” Although in-person support from the ISP was preferred, remote support was acceptable and increased feasibility of attendance. Virtual learning sessions were acceptable for learning and collaboration, particularly in the second year of implementation, once ISPs had developed closer relationships with school teams.

Conclusions

School staff found training, technical assistance, and virtual learning sessions to be acceptable and appropriate. Virtual formats of training and technical assistance decreased in acceptability but increased feasibility of attendance. In-person support was preferred during initial implementation, and virtual support was more acceptable thereafter.

Trial registration

This trial was prospectively registered on ClinicalTrials.gov (NCT03736395), on November 9, 2018.

Peer Review reports

Background

The Every Student Succeeds Act of 2015 mandated that US public schools use evidence-based practices not only to foster student academic achievement but also when providing mental health programming and other student services [1]. Although many universal interventions for improving student mental health and social-emotional learning have been shown effective in research [2,3,4], in practice, evidence-based programming that broadly supports students’ behavioral health and social-emotional needs is often not provided in schools or is provided with low fidelity [5,6,7]. Relative to schools in urban settings, at rural schools, the research-to-practice gap can be an even bigger challenge due to resource constraints, staff turnover, and inadequate funding [8, 9].

Positive Behavioral Interventions and Supports (PBIS) is a framework for implementing evidence-based interventions for preventing problem behavior and supporting student needs in schools [10]. While not a manualized intervention itself, PBIS is defined as a framework with a series of core features and best-practice guidance and tools to tailor those core features to fit a particular school context [11,12,13]. PBIS focuses on changing organizational systems and using school-based teaming, continued training, leadership involvement, coaching, and data-based decision making to effect change. PBIS is effective for improving student prosocial behavior and social-emotional functioning; reducing bullying, office discipline referrals, and suspensions [14,15,16] and improving school climate [17] and academic achievement [18]. The foundation of PBIS is the first tier—universal prevention—which involves schoolwide practices of establishing and teaching clear and consistent behavioral expectations and consequences for problem behavior, as well as rewards or acknowledgements for engaging in desired behaviors [13]. The adoption of PBIS is common in US schools but has been disproportionately higher among urban and suburban schools versus rural schools, with rural schools estimated to represent about 20% of schools implementing PBIS nationwide [19]. This does not reflect lesser need in rural areas, as youth in rural areas have less access to mental health resources [20] and two times higher rates of suicide [21, 22] compared to their urban peers. Identifying effective supports for rural schools is critical to scaling PBIS equitably, and studies of the implementation process are necessary to understand how best to facilitate effective scale-up in rural schools.

Implementation of evidence-based interventions (EBIs) in schools, particularly those with a universal focus such as school-wide PBIS, is dependent on a multitude of factors at the individual and organizational levels [8, 10, 18, 23, 24]. As noted in several publications about PBIS scale-up, literature and frameworks in implementation science have facilitated the understanding of PBIS implementation, particularly in defining the temporal nature of organizational change [25, 26]. Similarly, the vast knowledge base within the education sector regarding the application of multi-tiered systems of support, such as PBIS, is also a fertile ground for advancing the field of implementation science [27]. In particular, the use of multifaceted strategies to aid implementation of complex interventions is still a developing area [28, 29], as is the literature around scale-up of universal EBIs for promoting social-emotional wellness and mental health in schools [26, 27].

Various implementation strategies have been operationalized and identified as helpful for organizations (namely healthcare organizations) implementing EBIs through the Expert Recommendations for Change (ERIC) project [30,31,32]. The ERIC compilation’s strategies have also been translated for application in schools [33]. As described by Cook and colleagues, as well as in reviews of implementation supports including technical assistance (TA) [34] and active implementation support [35], there is much left to learn about how support strategies work and how they can be tailored and applied to various implementation scenarios. Part of what makes an implementation support strategy useful and translatable outside of research trials is its ability to be used by implementers (feasibility) and also how well it is received by implementers (appropriateness and acceptability). Components of acceptability, appropriateness, and feasibility were originally proposed as upstream implementation outcomes for EBIs themselves [31], but the same concepts can be applied to study the perceptions of implementation support strategies [36,37,38].

PBIS implementation practices dictate several components that are crucial to successful implementation, including the formation of implementation teams, high-quality training, the frequent use of data to make decisions, and ongoing action planning and feedback [11,12,13]. However, to support PBIS implementation with fidelity, additional supports are often needed [6, 26]. A recent systematic review that included 29 studies of the determinants of PBIS sustainment identified resources (including staffing, training, time, and funding) and ongoing training and support (including TA) among the top seven most important factors [39]. However, literature discussing how these supports should be tailored to rural schools, including which strategies are perceived as acceptable, appropriate, and feasible, is not well established. Rural school contexts, in particular, warrant special consideration due to features such as geographic isolation, limited resources, skepticism of outsiders, and poverty [40]. Importantly, identifying the features of implementation supports that make them acceptable, appropriate, and feasible for rural schools is essential for determining which strategies will be more likely to effect change and support scale-up.

Broadly, the selection and tailoring of implementation strategies to fit the needs of organizations in different contexts, including what strategies are most helpful when and for whom is an area of growing study [39, 41, 42]. Particularly, the literature discussing how implementation strategies should be tailored to rural schools, where proximity to implementation resources tends to be more limited, is not well established. This study examines perceptions of acceptability, appropriateness, and feasibility of a bundle of supports for facilitating PBIS implementation in rural schools. Specifically, we assessed the following: (1) perceptions of acceptability, appropriateness, and feasibility of in-person and virtual trainings among school implementation team members in 40 participating schools and (2) perceptions of acceptability, appropriateness, and feasibility of various in-person and remote enhanced implementation supports (including TA, monthly virtual learning sessions and asynchronous access to online educational materials and resources) among key implementers in 20 schools that had been randomized to receive these supports, in addition to training.

Methods

The Rural K-12 project is a type 3 effectiveness-implementation hybrid trial testing Rural School Support Strategies (RS3), a bundle of supports for EBI implementation in rural schools.

RS3 could also be operationalized as a “blended strategy,” as it is a protocolized set of many discrete implementation strategies [31, 38]. Hereafter, we refer to general strategy categories (i.e., technical assistance) as “supports” while referring to the discrete activities within each support as “strategies,” where applicable. The results presented here reflect staff perceptions about supports and strategies during the first and second year of implementation (2019–2020 and 2020–2021).

Theoretical framework

The larger project was conceptualized using the Quality Implementation Framework (QIF), which details the types of strategies that are best used at various stages of the implementation process [43]. The development of the QIF was partially informed by Fixsen and colleagues’ description of implementation stages [24], pertaining to PBIS [44]. The QIF details an exemplar implementation process and provides guidance to assist implementation, but it is not an evaluation framework [45]; thus, we applied Proctor and colleagues’ taxonomy of implementation outcomes as a guide for how to conceptualize the relevant features of the implementation supports and strategies [46].

Study protocol

All rural public K-12 schools in Idaho with at least 100 students and no prior PBIS training were given the option to participate in the trial through mailed invitation letters and follow-up phone calls to the schools. Once 40 schools were recruited (three additional schools were waitlisted), they were randomized into two groups of 20 schools each: (a) training or (b) training and RS3 (described below). Implementation supports are further detailed in Table 1. Both training and the types of supports included in RS3 have been previously identified as important for schools implementing PBIS [41]. All schools received guidance in selecting a school-level coach—a staff member with key operational responsibility for leading PBIS implementation—and building a school PBIS team (i.e., 5–8 school staff members) to promote implementation. All teams attended a 4-day in-person training in summer 2019 and a 3-day virtual PBIS training in summer 2020.

Table 1 Timeline and modality of implementation supports used

The Rural School Support Strategies (RS3)

The 20 schools randomized to the intervention condition (training + RS3) received supports throughout each year which included (1) in-person and remote TA on a proactive monthly basis from two experts on the project team (hereafter referred to as implementation support practitioners [ISPs], [35, 47]); (2) participation in monthly group-based virtual learning sessions with the ISPs, which included collaboration time with other school teams; (3) additional trainings to enhance coaching and leadership skills; and (4) access to a password-protected web portal with additional resources. The two full-time ISPs delivering the TA were both experienced K-12 educators with prior school leadership and coaching experience and had led PBIS implementation in Idaho schools. The monthly meetings between the ISP and each school were the primary delivery mode of TA. They were conducted on-site in September, October, and November 2019, then transitioned to virtual delivery (by teleconferencing) in December 2019 due to safety concerns with reaching remote rural areas in the winter (many required travel through snowy, mountainous regions). Originally, the design included conducting one onsite meeting at each school every spring and fall thereafter, but all meetings in 2020 and 2021 became virtual due to COVID-19 pandemic restrictions. Monthly meetings were centered around being responsive to schools’ needs and included elements of coaching, facilitation, problem-solving, and guidance on data-based decision making. See [48] for a full description of the trial methodology, including the strategies used, as classified by the SISTER Taxonomy [33]. See Additional file 1 for the completed Standards for Reporting Implementation Studies (StaRI) checklist [49].

Measures

A concurrent triangulation mixed methods design was used to assess acceptability, appropriateness, and feasibility of the training and RS3 supports. Data were collected at many timepoints, including pre-implementation (tier 1 training surveys, summer 2019) and active implementation in the following order: the mid-year survey (December 2019), the year 1 interviews (spring 2020), ISP reflections (spring 2020), the tier 2 training surveys (summer 2020), and the year 2 interviews (spring 2021). Quantitative surveys provided information about attendees’ perceptions of the acceptability and appropriateness of the trainings, with additional write-in questions providing information pertaining to perceived feasibility (research question 1). For the training + RS3 schools, quantitative surveys were used to explore perceptions of the appropriateness of the TA and virtual learning sessions, with qualitative interview data allowing a more in-depth exploration of why each of the support components were or were not successful (research question 2). The qualitative and quantitative data from these various sources and different timepoints were given equal weighting, integrated during interpretation, and used in parallel to explore outcomes of acceptability, appropriateness, and feasibility of the RS3 supports [50]. The measures are further described below, and Table 2 details how specific items align with acceptability, appropriateness, and feasibility outcomes. Our interpretations for mapping the items onto constructs were informed by Proctor and colleagues’ descriptions of implementation outcomes [46] as well as Schultes and colleagues’ applications to the education context [51].

Table 2 Mapping of study measures onto implementation outcomes

Training evaluations

Members of each school’s PBIS team completed brief (17-item) evaluation surveys after attending the tier 1 training in summer 2019, and a second training in summer 2020, which reviewed previous content about schoolwide foundational practices, and expanded to include advanced (tier 2) practices. Questions were Likert-type or write-in response and assessed perceived acceptability and appropriateness of the training and feasibility of applying and intention/motivation to apply the content. This survey was designed by the practitioners leading the trainings, given a primary aim of the survey was to aid them in improving the quality of the trainings. Items used in the survey are commonly found in professional development evaluations in the K-12 education domain [52].

Mid-year surveys

In December 2019, school PBIS coaches at training + RS3 schools completed an online survey assessing the appropriateness of training and supports they had experienced so far. This survey was designed by the research team.

Interviews

The school PBIS coach and principal from training + RS3 schools were each interviewed (separately) at the end of year 1 (April/May 2020). In year 2, only the school coach was interviewed (February/March 2021). Interviews lasted approximately 30 min, using a semi-structured guide. Trained interviewers (HC, MM) conducted interviews via Zoom and recorded and transcribed verbatim for analysis.

Implementation support practitioner reflections

The two ISPs provided written reflections using structured prompts in an open-response format. They addressed the benefits and challenges of in-person and virtual TA.

Data analysis

Quantitative

Demographic data for the schools, including rural locale and its subtypes [53], were obtained from the National Center for Education Statistics and summarized using descriptive statistics. Descriptive statistics were calculated for the summer training evaluations and mid-year surveys. Independent sample t-tests were used to compare year 1 and 2 training evaluations. Individual-level changes over time for training surveys were not evaluated, as the surveys were anonymous. Usage of the online resource portal was tracked by user (all PBIS team members at training + RS3 schools had individual login access) and date using Google Analytics and analyzed descriptively. Attendance to the trainings is not reported as a feasibility outcome because the research team set limits on how many people could attend each year. An adjusted critical P value of < 0.001 was used to account for multiple comparisons. Analyses were computed using SPSS version 29 (IBM SPSS Statistics, Armonk, NY, USA).

Qualitative

De-identified interview transcripts were coded and analyzed using Dedoose Version 8.3.45 (SocioCultural Research Consultants, LLC, 2016). Interview analyses were done in two cycles [50, 54]. Transcripts were first divided into excerpts by question, then open coding was done by a single coder (MM) in the first cycle to conceptually code response excerpts based on the question that was asked. Both coders (MM and HC) met several times to discuss the concept coding strategy, and reached agreement on the coding style for 10% of transcripts before completing the rest of the coding independently. Thereafter, the two coders independently reviewed all excerpts and conducted a preliminary thematic analysis [55]. Coders met three times to discuss and modify emergent themes and finalize theme descriptions. Write-in responses on survey items were open-coded by a single coder (HC) using content analysis [56].

Results

Results are presented below by support component and data source/year, where appropriate. Regarding the interview data, interviews with school coaches and administrators at training + RS3 schools solicited perceptions about all RS3 components. During the second year of the project (2020–2021 school year), COVID-19 was continuing to impact schools. Year 2 interviews with training + RS3 school coaches were shortened to reduce burden and only asked about the most frequently utilized elements of RS3: TA from the ISPs and the virtual learning sessions. Demographic characteristics of schools are provided in Table 3. More exemplar quotes are presented in Additional file 2.

Table 3 Demographic characteristics of 40 participating schools, by condition

Trainings

Training evaluation survey data

Descriptive statistics from all school staff responses on the training evaluation surveys are listed in Table 4. There were no meaningful differences between staff from training and training + RS3 schools. In both years, ratings on items measuring acceptability and appropriateness were high (> 4/5). However, independent samples comparisons showed lower ratings for several items in year 2 compared to year 1. Perceived applicability of the content, the novelty of the content, the clarity of presentation of the content, the motivation and plan to use training concepts in their work, positive recommendation of the training, and overall training quality were all rated significantly lower in year 2 compared to year 1. The magnitudes of these changes were small, with 0.44 as the largest decrease. The two items that significantly improved for year 2 were “I need more explanation” and “I need more support,” with a 0.69 and 0.65 decrease, respectively.

Table 4 Perceptions of PBIS trainings, by year (all schools)

School teams attending the year 2 virtual training were also asked to write comments about pros and cons of the online modality, in an open-response format. Commonly stated pros were about flexibility/convenience and team collaboration time. Attendees did not have travel hassles or time away from family, which made attending the training feasible for more team members. Many stated that the flexibility of being able to talk more with their teams and walk around the room without disturbing other groups was valued. More-effective team collaboration time (increased efficiency, more conversations, fewer distractions/noise from other groups) was a benefit.

Reported drawbacks of virtual training were varied. Attendees missed being in the room with instructors, as well as other school teams, describing that the comradery and collaborative learning possible in person was not possible to replicate in the virtual format. The lack of personal connection came up regularly. Many noted that it was difficult to stay engaged online, whereas the physical proximity of the instructors at in-person training helped teams stay on task and get immediate help. The online format made asking questions more difficult, despite the multiple avenues provided (chat box, breakout sessions, open forums after lectures). Staff noted that having their teams not in the same room (some teams attended separately for COVID-19 physical distancing reasons) made teamwork harder. Teams that attended while in the same room (i.e., each team in a conference room at their school building) noted improved collaboration. Technical difficulties were infrequently noted.

Taken together, survey items and open responses showed that the training was highly acceptable and appropriate both years, but with slightly lower ratings on presenter communication, content applicability/novelty, motivation/plan to use the content, and overall rating of the training in year 2. Qualitative findings elucidate that decreased acceptability and appropriateness of the year 2 training likely stemmed in part from the virtual format to the in-person training format. However, feasibility of attending the training increased.

Mid-year 1 coach surveys

Results of the mid-year (December 2019) training + RS3 school coach surveys are listed in Table 5. At least half of the coaches surveyed reported “very much” regarding the project kickoff meeting and the coaching institute being helpful for implementation. This rating was higher than the virtual learning sessions, but lower than the TA from the ISP.

Table 5 School coach perceptions of appropriateness, RS3 + training schools only (n = 20)

Year 1 administrator and coach interview themes

When coaches or administrators at training + RS3 schools were provided an open-response interview question about support components that they found helpful aside from the TA and virtual learning sessions, they most often cited the training, which all schools received. Themes are described as follows.

Team collaboration time

During the trainings, PBIS teams were able to plan for the upcoming school year. One principal articulated the power of the training sessions:

The big thing, honestly, was the training… We got the time that we needed to really sit down and be intentional about, how are we going to roll this out, what are some of the oppositions we're going to face and what are we going to do about it?

Quality

Participants said that there was a lot of high-quality content at the training, but the amount of information made it difficult to retain. A principal conveyed this, saying “When we were in summer conference, there was so much information... Everybody in my team was taking notes like crazy, but there are just some things that we forgot.”

Technical assistance from the ISP

Mid-year 1 coach surveys

The highest-rated support component on the survey was in-person TA visits from the ISP (80% rated “very helpful”; Table 5). More than two thirds of the coaches surveyed reported “very much” regarding the virtual learning sessions helping them to identify resources, increase knowledge, improve decision-making, use data for decision-making, and resolve challenges.

Year 1 administrator and coach interview themes

Participants were overwhelmingly positive about interactions with their ISP, indicating high acceptability and appropriateness of the support. Themes are illustrated below.

Knowledge

Participants said the advice their ISP gave was always helpful and their expertise was valued. Having an expert who had first-hand experience working in schools and with PBIS implementation was viewed as a positive. One coach noted, “His insight and knowledge on PBIS really helped to pave the way and kind of give us our next steps when we felt like, where do we go from here?”

Presence

While some school teams used the help of the ISP a lot, and some less, all reported that having the ISP as a resource was helpful. A coach stated, “He was right there ready to help us out in any way that we needed… ask him any questions, he could answer them.” Participants also said that the knowledge that the ISP would attend (physically or virtually) their school PBIS team’s meetings spurred them to get their agendas and data in order, and be ready with questions. A principal stated, “Accountability is what makes stuff work when you know you have to…you’re going to have to tell him what you’re doing.”

Relationship

Starting with the year 1 summer training and continuing throughout the school year, each ISP was intentional about building relationships with coaches and teams at each school. This emerged in interviews as participants expressed genuine appreciation and respect for their ISP. These relationships were grounded in the approachable and open communication style of the ISP and in connecting over similar experiences. The connections helped built trust between the ISP and the rest of the school staff. Sharing personal connections helped to foster relationships even more quickly, as stated by a coach, “We hit it off with [the ISP] because he is from [here] originally, so he knows the dynamics of what we're working with.”

The position of the ISP as someone outside of the school gave their opinions additional weight. One coach noted, “Having the meetings with [the ISP] was super helpful [for] mentorship and guiding, and a third party looking above and seeing what he sees objectively.”

Communication style

Many coaches and administrators stated that they really valued the professional and positive way that their ISP interacted with their school PBIS team. The ISP was valued for their willingness to listen without judgment, providing advice only when needed. The PBIS teams led the meetings, with the ISP stepping in to guide and offer suggestions, rather than telling teams the “correct” way to do things. One principal explained, “He allowed us to really vet ideas without dominating the conversation… he was really supportive, but also approachable.” Participants noted that constructive comments were always delivered with positivity, modeling how the ISP was encouraging the team to interact with their colleagues and students. A school coach talked about communicating with the ISP, “I've appreciated him reaching out to me and saying, ‘You're doing a great job.’ I've just always felt supported by it." Another coach said, “Even when there are things that we need to improve on, it's always positive.” Participants appreciated that the ISP reinforced the things going well, rather than focusing only on areas that needed improvement.

Delivery method

Face-to-face interaction was preferred over virtual meetings or interacting over email or phone. Drawbacks of virtual meetings included technical issues, decreased engagement, and decreased ability to read body language. In-person visits allowed for better relationship-building. Many stated that they would bring the ISP on a building walk-through to meet teachers and observe classroom behavior, and to provide suggestions. One coach noted, “Our staff was much more… willing to stick ideas out there and bounce them off [him] when he was in the room versus when he was on the screen.” In-person visits also allowed staff the opportunity to ask the ISP sensitive questions privately. Most stated that virtual support was still appropriate (and better than not having any TA). The initial mode being in-person eased the transition to virtual meetings: “He had established relationships with us enough before he went to the online, that [the transition] wasn't major. I do think that we got more [in person]. Face to face is more powerful.”

Year 2 coach interview themes

In year 2, TA from the ISP was still seen as highly acceptable and appropriate; two themes recurred, and a third theme emerged:

Presence

Coaches were still happy to have the ISPs available and emphasized that the ISPs always responded very quickly when they reached out for help. Even when coaches did not reach out, the fact that the ISPs proactively contacted teams regularly helped to keep PBIS going during an unprecedented year: “It's been a hard year, it's been hard to prioritize everything and they keep reminding me 'Okay, yes, PBIS is still a priority for us, we've got to keep going'. So that has helped a lot.”

Relationship

The relationship built between the school coach and the ISPs was invaluable. There was genuine care for their wellbeing that made them feel even more supported; one coach said, “I really feel like he cared about me as a person and not just about how PBIS was going at our school.” There was an overwhelmingly positive response regarding their involvement.

Flexibility

In year 2, the ISPs focused on supporting teams in whatever capacity they needed, and the support—without pressure to do more than teams could handle—was appreciated. ISPs helped teams set realistic goals and assuaged worries about regressing in some areas of implementation. One coach said, “They've kind of pulled back which I think is probably where they need to be this year. They let us know they're there if we need them.”

Implementation support practitioner reflections

Each ISP independently documented perceptions about the in-person and virtual formats for delivering TA. They expressed that in-person visits were highly acceptable and appropriate. Benefits included better relationship-building with school staff, ability to facilitate deeper conversations and better address specific challenges/sensitive issues, and increased familiarity with the physical school site and school climate. Visiting sites helped the ISPs develop a richer understanding of the strengths and needs of each school, be more involved with establishing processes, and prompt school teams to keep meeting. The benefits of virtual support included higher feasibility of attendance for the ISPs (more flexible scheduling, less time traveling). It was appropriate for quick updates after rapport was established with teams from the in-person visits, for screen-sharing documents, and there were no travel/health risks. However, technical challenges hampered productivity, and it was harder to have private conversations and to model processes virtually. Overall, the opinions of the ISPs on the acceptability, appropriateness, and feasibility of the in-person versus virtual TA corroborated the statements of the school coaches and administrators.

Virtual learning sessions

Mid-year 1 coach surveys

At least half of the coaches surveyed reported “very much” regarding the virtual learning sessions helping them to identify resources, increase knowledge, improve decision-making, use data for decision-making, and resolve challenges (Table 5). This represented overall lower appropriateness for the virtual learning sessions compared to the TA from the ISP.

Year 1 administrator and coach interview themes

Participants had mixed feelings about the acceptability of the virtual learning sessions, although responses were generally positive. Themes are described below.

Review

Many reported the virtual learning sessions as useful for reiterating concepts presented previously, since the large volume of new information in the multi-day training sessions could make it difficult to remember certain components.

Collaboration

Coaches expressed that it was helpful to troubleshoot implementation problems with coaches from other schools. One noted, “At first I kind of thought ‘this is just one more thing and isn't going to be helpful.’ But it's been nice to have an opportunity to talk to other schools, and to [compare] our progress [to] where they're at and see their challenges and help each other.”

Delivery method

Some stated that the digital interface of the virtual learning sessions was a drawback, either because teleconferencing made it harder to block out distractions or it was not conducive to their learning preferences. One coach stated, “If you're there in person, you're held a lot more accountable than if you're online.” Technical issues, such as internet connectivity problems and challenges with Zoom, did come up for some.

Year 2 coach interview themes

The virtual learning sessions in year 2 had a larger focus on stress management and self-care at the start and shifted toward PBIS later in the year. The collaboration element of the sessions was more acceptable and appropriate in year 2.

Review

Participants noted that the self-care focus of the virtual learning sessions was appreciated because there was so much stress and uncertainty. Staff appreciated the reminder to take care of themselves.

Collaboration

The value of relationships that teams had built with other schools in the project was highlighted in year 2: “I feel like I actually know [the other coaches] now. Because we're always in breakout sessions… and we do work, but it's nice to catch up.” Participants became better friends, and breakout rooms during the virtual learning sessions helped them solve problems and not feel as isolated. Another coach said, “I really like being able to talk to people at different schools to see if they're on the same page as us …it's just nice to see how everyone's doing …we're there for each other.”

Delivery method

While some staff noted the ease of participating such as, “I can listen on my phone,” others had consistent conflicts with other school events, and some would not participate at all because it was virtual. Technical barriers were less of an issue in year 2.

Online resource portal

Online resource portal usage data

During the 2020 calendar year, 9 users from 6 of the 20 training + RS3 schools logged in and accessed various resources within the portal 64 different times. As measured by access dates, 70% of the portal usage occurred prior to the COVID-19 school closures.

Year 1 administrator and coach interview themes

The web portal was cited as an appropriate resource repository, when participants thought to use it. The single theme is described below.

Review

Participants used the portal to find video content from trainings, recorded webinars, and documents (e.g., blueprints, planning forms). A coach said, “I watched some of the video tutorials to help refresh my memory.” Few reported accessing the portal independently, rather they did so after a referral from the ISP regarding a specific resource.

Discussion

This study assessed perceptions of over 150 rural school staff members about supports for implementation of school-wide PBIS across several years. The findings shed new light on when and why certain implementation supports, including training and TA in in-person and virtual formats, are considered acceptable, appropriate, and feasible for implementers in rural schools.

Results from all schools showed high ratings of perceived acceptability and appropriateness of the content and quality of the in-person training in year 1. Interviews with school coaches and administrators reiterated that the training was valuable, and dedicated collaboration time for school teams was important. Combined results from the surveys and interviews illuminated that the trainings covered an extensive amount of material which was hard to digest in the time allotted, which has been noted about PBIS trainings previously [57]. For teams at training + RS3 schools, having other opportunities to review and re-learn this material during the virtual learning sessions throughout the year, and to review it during dedicated meetings with their ISP, was helpful.

In year 2, training was delivered virtually. School stakeholders still rated the training as high quality, but average responses were significantly less favorable in several dimensions (e.g., acceptability, appropriateness, intent to use the information learned). While it was not directly assessed through the surveys, it was clear from the interviews that the ongoing COVID-19 pandemic affected perceptions of the relative priority and feasibility of implementing PBIS. In general, readiness is an important factor for implementation of universal EBIs in schools [58], so implementation stage (due to the pandemic, or other factors) may have also affected participants’ perceptions of the training and intent to apply the concepts. When participants specifically reflected on the transition to the year 2 virtual format in an open-response, most expressed that in-person training was more acceptable and engaging. The online format was missing the human element of connection with the instructors, the ease of asking questions, and the collaboration that comes from being in the room with other school teams. However, there were benefits of virtual trainings—namely increased feasibility of attendance and effective team collaboration time—indicating that the virtual format is appropriate but perhaps more suitable for ongoing activities such as the virtual learning sessions.

Other published data from PBIS trainings during the pandemic corroborates that in-person trainings can be more acceptable due to fewer distractions that arise from staff members’ duties in the school building, as well as better access to the instructors. Data from same survey found that team collaboration time is the most highly rated element of virtual trainings, and that appropriateness of trainings (including the clarity of the concepts) and feasibility (using the knowledge gained) was lower for schools implementing PBIS during the COVID-19 pandemic [57]. Additional benefits of virtual trainings cited in the literature include increased accessibility for rural communities as well as greater potential to tailor trainings to various needs within school teams (e.g., new team members who are catching up, or differentiated training content based on team role) [59].

When coaches at training + RS3 schools rated the acceptability of the RS3 supports, it was clear that the monthly meetings with the ISPs were the most valued, followed by the in-person trainings, the monthly virtual learning sessions, then the online resource portal. No strategies were rated as “not at all helpful,” so it could be argued that having more resources is better; however, it is important to consider the material and labor costs behind less-favorably rated components such as the online portal when considering how to deliver support. ISP meetings were seen as more appropriate for helping school teams with tasks such as data-based decision making and resolving problems. This corroborates other studies showing that TA can facilitate these aspects of PBIS implementation [60].

Our analysis of interview data revealed aspects of the RS3 components that increased perceived acceptability and appropriateness. As shown in prior work, the relationship between the support specialist and the school team was important for acceptability [34, 61,62,63]. In the early implementation phase, when teams and ISPs were just starting to build their relationship, the fact that the ISPs had previously worked in rural schools in Idaho helped create more immediate trust and rapport. In Year 2 with the stress of COVID-19 ongoing, the relationship between coaches and ISPs was a primary reason school coaches felt it was possible to continue PBIS implementation.

The delivery modality of the TA was also important; in-person visits were essential early in the process to build rapport. Once relationships were better established, virtual meetings were more acceptable and appropriate. As noted in a recent systematic review of the mechanisms through which ISPs are able to improve implementation outcomes, the characteristics and competencies of ISPs are crucial for success [35]; the personal characteristics, behaviors, and micro-communication skills of the two ISPs on our project appear to have been highly effective for building trusting relationships with school implementation leaders. Elements that appeared to accelerate relationship-building included the background of the ISPs working in schools similar to those implementing in the study, as well as their willingness to “meet schools where they were at,” or provide the help that was needed without judgment. Factors identified as important for trust building in the ISP-implementation team relationship in other recent work [35, 47] were described by coaches and administrators: communication style (i.e., empathy-driven exchanges, authenticity), presence (i.e., predictable and frequent interactions), and knowledge/demonstrated expertise were all identified elements which increased the acceptability of the TA in our study. Additional research exploring the interpersonal elements of the support process seems well-warranted given how central it is to organizational stakeholders’ perceptions of acceptability and their engagement with implementation support interventions.

In year 2 when all TA was virtual, schools reported few logistical drawbacks to the virtual modality but still expressed that they missed the in-person connection. Other research has shown that delivering school mental health-focused virtual training and TA can be effective, acceptable, and, importantly, can provide more-equitable access for those not typically able to afford the time or costs of travel [64, 65]. ISPs reflected that the virtual format made it more feasible for them to attend school-level meetings.

The virtual learning sessions were well-regarded in both years as a tool to review material from the trainings. Some coaches disliked the virtual format, but many found it worthwhile, particularly the breakout sessions where coaches from different schools could problem-solve together. By year 2, school staff had formed closer relationships and were leaning on one another for information and emotional support. The rapport built between school coaches as the study went on appeared to increase the acceptability of the virtual learning sessions in the second year compared to the first year, showing that virtual community-building with staff from similar school environments is a helpful implementation strategy. This type of collaboration can benefit staff located in geographically remote areas (e.g., where there is a single elementary school in a district), where opportunities to collaborate with peers from other schools are scarce [66]. This finding contributes to the growing literature on the use of online communities to aid in professional development and implementation of EBIs [67, 68]

Strengths of our work include the longitudinal data collection, breadth of stakeholder opinions assessed, and use of mixed methods to explore opinions in depth. Limitations include the use of quantitative data collection measures that were specific to this study. Though psychometrically strong measures for examining acceptability, appropriateness, and feasibility exist [69, 70], our measures were also used to inform the ISPs on how to improve support components in real-time and thus were tailored to those needs. COVID-19 introduced unexpected challenges, including that training and ongoing support were required to be virtual for all schools starting in March 2020; opinions of virtual/remote strategies may have been affected by the lack of other options. We acknowledge that the implementation strategies we report in this paper are underspecified [28] and that our assessments of acceptability, accessibility, and feasibility were not parsed out into specific strategies as articulated in the SISTER taxonomy (i.e., coaching, tailored feedback). Opinions of stakeholders at schools where the leadership has chosen to adopt PBIS, such as those in this study, may be inherently different from those at schools that have not. While this paper does not explore the relationships between implementation supports and subsequent fidelity of implementation, those data will be reported in forthcoming work.

Conclusions

We found that in-person and virtual trainings with ample collaboration time for teams, and ongoing implementation support provided in-person or virtually by an experienced ISP, were highly acceptable and appropriate strategies for supporting PBIS implementation in rural schools. Other supports, including monthly virtual learning sessions and a web portal with resources, were acceptable, but not as highly rated. While school staff prefer in-person trainings and meetings with ISPs when possible, strategic use of virtual meetings and TA (such as in-person at the onset of implementation, to better build relationships—and virtual later on) increases the feasibility of providing high-quality support to schools in remote settings, while not compromising too much on acceptability and appropriateness.

Availability of data and materials

The datasets generated and/or analyzed during the current study are not publicly available due to security provisions of the protocol approved by the institutional review board, but de-identified data may be available from the corresponding author on reasonable request.

Abbreviations

PBIS:

Positive Behavioral Interventions and Supports

EBIs:

Evidence-based interventions

RS3:

Rural School Support Strategies

QIF:

Quality Implementation Framework

ISP:

Implementation support practitioner

TA:

Technical assistance

References

  1. US Government. Every student succeeds act. Congr Rec. 2015;161:114–95 Public Law.

    Google Scholar 

  2. Hoffman DM. Reflecting on social emotional learning: a critical perspective on trends in the United States. Rev Educ Res. 2009;79(2):533–56.

    Google Scholar 

  3. Lawson GM, McKenzie ME, Becker KD, Selby L, Hoover SA. The core components of evidence-based social emotional learning programs. Prev Sci. 2019;20(4):457–67.

    PubMed  PubMed Central  Google Scholar 

  4. O’Reilly M, Svirydzenka N, Adams S, Dogra N. Review of mental health promotion interventions in schools. Soc Psychiatry Psychiatr Epidemiol. 2018;53(7):647–62.

    PubMed  PubMed Central  Google Scholar 

  5. Evans SW, Weist MD. Implementing empirically supported treatments in the schools: what are we asking? Clin Child Fam Psychol Rev. 2004;7(4):263–7.

    PubMed  Google Scholar 

  6. Fagan AA, Bumbarger BK, Barth RP, Bradshaw CP, Cooper BR, Supplee LH, et al. Scaling up evidence-based interventions in US public systems to prevent behavioral health problems: challenges and opportunities. Prev Sci. 2019;20(8):1147–68.

    PubMed  PubMed Central  Google Scholar 

  7. Owens JS, Lyon AR, Brandt NE, Warner CM, Nadeem E, Spiel C, et al. Implementation science in school mental health: key constructs in a developing research agenda. School Ment Health. 2014;6(2):99–111.

    PubMed  Google Scholar 

  8. Mihalic SF, Irwin K. Blueprints for Violence Prevention: from research to real-world settings—factors influencing the successful replication of model programs. Youth Viol Juv Justice. 2003;1(4):307–29.

    Google Scholar 

  9. Steed EA, Pomerleau T, Muscott H, Rohde L. Program-wide Positive Behavioral Interventions and Supports in rural preschools. Rural Spec Educ Q. 2013;32(1):38–46.

    Google Scholar 

  10. Sugai G, Horner RR. A promising approach for expanding and sustaining school-wide positive behavior support. School Psych Rev. 2006;35(2):245–59.

    Google Scholar 

  11. Lewis TJ, Sugai G. Effective behavior support: Systems approach to proactive schoolwide management. Except Child. 1999;31(6):1–24.

    Google Scholar 

  12. Sugai G, Horner RH. Discipline and behavioral support: preferred processes and practices. Eff School Pr. 1999;17(4):10–22.

    Google Scholar 

  13. Sugai G, Horner RH. Defining and describing schoolwide positive behavior support. In: Sailor W, Dunlap G, Sugai G, Horner R, editors. Handbook of positive behavior support. Springer; 2009. p. 307–26.

    Google Scholar 

  14. Bradshaw CP, Waasdorp TE, Leaf PJ. Effects of school-wide positive behavioral interventions and supports on child behavior problems. Pediatrics. 2012;130(5):e1136–45.

    PubMed  PubMed Central  Google Scholar 

  15. Childs KE, Kincaid D, George HP, Gage NA. The relationship between school-wide implementation of positive behavior intervention and supports and student discipline outcomes. J Posit Behav Interv. 2016;18(2):89–99.

    Google Scholar 

  16. Waasdorp TE, Bradshaw CP, Leaf PJ. The impact of schoolwide Positive Behavioral Interventions and Supports on bullying and peer rejection: a randomized controlled effectiveness trial. Arch Pediatr Adolesc Med. 2012;166(2):149–56.

    PubMed  Google Scholar 

  17. Charlton CT, Moulton S, Sabey CV, West R. A systematic review of the effects of schoolwide intervention programs on student and teacher perceptions of school climate. J Posit Behav Interv. 2021;23(3):185–200.

    Google Scholar 

  18. Horner RH, Sugai G, Smolkowski K, Eber L, Nakasato J, Todd AW, et al. A randomized, wait-list controlled effectiveness trial assessing school-wide positive behavior support in elementary schools. J Posit Behav Interv. 2009;11(3):133–44.

    Google Scholar 

  19. Chaparro EA, Kittelman A, McDaniel SC, Peshak George H, VanLone J, So B SoLing. Examining rural school implementation of positive behavioral supports across tiers. Rural Spec Educ Q. 2022;41(3):1–13.

  20. Blackstock J, Chae KB, Mauk GW, McDonald A. Achieving access to mental health care for school-aged children in rural communities. Rural Educ. 2018;39(1):12–25.

    Google Scholar 

  21. Fontanella CA, Hiance-Steelesmith DL, Phillips GS, Bridge JA, Lester N, Sweeney HA, et al. Widening rural-urban disparities in youth suicides, United States, 1996–2010. JAMA Pediatr. 2015;169(5):466–73.

    PubMed  PubMed Central  Google Scholar 

  22. Hirsch JK, Cukrowicz KC. Suicide in rural areas: an updated review of the literature. Rural Ment Health. 2014;38(2):65.

    Google Scholar 

  23. Bierman K. The implementation of the Fast Track Program: an example of a large-scale prevention science efficacy trial. J Abnorm Child Psychol. 2002;30(1):1–17.

    Google Scholar 

  24. Fixsen DL, Naoom SF, Blase KA, Friedman RM, Wallace F, Burns B, et al. Implementation research: a synthesis of the literature. Tampa: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231); 2005.

    Google Scholar 

  25. Horner R, Monzalve-Macaya M. A framework for building safe and effective school environments: Positive Behavioral Interventions and Supports (PBIS). Pedagogická orientace. 2018;28(4):663–85.

    Google Scholar 

  26. Kincaid D, Horner R. Changing systems to scale up an evidence-based educational intervention. Evid Based Commun Assess Interv. 2017;11(3–4):99–113.

    Google Scholar 

  27. Lyon AR, Bruns EJ. From evidence to impact: joining our best school mental health practices with our best implementation strategies. School Ment Health. 2019;11(1):106–14.

    PubMed  PubMed Central  Google Scholar 

  28. Lewis CC, Klasnja P, Powell BJ, Lyon AR, Tuzzio L, Jones S, Walsh-Bailey C, Weiner B. From classification to causality: advancing understanding of mechanisms of change in implementation science. Front Public Health. 2018;7(6):136.

    Google Scholar 

  29. Lewis CC, Boyd MR, Walsh-Bailey C, Lyon AR, Beidas R, Mittman B, et al. A systematic review of empirical studies examining mechanisms of implementation in health. Implement Sci. 2020;15(1):21.

    PubMed  PubMed Central  Google Scholar 

  30. Powell BJ, McMillen JC, Proctor EK, Carpenter CR, Griffey RT, Bunger AC, et al. A compilation of strategies for implementing clinical innovations in health and mental health. Med Care Res Rev. 2012;69(2):123–57.

    PubMed  Google Scholar 

  31. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8(1):139.

    PubMed  PubMed Central  Google Scholar 

  32. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10(1):21.

    PubMed  PubMed Central  Google Scholar 

  33. Cook CR, Lyon AR, Locke J, Waltz T, Powell BJ. Adapting a compilation of implementation strategies to advance school-based implementation research and practice. Prev Sci. 2019;20(6):914–35.

    PubMed  PubMed Central  Google Scholar 

  34. Katz J, Wandersman A. Technical assistance to enhance prevention capacity: a research synthesis of the evidence base. Prev Sci. 2016;17:417–28.

    PubMed  PubMed Central  Google Scholar 

  35. Albers B, Metz A, Burke K, Bührmann L, Bartley L, Driessen P, Varsi C. The mechanisms of implementation support-findings from a systematic integrative review. Res Soc Work Pract. 2021;23:10497315211042376.

    Google Scholar 

  36. Beidas RS, Becker-Haimes EM, Adams DR, Skriner L, Stewart RE, Wolk CB, et al. Feasibility and acceptability of two incentive-based implementation strategies for mental health therapists implementing cognitive-behavioral therapy: a pilot study to inform a randomized controlled trial. Implement Sci. 2017;12(1):148.

    PubMed  PubMed Central  Google Scholar 

  37. Duong MT, Cook CR, Lee K, Davis CJ, Vázquez-Colón CA, Lyon AR. User testing to drive the iterative development of a strategy to improve implementation of evidence-based practices in school mental health. Evid Based Pract Child Adolesc Ment Health. 2020;5(4):414–25.

    PubMed  PubMed Central  Google Scholar 

  38. Merle JL, Thayer AJ, Larson MF, Pauling S, Cook CR, Rios JA, et al. Investigating strategies to increase general education teachers’ adherence to evidence-based social-emotional behavior practices: a meta-analysis of the single-case literature. J Sch Psychol. 2022;1(91):1–26.

    Google Scholar 

  39. Powell BJ, Patel SV, Haley AD, Haines ER, Knocke KE, Chandler S, et al. Determinants of implementing evidence-based trauma-focused interventions for children and youth: a systematic review. Adm Policy Ment Health. 2020;47(5):705–19.

    PubMed  PubMed Central  Google Scholar 

  40. Fagan TK, Hughes J. Rural school psychology: perspectives on lessons learned and future directions. School Psych Rev. 1985;14(4):444–51.

    Google Scholar 

  41. Fox RA, Leif ES, Moore DW, Furlonger B, Anderson A, Sharma U. A systematic review of the facilitators and barriers to the sustained implementation of School-Wide Positive Behavioral Interventions and Supports. Educ Treat Children. 2021;45:105–26.

  42. Powell BJ, Haley AD, Patel SV, Amaya-Jackson L, Glienke B, Blythe M, et al. Improving the implementation and sustainment of evidence-based practices in community mental health organizations: a study protocol for a matched-pair cluster randomized pilot study of the Collaborative Organizational Approach to Selecting and Tailoring Implementation Strategies (COAST-IS). Implement Sci Comm. 2020;1(1):9.

    Google Scholar 

  43. Meyers DC, Durlak JA, Wandersman A. The Quality Implementation Framework: A synthesis of critical steps in the implementation process. Am J Community Psychol. 2012;50:462–80.

    PubMed  Google Scholar 

  44. Horner RH, Kincaid D, Sugai G, Lewis T, Eber L, Barrett S, et al. Scaling up school-wide positive behavioral interventions and supports: experiences of seven states with documented success. J Posit Behav Interv. 2014;16(4):197–208.

    Google Scholar 

  45. Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. 2015;21(10):53.

    Google Scholar 

  46. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. 2011;38(2):65–76.

    PubMed  Google Scholar 

  47. Metz A, Jensen T, Farley A, Boaz A, Bartley L, Villodas M. Building trusting relationships to support implementation: A proposed theoretical model. Front Health Serv. 2022;2:894599.

  48. Turner L, Calvert HG, Fleming CM, Lewis T, Siebert C, Anderson N, Castleton T, Havlicak A, McQuilkin M. Study protocol for a cluster-randomized trial of a bundle of implementation support strategies to improve the fidelity of implementation of schoolwide Positive Behavioral Interventions and Supports in rural schools. Contemp Clin Trials Commun. 2022;9(28): 100949.

    Google Scholar 

  49. Pinnock H, Barwick M, Carpenter C, Eldridge S, Grandes G, Griffiths CJ, Rycroft-Malone J, Meissner P, Murray E, Patel A, Sheikh A, Taylor SJC for the StaRI Group. Standards for Reporting Implementation Studies (StaRI) statement. BMJ. 2017;356:i6795.

    PubMed  PubMed Central  Google Scholar 

  50. US Department of Health and Human Services. Qualitative methods in implementation science. National Institutes of Health. Bethesda: National Cancer Institute. 2018:1-31.

  51. Schultes MT. An introduction to implementation evaluation of school-based interventions. Eur J Dev Psychol. 2021;0(0):1–13.

    CAS  Google Scholar 

  52. Guskey TR. Evaluating Professional Development. Thousand Oaks: Corwin Press; 2000.

  53. Common Core of Data America’s Public Schools. National Center for Education Statistics, Institute of Education Sciences, Washington, DC. 2018. https://nces.ed.gov/ccd/ccddata.asp Accessed May 22 2021.

  54. Sandelowski M. Whatever happened to qualitative description? Res Nurs Health. 2000;23(4):334–40.

    CAS  PubMed  Google Scholar 

  55. Saldana J, Omasta M. Qualitative research: Analyzing life. Thousand Oaks: SAGE Publications; 2017.

  56. Padgett DK. Qualitative methods in social work research. Thousand Oaks: SAGE Publications; 2016.

  57. Petersen AJ. Winter 2020-21 PBIS Training & Implementation Fidelity Updates: Summary of Results from Cohorts 15 and 16. Wilder Research. 2021. https://files.eric.ed.gov/fulltext/ED615643.pdf.

  58. Cook CR, Larson M, Zhang Y. Understanding readiness to implement as determinants of teacher adoption of evidence-based universal programs and practices. In: Evans SW, Owens JS, Bradshaw CP, Weist MD, editors. Handbook of School Mental Health: Innovations in Science and Practice. Cham: Springer International Publishing; 2023. p. 391–405.

    Google Scholar 

  59. Nese RN, Meng P, Breiner S, Chaparro E, Algozzine R. Using stakeholder feedback to improve online professional development opportunities. J Res Technol Educ. 2020;52(2):148–62.

    Google Scholar 

  60. Kincaid D, Childs K, Blase KA, Wallace F. Identifying barriers and facilitators in implementing Schoolwide Positive Behavior Support. J Posit Behav Interv. 2007;9(3):174–84.

    Google Scholar 

  61. Chilenski SM, Perkins DF, Olson J, Hoffman L, Feinberg ME, Greenberg M, et al. The power of a collaborative relationship between technical assistance providers and community prevention teams: a correlational and longitudinal study. Eval Program Plann. 2016;54:19–29.

    PubMed  Google Scholar 

  62. Chilenski SM, Welsh J, Olson J, Hoffman L, Perkins DF, Feinberg ME. Examining the highs and lows of the collaborative relationship between technical assistance providers and prevention implementers. Prev Sci. 2018;19(2):250–9.

    PubMed  PubMed Central  Google Scholar 

  63. Pas ET, Larson KE, Reinke WM, Herman KC, Bradshaw CP. Implementation and acceptability of an adapted classroom check-up coaching model to promote culturally responsive classroom management. Educ Treat Child. 2016;1:467–91.

    Google Scholar 

  64. McDaniel SC, Bloomfield BS, Guyotte KW, Shannon TM, Byrd DH. Telecoaching to support schoolwide positive behavior interventions and supports in rural schools. J Educ Stud Placed Risk. 2021;26(3):236–52.

    Google Scholar 

  65. Olson JR, Lucy M, Kellogg MA, Schmitz K, Berntson T, Stuber J, et al. What happens when training goes virtual? Adapting training and technical assistance for the school mental health workforce in response to COVID-19. School Ment Health. 2021;13(1):160–73.

    PubMed  PubMed Central  Google Scholar 

  66. Cassidy L. Online communities of practice to support collaborative mental health practice in rural areas. Issues Ment Health Nurs. 2011;32(2):98–107.

    PubMed  Google Scholar 

  67. Dille KB, Røkenes FM. Teachers’ professional development in formal online communities: a scoping review. Teach Teach Educ. 2021;1(105): 103431.

    Google Scholar 

  68. McLoughlin C, Patel KD, O’Callaghan T, Reeves S. The use of virtual communities of practice to improve interprofessional collaboration and education: findings from an integrated review. J Interprof Care. 2018;32(2):136–42.

    PubMed  Google Scholar 

  69. Lewis CC, Fischer S, Weiner BJ, Stanick C, Kim M, Martinez RG. Outcomes for implementation science: an enhanced systematic review of instruments using evidence-based rating criteria. Implement Sci. 2015;10(1):155.

    PubMed  PubMed Central  Google Scholar 

  70. Weiner BJ, Lewis CC, Stanick C, Powell BJ, Dorsey CN, Clary AS, et al. Psychometric assessment of three newly developed implementation outcome measures. Implement Sci. 2017;12(1):108.

    PubMed  PubMed Central  Google Scholar 

Download references

Acknowledgements

We are immensely grateful to the implementation support practitioners, Mr. Tate Castleton and Mr. Nate Anderson. We also thank the school coaches, administrators, and members of the PBIS teams at the 40 schools participating in this trial.

Funding

This study is funded by the National Institute of Justice, #2017-CK-BX-0021.

Author information

Authors and Affiliations

Authors

Contributions

HC, TL, and LT conceptualized and designed the study. AH and HC managed survey distribution and AH coordinated attendance at all trainings. TL conducted trainings. HC downloaded and analyzed survey data. LT, HC, and MM developed the interview guides and conducted interviews. HC and MM conducted qualitative analyses. HC, MM, and LT contributed to the initial draft of the manuscript. All authors critically reviewed the manuscript and approved the submitted version.

Corresponding author

Correspondence to Hannah G. Calvert.

Ethics declarations

Ethics approval and consent to participate

This research was approved by the Institutional Review Board at Boise State University (protocol number 101-SB17-207).

Consent for publication

Participants individually consented before completing surveys and interviews.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

Standards for Reporting Implementation Studies: the StaRI checklist for completion.

Additional file 2.

Table A. Representative Quotes from Themes Within each Implementation Support, Years 1 and 2.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Calvert, H.G., McQuilkin, M., Havlicak, A. et al. Acceptability, appropriateness, and feasibility of Rural School Support Strategies for behavioral interventions: a mixed methods evaluation over two years of a hybrid type 3 implementation-effectiveness trial. Implement Sci Commun 4, 92 (2023). https://doi.org/10.1186/s43058-023-00478-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s43058-023-00478-4

Keywords