Skip to main content

Implementation strategies to promote measurement-based care in schools: evidence from mental health experts across the USA



Despite an established taxonomy of implementation strategies, minimal guidance exists for how to select and tailor strategies to specific practices and contexts. We employed a replicable method to obtain stakeholder perceptions of the most feasible and important implementation strategies to increase mental health providers’ use of measurement-based care (MBC) in schools. MBC is the routine use of patient-reported progress measures throughout treatment to inform patient-centered, data-driven treatment adjustments.


A national sample of 52 school mental health providers and researchers completed two rounds of modified Delphi surveys to rate the relevance, importance, and feasibility of 33 implementation strategies identified for school settings. Strategies were reduced and definitions refined using a multimethod approach. Final importance and feasibility ratings were plotted on “go-zone” graphs and compared across providers and researchers to identify top-rated strategies.


The initial 33 strategies were rated as “relevant” or “relevant with changes” to MBC in schools. Importance and feasibility ratings were high overall for both survey rounds; on a scale of 1 to 5, importance ratings (3.61–4.48) were higher than feasibility ratings (2.55–4.06) on average. Survey 1 responses resulted in a reduced, refined set of 21 strategies, and six were rated most important and feasible on Survey 2: (1) assess for readiness and identify barriers and facilitators; (2) identify and prepare champions; (3) develop a usable implementation plan; (4) offer a provider-informed menu of free, brief measures; (5) develop and provide access to training materials; and (6) make implementation easier by removing burdensome documentation tasks. Provider and researcher ratings were not significantly different, with a few exceptions: providers reported higher feasibility and importance of removing burdensome paperwork than researchers, providers reported higher feasibility of train-the trainer approaches than researchers, and researchers reported higher importance of monitoring fidelity than providers.


The education sector is the most common setting for child and adolescent mental health service delivery in the USA. Effective MBC implementation in schools has the potential to elevate the quality of care received by many children, adolescents, and their families. This empirically derived, targeted list of six implementation strategies offers potential efficiencies for future testing of MBC implementation in schools.

Peer Review reports


Evidence-based practices (EBPs) continue to proliferate in child and adolescent mental health treatment, many of which are developed under controlled conditions in university clinics and healthcare settings [1]. However, intervention evidence is limited by the client populations and settings where the evidence was originally derived [2], often making it necessary to adapt the intervention to fit a particular setting [3, 4]. In addition, there are numerous barriers to successful EBP implementation in real-world mental health settings where children and their families are likely to receive care. Implementation barriers exist in both the outer setting (e.g., patient needs and resources) and the inner setting (e.g., organizational culture, leadership engagement), as well as involving individual characteristics and implementation processes unique to each intervention, intervention level, population, and service setting [3].

Desired implementation outcomes are more likely when implementation strategies are selected for and tailored to 1) specific patient populations, 2) care delivery systems and practices, and 3) local barriers and facilitators, often referred to as “determinants of practice” [5, 6]. Implementation strategies are single- or multiple-component approaches aimed at increasing adoption, implementation, and sustainment of EBPs in routine care [7]. Despite an established taxonomy of 73 implementation strategies, minimal guidance exists for how to select, integrate, and tailor these strategies to specific services and contexts [8, 9]. Proposed methods include concept mapping, group model building, conjoint analysis, and intervention mapping [10]. Yet, each method has limitations, such as requiring advanced methodological consultation, complex modeling that may overwhelm stakeholders, and/or use of proprietary software [10].

There are few examples of how to select strategies prospectively based on implementation science research and stakeholder knowledge of contextual factors [5]. This study replicates one established systematic method (the use of modified Delphi surveys) to select implementation strategies for a given EBP (measurement-based care) in the most common mental health service delivery setting for children and adolescents (schools) [10]. Delphi surveys are a pragmatic approach [10] that can be used when implementation strategy lists are established and thus stakeholders can rate existing strategies, propose new ones, and recommend changes in strategy definitions or applications. Stakeholder ratings of importance and feasibility have been used in numerous studies to assess which strategies are most actionable and applicable for a given implementation initiative to maximize success [11,12,13,14,15,16]. The actual effectiveness of these strategies on implementation, service, and client outcomes is an empirical question to be evaluated once they are applied [7].

Measurement-based care in mental health service delivery

Measurement-based care (MBC) is the routine collection and use of client data throughout treatment, including initial screening and assessment, problem definition and analysis, finalizing treatment objectives and intervention tactics, and monitoring treatment progress collaboratively with the client to inform treatment adjustments [17]. MBC is a critical component of an evidence-based practice orientation to mental health treatment [18]. There is strong evidence supporting MBC in settings other than schools. For instance, systematic reviews show better and faster goal attainment and symptom reduction with MBC as compared to usual care; effect sizes range from 0.28 to 0.70 [19,20,21]. Larger effect sizes of 0.49 to 0.70 are attributable to MBC with feedback, particularly feedback provided to both the patient and providers, or when clinical support tools are provided [21, 22]. Recent Cochrane reviews underscore the importance of including studies where measures are used to adjust the treatment plan [23, 24], indicating that patient outcomes associated with MBC are likely a result of the real-time, client-centered, data-driven adjustments made to interventions provided.

Despite the promise of MBC to improve mental health service quality, use of MBC in practice is minimal. Fewer than 20% of providers report collecting progress measures at least monthly [25, 26]. Barriers to MBC implementation in behavioral health care have been well-documented at the individual patient, provider, organizational, and system levels [27].

School mental health treatment services

Schools are the most common setting for children to receive mental health treatment, particularly for families who face barriers to accessing care in traditional clinic- or hospital-based settings [28,29,30,31,32]. However, the extent to which school mental health treatment services are grounded in EBPs is largely unknown [33, 34]. EBPs implemented in schools have potentially broad reach [35, 36] and school-based EBP implementation allows for adaptation to local culture and contexts that is scalable across communities and states [37, 38].

Implementation considerations in schools

Selecting and tailoring implementation strategies to practice and context has been found to optimize implementation feasibility and, ultimately, effectiveness outcomes [39, 40]. Yet, results are mixed, suggesting that tailoring may need to occur continuously throughout implementation [41]. Schools are also a unique setting for mental health treatment services, so implementation strategies defined for other behavioral healthcare delivery settings are unlikely to fit perfectly for schools without attention to strategic selection and tailoring. Indeed, implementing new practices in educational settings requires careful attention to school organizational factors, such as principal leadership, education policies at state and federal levels, a heterogenous mental health workforce, requirements and constraints related to professional development and ongoing coaching, and logistics as basic as the school calendar [42]. Other studies point to the importance of flexible treatment delivery and intentional family engagement efforts to facilitate EBP implementation and outcomes [43].

MBC implementation in schools

Barriers to MBC implementation in schools have some similarities with other more traditional behavioral health care settings, such as providers reporting limited time to administer measures. However, some barriers are more salient in the school context, such as difficulty reaching parents, limited access to measures, and lack of administrative or technical resources for scoring measures [44]. Although scientifically-rigorous applications of MBC in schools are new, an individualized approach to monitoring student progress and outcomes has been emphasized and studied in schools for decades [45, 46]. There are some published demonstrations of standardized, patient-reported outcome measures being implemented in school mental health systems [47, 48], as well as examples of psychosocial progress monitoring in schools as part of high-quality, comprehensive school mental health systems [49]. Moreover, MBC is consistent with schools’ emphasis on Response to Intervention, which is using student progress data to prevent and remediate academic and behavioral difficulties [50] and accountability requests for school-based providers to demonstrate outcomes [51]. Recent studies have highlighted case examples of an MBC approach in schools, from assessment tool selection to measurement processes and the role of feedback to the student and family [51, 52]. Yet, there still remains a substantial gap in the literature regarding implementation strategies best suited to MBC implementation when child mental health treatment services are provided on school grounds instead of a more traditional clinic or hospital setting.

Current study

The current study identifies feasible and important implementation strategies to increase school mental health provider use of MBC. This work builds on an initial list of 70+ implementation strategies that have been codified for general use [9, 53], and a recent extension to identify top strategies relevant to and important for implementing evidence-based practices in school settings [13, 54]. We focused specifically on selecting strategies for MBC in schools using prior Delphi survey methods. We collected importance and feasibility ratings for implementation strategies as well as operational definitions and recommendations for practical application in schools [9, 13, 53, 54]. Our objective was to obtain stakeholder perceptions of the most feasible and important implementation strategies for MBC as rated by provider and researcher stakeholders with expertise in school mental health treatment.



Study participants (N = 52) were drawn from a national sample of school mental health stakeholders: (1) providers with experience delivering and/or supervising mental health interventions in schools (N = 31); and [2] researchers with experience partnering with schools or districts to implement EBPs (N = 21). Providers were sampled from the National School Mental Health Census and researchers were sampled from two established lists of researchers with relevant expertise (see procedures for details). All participants were US-based in one or more than 23 states (AZ, AR, CA, CO, CT, FL, GA, IL, IN, LA, MD, MA, MI, MN, NE, NH, NC, OH, OR, PA, TX, VA, and WA). Table 1 shows demographic, professional, and urbanicity characteristics of participants.

Table 1 Demographic and professional characteristics of stakeholder participants, N = 52

Providers identified as school psychologists (N = 6, 19%), school social workers (N = 5, 16%), or school counselors (N = 5, 16%). Other provider roles included being a school psychology supervisor (N = 2, 7%), director of related services/special education/student support (N = 2, 7%), counselor (community- or hospital-employed; N = 1, 3%), mental health agency administrator (N = 1, 3%), or other positions (N = 9, 29%). School providers were based in 18 states representing all regions of the USA, and researchers in 14 states and had worked with school partners in 43 states, the District of Columbia, Guam, the US Virgin Islands, and other US territories. Most providers indicated they had current or past experiences delivering (N = 30, 97%) and/or supervising (N = 20, 65%) mental health treatment services in schools. Demographic and professional characteristics and urbanicity of the N = 31 participating providers displayed in Table 1 were not significantly different from those N = 53 providers recruited who completed the prescreening survey, based on chi-square tests (Awad M, Connors E: Promoting measurement-based care in school mental health through practice-specific supervision, submitted). These details were not available for individuals who completed the School Mental Health Profile generally.

Researchers had experience conducting research about child and adolescent mental health, conducting research in partnership with schools/districts, training school-based personnel, and providing consultation or technical assistance to schools/districts. Most researchers had current or past experience training graduate students about working in or with schools (N = 20, 95%), providing mental healthcare in schools (N = 16, 77%), supervising direct mental healthcare in schools (N = 13, 62%), and serving as an educator (N = 11, 52%). Researchers represented various age groups, fields of training, and urbanicity across the USA. Although gender identity (56% female) and degree (100% PhD) appear similar to researchers in our datasets who were not invited to participate, we did not have detailed self-reported characteristics of non-participating researchers to conduct statistical comparisons. Results from study participants are likely generalizable to stakeholders of similar demographics, professional expertise, and geographic location. There was a 94% retention rate of participants for Survey 2 (N = 49; N = 30 providers and N = 19 researchers).


Systematic sampling procedures that drew on nationally representative databases for school-based providers and researchers were used to identify the study sample. Providers were selected through stratified random sampling from the National School Mental Health Census, a nationally representative survey of school and district mental health teams’ services and data usage. Inclusion criteria, confirmed by self-report on a prescreening survey, was holding a position as a school mental health provider or clinical supervisor with experience delivering or supervising school-based psychotherapy, in which MBC would be used (e.g., school social worker). Census data with individuals meeting this inclusion criteria were stratified based on rural-urbanicity continuum codes (metropolitan vs. non-metropolitan) and geographic representation. Prospective participants were randomly selected with replacement until a target sample of at least 30 school mental health providers was achieved. We monitored the sample for approximate distributions in the USA for (1) metropolitan and non-metropolitan/rural urbanicity; and [2] geographic location. Using this approach, we oversampled for non-metropolitan/rural providers toward the end of recruitment to ensure adequate representation.

We recruited 211 school mental health providers; after a response rate of 25% (N = 53) for a prescreening survey, four were ineligible due to never being a clinician or clinical supervisor (N = 3) or community provider not working in a school (N = 1). Of the N = 49 eligible participants invited to complete Survey 1, a final sample of 31 providers participated. Eligible recruits who did not participate had nonworking emails (N = 24), did not respond to our recruitment request (N = 106), or declined (N = 28). Providers received up to three reminder emails over the course of 3 weeks to respond to the study invitation to consent and start Survey 1.

Researchers were selected using purposive sampling from two sources, which were (1) Implementation Research Institute fellows who applied to and were selected for implementation science training through a competitive process and reviewed for school mental health expertise [55]; and (2) a list of 138 school mental health researchers maintained by the National Center for School Mental Health with active peer-reviewed publications and/or grants on topics pertaining to school mental health and wellbeing. This latter group of researchers were part of an invitation-only annual national meeting and pre-reviewed for their scholarship and impact on the field, adjusted for career stage, by a planning committee team comprised of national school mental health scholars. Inclusion criteria were (1) expertise with mental health program or practice development, effectiveness testing, and/or implementation research; (2) experience partnering directly with schools; and (3) Associate Professor or Professor at their institution, which resulted in N = 56 eligible researchers. Next, advanced expertise implementing mental health programs or practices in schools was coded on a 4-point scale (3 = “optimal,” 2 = “good,” 1 = “okay’, and 0 = “unable to assess”) by three senior school mental health researchers with extensive experience in evidence-based practice implementation in schools. Ratings were averaged for each researcher and then recruits were invited with replacement from the highest ratings downwards until a sample size of at least N = 20 was achieved. We recruited 29 research participants, which resulted in a response rate of 72% (N = 21); among recruits, one did not respond to recruitment emails and seven declined.

Measures: Delphi surveys

Participants completed two rounds of feedback using anonymous Delphi surveys. Each survey started with operational definitions of implementation strategies, MBC, school mental health providers, and three vignettes illustrating MBC use in schools (see Supplemental file 1). Vignettes were developed and revised for clarity and accuracy based on feedback from several co-authors and other collaborators. The vignettes focus on MBC clinical practice representing various school mental health professional roles, presenting concerns, student agesFootnote 1, and measures. Due to our focus on identifying implementation strategies for MBC as a clinical practice, the vignettes did not refer to any implementation supports, such as decision support by a measurement feedback system or other digital interface for scoring and viewing progress data. Although clinical decision support tools have been associated with more robust effects of MBC, they are not necessary [56], and using technology to aid measure completion and review may create disparities in MBC access [57]. Availability and feasibility of technology-assisted decision support tools is variable in public schools given the ongoing digital divide in education [58]. Therefore, to ensure MBC was presented in a manner that would not raise resource or equity issues, our vignettes focused on the core components of MBC only, without noting how measures are collected.

The Delphi technique is an established method using a series of surveys or questionnaires to obtain controlled, mixed methods input from a diverse set of expert stakeholders to gain reliable consensus on a health care quality topic [59, 60]. This method was used in the Expert Recommendations for Implementing Change (ERIC) project to identify a complete list of implementation strategies and definitions for selection and application to specific practices and settings [9, 53]. Another research team replicated and extended this research to select and tailor strategies for implementing EBPs in schools [13, 61, 62]. As the prior school study was not practice-specific, we included the 33 implementation strategies rated most important and feasible by the prior study to further refine a list of strategies for MBC in schools [61]. For each strategy, participants indicated whether it is relevant to MBC care specifically (“yes,” “yes with changes,” or “no”). For strategies rated as relevant (“yes” or “yes with changes”), participants then were asked to provide (1) importance and feasibility ratings (1 = “not at all important/feasible” to 5 = “extremely important/feasible”) based on the definition provided, (2) possible synonyms or related activities to the strategy, and (3) suggestions about the definition or application of the strategy. To close the survey, participants were also asked to suggest additional implementation strategies not listed. The Round 2 survey included an updated list of strategies and definitions based on Round 1 results. Participants had 4 weeks to complete Round 1 and 2 surveys. Participants were compensated for their time and study procedures were approved by the Yale Institutional Review Board.

Data analyses

Descriptive statistics of quantitative feasibility and importance ratings were examined for normality. Independent samples t-tests were used to compare ratings between providers and researchers. Mean feasibility and importance ratings were plotted for each strategy on a “go-zone” plot to compare relative feasibility and importance by quadrants [63]. Go-zone plots provide a bivariate display of mean ratings and are often used in concept mapping. The origin represents the grand mean of both variables of interest (in this case, feasibility and importance) and the four resulting quadrants are used to interpret relative distance among items (in this case, strategies). The top right quadrant, Zone 1, is the “go-zone” where strategies of the highest feasibility and importance appear.

A multimethod approach was used to reduce strategies and refine definitions between Survey 1 and Survey 2. First, a document was developed to display quantitative and qualitative Survey 1 results for each strategy. This included each Survey 1 strategy and definition, go-zone quadrant results (overall, as well as for providers and researchers), quantitative considerations (e.g., percentage of stakeholders who indicated the strategy was not relevant for MBC in school, significant differences between providers and researchers, any distribution normality concerns with ratings), qualitative synonyms, and qualitative definition change recommendations made by participants. Second, one rater (EC) reviewed each strategy using this document and established decision-making guidance vetted by study team members for each zone. She coded an initial decision (e.g., retain with revisions, collapse, or remove) with justification for each, documented any synonyms reported more than three times, and drafted definition changes that were (a) minimal language adjustments; (b) not substantial additions to definition length, and (c) consistent with overall scope of the strategy. Then, another rater (CS) reviewed coded decisions and documentation, and all discrepancies were resolved through consensus conversations. Final decisions about collapsing strategies were made based on consultation with two implementation researchers.

We also examined additional strategies and associated definitions recommended by N = 10 providers and N = 7 researchers as well as substantive comments provided at the end of Survey 1 by N = 16 providers and N = 8 researchers that pertained to additional strategies. Using thematic analysis and consensus coding by both coders, these data resulted in four distinct strategies broadly related to incentives, policy change, workload/time, and measure selection which were added to Survey 2. We discovered that two strategies (“alter and provide individual- and system-level incentives” and “develop local policy that supports implementation”) already existed in the established list of strategies for EBPs in schools, so we added those strategies and definitions from the published literature [27]. Two strategies (“support workflow adjustments” and “offer a clinician-informed menu of free, brief measures”) were new, so we added those strategies and definitions based on stakeholder qualitative feedback.

To analyze Survey 2 results, descriptive statistics, independent samples t-tests and go-zone plots were used again, as was the multi-step process detailed above.


Survey 1 strategy ratings

In general, strategies were rated as “relevant” or “relevant with changes” by participants, and all 33 strategies in Survey 1 received importance and feasibility ratings. Eight strategies received the highest proportion of “not relevant” ratings (range = 25–38% participants) to MBC in schools, as follows: (1) model and simulate change; (2) change/alter environment; (3) provide practice-specific feedback; (4) identify early adopters; (5) visit other sites; (6) obtain and use student and family feedback; (7) develop academic partnerships; and (8) build partnerships (i.e., coalitions) to support implementation. Since the majority of participants rated these as “relevant” or “relevant with changes,” the importance and feasibility ratings are included in our analysis.

Importance and feasibility ratings were high overall for both survey rounds, with importance ratings higher than feasibility ratings on average. On Survey 1, importance ratings ranged from 3.44 (“develop academic partnerships”) to 4.61 (“make implementation easier by removing burdensome documentation tasks”) and feasibility ratings ranged from 2.89 (“visit other sites”) to 4.10 (“distribute educational materials”). Survey 1 standard deviations varied from 0.68 to 1.18. See Table 2 for importance and feasibility results for the 33 initial implementation strategies. Figures 1 and 2 display these findings on go-zone plots, where the four quadrants or “zones” are divided by the grand mean scores of 4.01 for importance and 3.49 for feasibility. Zone 1 includes strategies rated above the grand mean for importance and feasibility (i.e., high feasibility/high importance), Zone 2 includes strategies rated above the grand mean for feasibility but not importance (i.e., high feasibility/low importance), Zone 3 includes strategies rated below the grand mean for feasibility and importance (i.e., low feasibility/low importance), and Zone 4 includes strategies rated above the grand mean for importance but below the feasibility grand mean (i.e., low feasibility/high importance).

Table 2 Results of 33 initial implementation strategies in Survey 1
Fig. 1
figure 1

Go-zone plot: Survey 1 importance and feasibility ratings (limited range to focus on origin)

Fig. 2
figure 2

Go-zone plot: Survey 1 importance and feasibility ratings (full range 1–5)

Survey 2 strategy ratings

Based on the multimethod approach described above, Survey 2 contained a reduced set of 21 strategies with updated definitions (see Fig. 3). From Survey 1 to Survey 2, 14 strategies were retained (with updates to strategy title and/or definition in most cases), 7 strategies were collapsed into three, 12 were removed, and 4 were added. Feasibility and importance grand means were similar for Survey 2 (importance grand mean = 4.05; feasibility grand mean = 3.33). On Survey 2, importance ratings ranged from 3.61 (“use train the trainer strategies”) to 4.48 (“develop a useable implementation plan”) and feasibility ratings ranged from 2.55 (“support workflow adjustments”) to 4.06 (“offer a provider-informed menu of free, brief measures”). Survey 2 standard deviations varied from 0.56 to 1.22.

Fig. 3
figure 3

Go-zone plot: Survey 2 importance and feasibility ratings (limited range to focus on origin)

Survey 2 top-rated strategies

Among the 21 revised implementation strategies included in Survey 2 (see Table 4), six were rated as most important and most feasible (see Zone 1 strategies in Table 3, Fig. 3, and Fig. 4). These top-rated strategies include (1) assess for readiness and identify barriers and facilitators; (2) identify and prepare champions; (3) develop a usable implementation plan; (4) offer a provider-informed menu of free, brief measures; (5) develop and provide access to training materials; and (6) make implementation easier by removing burdensome documentation tasks.

Table 3 Results of 21 implementation strategies in Survey 2
Fig. 4
figure 4

Go-zone plot: Survey 2 importance and feasibility ratings (full range 1–5)

Several additional strategies were rated within 0.50 of the feasibility grand mean, yet above the mean cutoff for importance (see Table 3, Zone 4 strategies with asterisks). These include “conduct ongoing training,” “provide ongoing clinical consultation/coaching,” “monitor implementation progress and provide feedback,” “monitor fidelity to MBC core components,” and “promote adaptability”.

Stakeholder group comparisons

On Survey 1, provider and researcher ratings were not significantly different with three exceptions. First, as compared to researchers, providers reported that it is more feasible and important to make implementation easier by removing burdensome paperwork (feasibility provider M = 4.31 vs researcher M = 3.35; feasibility t [44] = -2.96, p = 0.01, d = 0.88; importance provider M = 4.85 vs researcher M = 4.30; importance t [44] = 2.90, p < 0.01, d = 0.86). Second, as compared to providers, researchers reported it is more important to monitor the implementation effort (provider M = 4.20 vs researcher M = 4.67; t [44] = −2.51, p = 0.02, d = −0.72). Third, train-the-trainer feasibility ratings were significantly higher among providers (M = 3.81) than researchers (M = 3.30; t [45] = 2.06, p < 0.05, d = 0.61). On Survey 2, provider and researcher ratings were not significantly different with one exception; providers reported it is more important to make implementation easier by removing burdensome paperwork (provider M = 4.50 vs researcher M = 3.94; t [44] = 2.04, p = 0.048, d = 0.62).


We applied an established, stakeholder-informed method to identify important and feasible implementation strategies for measurement-based care (MBC) used in school-based mental health treatment. MBC was selected as an under-implemented yet promising and scalable clinical practice in schools that can be added to any presenting concern or treatment plan to improve care quality for children and adolescents. We identified six top-rated implementation strategies for MBC based on ratings of importance and feasibility in schools. Those strategies were (1) assess for readiness and identify barriers and facilitators; (2) identify and prepare champions; (3) develop a usable implementation plan; (4) offer a provider-informed menu of free, brief measures; (5) develop and provide access to training materials; and (6) make implementation easier by removing burdensome documentation tasks.

These six strategies identified represent a natural chronology for organizing an implementation approach for clinical providers in schools. For example, several strategies could be put in place before an initial training or provision of training materials occurs (e.g., assess for readiness, develop an implementation plan) and others could follow. These strategies could also be provided as a “bundle” to support MBC implementation in schools.

Several additional strategies were rated as highly important and relatively feasible within 0.50 of the feasibility grand mean. In general, these strategies reflect those that promote ongoing implementation in clinical practice after initial planning and provider training, which is highly consistent with extant findings about the importance of post-training implementation support strategies [64,65,66]. As these strategies are near the “border” of feasibility and importance grand means, they warrant attention as potentially viable strategies, given the strictly numeric, bivariate cutoff between zones based on grand mean values.

Implementation and feasibility ratings were not significantly different between providers and researchers, although future replication with a larger sample size is warranted. The few significant differences identified involved moderate to large effect sizes, with providers emphasizing the reduction of burdensome documentation and researchers emphasizing fidelity monitoring to support MBC in schools. These differences have face validity; providers have more experience with barriers related to documentation and other clinical workflow details than researchers do, and researchers are more focused on ensuring the implementation is carried out as intended. These differences illustrate the importance of ensuring bidirectional communication, collaboration, and perspective sharing between these two groups of stakeholders, and highlight the importance of sampling various stakeholder perspectives when examining implementation processes.

Also, by focusing specifically on MBC implementation in schools, the current results reveal a narrower and a higher range of both importance and feasibility ratings for MBC implementation strategies in schools as compared to general EBP implementation (our importance range = 3.61–4.48 versus a range of 2.62–4.59 in prior work and our feasibility range = 2.55–4.06 versus a range of 2.08–3.72 in prior work [54]. These differences suggest the value of prioritizing implementation strategies to specific implementation settings and contexts as was the case in this study.


This study has several limitations. First, although this sample was nationally representative, it is relatively small, and thus importance and feasibility ratings may not hold for a larger sample. Degrees of freedom were further limited by only requesting feasibility or importance ratings if the participant responded that the strategy was relevant to MBC. Also, school providers were recruited from a national dataset of teams engaged in school mental health quality assessment and improvement efforts, which may be a more select group of school mental health providers. Future studies should examine importance and feasibility ratings from a wider range of school mental health providers. A larger sample would also allow for more powered analyses of school and provider characteristics (e.g., school size, provider characteristics in Table 1) as moderators of feasibility and importance ratings.

Also, we selected 33 implementation strategies already rated highly in a prior study of EBP implementation in schools, and thus we were unlikely to find mean importance or feasibility ratings in the low to moderate range. Although this may raise questions about potential ceiling effects, the grand means for each construct were not overly high (importance grand mean = 4.05; feasibility grand mean = 3.33), and we used the grand mean as the cut point for the sample (as is conventional for go-zone graphs) to interpret differences among ratings.

Finally, stakeholders’ qualitative feedback about the definition of each strategy was used to develop the final list that appears in Table 4, but recommendations about application of the strategy were not included. This is most pertinent to feasibility, and our team is currently examining these qualitative data to understand how we might optimize feasibility of individual strategies that were rated highly important, but less feasible (Awad M, Connors E: Promoting measurement-based care in school mental health through practice-specific supervision, submitted). Feasibility is a complex construct; many elements contribute to feasibility ratings for a given practice or strategy [67] and when we assess perceptions of feasibility prospectively, the rater has to make assumptions about what resource or training requirements, for example, are part of the strategy [7]. It is not uncommon for school stakeholders to rate implementation supports or best practices as more important than feasible due to their experience with resource constraints and structural barriers in schools [16, 68]. Therefore, future research should continue to examine how to operationalize, tailor, and evaluate strategies to promote feasibility in the school context, in order to support schools’ capacity to feasibly implement new initiatives with integrity and sustainability [33, 69].

Table 4 Final list of 21 implementation strategies and definitions for MBC in school mental health

Conclusion and future directions

Methods to select and tailor implementation strategies for a particular practice and setting have been somewhat elusive to date in implementation research and practice [5]. The methods used in this study can be applied to other evidence-based practices, settings, and contexts to solve implementation challenges. In addition, the effectiveness of implementation strategies selected for their potential importance and feasibility needs to be empirically examined. Identification of top-rated strategies for a particular intervention and context is foundational to future strategy testing with practicing providers in real-world care systems. Strategies selected from implementation science methods, such as the current survey methods with go-zone plots, should also be critically examined for the possibility of bundling or combining some strategies together (for parsimony and/or alignment) as well as when to apply strategies across implementation stages over time.

Availability of data and materials

The datasets generated during and/or analyzed during the current study are not publicly available due to the pilot nature of the study and lack of patient outcome data but are available from the corresponding author on reasonable request.


  1. Vignettes refer to student “grade”, not age. In the United States, 3rd grade is in primary school, approximately 8 years old, 6th grade is considered “middle school”, approximately 11 years old, and 9th grade is the beginning of secondary school, approximately 14 years old.



Measurement-Based Care


Evidence-based practice


  1. Hoagwood K, Burns BJ, Kiser L, Ringeisen H, Schoenwald SK. Evidence-based practice in child and adolescent mental health services. Psychiatr Serv. 2001;52(9):1179–89.

    CAS  PubMed  Article  Google Scholar 

  2. Drake RE, Goldman HH, Leff HS, Lehman AF, Dixon L, Mueser KT, et al. Implementing evidence-based practices in routine mental health service settings. Psychiatr Serv. 2001;52(2):179–82.

    CAS  PubMed  Article  Google Scholar 

  3. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4(1):50.

    PubMed  PubMed Central  Article  Google Scholar 

  4. Lyon AR, Ludwig K, Romano E, Koltracht J, Vander Stoep A, McCauley E. Using modular psychotherapy in school mental health: provider perspectives on intervention-setting fit. J Clin Child Adolesc Psychol. 2014;43(6):890–901.

    PubMed  Article  Google Scholar 

  5. Powell BJ, Fernandez ME, Williams NJ, Aarons GA, Beidas RS, Lewis CC, et al. Enhancing the impact of implementation strategies in healthcare: a research agenda. Front Public Health. 2019;7:3.

    PubMed  PubMed Central  Article  Google Scholar 

  6. Proctor EK, Landsverk J, Aarons G, Chambers D, Glisson C, Mittman B. Implementation research in mental health services: an emerging science with conceptual, methodological, and training challenges. Administration Policy Mental Health Mental Health Serv Res. 2009;36(1):24–34.

    Article  Google Scholar 

  7. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Administration Policy Mental Health Mental Health Serv Res. 2011;38(2):65–76.

    Article  Google Scholar 

  8. Waltz TJ, Powell BJ, Chinman MJ, Smith JL, Matthieu MM, Proctor EK, et al. Expert recommendations for implementing change (ERIC): protocol for a mixed methods study. Implement Sci. 2014;9(1):1–12.

    Article  Google Scholar 

  9. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10(1):21.

    PubMed  PubMed Central  Article  Google Scholar 

  10. Powell BJ, Beidas RS, Lewis CC, Aarons GA, McMillen JC, Proctor EK, et al. Methods to improve the selection and tailoring of implementation strategies. J Behav Health Serv Res. 2017;44(2):177–94.

    PubMed  PubMed Central  Article  Google Scholar 

  11. Waltz TJ, Powell BJ, Matthieu MM, Damschroder LJ, Chinman MJ, Smith JL, et al. Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: results from the Expert Recommendations for Implementing Change (ERIC) study. Implement Sci. 2015;10(1):1–8.

    Article  Google Scholar 

  12. Dopp AR, Parisi KE, Munson SA, Lyon AR. Integrating implementation and user-centred design strategies to enhance the impact of health services: protocol from a concept mapping study. Health Res Policy Syst. 2019;17(1):1–11.

    PubMed  PubMed Central  Article  Google Scholar 

  13. Cook CR, Lyon AR, Locke J, Waltz T, Powell BJ. Adapting a compilation of implementation strategies to advance school-based implementation research and practice. Prev Sci. 2019;20(6):914–35.

    PubMed  PubMed Central  Article  Google Scholar 

  14. Lewis CC, Powell BJ, Brewer SK, Nguyen AM, Schriger SH, Vejnoska SF, et al. Advancing mechanisms of implementation to accelerate sustainable evidence-based practice integration: protocol for generating a research agenda. BMJ Open. 2021;11(10):e053474.

    PubMed  PubMed Central  Article  Google Scholar 

  15. Ageberg E, Bunke S, Lucander K, Nilsen P, Donaldson A. Facilitators to support the implementation of injury prevention training in youth handball: a concept mapping approach. Scand J Med Sci Sports. 2019;29(2):275–85.

    PubMed  Article  Google Scholar 

  16. Stormont M, Lewis TJ, Covington SS. Behavior support strategies in early childhood settings: teachers' importance and feasibility ratings. J Positive Behavior Interventions. 2005;7(3):131–9.

    Article  Google Scholar 

  17. Scott K, Lewis CC. Using measurement-based care to enhance any treatment. Cogn Behav Pract. 2015;22(1):49–59.

    PubMed  PubMed Central  Article  Google Scholar 

  18. Resnick SG, Oehlert ME, Hoff RA, Kearney LK. Measurement-based care and psychological assessment: using measurement to enhance psychological treatment. Psychol Serv. 2020;17(3):233–7.

    PubMed  Article  Google Scholar 

  19. Christian U. Krägeloh,  Karol J. Czuba,  D. Rex Billington, Paula Kersten, Richard J. Siegert. Using feedback from patient-reported outcome measures in mental health services: a scoping study and typology. Psychiatr Serv 2015;66(3):224-241.

  20. Lambert MJ, Whipple JL, Hawkins EJ, Vermeersch DA, Nielsen SL, Smart DW. Is it time for clinicians to routinely track patient outcome? A meta-analysis. Clin Psychol Sci Pract. 2003;10(3):288–301.

    Article  Google Scholar 

  21. Lambert MJ, Whipple JL, Kleinstäuber M. Collecting and delivering progress feedback: a meta-analysis of routine outcome monitoring. Psychotherapy. 2018;55(4):520.

    PubMed  Article  Google Scholar 

  22. Shimokawa K, Lambert MJ, Smart DW. Enhancing treatment outcome of patients at risk of treatment failure: meta-analytic and mega-analytic review of a psychotherapy quality assurance system. J Consult Clin Psychol. 2010;78(3):298–311.

    PubMed  Article  Google Scholar 

  23. Bergman H, Kornør H, Nikolakopoulou A, Hanssen-Bauer K, Soares-Weiser K, Tollefsen TK, Bjørndal A. Client feedback in psychological therapy for children and adolescents with mental health problems. Cochrane Database Syst Rev. 2018;8(8):CD011729.

  24. Kendrick T, El-Gohary M, Stuart B, Gilbody S, Churchill R, Aiken L, et al. Routine use of patient reported outcome measures (PROMs) for improving treatment of common mental health disorders in adults. Cochrane Database Syst Rev. 2016;7:CD011119.

  25. Bickman L, Rosof-Williams J, Salzer M, Summerfelt W, Noser K, Wilson S, et al. What information do clinicians value for monitoring adolescent client progress and outcomes? Prof Psychol Res Pract. 2000;31:70–4.

    Article  Google Scholar 

  26. Jensen-Doss A, Haimes EMB, Smith AM, Lyon AR, Lewis CC, Stanick CF, et al. Monitoring treatment progress and providing feedback is viewed favorably but rarely used in practice. Administration Policy Mental Health Mental Health Serv Res. 2018;45(1):48–61.

    Article  Google Scholar 

  27. Lewis CC, Boyd M, Puspitasari A, Navarro E, Howard J, Kassab H, et al. Implementing measurement-based care in behavioral health: a review. JAMA Psychiat. 2019;76(3):324–35.

    Article  Google Scholar 

  28. Bains RM, Diallo AF. Mental health services in school-based health centers: systematic review. J Sch Nurs. 2016;32(1):8–19.

    PubMed  Article  Google Scholar 

  29. Farmer EM, Burns BJ, Phillips SD, Angold A, Costello EJ. Pathways into and through mental health services for children and adolescents. Psychiatr Serv. 2003;54(1):60–6.

    PubMed  Article  Google Scholar 

  30. Duong MT, Bruns EJ, Lee K, Cox S, Coifman J, Mayworm A, et al. Rates of mental health service utilization by children and adolescents in schools and other common service settings: a systematic review and meta-analysis. Administration Policy Mental Health Mental Health Serv Res. 2021;48(3):420–39.

    Article  Google Scholar 

  31. Cummings JR, Ponce N, Mays VM. Comparing racial/ethnic differences in mental health service use among high-need subpopulations across clinical and school-based settings. J Adolesc Health. 2010;46(6):603–6.

    PubMed  PubMed Central  Article  Google Scholar 

  32. Wilk AS, Hu J-C, Wen H, Cummings JR. Recent trends in school-based mental health services among low-income and racial and ethnic minority adolescents. JAMA. Pediatrics. 2022.

  33. Langley AK, Nadeem E, Kataoka SH, Stein BD, Jaycox LH. Evidence-based mental health programs in schools: barriers and facilitators of successful implementation. School Mental Health. 2010;2(3):105–13.

    PubMed  PubMed Central  Article  Google Scholar 

  34. Barrett NM, Gill KJ, Pratt CW, Roberts MM. Psychiatric rehabilitation: Academic Press; 2013.

    Google Scholar 

  35. Lindsey MA, Chambers K, Pohle C, Beall P, Lucksted A. Understanding the behavioral determinants of mental health service use by urban, under-resourced Black youth: adolescent and caregiver perspectives. J Child Fam Stud. 2013;22(1):107–21.

    PubMed  Article  Google Scholar 

  36. Whitaker K, Nicodimos S, Pullmann MD, Duong MT, Bruns EJ, Wasse JK, et al. Predictors of disparities in access and retention in school-based mental health services. School Mental Health. 2018;10(2):111–21.

    Article  Google Scholar 

  37. Graczyk PA, Domitrovich CE, Zins JE. Facilitating the implementation of evidence-based prevention and mental health promotion efforts in schools. In: Weist MD, Evans SW, Lever NA, editors. Handbook of School Mental Health Advancing Practice and Research: Springer; 2003. p. 301-318.

  38. Tebes JK, Feinn R, Vanderploeg JJ, Chinman MJ, Shepard J, Brabham T, et al. Impact of a positive youth development program in urban after-school settings on the prevention of adolescent substance use. J Adolesc Health. 2007;41(3):239–47.

    PubMed  Article  Google Scholar 

  39. Joosen MC, van Beurden KM, Terluin B, van Weeghel J, Brouwers EP, van der Klink JJ. Improving occupational physicians’ adherence to a practice guideline: feasibility and impact of a tailored implementation strategy. BMC Med Educ. 2015;15(1):1–12.

    Article  Google Scholar 

  40. Engel M, Bruns A, Hulscher M, Gaillard C, Sankatsing S, Teding van Berkhout F, et al. A tailored implementation strategy to reduce the duration of intravenous antibiotic treatment in community-acquired pneumonia: a controlled before-and-after study. Eur J Clin Microbiol Infect Dis. 2014;33(11):1897–908.

    CAS  PubMed  Article  Google Scholar 

  41. Wensing M. The Tailored Implementation in Chronic Diseases (TICD) project: introduction and main findings. Implement Sci. 2017;12(1):1–4.

    CAS  Article  Google Scholar 

  42. Owens JS, Lyon AR, Brandt NE, Warner CM, Nadeem E, Spiel C, et al. Implementation science in school mental health: key constructs in a developing research agenda. School Mental Health. 2014;6(2):99–111.

    PubMed  Article  Google Scholar 

  43. Weist MD, Youngstrom EA, Stephan S, Lever N, Fowler J, Taylor L, et al. Challenges and ideas from a research program on high-quality, evidence-based practice in school mental health. J Clin Child Adolesc Psychol. 2014;43(2):244–55.

    PubMed  Article  Google Scholar 

  44. Connors EH, Arora P, Curtis L, Stephan SH. Evidence-based assessment in school mental health. Cogn Behav Pract. 2015;22:60–73.

    Article  Google Scholar 

  45. Ruble L, McGrew JH, Toland MD. Goal attainment scaling as an outcome measure in randomized controlled trials of psychosocial interventions in autism. J Autism Dev Disord. 2012;42(9):1974–83.

    PubMed  PubMed Central  Article  Google Scholar 

  46. Miller FG, Crovello NJ, Chafouleas SM. Progress monitoring the effects of daily report cards across elementary and secondary settings using Direct Behavior Rating: Single Item Scales. Assess Effect Intervent. 2017;43(1):34–47.

    Article  Google Scholar 

  47. Connors E, Lawson G, Wheatley-Rowe D, Hoover S. Exploration, Preparation, and Implementation of Standardized Assessment in a Multi-agency School Behavioral Health Network. Administration Policy Mental Health Mental Health Serv Res. 2021;48(3);64–481.

  48. Sander MA, Everts J, Johnson J. Using data to inform program design and implementation and make the case for school mental health. Adv School Mental Health Promot. 2011;4(4):13–21.

    Article  Google Scholar 

  49. Detterman R, Ventura J, Rosenthal L, Berrick K. Unconditional education: supporting schools to serve all students: Oxford University Press; 2019.

    Book  Google Scholar 

  50. Fletcher JM, Vaughn S. Response to intervention: preventing and remediating academic difficulties. Child Dev Perspect. 2009;3(1):30–7.

    PubMed  PubMed Central  Article  Google Scholar 

  51. Borntrager C, Lyon AR. Client progress monitoring and feedback in school-based mental health. Cogn Behav Pract. 2015;22(1):74–86.

    PubMed  Article  Google Scholar 

  52. Bohnenkamp JH, Glascoe T, Gracey KA, Epstein RA, Benningfield MM. Implementing clinical outcomes assessment in everyday school mental health practice. Child Adolesc Psychiatric Clin. 2015;24(2):399–413.

    Article  Google Scholar 

  53. Waltz TJ, Powell BJ, Matthieu MM, Damschroder LJ, Chinman MJ, Smith JL, et al. Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: results from the Expert Recommendations for Implementing Change (ERIC) study. Implement Sci. 2015;10(1):109.

    PubMed  PubMed Central  Article  Google Scholar 

  54. Lyon AR, Cook CR, Locke J, Davis C, Powell BJ, Waltz TJ. Importance and feasibility of an adapted set of implementation strategies in schools. J Sch Psychol. 2019;76:66–77.

    PubMed  PubMed Central  Article  Google Scholar 

  55. Baumann AA, Carothers BJ, Landsverk J, Kryzer E, Aarons GA, Brownson RC, et al. Evaluation of the implementation research institute: trainees’ publications and grant productivity. Administration Policy Mental Health Mental Health Serv Res. 2020;47(2):254–64.

    Article  Google Scholar 

  56. Barber J, Resnick SG. Collect, Share, Act: a transtheoretical clinical model for doing measurement-based care in mental health treatment. Psychol Serv. 2022.

  57. Sisodia RC, Rodriguez JA, Sequist TD. Digital disparities: lessons learned from a patient reported outcomes program during the COVID-19 pandemic. J Am Med Inform Assoc. 2021;28(10):2265–8.

    PubMed  PubMed Central  Article  Google Scholar 

  58. Hohlfeld TN, Ritzhaupt AD, Barron AE, Kemker K. Examining the digital divide in K-12 public schools: four-year trends for supporting ICT literacy in Florida. Comput Educ. 2008;51(4):1648–63.

    Article  Google Scholar 

  59. Boulkedid R, Abdoul H, Loustau M, Sibony O, Alberti C. Using and reporting the Delphi method for selecting healthcare quality indicators: a systematic review. PLoS One. 2011;6(6):e20476.

    CAS  PubMed  PubMed Central  Article  Google Scholar 

  60. Powell C. The Delphi technique: myths and realities. J Adv Nurs. 2003;41(4):376–82.

    PubMed  Article  Google Scholar 

  61. Lyon AR, Pullmann MD, Whitaker K, Ludwig K, Wasse JK, McCauley E. A Digital Feedback System to Support Implementation of Measurement-Based Care by School-Based Mental Health Clinicians. J Clin Child Adolesc Psychol. 2019;48(sup1):S168–S79.

    PubMed  Article  Google Scholar 

  62. Lyon AR, Pullmann MD, Dorsey S, Martin P, Grigore AA, Becker EM, et al. Reliability, validity, and factor structure of the current assessment practice evaluation-revised (caper) in a national sample. J Behav Health Serv Res. 2019;46(1):43–63.

    PubMed  Article  Google Scholar 

  63. Trochim W, Kane M. Concept mapping: an introduction to structured conceptualization in health care. International J Qual Health Care. 2005;17(3):187–91.

    Article  Google Scholar 

  64. Herschell A, Kolko D, Baumann B, Davis AC. The role of therapist training in the implementation of psychosocial treatments: a review and critique with recommendations. Clin Psychol Rev. 2010;30(4):448–66.

    PubMed  PubMed Central  Article  Google Scholar 

  65. Valenstein-Mah H, Greer N, McKenzie L, Hansen L, Strom TQ, Wiltsey Stirman S, et al. Effectiveness of training methods for delivery of evidence-based psychotherapies: a systematic review. Implement Sci. 2020;15:1–17.

    Article  Google Scholar 

  66. Rinad S. Beidas, Julie M. Edmunds, Steven C. Marcus, Philip C. Kendall. Training and consultation to promote implementation of an empirically supported treatment: a randomized trial. Psychiatr Serv. 2012;63(7):660-665.

  67. Richesson RL, Staes CJ, Douthit BJ, Thoureen T, Hatch DJ, Kawamoto K, et al. Measuring implementation feasibility of clinical decision support alerts for clinical practice recommendations. J Am Med Inform Assoc. 2020;27(4):514–21.

    PubMed  PubMed Central  Article  Google Scholar 

  68. Connors EH, Stephan SH, Lever N, Ereshefsky S, Mosby A, Bohnenkamp J. A national initiative to advance school mental health performance measurement in the US. Adv School Mental Health Promot. 2016;9(1):50–69.

    Article  Google Scholar 

  69. Forman SG, Olin SS, Hoagwood KE, Crowe M, Saka N. Evidence-based interventions in schools: developers’ views of implementation barriers and facilitators. School Mental Health. 2009;1(1):26–36.

    Article  Google Scholar 

Download references


We are grateful to Drs. Nancy Lever and John Landsverk for their consultation, support and review of study design and materials. We also acknowledge the role of The SHAPE System ( as a national quality improvement system for school mental health in the USA from which provider participants were recruited.


This publication was funded by the National Institute of Mental Health (K08MH116119). The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

Author information

Authors and Affiliations



EHC developed the overarching scientific aims and design of the project. EHC led all aspects of the study under close mentorship and guidance of JKT. ARL provided study design and methodological consultation specifically related to survey construction and go-zone plot analyses. ARL, SH, MW, and JKT supported the design and execution of participant recruitment. KG consented study participants. EHC, KG, and CS were involved in data collection, management, and analysis with consultation and input from all other co-authors. All authors contributed to the development, drafting, or review of the manuscript. All authors approved the final manuscript.

Corresponding author

Correspondence to Elizabeth H. Connors.

Ethics declarations

Ethics approval and consent to participate

All study procedures were reviewed and approved by the Yale University institutional review board. The study team conducted informed consent meetings via phone with prospective participants and received written/verbal consent consistent with IRB-approved procedures.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Connors, E.H., Lyon, A.R., Garcia, K. et al. Implementation strategies to promote measurement-based care in schools: evidence from mental health experts across the USA. Implement Sci Commun 3, 67 (2022).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI:


  • Implementation strategy selection
  • Measurement-based care
  • School mental health treatment