Skip to main content

Coordination of sustainable financing for evidence-based youth mental health treatments: protocol for development and evaluation of the fiscal mapping process

Abstract

Background

Sustained delivery of evidence-based treatments (EBTs) is essential to addressing the public health and economic impacts of youth mental health problems, but is complicated by the limited and fragmented funding available to youth mental health service agencies (hereafter, “service agencies”). Strategic planning tools are needed that can guide these service agencies in their coordination of sustainable funding for EBTs. This protocol describes a mixed-methods research project designed to (1) develop and (2) evaluate our novel fiscal mapping process that guides strategic planning efforts to finance the sustainment of EBTs in youth mental health services.

Method

Participants will be 48 expert stakeholder participants, including representatives from ten service agencies and their partners from funding agencies (various public and private sources) and intermediary organizations (which provide guidance and support on the delivery of specific EBTs). Aim 1 is to develop the fiscal mapping process: a multi-step, structured tool that guides service agencies in selecting the optimal combination of strategies for financing their EBT sustainment efforts. We will adapt the fiscal mapping process from an established intervention mapping process and will incorporate an existing compilation of 23 financing strategies. We will then engage participants in a modified Delphi exercise to achieve consensus on the fiscal mapping process steps and gather information that can inform the selection of strategies. Aim 2 is to evaluate preliminary impacts of the fiscal mapping process on service agencies’ EBT sustainment capacities (i.e., structures and processes that support sustainment) and outcomes (e.g., intentions to sustain). The ten agencies will pilot test the fiscal mapping process. We will evaluate how the fiscal mapping process impacts EBT sustainment capacities and outcomes using a comparative case study approach, incorporating data from focus groups and document review. After pilot testing, the stakeholder participants will conceptualize the process and outcomes of fiscal mapping in a participatory modeling exercise to help inform future use and evaluation of the tool.

Discussion

This project will generate the fiscal mapping process, which will facilitate the coordination of an array of financing strategies to sustain EBTs in community youth mental health services. This tool will promote the sustainment of youth-focused EBTs.

Background

One in 5 children [1] and 1 in 2 adolescents [2] experience a mental health problem annually, leading to considerable distress and impairment with an associated economic burden of $247 billion [3]. Research has identified evidence-based treatments (EBTs) that show clinical and cost-effectiveness for youth mental health outcomes [4,5,6,7,8,9] and can be economically feasible to implement [10,11,12], offering an important but underused way to improve quality of care. Addressing the societal impact of youth mental health problems requires that US mental health service systems offer EBTs widely and consistently [3, 13, 14]. Although many youths with mental health problems receive some treatment [15], service providers often offer treatments of limited or unknown effectiveness [16,17,18,19]—especially to youth from marginalized racial and ethnic groups [20,21,22].

One way to address this research-practice gap is improved implementation—the adoption and integration of EBTs in clinical service settings [23]. EBT implementation requires considerable financial resources, and limited and fragmented funding is one of the most-cited barriers to successful implementation processes and outcomes [24,25,26,27,28]. Ongoing investments also are needed to promote sustainment [29,30,31], defined as continued use of an EBT with ongoing program and population benefits [32, 33]. Without sustained use, the public health impact of EBT implementation is limited [34], yet many youth-focused EBTs are difficult to sustain [35,36,37,38]. In response, our team is developing and evaluating a strategic planning tool for the financial sustainment of EBTs in youth mental health services: the fiscal mapping process.

Strategic planning to support EBT sustainment

The underlying premise of the fiscal mapping process is that the financial sustainment of EBT delivery requires youth mental health service agencies (hereafter, “service agencies”) to collaborate with their stakeholder partners [39,40,41] in order to navigate the complex, multi-level, and dynamic factors influencing sustainment [30, 38, 42, 43]. In the USA, service agencies include a variety of publicly and privately operated organizations (e.g., community mental health centers, hospitals, private organizations, children’s advocacy centers [44]). Service agencies’ stakeholder partners include (a) various third-party funding sources, including federal, state, and county agencies; public and commercial health insurance plans; and private foundations; and (b) intermediary organizations [45] that offer EBT implementation guidance and support to providers, for example, through expert training and supervision/consultation. It should be noted that third-party payors cover 87% of all US health care expenditures and, therefore, substantially influence service provider activities [46, 47].

Sustainable funding sources are not readily available for many activities that are essential to high-quality EBT delivery and commensurate reductions in youth mental health problems [43, 48,49,50,51]. Without funding, service agencies find it difficult to manage expenses for training and supervision/consultation, monitoring outcomes and EBT fidelity/adaptations, case management and care coordination, required resources and materials, and family- or group-based services [17,18,19, 24]. Support for direct service delivery traditionally comes from program budgets [38, 52, 53] and fee-for-service payments [54, 55], and these funds are often too limited to cover EBP delivery costs, let alone sustainment activities.

With the goal of promoting effective collaboration among US youth mental health service stakeholders, we grounded the fiscal mapping process in the Public Health Sustainability Framework [29, 56], which is comprised of eight core domains of sustainment capacities. The central domain in the framework is strategic planning (defined as processes that guide a program’s directions, goals, and strategies), which coordinates the other seven domains into an outcome-oriented plan. Although the fiscal mapping process focuses on the funding stability domain, our approach is informed by abundant evidence that funding stability relies on strategic planning capacities [29, 56,57,58]. Indeed, an agency may need to focus on building capacities in other domains—such as partnerships, political support, or communications—before funding stability is possible.

The importance of strategic planning to financial sustainment of EBTs is reinforced by theoretical work on the financing of public and non-profit private service organizations, showing that organizational success depends on the ability to identify and secure resources through diverse revenue streams (i.e., Resource Dependence Theory [25, 59, 60]) and that resource obtainment is influenced by relationships with the individuals and organizations that control those resources (i.e., Open Systems Theory [61, 62]). Numerous observational studies describe how service agencies often must engage in “creative financing” involving coordination of multiple funding sources to sustain EBTs [37, 48, 63, 64].

Tailored selection of financing strategies as a solution

Strategic planning requires a sufficient understanding of the options available to achieve a goal or solve a problem. Implementation strategies are methods or techniques used to enhance implementation and/or sustainment [65]; various efforts are underway to compile and describe these strategies [66,67,68]. In one effort, a national group of implementation and financing experts identified and defined 23 financing strategies [69] that can support EBT implementation and/or sustainment in behavioral health systems. Example strategies included increased fee-for-service reimbursement, contracts for EBTs, and cost-sharing. This comprehensive compilation of financing strategies offers a foundation for the fiscal mapping process.

A catalog of EBT financing strategies is helpful, but insufficient, when selecting the optimal combinations of strategies necessary to sustain a particular EBT. Increasingly, implementation science emphasizes “tailored selection” [70,71,72] whereby various strategies are considered, then matched to the goals, needs, and constraints of a given implementation effort. Evidence to date suggests that tailored strategies promote implementation and health outcomes better than non-tailored strategies [70, 72]. Methods of tailoring implementation strategies are in their infancy, but implementation experts [71, 73] recently identified Intervention Mapping [74, 75] as showing promise for pragmatically selecting implementation strategies.

Intervention mapping is a well-specified, multi-step method for developing interventions, or implementation strategies [71, 73, 76], based on theory, research evidence, and stakeholder perspectives. The use of intervention mapping to tailor implementation strategies has led to successful EBT implementation in both uncontrolled and controlled studies [77,78,79]. We are adapting this process as a multi-step, structured tool that guides youth mental health service agencies in strategic planning efforts to finance EBT sustainment—the fiscal mapping process. Table 1 outlines the proposed steps of the fiscal mapping process as derived from intervention mapping. Briefly, these steps involve (1) identifying resources needed for EBT implementation, (2) specifying funding objectives linked to those needs, (3) matching financing strategies to the funding objectives, (4) selecting and using the best-fit combination of financing strategy options to meet all objectives, and (5) monitoring financial viability over time. The goal of the fiscal mapping process is to help service agencies select the optimal combination of strategies for financing their EBT sustainment efforts (within their existing constraints).

Table 1 Proposed steps of the fiscal mapping process, as adapted from intervention mapping

Current project

This project will develop and evaluate the fiscal mapping process with key stakeholder input from youth mental health service agencies and their funding agency and EBT intermediary partners. The development process involves stages of feedback and revision aimed at gaining consensus on key fiscal mapping process steps. We will evaluate the preliminary impact of the fiscal mapping process through pilot-testing with ten youth mental health service agencies.

Participating agencies will pilot-test the fiscal mapping process with one of two widely disseminated EBTs for high-priority youth mental health problems: disruptive behavior problems and traumatic stress. Both of these clinical concerns have high prevalence rates (10–20%) [2] and result in severe personal, societal, and economic consequences well into adulthood if untreated [80,81,82]. The EBTs are parent-child interaction therapy (PCIT) [83] and trauma-focused cognitive-behavioral therapy (TF-CBT) [84]. PCIT is an EBT for youth ages 2–7 with disruptive behavior problems and their caregivers. It focuses on parent skill training in conjoint caregiver-child sessions, emphasizing positive interaction skills and effective discipline skills. TF-CBT is an EBT for youth ages 3–18 with traumatic stress symptoms. It is an exposure-based treatment that focuses on processing the traumatic experience and correcting problematic trauma-related beliefs, with sessions typically divided into youth, caregiver, and combined portions. There is extensive evidence for the clinical and cost-effectiveness of PCIT [85, 86] and TF-CBT [11, 87, 88]. By pilot-testing with two EBTs, we sought to promote the generalizability of the resulting fiscal mapping process.

This project is situated at the critical intersection of strategic planning for EBT sustainment, financing strategies, and youth mental health services. We will bring together knowledge from these three areas to develop and evaluate the fiscal mapping process.

Method

We followed the Standards for Reporting Implementation Studies [89] (StaRI; see Additional file 1) for describing our project. All procedures were reviewed by the RAND Corporation Institutional Review Board and determined to not constitute human subjects research (Protocol #2020-N0607); nevertheless, we will follow all ethical principles for the protection of human research participants to minimize any risk of harm.

Research design

Figure 1 summarizes our approach to developing (Aim 1) and evaluating (Aim 2) the fiscal mapping process. These aims have distinct designs, but will be completed concurrently over a 2-year period and inform each other throughout. Overall, we will use a mixed-methods [90] approach that examines the convergence between qualitative and quantitative data to provide an in-depth understanding of the fiscal mapping process.

Fig. 1
figure 1

Overview of the research design for developing and evaluating the fiscal mapping process

Aim 1 is to develop the fiscal mapping process by adapting the intervention mapping process [74] and incorporating our compilation of financing strategies [69]. We will use a modified Delphi technique [91] to obtain formative stakeholder feedback. Delphi is a structured approach to group decision-making, and previous research has established its use for developing consensus about implementation strategies [66]. Sub-aims are to (1a) achieve consensus among our participants—through two web-based survey rounds followed by a round of live, virtual voting—on the key steps of the fiscal mapping process, while (1b) incorporating additional information into the financing strategy compilation to more fully inform strategy selection.

Aim 2 is to evaluate the preliminary impact of the fiscal mapping process. Our 2-year timeline is too short to observe sustainment trajectories, so we will instead focus on short-term factors related to EBT sustainment. Specifically, we will (2a) examine EBT sustainment capacities (e.g., for strategic planning) [29, 56] and outcomes (e.g., intentions to sustain) at the ten pilot-testing service agencies using a comparative case study approach [92, 93]. Each agency that pilot-tests the fiscal mapping process will be considered a case, and we will draw on multiple data sources (i.e., surveys, focus groups, document review, field notes) to compare and contrast experiences across agencies. Following pilot-testing, participants will contribute to a conceptual model of fiscal mapping’s process and outcomes through a participatory modeling exercise [94].

Project timeline

The project began in February 2021, focusing first on recruitment and developing the initial fiscal mapping process prototype. When we completed this protocol in October 2021, we had finished recruitment and were conducting initial training with participating agencies; pilot-testing and data collection will take place over the subsequent 12 months. We will iteratively analyze data and incorporate it into the fiscal mapping process throughout pilot-testing, with the goal of finalizing the tool by the end of the project period (January 2023). This timeline coincides with the COVID-19 pandemic, but all project activities were planned to be conducted virtually which helped to minimize disruption.

Participant and site recruitment

We have recruited 48 expert stakeholder participants, representing key roles in US youth mental health services, and we will engage them in all phases of the project. Stakeholder involvement is critical to producing research evidence relevant to those who deliver and fund EBTs [95,96,97,98]. Our recruitment plan is grounded in comparative case study methods [92, 93], using rigorous sampling to maximize the representativeness of small samples when random sampling is not feasible or effective [99, 100]. The cases are the ten service agencies, each represented by service agency representatives and their EBT intermediary and funding agency partners. Experts recommend recruiting approximately ten cases for subtle between-case comparisons [93], and representing multiple perspectives from each case [92]. For our sample, each agency will contribute up to 3 participants per stakeholder group. The resulting sample will allow us to use a variety of research methods, including—but also well beyond—case study methods.

EBT intermediary representatives

To begin recruitment, members of the research team nominated EBT intermediary organization representatives with expertise in the high-fidelity implementation and sustainment of PCIT or TF-CBT. We met our goal of recruiting 12 intermediary representatives; of the 12 enrolled intermediaries, five had expertise in PCIT, four in TF-CBT, and three in both models.

Youth mental health service agencies (cases)

Using snowball sampling [99, 100], intermediary representatives nominated service agencies with whom they worked to implement PCIT or TF-CBT in the past 5 years. We invited those agencies to apply to join the project and enclosed a detailed information guide about the project with each invitation. Our nomination and application process collected detailed quantitative and qualitative data about each agency from three stakeholder groups, which is ideal for rigorous case selection [100, 101] and will guide later comparative case study analyses. We received 45 service agency nominations and an additional six referrals from the nominated agencies, for a total of 51 nominees.

We found that youth mental health service agencies benefitted from technical support prior to their submitting an application. The principal investigator often met with agency representatives to engage them in the project and discuss key decisions, such as which service agency representatives should participate or which EBT would most benefit from the fiscal mapping process. We used purposive sampling [93, 99, 100] to prioritize cases for recruitment that provided a representative range of agencies, allowing for useful comparisons within and across our two EBT models of interest while providing adequate representation of service agency and funding agency participants. We recruited cases based on important characteristics of EBTs (e.g., use of PCIT vs. TF-CBT vs. both, use with racial/ethnic minority and low-income populations), agencies (e.g., type of agency, rural/urban service area, size), and funding contexts (e.g., state/region, service-funding agency partnerships). To ensure a clear focus on sustainment, agencies were required to have fully implemented the EBT of focus with at least one clinician.

In the application, agencies also contributed to our snowball sampling recruitment approach by nominating stakeholders involved in their EBT sustainment efforts to participate, including representatives from the service agency and from partner funding agencies. We then followed up with nominated individuals to verify their interest in participating (prior to finalizing an agency’s selection) and to gather demographic information.

Ultimately, 12 agencies submitted applications to join the project, of which ten 10 were selected to pilot-test the fiscal mapping process. Four of the participating agencies chose to focus on PCIT for the fiscal mapping the pilot test and the other six to focus on TF-CBT. The two agencies that applied but were not selected both had difficulty identifying service agency and/or funding agency representatives with the capacity to participate in the project (i.e., nominees from the application did not follow through with enrollment).

Youth mental health service agency representatives

Service agencies nominated personnel who had expertise and oversight regarding the financial aspects of EBT implementation and sustainment at the agency. We sought to recruit at least 18 service agency representatives; we found that nominated representatives were typically willing to participate once their service agency had committed and ultimately enrolled 24 service agency representatives. Most were in an agency leadership role (e.g., CEO, Chief Financial Officer, Vice President), a clinical administration role (e.g., clinical director, program supervisor), and/or a financial administration role (e.g., grants administration, development officer).

Funding agency representatives

Service agencies also nominated representatives from funding agencies that supported their EBT of focus in the past 5 years. Although we sought to recruit 18 funding agency representatives, service agencies reported it was challenging to identify funders who were willing to participate in this study. For example, some funding agencies had policies that precluded staff participation in research. Therefore, we concluded recruitment after enrolling 12 funding agency representatives, as this was equivalent to the number of intermediary participants and (given the higher-than-expected service agency representative enrollment) achieved the overall recruitment goal of 48 participants. The funding agency representatives came from a diverse range of organizations including state and tribal agencies, private foundations, and managed care.

Pilot-testing activities

Pilot testing will provide service agency representatives with hands-on experience that can inform ongoing refinements of the fiscal mapping process. As a supplement to StaRI, here we follow the Template for Intervention Description and Replication [102] (TIDieR; see Additional file 2) when describing the fiscal mapping process and associated activities.

Fiscal mapping process tool

The research team created an initial prototype of the fiscal mapping process (version 1.0) for pilot-testing. The prototype format is an Excel workbook, and it is structured to clearly indicate what information should be entered to complete each step, but also flexible enough to accommodate agencies’ varied strategic planning goals and capture important contextual factors in each step. After specifying the focus of a given fiscal map (EBT, sites, etc.), the user completes the five fiscal mapping process steps: (1) resources needed, (2) funding objectives, (3) financing strategies, (4) fiscal map of EBT, and (5) monitoring plan (see Table 1). A resource tab accompanies each step with other materials useful for completing the step. For example, Step 1 resources include information about EBT time and cost models that help identify resource needs [103] and Step 3 resources summarize the aforementioned compilation of 23 financing strategies for behavioral health [69]. Each resource tab also includes a completed example of the associated step with a hypothetical service agency.

Initial training

We will provide a 3-h virtual training to the representatives from each pilot-testing service agency via Microsoft Teams. The agenda includes (a) introductions and project overview (30 min); (b) step-by-step instructions for using the fiscal mapping process, including ample hands-on discussion about completing the tool’s steps for the service agency (2 h); and (c) plans for coaching calls and data collection activities (30 min); regular breaks are included. We will promote engagement in the training through a practical, applied focus that allows agency representatives to leave training with an in-progress fiscal map and concrete next steps for using the tool. We will video-record each training session and give the service agency representatives access to the recording if desired. Two coaches (the principal investigator and project manager) will lead trainings for 5 agencies each; the other coach will attend to provide technical support and record detailed field notes. Both coaches have training in mental health service delivery (clinical psychologist and social worker, respectively) and EBT implementation.

Monthly coaching

To facilitate the use of the fiscal mapping process, each coach will provide monthly coaching sessions for 1 year with the service agencies for which they led training. Coaching sessions will be brief (~ 15 min per month) and focus on answering the service agency representatives’ practical questions about applying the fiscal mapping process. Prior to each coaching call, the coach will send a structured email inquiry asking representatives to specify (a) which fiscal mapping process steps they have worked on; (b) key areas they wish to prioritize for coaching, such as working toward completion of certain steps or deciding how to share conclusions with stakeholders; and (c) any desired modifications to the session format, like extending the session length or inviting stakeholders to join. The coach will also be available for as-needed consultation outside of the scheduled coaching calls; thus, rather than limiting coaching to 15 min, the use of this brief model provides a sustainable way to maintain monthly coach-agency contact for the duration of pilot-testing. Coaches will record field notes about the frequency, length, modality, and content of each coaching contact in a detailed logbook.

Plans to address adaptation and fidelity

Throughout the pilot-testing year, we will incorporate feedback from the 48 stakeholder participants into refinements of the fiscal mapping process. If there are major changes to the tool (Version 2.0, 3.0, etc.), then we will re-distribute it to participating agencies and provide additional guidance or training as needed. Thus, we will initially prioritize the adaptability of the fiscal mapping process while we incorporate stakeholder perspectives into the tool. Over time, we will develop a fidelity checklist of core fiscal mapping process steps that can be used by coaches as well as guide fidelity assessments for subsequent evaluations of the strategy.

Data collection activities and measures

We will collect a mix of quantitative and qualitative data from the expert stakeholder participants for all project aims (see Fig. 1). Data will be collected using secure web-based programs: SelectSurvey for surveys and Microsoft Teams or Zoom.gov video-conference for the focus groups, webinar, and training/coaching activities. We will not collect personally identifiable information; participants will assign each participant a unique, anonymous identification number to identify their data. Table 2 provides a summary of each data collection activity, including the timeframe, measures used, participants involved, compensation amount, and relevant aims.

Table 2 Data collection activities for developing and evaluating the fiscal mapping process

Surveys

The modified Delphi [91] (Aim 1a) will begin with two rounds of feedback on the fiscal mapping process via online surveys administered 6 months apart. Each online survey will provide (a) a detailed description of each step of fiscal mapping; (b) a text box for comments, concerns, or proposed changes to each description; and (c) a text box to offer additional or alternative steps for the fiscal mapping process.

We will also incorporate feedback into the compilation of financing strategies [69] (Aim 1b) through two follow-up surveys (one in each of the first two Delphi rounds). Service agency representatives will provide additional information about their agencies in these follow-up surveys to provide context for the feedback. In the first survey, the expert participants will review the compilation and provide (a) quantitative ratings of each strategy’s relevance to youth mental health services, (b) qualitative feedback on each strategy, and (c) suggestions for additional financing strategies. Service agency representatives will provide ratings, using validated scales, of the agency’s implementation climate (Implementation Climate Scale [105]) and financial status for EBT implementation (Agency Financial Status Scales [104]). In the second survey, participants will provide ratings of each strategy’s availability in their funding environment, level of suitability for funding different implementation activities, feasibility, and effectiveness. Service agency representatives will also rate each strategy’s contribution to their funding for EBT sustainment (percentage of total funding over the last 3 years).

Each survey (Delphi + follow-up) is expected to take approximately 30 min. Participants will receive a $30 electronic gift card for each completed survey.

Focus groups

About 3 months after each survey, we will conduct a virtual focus group with each service agency. A given focus group will include one service agency’s representatives; the funding agency representative(s) nominated by the service agency; and an intermediary with expertise in the EBT of focus for pilot-testing (ideally, but not necessarily, the intermediary who nominated the agency). During the focus group, participants will discuss the service agency’s experience with pilot-testing the fiscal mapping process and how using the tool has impacted EBT sustainment capacities (from the Public Health Sustainability Framework [29]; especially financial stability and strategic planning) and outcomes. The groups will also discuss key characteristics of the EBT, agency, and funding context that influence the fiscal mapping process. The coach who does not conduct the agency’s coaching sessions will lead their focus group (to avoid demand effects). Focus groups will be supported by a research assistant who will take detailed notes, and will be audio-recorded for later analysis.

Each focus group is expected to take approximately 1 h, and participants will receive a $50 electronic gift card as compensation. Afterwards, participants will complete a brief web-based survey rating (a) the agency’s capacity for sustaining the chosen EBT using the Program Sustainability Assessment Tool, a measure of Public Health Sustainability Framework domains [56]; (b) extent of EBT sustainment using the three-item Provider REport of Sustainment Scale [106]; and for service agency representatives only (c) intentions to sustain the EBT over the next year. The focus group audio-recordings will be transcribed, with any identifying information removed, and destroyed once the analysis is complete.

Document review

To provide additional insights into the use of the fiscal mapping process, we will also collect and review relevant documents, such as agencies’ draft or final fiscal mapping process tools or information obtained from EBT intermediary and funding agency partners that informed completion of the tool. This method can provide useful insights into complex systems-level processes when interpreted alongside other qualitative and quantitative data [107]. We will identify relevant documents during the focus group discussions and coordinate with service agencies to support sharing as much as they are comfortable (establishing data use agreements and secure file transfers as needed).

Webinar: consensus voting and participatory modeling

At the end of pilot-testing, we will invite all 48 participants to participate in a 2-h webinar. Two data collection activities will be completed during the webinar: consensus voting for the final Delphi round (Aim 1a) and a participatory modeling exercise (Aim 2b). The two fiscal mapping process coaches will serve as facilitators.

The final Delphi round will be a live voting and consensus process. The facilitators will present each step of the process for voting, with associated comments and alternative specifications (if applicable). We will use the US Senate benchmark for a supermajority to end debate (≥ 60%) [108] for indicating consensus, as in a prior Delphi for implementation strategies [66]. We will attempt to identify consensus on a step using approval votes (i.e., for all acceptable options) before moving on to “run-off” voting, as this is the most efficient and “sincere” (i.e., strategy-proof) form of voting [109]. If consensus is not reached after runoff voting, the original description of the step will be retained. Throughout voting, participants can make comments in the chat or virtually “raise their hand” to make verbal comments for 1 min at a time. We will keep a record of the webinar polls used to count votes.

In the second portion of the webinar, participants will complete a participatory modeling exercise in which they conceptualize the process and outcomes of fiscal mapping. Participatory modeling is a technique from systems science that guides a group of stakeholders through the creation of a conceptual model of systems structures [94]. The facilitators will guide participants’ identification of actors, activities, outcomes, and contextual factors involved in each step of the process and solicit ideas for how to best evaluate changes in these factors. We will use the whiteboard function to illustrate the participants’ conceptual model in real-time as the discussion proceeds. To help make the discussion more engaging, we will solicit feedback through diverse channels including webinar polls, word clouds, chat box (including an anonymous option), and annotation on the whiteboard.

We will video-record the entire webinar to allow for a detailed record of the activities. The recording will be destroyed once the analysis is complete. We expect the entire webinar will take approximately 2 h, and attendees will each receive a $100 electronic gift card.

Field notes

As noted previously, coaches will log detailed field notes during training and coaching activities. In addition to being useful for the coaching process, these notes can be analyzed later for research purposes. The content of field notes will be most relevant for capturing service agency feedback on the fiscal mapping process (Aim 1b) and offering another source of insights into agencies’ experiences with the process and its outcomes (Aim 2a).

Analysis plan

Our analytic approach is grounded in mixed methods, which is standard practice for implementation research [90]. Mixed methods involve combining quantitative data (Delphi votes, standardized scales) and qualitative data (e.g., focus group notes and transcripts, open-ended survey responses, document review, field notes) to gain higher-level insights that would not be possible through the use of either approach in isolation.

Initial data processing

We will calculate descriptive statistics for quantitative measures. For qualitative data, we will use rapid content analysis [110, 111] to distill major themes from a given data source. Rapid content analysis is ideal for synthesizing actionable conclusions from qualitative data to inform implementation activities, and it can be applied to a variety of written data sources (including documents and logs [107]). Qualitative themes will be critical for interpretation, given that our small sample precludes complex quantitative analyses. We will also calculate internal consistency reliability for each scale and compare quantitative and qualitative results as a validity check.

Aim 1: development

We will organize the quantitative and qualitative survey data (from Aims 1a and 1b) into response matrices, which will guide team discussions about how to incorporate stakeholder feedback into the fiscal mapping process. The matrices represent a mixed-methods convergence function [90], where cells will summarize the overlap between qualitative and quantitative feedback across different dimensions (e.g., EBT models, stakeholder types) to help identify key priorities. For example, we might make refinements to the prototype by adding, removing, or refining the steps; we might also incorporate additional resources, including summaries of survey ratings on the financing strategy compilation. Ultimately, we will produce a well-specified fiscal mapping process with consensus on the key steps involved [91, 108].

Aim 2: evaluation

Our evaluation will primarily rely on the comparative case study approach [92, 93], synthesizing all available quantitative and qualitative data for in-depth insights into each case (i.e., youth mental health service agency that pilot-tested the fiscal mapping process). This approach involves creating descriptive summaries of the role of the fiscal mapping process in EBT sustainment capacities and outcomes at each agency, clearly identifying the contributions of different qualitative and quantitative measures to the conclusions drawn. We will then compare and contrast the ten pilot-testing agencies based on the key characteristics in the sampling plan. At various points in the analysis, a given pair of agencies may be grouped together or contrasted, depending on the characteristic being considered. We will also consider differences in perspective among the three stakeholder participant groups (service agency, funding agency, intermediary). Statistical power is limited, but we will examine if quantitative data follow expected contrasts and patterns over time, such as more effective use of fiscal mapping at agencies with higher and/or increasing strategic planning capacities. We will heavily leverage qualitative data to ensure accurate interpretation and maximize depth of understanding.

To complement our comparative case studies, we will analyze the participatory modeling exercise results to create an overarching conceptual model of the fiscal mapping process that can guide future evaluation. Following the webinar, the project team will review the exercise results and create a system dynamics diagram [94] representing the conceptual model that the participants generated. The system dynamics diagram will specify actors, activities, outcomes, and contextual factors for each step of the fiscal mapping process, providing a visual representation of the complex interactions and feedback loops involved in EBT financing decisions. For specified outcomes of each step, we will also note key indicators for evaluating success. Finally, we will use the conceptual model to expand the Public Health Sustainability Framework [29]—which describes key capacity domains but is silent on how to evaluate their impact—so that the framework can guide prospective evaluations of the fiscal mapping process and other approaches targeting EBT sustainment.

Discussion

This study will generate a novel fiscal mapping process, an innovative tool that will help service agencies identify and coordinate financing strategies for sustaining youth EBTs. Our research process and outputs will integrate existing knowledge from strategic planning for EBT sustainment, financing strategies, and youth mental health services in a stakeholder-friendly format. By rigorously developing and evaluating a strategic planning tool for EBT sustainment strategies [70,71,72], this project has great potential to improve sustainment outcomes. This is a complex undertaking, but our mixed-methods approach will integrate qualitative and quantitative data (i.e., surveys, focus groups, document review, field notes, Delphi method, and systems science) into an in-depth, comprehensive understanding of the fiscal mapping process.

This work is a unique effort to consider the important role of financing systems within efforts to support EBT implementation and sustainment. Consideration of financing systems introduces many challenges to the use of implementation strategies, including the need to coordinate strategic planning efforts among service delivery and financing agencies. Agencies will navigate how to maximize their prospects for EBT sustainment through a balance of (a) cultivating a diverse range of reliable funding sources while (b) keeping the entire process feasible to manage. These efforts will almost certainly involve additional sustainment capacities from the Public Health Sustainability Framework [29], such as agencies’ communications with stakeholders or partnerships in their communities. In fact, service agencies may need to advocate for funders to offer new financing strategies before they can realistically cover EBT sustainment costs. Service agencies may find it useful to present their fiscal map to stakeholders when communicating around gaps in current funding and priorities for future support.

We anticipate that various audiences will be interested in the broader implications of the knowledge we generate about financing strategies, and the methodological advances that we bring to this area of study. Potential audiences include state and federal behavioral health administrators, policymakers, youth mental health treatment organizations, and researchers in fields like implementation, health policy, and public finance. In addition to sharing new understanding, we also view dissemination efforts as an opportunity to collaboratively generate further knowledge with additional stakeholders. We are particularly interested in understanding when and how the fiscal mapping process should be introduced to youth mental health service agencies. For example, there may be advantages and disadvantages to introducing the tool earlier versus later in the EBT implementation process, or in having the tool introduced by EBT intermediaries (e.g., trainers), a neutral third party, or funding agencies. To date, implementation research has paid little attention to how implementation efforts should integrate strategies that target both practice-specific capacities (e.g., knowledge, skills) and capacities that support EBPs generally, such as the fiscal mapping process. We expect that conversations with stakeholders will be the ideal first step in exploring such decisions. Moreover, as we continue to develop and evaluate the fiscal mapping process, we will seek funding and opportunities to incorporate feedback from youth and family stakeholders as well; although they are not envisioned as users of the fiscal mapping process, we believe youth and families should have a voice in setting the broader strategic priorities that agencies pursue.

Beyond the implications for EBT financing, this project will provide an innovative advance in implementation research methods by expanding the Public Health Sustainability Framework [29] for use in the evaluation of EBT sustainment strategies (see Aim 2b). This expansion is an important step toward evaluating the effects of implementation strategies on long-term sustainment and health outcomes in future work. Few implementation research frameworks currently focus on sustainment [30, 112], and even fewer were designed for evaluation purposes [113, 114]. Even more broadly, our research on financing strategies may produce useful insights into other strategies operating in the outer setting [26, 48, 115], such as policies mandating EBPs [17]. Better understanding of implementation strategies that can support sustainment capacities and address systems-level issues (like financing) promises to improve the implementation, sustainment, and ultimate public health impact of youth mental health EBTs.

In keeping with best practices in policy dissemination [116, 117], we will maximize the impact of our dissemination efforts through strategies such as framing the presentation of results in ways that highlight their relevance to various stakeholder audiences, or sharing our results with intermediary organizations (e.g., mental health advocacy organizations, National Association of State Mental Health Program Directors) that have trusted relationships with administrators and policymakers. We also plan to make the fiscal mapping process tool available for public use, should our findings suggest that youth mental health service systems would benefit.

We recognize that most results derived from this pilot study will be preliminary and exploratory. This is especially so because our 2-year timeframe is not adequate for examining long-term effects on sustainment. If our evaluation outcomes are promising, we anticipate following up with a large-scale, randomized implementation-effectiveness trial [118] to rigorously test the impact of the fiscal mapping process—and its mechanisms—on EBT sustainment and fidelity outcomes, while monitoring clinical outcomes (i.e., mental health symptoms). It will also be important to test the generalizability of the fiscal mapping process with multiple EBTs and with agencies not involved in its development, as the results of this project will be limited to sustaining PCIT and TF-CBT in youth mental health services. We anticipate further testing with youth mental health EBTs would be the next step, but may need to expand the fiscal mapping process into other service sectors (e.g., schools, child welfare, primary care) and populations (e.g., parents, prevention with at-risk populations) to impact youth mental health at a population level. Of course, additional development and evaluation work will be necessary to confirm whether the fiscal mapping process is beneficial in different contexts.

Conclusions

In sum, this project will develop the fiscal mapping process and evaluate its promise for promoting the financial sustainment of EBTs within youth mental health service agencies. The goal throughout will remain to help direct resources where they are most needed to support effective practices and promote health—particularly among our society’s most vulnerable and under-resourced communities.

Availability of data and materials

Data sharing is not applicable to this article as no datasets were generated or analyzed during the current study. In the future, we will share quantitative data collected using standardized scales through the National Institute of Mental Health Data Archive (https://nda.nih.gov/), a collaborative informatics system created by the National Institutes of Health to provide a national resource to support and accelerate research in mental health. Future publications will also detail other materials that may be shared, such as the fiscal mapping process tool and supportive documentation, as deemed appropriate based on our findings.

Abbreviations

EBT:

Evidence-based treatment

PCIT:

Parent-child interaction therapy

StaRI:

Standards for Reporting Implementation Studies

TF-CBT:

Trauma-focused cognitive-behavioral therapy

TIDieR:

Template for Intervention Description and Replication

References

  1. 1.

    Costello E, Angold A. Developmental epidemiology. In: Cichetti D, editor. Developmental Psychology, Vol. 1: theory and method. 3rd ed. New York, NY: Wiley; 2016. p. 94–128.

    Google Scholar 

  2. 2.

    Merikangas KR, He JP, Burstein M, et al. Lifetime prevalence of mental disorders in U.S. adolescents: results from the national comorbidity survey replication-adolescent supplement (NCS-A). J Am Acad Child Adolesc Psychiatry. 2010; doi:https://doi.org/10.1016/j.jaac.2010.05.017

  3. 3.

    O’Connell M, Boat T, Warner K. Preventing mental, emotional, and behavioral disorders among young people: progress and possibilities. Washington, DC: National Academies Press; 2009.

    Google Scholar 

  4. 4.

    Beecham J. Annual research review: child and adolescent mental health interventions: a review of progress in economic studies across different disorders. J Child Psychol Psychiatry Allied Discip. 2014. https://doi.org/10.1111/jcpp.12216.

  5. 5.

    Kieling C, Baker-Henningham H, Belfer M, et al. Child and adolescent mental health worldwide: evidence for action. Lancet. 2011. https://doi.org/10.1016/S0140-6736(11)60827-1.

  6. 6.

    Masters R, Anwar E, Collins B, Cookson R, Capewell S. Return on investment of public health interventions: a systematic review. J Epidemiol Community Health. 2017. https://doi.org/10.1136/jech-2016-208141.

  7. 7.

    McDaid D, Park A-L, Knapp M, et al. Making the case for investing in child and adolescent mental health: how can economics help? Int J Ment Health Promot. 2010. https://doi.org/10.1080/14623730.2010.9721824.

  8. 8.

    Chorpita BF, Daleiden EL, Ebesutani C, et al. Evidence-based treatments for children and adolescents: an updated review of indicators of efficacy and effectiveness. Clin Psychol Sci Pract. 2011. https://doi.org/10.1111/j.1468-2850.2011.01247.x.

  9. 9.

    Weisz JR, Kazdin A. Evidence-based psychotherapies for children and adolescents. 3rd ed. New York, NY: Guilford Press; 2017.

    Google Scholar 

  10. 10.

    Dopp AR, Coen AS, Smith AB, et al. Economic impact of the statewide implementation of an evidence-based treatment: multisystemic therapy in New Mexico. Behav Ther. 2018. https://doi.org/10.1016/j.beth.2017.12.003.

  11. 11.

    Dopp AR, Hanson RF, Saunders BE, et al. Community-based implementation of trauma-focused interventions for youth: economic impact of the learning collaborative model. Psychol Serv. 2017. https://doi.org/10.1037/ser0000131.

  12. 12.

    Okamura KH, Benjamin Wolk CL, Kang-Yi CD, et al. The price per prospective consumer of providing therapist training and consultation in seven evidence-based treatments within a large public behavioral health system: an example cost-analysis metric. Front Public Heal. 2016. https://doi.org/10.3389/fpubh.2017.00356.

  13. 13.

    Kazak AE, Hoagwood K, Weisz JR, et al. A meta-systems approach to evidence-based practice for children and adolescents. Am Psychol. 2010. https://doi.org/10.1037/a0017784.

  14. 14.

    Rapp CA, Bond GR, Becker DR, et al. The role of state mental health authorities in promoting improved client outcomes through evidence-based practice. Community Ment Health J. 2005. https://doi.org/10.1007/s10597-005-5008-8.

  15. 15.

    National Center for Health Statistics. Health, United States, 2018 with chartbook on long-term trends in health. Hyattsville, MD; 2019.

  16. 16.

    American Psychological Association Task Force on Evidence-Based Practice for Children and Adolescents. Disseminating evidence-based practice for children and adolescents : a systems approach to enhancing care. Washington, DC: American Psychological Association; 2008.

  17. 17.

    Bruns EJ, Kerns SEU, Pullmann MD, et al. Research, data, and evidence-based treatment use in state behavioral health systems, 2001–2012. Psychiatr Serv. 2016. https://doi.org/10.1176/appi.ps.201500014.

  18. 18.

    McHugh RK, Barlow DH. The dissemination and implementation of evidence-based psychological treatments A review of current efforts Am Psychol 2010; doi:https://doi.org/10.1037/a0018121.

  19. 19.

    Alegría M, Green JG, McLaughlin KA, et al. Disparities in child and adolescent mental health and mental health services in the U.S. New York: William T. Grant Foundation; 2015.

    Google Scholar 

  20. 20.

    Alegria M, Vallas M, Pumariega AJ. Racial and ethnic disparities in pediatric mental health. Child Adolesc Psychiatr Clin N Am. 2010. https://doi.org/10.1016/j.chc.2010.07.001.

  21. 21.

    Marrast L, Himmelstein DU, Woolhandler S. Racial and ethnic disparities in mental health care for children and young adults: a national study. Int J Health Serv. 2016. https://doi.org/10.1177/0020731416662736.

  22. 22.

    Agency for Healthcare Research and Quality. 2016 national healthcare quality and disparities report. Rockville, MD; 2017.

  23. 23.

    Bauer MS, Damschroder L, Hagedorn H, et al. An introduction to implementation science for the non-specialist. BMC Psychol. 2015. https://doi.org/10.1186/S40359-015-0089-9.

  24. 24.

    Raghavan R, Bright CL, Shadoin AL. Toward a policy ecology of implementation of evidence-based practices in public mental health settings. Implement Sci. 2008. https://doi.org/10.1186/1748-5908-3-26.

  25. 25.

    Jaramillo ET, Willging CE, Green AE, et al. “Creative financing”: funding evidence-based interventions in human service systems. J Behav Health Ser R. 2019; doi:https://doi.org/10.1007/s11414-018-9644-5.

  26. 26.

    Beidas RS, Marcus S, Wolk CB, et al. A prospective examination of clinician and supervisor turnover within the context of implementation of evidence-based practices in a publicly-funded mental health system. Adm Policy Ment Heal Ment Heal Serv Res. 2016. https://doi.org/10.1007/s10488-015-0673-6.

  27. 27.

    Crome E, Shaw J, Baillie A. Costs and returns on training investment for empirically supported psychological interventions. Aust Health Rev. 2017. https://doi.org/10.1071/AH15129.

  28. 28.

    Lang JM, Connell CM. Measuring costs to community-based agencies for implementation of an evidence-based practice. J Behav Heal Ser R. 2017. https://doi.org/10.1007/s11414-016-9541-8.

  29. 29.

    Schell SF, Luke DA, Schooley MW, et al. Public health program capacity for sustainability: a new framework. Implement Sci. 2013. https://doi.org/10.1186/1748-5908-8-15.

  30. 30.

    Shelton RC, Cooper BR, Stirman SW. The sustainability of evidence-based interventions and practices in public health and health care. Annu Rev Public Health. 2018. https://doi.org/10.1146/annurev-publhealth-040617-014731.

  31. 31.

    Stirman SW, Kimberly J, Cook N, et al. The sustainability of new programs and innovations: a review of the empirical literature and recommendations for future research. Implement Sci. 2012. https://doi.org/10.1186/1748-5908-7-17.

  32. 32.

    Scheirer MA, Dearing JW. An agenda for research on the sustainability of public health programs. Am J Public Health. 2011. https://doi.org/10.2105/AJPH.2011.300193.

  33. 33.

    Urquhart R, Kendell C, Cornelissen E, et al. Defining sustainability in practice: views from implementing real-world innovations in health care. BMC Health Serv Res. 2020. https://doi.org/10.1186/s12913-020-4933-0.

  34. 34.

    Aarons GA, Green AE, Willging CE, et al. Mixed-method study of a conceptual model of evidence-based intervention sustainment across multiple public-sector service settings. Implement Sci. 2014. https://doi.org/10.1186/s13012-014-0183-z.

  35. 35.

    Jensen-Doss A, Hawley KM, Lopez M, et al. Using evidence-based treatments: the experiences of youth providers working under a mandate. Prof Psychol Res Pract. 2009. https://doi.org/10.1037/a0014690.

  36. 36.

    Massatti RR, Sweeney HA, Panzano PC, et al. The de-adoption of innovative mental health practices (IMHP): why organizations choose not to sustain an IMHP. Adm Policy Ment Heal Ment Heal Serv Res. 2008. https://doi.org/10.1007/s10488-007-0141-z.

  37. 37.

    Rodriguez A, Lau AS, Wright B, et al. Mixed-method analysis of program leader perspectives on the sustainment of multiple child evidence-based practices in a system-driven implementation. Implement Sci. 2018. https://doi.org/10.1186/s13012-018-0737-6.

  38. 38.

    Stewart RE, Adams DR, Mandell DS, et al. The perfect storm: collision of the business of mental health and the implementation of evidence-based practices. Psychiatr Serv. 2016. https://doi.org/10.1176/appi.ps.201500392.

  39. 39.

    Bond GR, Drake RE, McHugo GJ, et al. Long-term sustainability of evidence-based practices in community mental health agencies. Adm Policy Ment Heal Ment Heal Serv Res. 2014. https://doi.org/10.1007/s10488-012-0461-5.

  40. 40.

    Roundfield KD, Lang JM. Costs to community mental health agencies to sustain an evidence-based practice. Psychiatr Serv. 2017. https://doi.org/10.1176/appi.ps.201600193.

  41. 41.

    Hoagwood KE, Olin SS, Horwitz S, et al. Scaling up evidence-based practices for children and families in New York state: toward evidence-based policies on implementation for state mental health systems. J Clin Child Adolesc Psychol. 2014. https://doi.org/10.1080/15374416.2013.869749.

  42. 42.

    Chambers DA, Glasgow RE, Stange KC. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implement Sci. 2013. https://doi.org/10.1186/1748-5908-8-117.

  43. 43.

    Willging CE, Green AE, Gunderson L, et al. From a “perfect storm” to “smooth sailing”: policymaker perspectives on implementation and sustainment of an evidence-based practice in two states. Child Maltreat. 2015. https://doi.org/10.1177/1077559514547384.

  44. 44.

    National Children’s Alliance. How the CAC model works. http://www.nationalchildrensalliance.org/cac-model. 2014.

  45. 45.

    Franks RP, Bory CT. Who supports the successful implementation and sustainability of evidence-based practices? Defining and understanding the roles of intermediary and purveyor organizations. In: McCoy KP, Diana A, editors. The science, and art, of program dissemination: strategies, successes, and challenges. New directions for child and adolescent development. San Francisco, CA: Jossey-Boss; 2015. p. 41–56. doi:https://doi.org/10.1002/cad.20112

  46. 46.

    Cleverley WO, Cleverley JO. Essentials of health care finance. 8th ed. Burlington, MA: Jones & Bartlett Learning; 2018.

    Google Scholar 

  47. 47.

    Folland S, Goodman AC, Stano M. The economics of health and health care. 8th ed. New York, NY: Routledge; 2017. https://doi.org/10.2190/EN1T-F9A1-LV0P-BLLR.

    Book  Google Scholar 

  48. 48.

    Scudder AT, Taber-Thomas SM, Schaffner K, et al. A mixed-methods study of system-level sustainability of evidence-based practices in 12 large-scale implementation initiatives. Heal Res Policy Syst. 2017. https://doi.org/10.1186/s12961-017-0230-8.

  49. 49.

    Funderburk B, Chaffin M, Bard E, et al. Comparing client outcomes for two evidence-based treatment consultation strategies. J Clin Child Adolesc Psychol. 2015. https://doi.org/10.1080/15374416.2014.910790.

  50. 50.

    Goense PB, Assink M, Stams GJ, et al. Making ‘what works’ work: a meta-analytic study of the effect of treatment integrity on outcomes of evidence-based interventions for juveniles with antisocial behavior. Aggress Violent Behav. 2016. https://doi.org/10.1016/j.avb.2016.08.003.

  51. 51.

    Schoenwald SK, Garland AF, Chapman JE, et al. Toward the effective and efficient measurement of implementation fidelity. Adm Policy Ment Heal Ment Heal Serv Res. 2011. https://doi.org/10.1007/s10488-010-0321-0.

  52. 52.

    Knapp M, Funk M, Curran C, et al. Economic barriers to better mental health practice and policy. Health Policy Plan. 2006. https://doi.org/10.1093/heapol/czl003.

  53. 53.

    Schoenwald SK, Chapman JE, Kelleher K, et al. A survey of the infrastructure for children’s mental health services: implications for the implementation of empirically supported treatments (ESTs). Adm Policy Ment Heal Ment Heal Serv Res. 2008. https://doi.org/10.1007/s10488-007-0147-6.

  54. 54.

    Miller HD. From volume to value: better ways to pay for health care. Health Aff. 2009. https://doi.org/10.1377/hlthaff.28.5.1418.

  55. 55.

    O’Malley AS, Collins A, Contreary K, et al. Barriers to and facilitators of evidence-based decision making at the point of care: implications for delivery systems, payers, and policy makers. MDM Policy Pract. 2016. https://doi.org/10.1177/2381468316660375.

  56. 56.

    Center for Public Health System Science. Program Sustainability Assessment Tool. https://www.sustaintool.org/. 2012.

  57. 57.

    Calhoun A, Mainor A, Moreland-Russell S, et al. Using the program sustainability assessment tool to assess and plan for sustainability. Prev Chronic Dis. 2014. https://doi.org/10.5888/pcd11.130185.

  58. 58.

    Hawe P, Noort M, King L, Jordens C. Multiplying health gains: the critical role of capacity-building within health promotion programs. Health Policy. 1997. https://doi.org/10.1016/S0168-8510(96)00847-0.

  59. 59.

    Froelich, KA. Diversification of revenue strategies: evolving resource dependence in nonprofit organizations Nonprofit Volunt Sec Q 1999; doi:https://doi.org/10.1177/0899764099283002.

  60. 60.

    Grønbjerg, KA. Understanding nonprofit funding: managing revenues in social services and community development organizations. 1993; Jossey-Bass.

  61. 61.

    Hillman AJ, Withers MC, Collins BJ. Resource dependence theory: a review. J Manage. 2009. https://doi.org/10.1177/0149206309343469.

  62. 62.

    Katz D, Kahn RL. Social psychology of organizations. New York: Wiley; 1966.

    Google Scholar 

  63. 63.

    Weber K, Waeger D. Organizations as polities: an open systems perspective. Acad Manag Ann. 2017. https://doi.org/10.5465/annals.2015.0152.

  64. 64.

    Mundey P, Slemaker A, Dopp A, et al. Sustaining treatment for youth with problematic sexual behavior: stakeholder perspectives following implementation. J Behav Health Serv Res. 2021. https://doi.org/10.1007/s11414-020-09726-0.

  65. 65.

    Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013. https://doi.org/10.1186/1748-5908-8-139.

  66. 66.

    Powell BJ, Waltz TJ, Chinman MJ, et al. A refined compilation of implementation strategies: results from the expert recommendations for implementing change (ERIC) project. Implement Sci. 2015. https://doi.org/10.1186/s13012-015-0209-1.

  67. 67.

    Leeman J, Birken SA, Powell BJ, et al. Beyond “implementation strategies”: classifying the full range of strategies used in implementation science and practice. Implement Sci. 2017. https://doi.org/10.1186/s13012-017-0657-x.

  68. 68.

    Waltz TJ, Powell BJ, Matthieu MM, et al. Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: results from the expert recommendations for implementing change (ERIC) study. Implement Sci. 2015. https://doi.org/10.1186/s13012-015-0295-0.

  69. 69.

    Dopp AR, Narcisse M-R, Mundey P, et al. A scoping review of strategies for financing the implementation of evidence-based practices in behavioral health systems: state of the literature and future directions. Implementation Research and Practice. 2020. https://doi.org/10.1177/2633489520939980.

  70. 70.

    Baker R, Camosso-Stefinovic J, Gillies C, et al. Tailored interventions to overcome identified barriers to change: effects on professional practice and health care outcomes. Cochrane Database Syst Rev. 2010. https://doi.org/10.1002/14651858.CD005470.pub2.

  71. 71.

    Powell BJ, Beidas RS, Lewis CC, et al. Methods to improve the selection and tailoring of implementation strategies. J Behav Heal Ser R. 2017. https://doi.org/10.1007/s11414-015-9475-6.

  72. 72.

    Wensing M, Boschan M, Grol R. The Knowledge-to-Action Cycle: Selecting KT interventions: selecting, tailoring, and implementing knowledge translation interventions. In: Knowledge translation in health care: moving from evidence to practice; 2009. p. 94–113. doi:https://doi.org/10.1002/9781444311747.ch3

  73. 73.

    Fernandez ME, ten Hoor GA, van Lieshout S, et al. Implementation mapping: using intervention mapping to develop implementation strategies. Front Public Heal. 2019. https://doi.org/10.3389/fpubh.2019.00158.

  74. 74.

    Bartholomew LK, Parcel GS, Kok G. Intervention mapping: a process for developing theory- and evidence-based health education programs. Health Educ Behav. 1998. https://doi.org/10.1186/1471-2458-9-216.

  75. 75.

    Bartholomew KL, Markham CM, Ruiter RAC, Fernandez ME, Kok G, Parcel GS. Planning health promotion programs: an intervention mapping approach. 4th ed. San Francisco, CA: Jossey-Boss; 2016.

    Google Scholar 

  76. 76.

    Grol R, Bosch M, Wensing M. Development and selection of strategies for improving patient care. In: Grol R, Wensing M, Eccles M, Davis D, editors. Improving patient care: the implementation of change in health care, Vol. 2.; 2013. p. 165–184.

  77. 77.

    Highfield L, Valerio MA, Fernandez ME, Eldridge-Bartholomew LK. Development of an implementation intervention using intervention mapping to increase mammography among low income women. Front Public Heal. 2018. https://doi.org/10.3389/fpubh.2018.00300.

  78. 78.

    Peskin MF, Hernandez BF, Gabay EK, et al. Using intervention mapping for program design and production of iCHAMPSS: an online decision support system to increase adoption, implementation, and maintenance of evidence-based sexual health programs. Front Public Heal. 2017. https://doi.org/10.3389/fpubh.2017.00203.

  79. 79.

    Zwerver F, Schellart AJM, Knol DL, et al. An implementation strategy to improve the guideline adherence of insurance physicians: an experiment in a controlled setting. Implement Sci. 2011. https://doi.org/10.1186/1748-5908-6-131.

  80. 80.

    Fang X, Brown DS, Florence C, et al. The economic burden of child maltreatment in the United States and implications for prevention. Child Abus Negl. 2012. https://doi.org/10.1016/j.chiabu.2011.10.006.

  81. 81.

    Krug E, Dahlberg L, Mercy J, et al. World health report on violence and health. Geneva, Switzerland: World Health Organization; 2002.

    Google Scholar 

  82. 82.

    McCollister KE, French MT, Fang H. The cost of crime to society: new crime-specific estimates for policy and program evaluation. Drug Alcohol Depend. 2010. https://doi.org/10.1016/j.drugalcdep.2009.12.002.

  83. 83.

    Kaminski JW, Claussen AH. Evidence base update for psychosocial treatments for disruptive behaviors in children. J Clin Child Adolesc Psychol. 2017. https://doi.org/10.1080/15374416.2017.1310044.

  84. 84.

    Dorsey S, McLaughlin KA, Kerns SEU, et al. Evidence base update for psychosocial treatments for children and adolescents exposed to traumatic events. J Clin Child Adolesc Psychol. 2017. https://doi.org/10.1080/15374416.2016.1220309.

  85. 85.

    Thomas R, Abell B, Webb HJ, et al. Parent-child interaction therapy: a meta-analysis. Pediatrics. 2017. https://doi.org/10.1542/peds.2017-0352.

  86. 86.

    Goldfine ME, Wagner SM, Branstetter SA, et al. Parent-child interaction therapy: an examination of cost-effectiveness. J Early Intensive Behav Interv. 2008. https://doi.org/10.1037/h0100414.

  87. 87.

    Pollio E, McLean M, Behl LE, et al. Trauma-focused cognitive behavioral therapy. In: Reece RM, Hanson RF, Sargent J, editors. Treatment of child abuse: common ground for mental health, medical, and legal practitioners. 2nd ed. Baltimore, MD: Johns Hopkins University Press; 2014. p. 31–8.

    Google Scholar 

  88. 88.

    Aas E, Iverson T, Holt T, et al. Cost-effectiveness analysis of trauma-focused cognitive behavioral therapy: a randomized control trial among Norwegian youth. J Clin Child Adolesc Psychol. 2019. https://doi.org/10.1080/15374416.2018.1463535.

  89. 89.

    Pinnock H, Barwick M, Carpenter C, Eldridge S, Grandes G, Griffiths CJ, et al. Standards for reporting implementation studies (StaRI) statement. BMJ. 2017. https://doi.org/10.1136/bmj.i6795.

  90. 90.

    Palinkas LA, Aarons GA, Horwitz S, et al. Mixed method designs in implementation research. Adm Policy Ment Heal Ment Heal Serv Res. 2011. https://doi.org/10.1007/s10488-010-0314-z.

  91. 91.

    Hasson F, Keeney S. Enhancing rigour in the Delphi technique research. Technol Forecast Soc Change. 2011. https://doi.org/10.1016/j.techfore.2011.04.005.

  92. 92.

    Bartlett L, Vavrus F. Comparative case studies: an innovative approach. Nord J Comp Int Educ. 2017. https://doi.org/10.7577/njcie.1929.

  93. 93.

    Yin RK. Case study research design and methods. 4th ed. Thousand Oaks, CA: Sage Publications; 2009.

    Google Scholar 

  94. 94.

    Pahl-Wostl C. Participative and stakeholder-based policy design, evaluation and modeling processes. Integr Assess. 2002; doi:1389–5176/02/0301–003.

  95. 95.

    Brownson RC, Jacobs JA, Tabak RG, et al. Designing for dissemination among public health researchers: findings from a national survey in the United States. Am J Public Health. 2013. https://doi.org/10.2105/AJPH.2012.301165.

  96. 96.

    Drahota A, Meza RD, Brikho B, et al. Community-academic partnerships: a systematic review of the state of the literature and recommendations for future research. Milbank Q. 2016. https://doi.org/10.1111/1468-0009.12184.

  97. 97.

    Lyon A, Koerner K. User-centered design for psychosocial intervention development and implementation. Clin Psychol Sci Pract. 2016. https://doi.org/10.1111/cpsp.12154.

  98. 98.

    Owen N, Goode A, Sugiyama T, et al. Designing for dissemination in chronic disease prevention and management. In: Brownson RC, Colditz GA, Proctor EK, editors. Dissemination and implementation research in health: translating science to practice. 2nd ed. New York, NY: Oxford University Press: 2018. p. 107–120. doi:https://doi.org/10.1093/oso/9780190683214.003.0007

  99. 99.

    Kemper EA, Stringfield S, Teddlie C. Mixed methods sampling strategies in social science research. In: Tashakkori A, Teddlie C, editors. Handbook of mixed methods in social and behavioral research. Thousand Oaks, CA: Sage Publications; 2003. p. 273–96.

    Google Scholar 

  100. 100.

    Patton MQ. Qualitative research and evaluation methods. 3rd ed. Thousand Oaks, CA: Sage Publications; 2002.

    Google Scholar 

  101. 101.

    Sharp JL, Mobley C, Hammond C, et al. A mixed methods sampling methodology for a multisite case study. J Mix Methods Res. 2012. https://doi.org/10.1177/1558689811417133.

  102. 102.

    Hoffmann TC, Glasziou PP, Boutron I, et al. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ. 2014. https://doi.org/10.1136/bmj.g1687.

  103. 103.

    North Carolina Child Treatment Program. Implementation support: clinical service delivery time models. https://www.ncchildtreatmentprogram.org/implementation-support/

  104. 104.

    Maxwell CA, Ehrhart MG, Williams NJ, et al. The organizational financial context of publicly-funded mental health clinics: development and preliminary psychometric evaluation of the agency financial status scales. Admin Pol Ment Health. 2021. https://doi.org/10.1007/s10488-021-01128-4.

  105. 105.

    Ehrhart MG, Aarons GA, Farahnak LR. Assessing the organizational context for EBP implementation: the development and validity testing of the implementation climate scale (ICS). Implement Sci. 2014. https://doi.org/10.1186/s13012-014-0157-1.

  106. 106.

    Moullin JC, Sklar M, Ehrhart MG, et al. Provider REport of sustainment scale (PRESS): development and validation of a brief measure of inner context sustainment. Implement Sci. 2021. https://doi.org/10.1186/s13012-021-01152-w.

  107. 107.

    Bowen GA. Document analysis as a qualitative research method. Qual Res J. 2009. https://doi.org/10.3316/QRJ0902027.

  108. 108.

    Oleszek WJ. Super-majority votes in the senate. Washington, DC; 2010.

    Google Scholar 

  109. 109.

    Brams SJ, Fishburn PC. Approval voting. New York, NY: Springer-Verlag; 2007. https://doi.org/10.1007/978-0-387-49896-6.

    Book  Google Scholar 

  110. 110.

    Hamilton AB, Finley EP. Qualitative methods in implementation research: an introduction. Psychiatry Res. 2019. https://doi.org/10.1016/j.psychres.2019.112516.

  111. 111.

    Taylor B, Henshall C, Kenyon S, et al. Can rapid approaches to qualitative analysis deliver timely, valid findings to clinical leaders? A mixed methods study comparing rapid and thematic analysis. BMJ Open. 2018. https://doi.org/10.1136/bmjopen-2017-019993.

  112. 112.

    Tabak RG, Chambers DA, Hook M, Brownson RC. The conceptual basis for dissemination and implementation research: lessons from existing models and frameworks. In: Brownson RC, Colditz GA, Proctor EK, editors. Dissemination and implementation research in health: translating science to practice. 2nd ed. New York: Oxford University Press; 2018. p. 73–88.

    Google Scholar 

  113. 113.

    Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. 2015. https://doi.org/10.1186/s13012-015-0242-0.

  114. 114.

    Shelton RC, Chambers DA, Glasgow RE. An extension of RE-AIM to enhance sustainability: addressing dynamic context and promoting health equity over time. Front Public Health. 2020. https://doi.org/10.3389/fpubh.2020.00134.

  115. 115.

    Kim JJ, Brookman-Frazee L, Gellatly R, et al. Predictors of burnout among community therapists in the sustainment phase of a system-driven implementation of multiple evidence-based practices in children’s mental health. Prof Psychol Res Pract. 2018. https://doi.org/10.1037/pro0000182.

  116. 116.

    Purtle J, Lê-Scherban F, Nelson KL, et al. State mental health agency officials’ preferences for and sources of behavioral health research. Psychol Serv. 2019. https://doi.org/10.1037/ser0000364.

  117. 117.

    Purtle J, Nelson KL, Bruns EJ, Hoagwood KE. Dissemination strategies to accelerate the policy impact of children’s mental health services research. Psychiatr Serv. 2020. 32517640. https://doi.org/10.1176/appi.ps.201900527.

  118. 118.

    Landes SJ, McBain SA, Curran GM. An introduction to effectiveness-implementation hybrid designs. Psychiatry Res. 2019. https://doi.org/10.1016/j.psychres.2019.112513.

Download references

Acknowledgements

We are deeply thankful to the service agencies and stakeholder participants partnering with us to develop and evaluate the fiscal mapping process, whose contributions make this work both possible and worthwhile. We would also like to acknowledge Natalie Richards for providing project administration (including formatting this manuscript and references), Maddison North for her assistance with coordinating collaboration across institutions, and Kristen Meadows and Monique Martineau for helping develop materials for the fiscal mapping process.

Funding

This project was supported by an award from the US National Institute of Mental Health (R21MH122889; Dopp, PI). Through June 2021, A.D. was also an investigator with the Implementation Research Institute (IRI), at the George Warren Brown School of Social Work, Washington University in St. Louis; through an award from the National Institutes of Mental Health (5R25MH08091607) and the Department of Veterans Affairs, Health Services Research & Development Service, Quality Enhancement Research Initiative (QUERI). B.P. was supported by the National Institute of Mental Health through K01MH113806 (Powell, PI).

Author information

Affiliations

Authors

Contributions

A.D. conceptualized the fiscal mapping process and the pilot study described in this protocol, wrote the first draft of the manuscript, and incorporated feedback and revisions. All other authors (M.G., J.S., J.R., S.S., B.F., A.J., B.P., D.L., D.M., D.E., M.B., D.H.) provided input into project conceptualization, reviewed drafts of this manuscript, and contributed additional conceptualization and writing to the final manuscript. The authors reviewed and approved the submitted version of the manuscript.

Corresponding author

Correspondence to Alex R. Dopp.

Ethics declarations

Ethics approval and consent to participate

All procedures were reviewed by the RAND Corporation Institutional Review Board and determined to not constitute human subjects research (Protocol #2020-N0607). Nevertheless, we will follow all ethical principles for the protection of human research participants (e.g., from the Belmont Report) to minimize any risk of harm during their participation, including the collection of informed consent at the start of the project and as part of each data collection activity.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1

: StaRI checklist for Fiscal MappingR0.

Additional file 2

: TIDieR checklist for Fiscal MappingR0. 

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Dopp, A.R., Gilbert, M., Silovsky, J. et al. Coordination of sustainable financing for evidence-based youth mental health treatments: protocol for development and evaluation of the fiscal mapping process. Implement Sci Commun 3, 1 (2022). https://doi.org/10.1186/s43058-021-00234-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s43058-021-00234-6

Keywords

  • Youth mental health services
  • Evidence-based treatment
  • Financing strategies
  • Sustainment
  • Strategic planning
  • Tailored implementation strategies