Decision-making about all aspects of research design, including analysis, entails judgment about “fit.” Researchers need not identify a single analytic approach and attempt to force its strict application, regardless of fit. Indeed, the flexible, study-specific combination of design elements is a hallmark of applied qualitative research practice [9]. Relevant considerations for fit include the inquiry’s purpose and nature of the subject matter; the diversity of intended audiences for findings; the criteria used to judge the quality and practical value of the results; and the research context (including characteristics of the setting, participants, and investigators). Other important considerations relate to constraints of available resources (e.g., funding, time, and staff) and access to relevant participants [3]. We contend that in the applied IS setting, finding an appropriate fit often includes borrowing procedures from different approaches to create a pragmatic, hybrid approach. A pragmatic approach also addresses the IS-specific tensions outlined above, i.e., a need to conduct research that is time-bounded, engages with theories/frameworks/models, supports application in practice, and speaks to a diversity of colleagues. To promote goals of achieving fit and internal coherence in light of IS-specific requirements, we offer the considerations above and additional guiding questions for selecting analytic procedures to create a pragmatic approach, as summarized in Fig. 1.
Key questions include the following:
-
1.
What is the appropriate balance of inductive and deductive analytic procedures given the research goals?
A deductive process emphasizes themes and explanations derived from previously established concepts, pre-existing theories, or the relevant literature [9]. For example, an analysis that leans heavily on a deductive process might use the core components of the Exploration, Preparation, Implementation, Sustainment (EPIS) framework [15] to inform the coding structure and analysis. This process would support efforts to bound the investigation’s scope or expand an existing framework or model [16]. On the other hand, rather than trying to fit data with pre-existing concepts or theory, an inductive process generates interpretation and understanding that is primarily grounded in and driven by the data [9].
A balance of deductive and inductive processes might use an IS framework as a starting point for the deductive portion and then emphasize inductive processes to garner additional insight into topics not anticipated by the team or framework. For example, a selected IS framework may not attend sufficiently to the ways in which implementation context drives inequities [17], if the dataset includes valuable information on this topic, including inductive processes would allow a fuller exploration of such patterns.
-
2.
To what extent will the analysis emphasize the perspectives of participants vs. researchers?
An important decision relates to where the research team wishes to ground the analysis on the continuum between insider (emic) and outsider (etic) perspectives. The appropriate balance of insider/outsider orientation will reflect the overall research design and questions. Specific decisions about how to execute the desired balance through the analysis include; for example, the types of codes used or the value placed on participant reflections. As described below in section 2, value is often placed on incorporating participants’ feedback on the development analysis, sometimes called “member checks” or “member reflections” [8].
An insider (emic) orientation represents findings in the ways that participants experience them, and insider knowledge is valued and privileged [9]. As an example, MacFarlane and colleagues used Normalization Process Theory and participatory approaches to identify appropriate implementation strategies to support the integration of evidence-based cross-cultural communication in European primary care settings. The participatory nature of the project offered the opportunity to gain “insider” insight rather than imposing and prioritizing the academic researchers’ “outsider” perspective. The insider (emic) orientation was operationalized in the analytic approach by using stakeholder co-analysis, which engages a wider set of stakeholders in the iterative processes of thematically analyzing the data [18]. By contrast, an outsider (etic) orientation represents the setting and participants in terms that the researcher or external audiences bring to the study and emphasizes the outsider’s perspective [9]. For instance, Van deGriend and colleagues conducted an analysis of influences on scaling-up group prenatal care. They used outsider (etic) codes that drew on researchers’ concepts and the literature to complement the insider (emic) codes that reflected participants’ concepts and views [19]. Balancing insider and outsider orientations is useful for pragmatic, qualitative IS studies increase the potential for the study to highlight practice- and community-based expertise, build the literature, and ultimately support the integration of evidence into practice.
-
3.
How can the analytic plan be designed to yield the outputs and products needed to support the integration of evidence into research and practice?
The research team can maximize efficiency and impact by intentionally connecting the analytic plan and the kind of products needed to meet scientific and practice goals (e.g., journal articles versus policy briefs). The ultimate use of the research outputs can also impact decisions around the breadth versus depth of the analysis. For example, in a recent implementation evaluation for community-clinical partnerships delivering EBIs in underserved communities, members of this author team (SR and RL) analyzed data to explore how partnership networks impacted implementation outcomes. At the same time, given the broader goal of supporting the establishment of health policies to support partnered EBI delivery, the team was also charged (by the state Department of Public Health) with capturing stories that would resonate with legislators regarding the need for broad, sustained investments [20]. We created a unique code to identify these stories during analysis and easily incorporate them into products for health department leaders. Given the practice-focused orientation, qualitative IS studies often support products for practitioners, e.g., “playbooks” to guide the process of implementing an intervention or novel care process [1].
-
4.
How can analysis resources be used strategically in time-sensitive projects or where there is limited staff or resource availability?
IS research is often conducted by teams, and strategic analytic decisions can promote rigor while capitalizing on the potential for teamwork to speed up analysis. Deterding and Waters’ strategy of flexible coding, for example, offers such benefits [21]. Through an initial, framework-driven analytic step, large chunks of text can be quickly indexed deductively into predefined categories, such as the five Consolidated Framework for Implementation Research domains of inner setting, outer setting, characteristics of individuals, intervention attributes, and processes [22]. This is a more straightforward coding task appropriate for research assistants who have been trained in qualitative research and understand the IS framework. Then, during the second analytic coding step, more in-depth coding by research team members with more experience can ensure a deeper exploration of existing and new themes. This two-step process can also enable team members to lead different parts of an IS project with different goals, purposes, or audiences. Other innovations in team-based analyses are becoming increasingly common in IS, such as rapid ethnographic approaches [23].
Building blocks for pragmatic analysis: examples from pattern-based analytic approaches
We offer illustrative examples of established analytic approaches in the following, highlighting their utility for IS and procedures that a pragmatic approach might usefully borrow and combine. These examples are not exhaustive; instead, they represent selected, pattern-based analytic approaches commonly used in IS. We aim to offer helpful anchor points that encompass the breadth and flexibility to apply to a wide range of IS projects [24] while also reflecting and speaking to a diversity of home disciplines, including sociology, applied policy, and psychology.
Grounded theory
Grounded theory is one of the most recognizable and influential approaches to qualitative analysis, although many variations have emerged since its introduction. Sociologists developed the approach, and the history and underlying philosophy are richly detailed elsewhere [25, 26]. The central goal of this approach is to generate a theoretical explanation grounded in close inspection of the data and without a preconceived starting point. In many instances, the emphasis of grounded theory on a purely inductive orientation may be at odds with the focus in IS on the use of existing theories and frameworks, as highlighted by the QUALRIS group [4]. Additionally, a “full” grounded theory study, aligned with all its methodological assumptions and prescriptions (e.g., for sampling), is very demanding and time-consuming and may not be appropriate when timely turnaround in the service of research or practice change is required. For these reasons, a full grounded theory approach is rarely seen in the IS literature. Instead, IS researchers who use this approach are likely to use a modified version, sometimes described as “grounded theory lite” [6].
Core features and procedures characteristic of grounded theory that can be incorporated into a pragmatic approach include inductive coding techniques [27]. Open, inductive coding allows the researcher to “open up the inquiry” by examining the data to see what concepts best fit the data, without a preconceived explanation or framework [28,29,30]. Concepts and categories derived from open coding prompt the researcher to consider aspects of the research topic that were overlooked or unanticipated [31]. The intermediate stages of coding in grounded theory, referred to as axial or focused coding, build on the open coding and generate a more refined set of key categories and identify relationships between these categories [32]. Another useful procedure from grounded theory is the constant comparison method, in which data are collected, categorized, and compared to previously collected data. This continuing, iterative process prompts continuous engagement with the analysis process and reshapes and redefines ideas, which is useful for most qualitative studies [25, 29, 33]. Grounded theory also allows for community expertise and broader outsider perspectives to complement one another for a more comprehensive understanding of practices [34].
An illustration of the utility of grounded theory procedures comes from a study that explored how implementing organizations can influence local context to support the scale-up of mental health interventions in middle-income countries [35]. Using a multiple case study design, the study team used an analytic approach based on grounded theory to analyze data from 159 semi-structured interviews across five case sites. They utilized line-by-line open coding, constant comparison, and exploration of connections between themes in the process of developing an overarching theoretical framework. To increase rigor, they employed triangulation by data source and type and member reflections. Their team-based plan included multiple coders who negotiated conflicts and refined the thematic framework jointly. The output of the analysis was a model of processes by which entrepreneurial organizations could marshal and create resources to support the delivery of mental health interventions in limited-resource settings. By taking a divergent perspective (grounded in social entrepreneurship, in this case), the study output provided a basis for further inquiry into the design and scale-up of mental health interventions in middle-income countries.
Framework analysis
Framework analysis comes from the policy sphere and tends to have a practical orientation; this applied nature typically includes a more structured and deductive approach. The history, philosophical assumptions, and core processes are richly described by Ritchie and Spencer [36]. Framework analysis entails several features common to many qualitative analytic approaches, including defining concepts, creating typologies, and identifying patterns and relationships, but does so in a more predefined and structured way [37, 38]. For example, the research team can create codes based on a framework selected in advance and can also include open-ended inquiry to capture additional insights. This analytic approach is well-suited to multi-disciplinary teams whose members have varying levels of experience with qualitative research [37]. It may require fewer staff resources and less time than some other approaches.
The framework analysis process includes five key steps. Step 1 is familiarization: Team members immerse themselves in the data, e.g., reading, taking notes, and listening to audio. Step 2 is identifying a coding framework: The research team develops a coding scheme, typically using an iterative process primarily driven by deductive coding (e.g., based on the IS framework). Step 3 is indexing: The team applies the coding structure to the entire data set. Step 4 is charting: The team rearranges the coded data and compares patterns between and within cases. Step 5 is mapping and interpretation: The team looks at the range and nature of relationships across and between codes [36, 39, 40]. The team can use tables and diagrams to systematically synthesize and display the data based on predetermined concepts, frameworks, or areas of interest. While more structured than other approaches, framework analysis still presents a flexible design that combines well with other analytic approaches to achieve study objectives [37]. The case example given in section 3 offers a detailed application of a modified framework analytic approach.
Interpretive phenomenological analysis (IPA)
Broadly, the purpose of a phenomenological inquiry is to understand the experiences and perceptions of individuals related to an occurrence of interest [41, 42]. For example, a phenomenological inquiry might focus on implementers’ experiences with remote training to support implementing a new EBI, aiming to explore their views, how those changed over time, and why implementers reacted the way they did. Drawing on this tradition, IPA focuses specifically on particular individuals (or cases), understanding both the experience of individuals and the sense they are making of those experiences. With roots in psychology, this approach prioritizes the perspective of the participant, who is understood to be part of a broader system of interest; additional details about the philosophical underpinnings are available elsewhere [41]. Research questions are open and broad, taking an inductive, exploratory perspective. Samples are typically small and somewhat homogeneous as the emphasis is placed on an in-depth exploration of a small set of cases to identify patterns of interest [43]. Despite the smaller sample size, the deep, detailed analysis requires thoughtful and time-intensive engagement with the data. The resulting outputs can be useful to develop theories that attend to a particular EBI or IS-related process or to refine existing frameworks and models [44].
A useful example comes from a study that sought to understand resistance to using evidence-based guidelines from the perspective of physicians focused on providing clinical care [45]. The analysis drew on data collected from interviews of 11 physicians selected for their expertise and diversity across a set of sociodemographic characteristics. In the first phase of the analysis, the team analyzed the full-length interviews and identified key themes and the relationships between them. Particular attention was paid to implicit and explicit meanings, repeated ideas or phrases, and metaphor choices. Two authors conducted the analyses separately and then compared them to reach a consensus. In the second phase of the analysis, the team considered the group of 11 interviews as a set. Using an inductive perspective, the team identified superordinate (or high-level) themes that addressed the full dataset. The final phase of the analysis was to identify a single superordinate theme that would serve as the core description of clinical practice. The team engaged other colleagues from diverse backgrounds to support reflection and refinement of the analysis. The analysis yielded a theoretical model that focused on a core concept (clinical practice as engagement), broken out into five constituent parts addressing how clinicians experience their practice, separate from following external guidelines.