- Open Access
Pragmatic approaches to analyzing qualitative data for implementation science: an introduction
Implementation Science Communications volume 2, Article number: 70 (2021)
Qualitative methods are critical for implementation science as they generate opportunities to examine complexity and include a diversity of perspectives. However, it can be a challenge to identify the approach that will provide the best fit for achieving a given set of practice-driven research needs. After all, implementation scientists must find a balance between speed and rigor, reliance on existing frameworks and new discoveries, and inclusion of insider and outsider perspectives. This paper offers guidance on taking a pragmatic approach to analysis, which entails strategically combining and borrowing from established qualitative approaches to meet a study’s needs, typically with guidance from an existing framework and with explicit research and practice change goals.
Section 1 offers a series of practical questions to guide the development of a pragmatic analytic approach. These include examining the balance of inductive and deductive procedures, the extent to which insider or outsider perspectives are privileged, study requirements related to data and products that support scientific advancement and practice change, and strategic resource allocation. This is followed by an introduction to three approaches commonly considered for implementation science projects: grounded theory, framework analysis, and interpretive phenomenological analysis, highlighting core analytic procedures that may be borrowed for a pragmatic approach. Section 2 addresses opportunities to ensure and communicate rigor of pragmatic analytic approaches. Section 3 provides an illustrative example from the team’s work, highlighting how a pragmatic analytic approach was designed and executed and the diversity of research and practice products generated.
As qualitative inquiry gains prominence in implementation science, it is critical to take advantage of qualitative methods’ diversity and flexibility. This paper furthers the conversation regarding how to strategically mix and match components of established qualitative approaches to meet the analytic needs of implementation science projects, thereby supporting high-impact research and improved opportunities to create practice change.
Implementation science (IS) is truly pragmatic at its core, answering questions about how existing evidence can be best translated into practice to accelerate impact on population health and health equity. Qualitative methods are critical to support this endeavor as they support the examination of the dynamic context and systems into which evidence-based interventions (EBIs) are integrated — addressing the “hows and whys” of implementation . Numerous IS frameworks highlight the complexity of the systems in which implementation efforts occur and the uncertainty regarding how various determinants interact to produce multi-level outcomes . With that lens, it is unsurprising that diverse qualitative methodologies are receiving increasing attention in IS as they allow for an in-depth understanding of complex processes and interactions [1, 3, 4]. Given the wide variety of possible analytic approaches and techniques, an important question is which analytic approach best fits a given set of practice-driven research needs. Thoughtful design is needed to align research questions and objectives, the nature of the subject matter, the overall approach, the methods (specific tools and techniques used to achieve research goals, including data collection procedures), and the analytic strategies (including procedures used for exploring and interpreting data) [5, 6]. Achieving this kind of alignment is often described as “fit,” “methodological integrity,” or “internal coherence” [3, 7, 8]. Tailoring research designs to the unique constellation of these considerations in a given study may also require creative adaptation or innovation of analytic procedures . Yet, for IS researchers newer to qualitative approaches, a lack of understanding of the range of relevant options may limit their ability to effectively connect qualitative approaches and research goals.
For IS studies, several factors further complicate the selection of analytic approaches. First, there is a tension between the speed with which IS must move to be relevant and the need to conduct rigorous research. Second, though qualitative research is often associated with attempts to generate new theories, qualitative IS studies’ goals may also include elaborating conceptual definitions, creating classifications or typologies, and examining mechanisms and associations . Given the wealth of existing IS frameworks and models, covering determinants, processes, and outcomes , IS studies often focus on extending or applying existing frameworks. Third, as an applied field, IS work usually entails integrating different kinds of “insider” and “outsider” expertise to support implementation or practice change . Fourth, diverse traditions have contributed to the new field of IS, including agriculture, operations research, public health, medicine, anthropology, sociology, and more . The diversity of disciplines among IS researchers can bring a wealth of complementary perspectives but may also pose challenges in communicating about research processes.
Pragmatic approaches to qualitative analysis are likely valuable for IS researchers yet have not received enough attention in the IS literature to support researchers in using them confidently. By pragmatic approaches, we mean strategic combining and borrowing from established qualitative approaches to meet the needs of a given IS study, often with guidance from an IS framework and with clear research and practice change goals. Pragmatic approaches are not new, but they receive less attention in qualitative research overall and are not always clearly explicated in the literature . Part of the challenge in using pragmatic approaches is the lack of guidance on how to mix and match components of established approaches in a coherent, credible manner.
Our motivation in offering this guidance reflects our experiences as researchers, collaborators, and teachers connecting qualitative methods and IS research questions. The author team includes two behavioral scientists who conduct stakeholder-engaged implementation science and regularly utilize qualitative approaches (SR and RL). The team also includes a sociologist and a social psychologist who were trained in qualitative methods and have rich expertise with health services and implementation research (AR and EA). Through conducting qualitative IS studies and supporting students and colleagues new to qualitative approaches, we noticed a regularly occurring set of concerns and queries. Many questions seem to stem from a sense that there is a singular, “right” way to conduct qualitative projects. Such concerns are often amplified by fear that deviation from rigid adherence to established sets of procedures may jeopardize the (perceived or actual) rigor of the work. While the appeal of recipe-like means of ensuring rigor is understandable, fixation on compliance with “established” approaches overlooks the fact that versions of recognizable, named approaches (e.g., grounded theory) often use different procedures . As Braun and Clarke suggest, this “hallowed quest” for a singular, ideal approach leads many researchers astray and risks limiting appropriate and necessary adaptations and innovations in methods . IS researchers seeking to broaden the range of approaches they can apply should take comfort that there is “no single right way to do qualitative data analysis […]. Much depends on the purpose of the research, and it is important that the proposed method of analysis is carefully considered in planning the research, and is integrated from the start with other parts of the research, rather than being an afterthought.” . At the same time, given the wealth of traditions represented in the IS community, it can be difficult for researchers to effectively ensure and convey the quality and rigor of their work. This paper aims to serve as a resource for IS researchers seeking innovative and accessible approaches to qualitative research. We present suggestions for developing and communicating approaches to analysis that are the right “fit” for complex IS research projects and demonstrate rigor and quality.
Accordingly, section 1 offers guidance on identifying an analytic approach that aligns with study goals and allows for practical constraints. We describe three approaches commonly considered for IS projects: grounded theory, framework analysis, and interpretive phenomenological analysis, highlighting core elements that researchers can borrow to create a tailored, pragmatic approach. Section 2 addresses opportunities to ensure and communicate the rigor of pragmatic analytic approaches. Section 3 provides an illustrative example from the team’s work, describing the design and execution of a pragmatic analytic approach and the diversity of research and practice products generated.
Section 1: ensuring fit between research goals, practical constraints, and analytic approaches
Decision-making about all aspects of research design, including analysis, entails judgment about “fit.” Researchers need not identify a single analytic approach and attempt to force its strict application, regardless of fit. Indeed, the flexible, study-specific combination of design elements is a hallmark of applied qualitative research practice . Relevant considerations for fit include the inquiry’s purpose and nature of the subject matter; the diversity of intended audiences for findings; the criteria used to judge the quality and practical value of the results; and the research context (including characteristics of the setting, participants, and investigators). Other important considerations relate to constraints of available resources (e.g., funding, time, and staff) and access to relevant participants . We contend that in the applied IS setting, finding an appropriate fit often includes borrowing procedures from different approaches to create a pragmatic, hybrid approach. A pragmatic approach also addresses the IS-specific tensions outlined above, i.e., a need to conduct research that is time-bounded, engages with theories/frameworks/models, supports application in practice, and speaks to a diversity of colleagues. To promote goals of achieving fit and internal coherence in light of IS-specific requirements, we offer the considerations above and additional guiding questions for selecting analytic procedures to create a pragmatic approach, as summarized in Fig. 1.
Key questions include the following:
What is the appropriate balance of inductive and deductive analytic procedures given the research goals?
A deductive process emphasizes themes and explanations derived from previously established concepts, pre-existing theories, or the relevant literature . For example, an analysis that leans heavily on a deductive process might use the core components of the Exploration, Preparation, Implementation, Sustainment (EPIS) framework  to inform the coding structure and analysis. This process would support efforts to bound the investigation’s scope or expand an existing framework or model . On the other hand, rather than trying to fit data with pre-existing concepts or theory, an inductive process generates interpretation and understanding that is primarily grounded in and driven by the data .
A balance of deductive and inductive processes might use an IS framework as a starting point for the deductive portion and then emphasize inductive processes to garner additional insight into topics not anticipated by the team or framework. For example, a selected IS framework may not attend sufficiently to the ways in which implementation context drives inequities , if the dataset includes valuable information on this topic, including inductive processes would allow a fuller exploration of such patterns.
To what extent will the analysis emphasize the perspectives of participants vs. researchers?
An important decision relates to where the research team wishes to ground the analysis on the continuum between insider (emic) and outsider (etic) perspectives. The appropriate balance of insider/outsider orientation will reflect the overall research design and questions. Specific decisions about how to execute the desired balance through the analysis include; for example, the types of codes used or the value placed on participant reflections. As described below in section 2, value is often placed on incorporating participants’ feedback on the development analysis, sometimes called “member checks” or “member reflections” .
An insider (emic) orientation represents findings in the ways that participants experience them, and insider knowledge is valued and privileged . As an example, MacFarlane and colleagues used Normalization Process Theory and participatory approaches to identify appropriate implementation strategies to support the integration of evidence-based cross-cultural communication in European primary care settings. The participatory nature of the project offered the opportunity to gain “insider” insight rather than imposing and prioritizing the academic researchers’ “outsider” perspective. The insider (emic) orientation was operationalized in the analytic approach by using stakeholder co-analysis, which engages a wider set of stakeholders in the iterative processes of thematically analyzing the data . By contrast, an outsider (etic) orientation represents the setting and participants in terms that the researcher or external audiences bring to the study and emphasizes the outsider’s perspective . For instance, Van deGriend and colleagues conducted an analysis of influences on scaling-up group prenatal care. They used outsider (etic) codes that drew on researchers’ concepts and the literature to complement the insider (emic) codes that reflected participants’ concepts and views . Balancing insider and outsider orientations is useful for pragmatic, qualitative IS studies increase the potential for the study to highlight practice- and community-based expertise, build the literature, and ultimately support the integration of evidence into practice.
How can the analytic plan be designed to yield the outputs and products needed to support the integration of evidence into research and practice?
The research team can maximize efficiency and impact by intentionally connecting the analytic plan and the kind of products needed to meet scientific and practice goals (e.g., journal articles versus policy briefs). The ultimate use of the research outputs can also impact decisions around the breadth versus depth of the analysis. For example, in a recent implementation evaluation for community-clinical partnerships delivering EBIs in underserved communities, members of this author team (SR and RL) analyzed data to explore how partnership networks impacted implementation outcomes. At the same time, given the broader goal of supporting the establishment of health policies to support partnered EBI delivery, the team was also charged (by the state Department of Public Health) with capturing stories that would resonate with legislators regarding the need for broad, sustained investments . We created a unique code to identify these stories during analysis and easily incorporate them into products for health department leaders. Given the practice-focused orientation, qualitative IS studies often support products for practitioners, e.g., “playbooks” to guide the process of implementing an intervention or novel care process .
How can analysis resources be used strategically in time-sensitive projects or where there is limited staff or resource availability?
IS research is often conducted by teams, and strategic analytic decisions can promote rigor while capitalizing on the potential for teamwork to speed up analysis. Deterding and Waters’ strategy of flexible coding, for example, offers such benefits . Through an initial, framework-driven analytic step, large chunks of text can be quickly indexed deductively into predefined categories, such as the five Consolidated Framework for Implementation Research domains of inner setting, outer setting, characteristics of individuals, intervention attributes, and processes . This is a more straightforward coding task appropriate for research assistants who have been trained in qualitative research and understand the IS framework. Then, during the second analytic coding step, more in-depth coding by research team members with more experience can ensure a deeper exploration of existing and new themes. This two-step process can also enable team members to lead different parts of an IS project with different goals, purposes, or audiences. Other innovations in team-based analyses are becoming increasingly common in IS, such as rapid ethnographic approaches .
Building blocks for pragmatic analysis: examples from pattern-based analytic approaches
We offer illustrative examples of established analytic approaches in the following, highlighting their utility for IS and procedures that a pragmatic approach might usefully borrow and combine. These examples are not exhaustive; instead, they represent selected, pattern-based analytic approaches commonly used in IS. We aim to offer helpful anchor points that encompass the breadth and flexibility to apply to a wide range of IS projects  while also reflecting and speaking to a diversity of home disciplines, including sociology, applied policy, and psychology.
Grounded theory is one of the most recognizable and influential approaches to qualitative analysis, although many variations have emerged since its introduction. Sociologists developed the approach, and the history and underlying philosophy are richly detailed elsewhere [25, 26]. The central goal of this approach is to generate a theoretical explanation grounded in close inspection of the data and without a preconceived starting point. In many instances, the emphasis of grounded theory on a purely inductive orientation may be at odds with the focus in IS on the use of existing theories and frameworks, as highlighted by the QUALRIS group . Additionally, a “full” grounded theory study, aligned with all its methodological assumptions and prescriptions (e.g., for sampling), is very demanding and time-consuming and may not be appropriate when timely turnaround in the service of research or practice change is required. For these reasons, a full grounded theory approach is rarely seen in the IS literature. Instead, IS researchers who use this approach are likely to use a modified version, sometimes described as “grounded theory lite” .
Core features and procedures characteristic of grounded theory that can be incorporated into a pragmatic approach include inductive coding techniques . Open, inductive coding allows the researcher to “open up the inquiry” by examining the data to see what concepts best fit the data, without a preconceived explanation or framework [28,29,30]. Concepts and categories derived from open coding prompt the researcher to consider aspects of the research topic that were overlooked or unanticipated . The intermediate stages of coding in grounded theory, referred to as axial or focused coding, build on the open coding and generate a more refined set of key categories and identify relationships between these categories . Another useful procedure from grounded theory is the constant comparison method, in which data are collected, categorized, and compared to previously collected data. This continuing, iterative process prompts continuous engagement with the analysis process and reshapes and redefines ideas, which is useful for most qualitative studies [25, 29, 33]. Grounded theory also allows for community expertise and broader outsider perspectives to complement one another for a more comprehensive understanding of practices .
An illustration of the utility of grounded theory procedures comes from a study that explored how implementing organizations can influence local context to support the scale-up of mental health interventions in middle-income countries . Using a multiple case study design, the study team used an analytic approach based on grounded theory to analyze data from 159 semi-structured interviews across five case sites. They utilized line-by-line open coding, constant comparison, and exploration of connections between themes in the process of developing an overarching theoretical framework. To increase rigor, they employed triangulation by data source and type and member reflections. Their team-based plan included multiple coders who negotiated conflicts and refined the thematic framework jointly. The output of the analysis was a model of processes by which entrepreneurial organizations could marshal and create resources to support the delivery of mental health interventions in limited-resource settings. By taking a divergent perspective (grounded in social entrepreneurship, in this case), the study output provided a basis for further inquiry into the design and scale-up of mental health interventions in middle-income countries.
Framework analysis comes from the policy sphere and tends to have a practical orientation; this applied nature typically includes a more structured and deductive approach. The history, philosophical assumptions, and core processes are richly described by Ritchie and Spencer . Framework analysis entails several features common to many qualitative analytic approaches, including defining concepts, creating typologies, and identifying patterns and relationships, but does so in a more predefined and structured way [37, 38]. For example, the research team can create codes based on a framework selected in advance and can also include open-ended inquiry to capture additional insights. This analytic approach is well-suited to multi-disciplinary teams whose members have varying levels of experience with qualitative research . It may require fewer staff resources and less time than some other approaches.
The framework analysis process includes five key steps. Step 1 is familiarization: Team members immerse themselves in the data, e.g., reading, taking notes, and listening to audio. Step 2 is identifying a coding framework: The research team develops a coding scheme, typically using an iterative process primarily driven by deductive coding (e.g., based on the IS framework). Step 3 is indexing: The team applies the coding structure to the entire data set. Step 4 is charting: The team rearranges the coded data and compares patterns between and within cases. Step 5 is mapping and interpretation: The team looks at the range and nature of relationships across and between codes [36, 39, 40]. The team can use tables and diagrams to systematically synthesize and display the data based on predetermined concepts, frameworks, or areas of interest. While more structured than other approaches, framework analysis still presents a flexible design that combines well with other analytic approaches to achieve study objectives . The case example given in section 3 offers a detailed application of a modified framework analytic approach.
Interpretive phenomenological analysis (IPA)
Broadly, the purpose of a phenomenological inquiry is to understand the experiences and perceptions of individuals related to an occurrence of interest [41, 42]. For example, a phenomenological inquiry might focus on implementers’ experiences with remote training to support implementing a new EBI, aiming to explore their views, how those changed over time, and why implementers reacted the way they did. Drawing on this tradition, IPA focuses specifically on particular individuals (or cases), understanding both the experience of individuals and the sense they are making of those experiences. With roots in psychology, this approach prioritizes the perspective of the participant, who is understood to be part of a broader system of interest; additional details about the philosophical underpinnings are available elsewhere . Research questions are open and broad, taking an inductive, exploratory perspective. Samples are typically small and somewhat homogeneous as the emphasis is placed on an in-depth exploration of a small set of cases to identify patterns of interest . Despite the smaller sample size, the deep, detailed analysis requires thoughtful and time-intensive engagement with the data. The resulting outputs can be useful to develop theories that attend to a particular EBI or IS-related process or to refine existing frameworks and models .
A useful example comes from a study that sought to understand resistance to using evidence-based guidelines from the perspective of physicians focused on providing clinical care . The analysis drew on data collected from interviews of 11 physicians selected for their expertise and diversity across a set of sociodemographic characteristics. In the first phase of the analysis, the team analyzed the full-length interviews and identified key themes and the relationships between them. Particular attention was paid to implicit and explicit meanings, repeated ideas or phrases, and metaphor choices. Two authors conducted the analyses separately and then compared them to reach a consensus. In the second phase of the analysis, the team considered the group of 11 interviews as a set. Using an inductive perspective, the team identified superordinate (or high-level) themes that addressed the full dataset. The final phase of the analysis was to identify a single superordinate theme that would serve as the core description of clinical practice. The team engaged other colleagues from diverse backgrounds to support reflection and refinement of the analysis. The analysis yielded a theoretical model that focused on a core concept (clinical practice as engagement), broken out into five constituent parts addressing how clinicians experience their practice, separate from following external guidelines.
Section 2: ensuring and communicating rigor of a pragmatic analysis
Building on the discussion of pragmatic combination of approaches for a given study, we turn now to the question of ensuring and communicating rigor so that consumers of the scientific products will feel confident assessing, interpreting, and engaging with the findings . This is of particular importance for IS given that the field tends to emphasize quantitative methods and there may be perceptions that qualitative research (and particularly research that must be completed more quickly) is less rigorous. To address those field-specific concerns and ensure pragmatic approaches are understood and valued, IS researchers must ensure and communicate the rigor of their approach. Given journal constraints, authors may consider using supplementary files to offer rich details to describe the study context and details of coding and analysis procedures (see for example, Aveling et al. ). We build on the work of Mays and Pope , Tracy , and others [48,49,50,51,52] to offer a shortlist of considerations for IS researchers to ensure pragmatic analysis is conducted with rigor and its quality and credibility are communicated (Table 1). We also recommend these articles as valuable resources for further reading.
Reporting checklists can help researchers ensure the details of the pragmatic analytic approach are communicated effectively, and inclusion of such a checklist is often required by journals for manuscript submission. Popular choices include the Standards for Reporting Qualitative Research (SRQR) and Consolidated Criteria for Reporting Qualitative (COREQ) checklists. These were developed based on reviews of other checklists and are intended to capture a breadth of information to increase transparency, rather than being driven by a philosophical underpinning regarding how to design rigorous qualitative research [53, 54]. For that reason, researchers should use these checklists with a critical lens as they do not alone demonstrate rigor. Instead, they can be thought of as a flexible guide and support, without focusing solely on technical components at the expense of the broader qualitative expertise that drives the research effort .
Section 3: case example of a modified framework analysis approach
To illustrate the ideas presented above, we offer a recent example of work conducted by two authors (AR and SR) and colleagues . The broad motivation for the study was to increase the use of EBIs in community-based organizations (CBOs) and faith-based organizations (FBOs) working with underserved communities. Our past work and the literature highlighted challenges in matching practitioner capacity (i.e., knowledge, motivation, skills, and resources) with the skillset required to use EBIs successfully [57, 58]. The study utilized a participatory implementation science perspective, which offered a unique opportunity to integrate insider and outsider perspectives and increase the likelihood that solutions developed would reflect the realities of practice. The work was conducted in partnership with a Community Advisory Board and attempted to balance research and action [59, 60].
The qualitative portion of the project had two primary goals. The research goal was to identify improvements to the design and delivery of capacity-building interventions for CBOs and FBOs working with underserved populations. The practice-related goal was to identify local training needs and refine an existing EBI capacity-building curriculum. We drew on the EPIS Framework  to support our exploration of multi-level factors that drive EBI implementation in social service settings. We conducted four focus group discussions with intended capacity-building recipients (n = 27) and key informant interviews with community leaders (n = 15). Given (1) the applied nature of the research and practice goals, (2) our reliance on an existing IS framework, (3) limited staff resources, and (4) a need to analyze data rapidly to support intervention refinement, we chose a modified framework analysis approach. Modifications included incorporating aspects of grounded theory, including open coding, to increase the emphasis on inductive perspectives. The team also modified the charting procedures, replacing tabular summaries with narrative summaries of coded data.
Analysis was conducted by three doctoral-level researchers with complementary training (IS, sociology, and nursing). We started by familiarizing ourselves with the data — the three researchers read a subset of the transcripts, with purposeful overlap in reading assignments to facilitate discussion. Then, we created the coding framework and indexed the data. We went back and forth between indexing and charting, starting with deductive codes based on the EPIS framework, and then using a more inductive open coding strategy to identify emergent codes that fell outside the EPIS framework, e.g., the importance of investing in resources that remain in the community. The new coding framework, with both inductive and deductive codes, was applied to all interview transcripts. Each transcript was independently coded by two of the three investigators, followed by coding comparison to address discrepancies. We used NVivo 12 software , which enabled the exploration and reorganization of data to examine patterns within specific codes and across the data set. We utilized narrative summaries to organize our findings. Finally, we revisited the relevant data to identify broad themes of interest. This step was collaborative and iterative, with each team member taking the lead on a subset of codes and themes that aligned with their expertise, and the interpretations were shared with the other research investigators and discussed. This “divide-and-conquer” tactic was similar to the Deterding and Waters example of flexible coding . We used triangulation to explore perceptions by different groups of participants (e.g., leaders vs. program implementers and individuals representing CBOs vs. FBOs). This type of triangulation is sometimes referred to as “triangulation of data” and stands in contrast to triangulation between different methods .
Our analytic plan was informed by the participatory design of the larger project. At multiple points in the analytic process, we presented interpretations to the advisory board and then refined interpretations and subsequent steps of the analysis accordingly. This was critical because our use of an IS framework likely imposed an outsider’s perspective on the use of EBIs in practice and we wanted to ensure the interpretations reflected insider perspectives on the realities of practice. The incorporation of practice-based expertise in our analytic process also reflected the participatory nature of the research project. We note that advisory board members did not wish to analyze the data in-depth and instead preferred this manner of engagement.
To meet our research goals, we produced scientific publications that expanded the literature on capacity-building strategies to promote evidence-based prevention in CBOs and FBOs addressing health equity. The modified framework analysis approach allowed us to build on and extend the EPIS framework by allowing for framework-driven deductive coding and open, inductive coding. As an example, the EPIS framework highlights relationships between patient/client characteristics (within the “outer context” domain) and EBI fit (within the “innovation” domain). We added an emergent code to capture the wide range of resources CBO- and FBO-based practitioners needed to improve the fit between available EBIs and community needs. This included attention to the limitations of available EBIs to address the multi-level barriers to good health experienced by underserved communities. Participants highlighted the importance of solutions to these gaps coming not from external resources (such as those highlighted within the “bridging factors” domain of the framework), but instead from resources built and maintained within the community. Per the journal’s requirements, we presented the SRQR checklist to explain how we ensured a rigorous analysis.
To achieve practice goals, we drew on the rich dataset to refine the capacity-building intervention, from recruitment to the training components and ongoing supports. For example, we were able to create more compelling arguments for organizational leaders to send staff to the training and support the use of EBIs in their organizations, use language during trainings that better resonated with trainees, and include local examples related to barriers and facilitators to EBI use. We also revised programmatic offerings to include co-teaching by community members and created shorter, implementation-focused training opportunities. The balance of framework-driven, deductive processes, and open, inductive processes allowed us to capture patterns in anticipated and unanticipated content areas. This balance also allowed us to develop research briefs that provide high-level summaries that could be useful to other practitioners considering how best to invest limited professional development resources.
We encourage IS researchers to explore the diversity and flexibility of qualitative analytic approaches and combine them pragmatically to best meet their needs. We recognize that some approaches to analysis are tied to particular methodological orientations and others are not, but a pragmatic approach can offer the opportunity to combine analytic strategies and procedures. To do this successfully, it is essential for the research team to ensure fit, preserve quality, and rigor, and provide transparent explanations connecting the analytic approach and findings so that others can assess and build on the research. We believe pragmatic approaches offer an important opportunity to make strategic analytic decisions, such as identifying an appropriate balance of insider and outsider perspectives, to extend current IS frameworks and models. Given the urgency to increase the utilization and utility of EBIs in practice settings, we see a natural fit with the pragmatist prompt to judge our research efforts based on whether or not the knowledge obtained serves our purposes . In that spirit, the use of pragmatic approaches can support high-quality, efficient, practice-focused research, which can broaden the scope and ultimate impact of IS research.
Availability of data and materials
Hamilton AB, Finley EP. Qualitative methods in implementation research: an introduction. Psychiatry Res. 2019;280:112516. https://doi.org/10.1016/j.psychres.2019.112516.
Tabak RG, Chambers D, Hook M, Brownson RC. The conceptual basis for dissemination and implementation research: lessons from existing models and frameworks. In: Brownson RC, Colditz GA, Proctor EK, editors. Dissemination and implementation research in health: translating science to practice. New York: Oxford University Press; 2018. p. 73–88.
Patton MQ. Qualitative research & evaluation methods: integrating theory and practice: Sage publications; 2014.
QualRIS (Qualitative Research in Implementation Science). Qualitative methods in implementation science. Division of Cancer Control and Population Sciences, National Cancer Institute; 2019.
Creswell JW, Poth CN. Qualitative inquiry and research design: choosing among five approaches: Sage publications; 2016.
Braun V, Clarke V. Successful qualitative research: a practical guide for beginners: sage; 2013.
Levitt HM, Motulsky SL, Wertz FJ, Morrow SL, Ponterotto JG. Recommendations for designing and reviewing qualitative research in psychology: promoting methodological integrity. Qual Psychol. 2017;4(1):2–22. https://doi.org/10.1037/qup0000082.
Tracy SJ. Qualitative quality: eight “big-tent” criteria for excellent qualitative research. Qualitative inquiry. 2010;16(10):837–51. https://doi.org/10.1177/1077800410383121.
Green J, Thorogood N. Qualitative methods for health research. 4th ed. Thousand Oaks: SAGE; 2018.
Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. 2015;10(1):53. https://doi.org/10.1186/s13012-015-0242-0.
Aveling E-L, Zegeye DT, Silverman M. Obstacles to implementation of an intervention to improve surgical services in an Ethiopian hospital: a qualitative study of an international health partnership project. BMC health services research. 2016;16(1):393. https://doi.org/10.1186/s12913-016-1639-4.
Dearing JW, Kee KF, Peng T. Historical roots of dissemination and implementation science. In: Brownson RC, Colditz GA, Proctor E, editors. Dissemination and implementation research in health: translating science to practice. 2nd ed. New York: Oxford University Press; 2018. p. 47–61.
Braun V, Clarke V. Can I use TA? Should I use TA? Should I not use TA? Comparing reflexive thematic analysis and other pattern-based qualitative analytic approaches. Counsel Psychother Res. 2021;21(1):37–47. https://doi.org/10.1002/capr.12360.
Punch KF, Oancea A. Introduction to research methods in education: Sage; 2014.
Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Mental Health Mental Health Serv Res. 2011;38(1):4–23. https://doi.org/10.1007/s10488-010-0327-7.
Miles MB, Huberman AM. Qualitative data analysis: an expanded sourcebook: Sage; 1994.
Baumann AA, Cabassa LJ. Reframing implementation science to address inequities in healthcare delivery. BMC Health Serv Res. 2020;20(1):190. https://doi.org/10.1186/s12913-020-4975-3.
MacFarlane A, O’Donnell C, Mair F, O’Reilly-de Brún M, de Brún T, Spiegel W, et al. REsearch into implementation STrategies to support patients of different ORigins and language background in a variety of European primary care settings (RESTORE): study protocol. Implementation Science. 2012;7(1):111. https://doi.org/10.1186/1748-5908-7-111.
Van De Griend KM, Billings DL, Frongillo EA, Messias DKH, Crockett AH, Covington-Kolb S. Core strategies, social processes, and contextual influences of early phases of implementation and statewide scale-up of group prenatal care in South Carolina. Eval Program Plann. 2020;79:101760. https://doi.org/10.1016/j.evalprogplan.2019.101760.
Ramanadhan S, Daly J, Lee RM, Kruse G, Deutsch C. Network-based delivery and sustainment of evidence-based prevention in community-clinical partnerships addressing health equity: a qualitative exploration. Front Public Health. 2020;8:213. https://doi.org/10.3389/fpubh.2020.00213.
Deterding NM, Waters MC. Flexible coding of in-depth interviews: a twenty-first-century approach. Soc Methods Res. 2018:0049124118799377.
Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implementation Science. 2009;4:50. https://doi.org/10.1186/1748-5908-4-50.
Palinkas LA, Zatzick D. Rapid assessment procedure informed clinical ethnography (rapice) in pragmatic clinical trials of mental health services implementation: methods and applied case study. Administration and Policy in Mental Health and Mental Health Services Research. 2019;46(2):255–70. https://doi.org/10.1007/s10488-018-0909-3.
Pistrang N, Barker C. Varieties of qualitative research: a pragmatic approach to selecting methods. 2012.
Glaser B, Strauss A. The discovery grounded theory: strategies for qualitative inquiry. Chicago: Aldine Publishing Company; 1967.
Birks M, Mills J. Grounded theory: a practical guide: Sage; 2015.
Lara Varpio MATM, Mylopoulos M. 21 Qualitative research methodologies: embracing methodological borrowing, shifting and importing. Res Med Educ. 2015;18:245.
Strauss AL. Qualitative analysis for social scientists: Cambridge university press; 1987, DOI: https://doi.org/10.1017/CBO9780511557842.
Creswell JW. Qualitative inquiry and research design: choosing among five approaches. 3rd ed. Thousand Oaks: Sage Publications; 2012.
Burkholder GJ, Cox KA, Crawford LM, Hitchcock JH. Research design and methods: an applied guide for the scholar-practitioner: SAGE Publications, Incorporated; 2019.
Schutt RK. Investigating the social world: the process and practice of research. Thousand Oaks: Sage Publications; 2018.
Chun Tie Y, Birks M, Francis K. Grounded theory research: a design framework for novice researchers. SAGE Open Med. 2019;7:2050312118822927.
Curry L, Nunez-Smith M. Mixed methods in health sciences research: a practical primer: Sage Publications; 2014.
Hoare KJ, Buetow S, Mills J, Francis K. Using an emic and etic ethnographic technique in a grounded theory study of information use by practice nurses in New Zealand. Journal of Research in Nursing. 2013;18(8):720–31. https://doi.org/10.1177/1744987111434190.
Kidd SA, Madan A, Rallabandi S, Cole DC, Muskat E, Raja S, et al. A multiple case study of mental health interventions in middle income countries: considering the science of delivery. PloS one. 2016;11(3):e0152083. https://doi.org/10.1371/journal.pone.0152083.
Ritchie J, Spencer L. Qualitative data analysis for applied policy research. In A. M. Huberman & M. B. Miles (Eds.), The qualitative researcher’s companion. Thousand Oaks: SAGE; 2002;305–30.
Gale NK, Heath G, Cameron E, Rashid S, Redwood S. Using the framework method for the analysis of qualitative data in multi-disciplinary health research. BMC Med Res Methodol. 2013;13(1):1–8.
Mays N, Pope C. Assessing quality in qualitative research. BMJ. 2000;320(7226):50–2. https://doi.org/10.1136/bmj.320.7226.50.
Gale NK, Heath G, Cameron E, Rashid S, Redwood S. Using the framework method for the analysis of qualitative data in multi-disciplinary health research. BMC medical research methodology. 2013;13(1):117. https://doi.org/10.1186/1471-2288-13-117.
Bonello M, Meehan B. Transparency and coherence in a doctoral study case analysis: reflecting on the use of NVivo within a “framework” approach. Qual Rep. 2019;24(3):483–99.
Eatough V, Smith JA. Interpretative phenomenological analysis. In: Willig C, Stainton-Rogers W, editors. The Sage handbook of qualitative research in psychology. 179 Thousand Oaks: SAGE; 2008;193-211.
McWilliam CL, Kothari A, Ward-Griffin C, Forbes D, Leipert B, Collaboration SWCCACHC. Evolving the theory and praxis of knowledge translation through social interaction: a social phenomenological study. Implement Sci. 2009;4(1):26. https://doi.org/10.1186/1748-5908-4-26.
Smith JA, Shinebourne P. Interpretative phenomenological analysis: American Psychological Association; 2012.
Kislov R, Pope C, Martin GP, Wilson PM. Harnessing the power of theorising in implementation science. Implementation Science. 2019;14(1):103. https://doi.org/10.1186/s13012-019-0957-4.
Saraga M, Boudreau D, Fuks A. Engagement and practical wisdom in clinical practice: a phenomenological study. Medicine, Health Care and Philosophy. 2019;22(1):41–52. https://doi.org/10.1007/s11019-018-9838-x.
Lincoln YS, Guba EG. Naturalistic inquiry. Beverly Hill: Sage; 1985.
Aveling E-L, Stone J, Sundt T, Wright C, Gino F, Singer S. Factors influencing team behaviors in surgery: a qualitative study to inform teamwork interventions. The Annals of thoracic surgery. 2018;106(1):115–20. https://doi.org/10.1016/j.athoracsur.2017.12.045.
Waring J, Jones L. Maintaining the link between methodology and method in ethnographic health research. BMJ quality & safety. 2016;25(7):556–7. https://doi.org/10.1136/bmjqs-2016-005325.
Ritchie J, Lewis J, Nicholls CM, Ormston R. Qualitative research practice: a guide for social science students and researchers: sage; 2013.
Patton MQ. Enhancing the quality and credibility of qualitative analysis. Health Serv Res. 1999;34(5 Pt 2):1189–208.
Barry CA, Britten N, Barber N, Bradley C, Stevenson F. Using reflexivity to optimize teamwork in qualitative research. Qual Health Res. 1999;9(1):26–44. https://doi.org/10.1177/104973299129121677.
Booth A, Carroll C, Ilott I, Low LL, Cooper K. Desperately seeking dissonance: identifying the disconfirming case in qualitative evidence synthesis. Qual Health Res. 2013;23(1):126–41. https://doi.org/10.1177/1049732312466295.
Tong A, Sainsbury P, Craig J. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. Int J Qual Health Care. 2007;19(6):349–57. https://doi.org/10.1093/intqhc/mzm042.
O’Brien BC, Harris IB, Beckman TJ, Reed DA, Cook DA. Standards for reporting qualitative research: a synthesis of recommendations. Acad Med. 2014;89(9):1245–51. https://doi.org/10.1097/ACM.0000000000000388.
Barbour RS. Checklists for improving rigour in qualitative research: a case of the tail wagging the dog? BMJ. 2001;322(7294):1115–7. https://doi.org/10.1136/bmj.322.7294.1115.
Ramanadhan S, Galbraith-Gyan K, Revette A, Foti A, James CR, Martinez-Dominguez VL, et al. Key considerations for designing capacity-building interventions to support evidence-based programming in underserved communities: a qualitative exploration. Translat Behav Med. 2021;11(2):452–61. https://doi.org/10.1093/tbm/ibz177.
Ramanadhan S, Aronstein D, Martinez-Dominguez VL, Xuan Z, Viswanath K. Designing capacity-building supports to promote evidence-based programs in community-based organizations working with underserved populations. Progress in Community Health Partnerships. 2020;14(2):149–60. https://doi.org/10.1353/cpr.2020.0027.
Leeman J, Calancie L, Hartman MA, Escoffery CT, Herrmann AK, Tague LE, et al. What strategies are used to build practitioners' capacity to implement community-based interventions and are they effective?: a systematic review. Implementation Science. 2015;10(1):80. https://doi.org/10.1186/s13012-015-0272-7.
Ramanadhan S, Davis MM, Armstrong RA, Baquero B, Ko LK, Leng JC, et al. Participatory implementation science to increase the impact of evidence-based cancer prevention and control. Cancer Causes Control. 2018;29(3):363–9. https://doi.org/10.1007/s10552-018-1008-1.
Minkler M, Salvatore AL, Chang C. Participatory approaches for study design and analysis in dissemination and implementation research. In: Brownson RC, Colditz GA, Proctor EK, editors. Dissemination and Implementation Research in Health. 2nd ed. New York: Oxford; 2018. p. 175–90.
QSR International Pty Ltd. NVivo qualitative data analysis software; Version 12. Melbourne, Australia. 2018.
Flick U. Triangulation in qualitative research. In: Flick U, vonKardorff E, Steinke I, editors. A companion to qualitative research. 2004;178-83.
Rorty RM. Philosophy and social hope: Penguin UK; 1999.
We wish to thank Priscilla Gazarian, RN, PhD, for insightful feedback on a draft of the manuscript.
This work was conducted with support from the National Cancer Institute (P50 NCA244433) and from Harvard Catalyst/National Center for Advancing Translational Sciences (UL 1TR002541). The content is solely the responsibility of the authors and does not necessarily represent the official views of Harvard Catalyst, Harvard University, or the National Institutes of Health.
Ethics approval and consent to participate
Consent for publication
The authors declare that they have no competing interests.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Ramanadhan, S., Revette, A.C., Lee, R.M. et al. Pragmatic approaches to analyzing qualitative data for implementation science: an introduction. Implement Sci Commun 2, 70 (2021). https://doi.org/10.1186/s43058-021-00174-1
- Implementation science