• Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • QuestionPro

survey software icon

  • Solutions Industries Gaming Automotive Sports and events Education Government Travel & Hospitality Financial Services Healthcare Cannabis Technology Use Case NPS+ Communities Audience Contactless surveys Mobile LivePolls Member Experience GDPR Positive People Science 360 Feedback Surveys
  • Resources Blog eBooks Survey Templates Case Studies Training Help center

research methods of qualitative data analysis

Home Market Research

Qualitative Data Analysis: What is it, Methods + Examples

Explore qualitative data analysis with diverse methods and real-world examples. Uncover the nuances of human experiences with this guide.

In a world rich with information and narrative, understanding the deeper layers of human experiences requires a unique vision that goes beyond numbers and figures. This is where the power of qualitative data analysis comes to light.

In this blog, we’ll learn about qualitative data analysis, explore its methods, and provide real-life examples showcasing its power in uncovering insights.

What is Qualitative Data Analysis?

Qualitative data analysis is a systematic process of examining non-numerical data to extract meaning, patterns, and insights.

In contrast to quantitative analysis, which focuses on numbers and statistical metrics, the qualitative study focuses on the qualitative aspects of data, such as text, images, audio, and videos. It seeks to understand every aspect of human experiences, perceptions, and behaviors by examining the data’s richness.

Companies frequently conduct this analysis on customer feedback. You can collect qualitative data from reviews, complaints, chat messages, interactions with support centers, customer interviews, case notes, or even social media comments. This kind of data holds the key to understanding customer sentiments and preferences in a way that goes beyond mere numbers.

Importance of Qualitative Data Analysis

Qualitative data analysis plays a crucial role in your research and decision-making process across various disciplines. Let’s explore some key reasons that underline the significance of this analysis:

In-Depth Understanding

It enables you to explore complex and nuanced aspects of a phenomenon, delving into the ‘how’ and ‘why’ questions. This method provides you with a deeper understanding of human behavior, experiences, and contexts that quantitative approaches might not capture fully.

Contextual Insight

You can use this analysis to give context to numerical data. It will help you understand the circumstances and conditions that influence participants’ thoughts, feelings, and actions. This contextual insight becomes essential for generating comprehensive explanations.

Theory Development

You can generate or refine hypotheses via qualitative data analysis. As you analyze the data attentively, you can form hypotheses, concepts, and frameworks that will drive your future research and contribute to theoretical advances.

Participant Perspectives

When performing qualitative research, you can highlight participant voices and opinions. This approach is especially useful for understanding marginalized or underrepresented people, as it allows them to communicate their experiences and points of view.

Exploratory Research

The analysis is frequently used at the exploratory stage of your project. It assists you in identifying important variables, developing research questions, and designing quantitative studies that will follow.

Types of Qualitative Data

When conducting qualitative research, you can use several qualitative data collection methods , and here you will come across many sorts of qualitative data that can provide you with unique insights into your study topic. These data kinds add new views and angles to your understanding and analysis.

Interviews and Focus Groups

Interviews and focus groups will be among your key methods for gathering qualitative data. Interviews are one-on-one talks in which participants can freely share their thoughts, experiences, and opinions.

Focus groups, on the other hand, are discussions in which members interact with one another, resulting in dynamic exchanges of ideas. Both methods provide rich qualitative data and direct access to participant perspectives.

Observations and Field Notes

Observations and field notes are another useful sort of qualitative data. You can immerse yourself in the research environment through direct observation, carefully documenting behaviors, interactions, and contextual factors.

These observations will be recorded in your field notes, providing a complete picture of the environment and the behaviors you’re researching. This data type is especially important for comprehending behavior in their natural setting.

Textual and Visual Data

Textual and visual data include a wide range of resources that can be qualitatively analyzed. Documents, written narratives, and transcripts from various sources, such as interviews or speeches, are examples of textual data.

Photographs, films, and even artwork provide a visual layer to your research. These forms of data allow you to investigate what is spoken and the underlying emotions, details, and symbols expressed by language or pictures.

When to Choose Qualitative Data Analysis over Quantitative Data Analysis

As you begin your research journey, understanding why the analysis of qualitative data is important will guide your approach to understanding complex events. If you analyze qualitative data, it will provide new insights that complement quantitative methodologies, which will give you a broader understanding of your study topic.

It is critical to know when to use qualitative analysis over quantitative procedures. You can prefer qualitative data analysis when:

  • Complexity Reigns: When your research questions involve deep human experiences, motivations, or emotions, qualitative research excels at revealing these complexities.
  • Exploration is Key: Qualitative analysis is ideal for exploratory research. It will assist you in understanding a new or poorly understood topic before formulating quantitative hypotheses.
  • Context Matters: If you want to understand how context affects behaviors or results, qualitative data analysis provides the depth needed to grasp these relationships.
  • Unanticipated Findings: When your study provides surprising new viewpoints or ideas, qualitative analysis helps you to delve deeply into these emerging themes.
  • Subjective Interpretation is Vital: When it comes to understanding people’s subjective experiences and interpretations, qualitative data analysis is the way to go.

You can make informed decisions regarding the right approach for your research objectives if you understand the importance of qualitative analysis and recognize the situations where it shines.

Qualitative Data Analysis Methods and Examples

Exploring various qualitative data analysis methods will provide you with a wide collection for making sense of your research findings. Once the data has been collected, you can choose from several analysis methods based on your research objectives and the data type you’ve collected.

There are five main methods for analyzing qualitative data. Each method takes a distinct approach to identifying patterns, themes, and insights within your qualitative data. They are:

Method 1: Content Analysis

Content analysis is a methodical technique for analyzing textual or visual data in a structured manner. In this method, you will categorize qualitative data by splitting it into manageable pieces and assigning the manual coding process to these units.

As you go, you’ll notice ongoing codes and designs that will allow you to conclude the content. This method is very beneficial for detecting common ideas, concepts, or themes in your data without losing the context.

Steps to Do Content Analysis

Follow these steps when conducting content analysis:

  • Collect and Immerse: Begin by collecting the necessary textual or visual data. Immerse yourself in this data to fully understand its content, context, and complexities.
  • Assign Codes and Categories: Assign codes to relevant data sections that systematically represent major ideas or themes. Arrange comparable codes into groups that cover the major themes.
  • Analyze and Interpret: Develop a structured framework from the categories and codes. Then, evaluate the data in the context of your research question, investigate relationships between categories, discover patterns, and draw meaning from these connections.

Benefits & Challenges

There are various advantages to using content analysis:

  • Structured Approach: It offers a systematic approach to dealing with large data sets and ensures consistency throughout the research.
  • Objective Insights: This method promotes objectivity, which helps to reduce potential biases in your study.
  • Pattern Discovery: Content analysis can help uncover hidden trends, themes, and patterns that are not always obvious.
  • Versatility: You can apply content analysis to various data formats, including text, internet content, images, etc.

However, keep in mind the challenges that arise:

  • Subjectivity: Even with the best attempts, a certain bias may remain in coding and interpretation.
  • Complexity: Analyzing huge data sets requires time and great attention to detail.
  • Contextual Nuances: Content analysis may not capture all of the contextual richness that qualitative data analysis highlights.

Example of Content Analysis

Suppose you’re conducting market research and looking at customer feedback on a product. As you collect relevant data and analyze feedback, you’ll see repeating codes like “price,” “quality,” “customer service,” and “features.” These codes are organized into categories such as “positive reviews,” “negative reviews,” and “suggestions for improvement.”

According to your findings, themes such as “price” and “customer service” stand out and show that pricing and customer service greatly impact customer satisfaction. This example highlights the power of content analysis for obtaining significant insights from large textual data collections.

Method 2: Thematic Analysis

Thematic analysis is a well-structured procedure for identifying and analyzing recurring themes in your data. As you become more engaged in the data, you’ll generate codes or short labels representing key concepts. These codes are then organized into themes, providing a consistent framework for organizing and comprehending the substance of the data.

The analysis allows you to organize complex narratives and perspectives into meaningful categories, which will allow you to identify connections and patterns that may not be visible at first.

Steps to Do Thematic Analysis

Follow these steps when conducting a thematic analysis:

  • Code and Group: Start by thoroughly examining the data and giving initial codes that identify the segments. To create initial themes, combine relevant codes.
  • Code and Group: Begin by engaging yourself in the data, assigning first codes to notable segments. To construct basic themes, group comparable codes together.
  • Analyze and Report: Analyze the data within each theme to derive relevant insights. Organize the topics into a consistent structure and explain your findings, along with data extracts that represent each theme.

Thematic analysis has various benefits:

  • Structured Exploration: It is a method for identifying patterns and themes in complex qualitative data.
  • Comprehensive knowledge: Thematic analysis promotes an in-depth understanding of the complications and meanings of the data.
  • Application Flexibility: This method can be customized to various research situations and data kinds.

However, challenges may arise, such as:

  • Interpretive Nature: Interpreting qualitative data in thematic analysis is vital, and it is critical to manage researcher bias.
  • Time-consuming: The study can be time-consuming, especially with large data sets.
  • Subjectivity: The selection of codes and topics might be subjective.

Example of Thematic Analysis

Assume you’re conducting a thematic analysis on job satisfaction interviews. Following your immersion in the data, you assign initial codes such as “work-life balance,” “career growth,” and “colleague relationships.” As you organize these codes, you’ll notice themes develop, such as “Factors Influencing Job Satisfaction” and “Impact on Work Engagement.”

Further investigation reveals the tales and experiences included within these themes and provides insights into how various elements influence job satisfaction. This example demonstrates how thematic analysis can reveal meaningful patterns and insights in qualitative data.

Method 3: Narrative Analysis

The narrative analysis involves the narratives that people share. You’ll investigate the histories in your data, looking at how stories are created and the meanings they express. This method is excellent for learning how people make sense of their experiences through narrative.

Steps to Do Narrative Analysis

The following steps are involved in narrative analysis:

  • Gather and Analyze: Start by collecting narratives, such as first-person tales, interviews, or written accounts. Analyze the stories, focusing on the plot, feelings, and characters.
  • Find Themes: Look for recurring themes or patterns in various narratives. Think about the similarities and differences between these topics and personal experiences.
  • Interpret and Extract Insights: Contextualize the narratives within their larger context. Accept the subjective nature of each narrative and analyze the narrator’s voice and style. Extract insights from the tales by diving into the emotions, motivations, and implications communicated by the stories.

There are various advantages to narrative analysis:

  • Deep Exploration: It lets you look deeply into people’s personal experiences and perspectives.
  • Human-Centered: This method prioritizes the human perspective, allowing individuals to express themselves.

However, difficulties may arise, such as:

  • Interpretive Complexity: Analyzing narratives requires dealing with the complexities of meaning and interpretation.
  • Time-consuming: Because of the richness and complexities of tales, working with them can be time-consuming.

Example of Narrative Analysis

Assume you’re conducting narrative analysis on refugee interviews. As you read the stories, you’ll notice common themes of toughness, loss, and hope. The narratives provide insight into the obstacles that refugees face, their strengths, and the dreams that guide them.

The analysis can provide a deeper insight into the refugees’ experiences and the broader social context they navigate by examining the narratives’ emotional subtleties and underlying meanings. This example highlights how narrative analysis can reveal important insights into human stories.

Method 4: Grounded Theory Analysis

Grounded theory analysis is an iterative and systematic approach that allows you to create theories directly from data without being limited by pre-existing hypotheses. With an open mind, you collect data and generate early codes and labels that capture essential ideas or concepts within the data.

As you progress, you refine these codes and increasingly connect them, eventually developing a theory based on the data. Grounded theory analysis is a dynamic process for developing new insights and hypotheses based on details in your data.

Steps to Do Grounded Theory Analysis

Grounded theory analysis requires the following steps:

  • Initial Coding: First, immerse yourself in the data, producing initial codes that represent major concepts or patterns.
  • Categorize and Connect: Using axial coding, organize the initial codes, which establish relationships and connections between topics.
  • Build the Theory: Focus on creating a core category that connects the codes and themes. Regularly refine the theory by comparing and integrating new data, ensuring that it evolves organically from the data.

Grounded theory analysis has various benefits:

  • Theory Generation: It provides a one-of-a-kind opportunity to generate hypotheses straight from data and promotes new insights.
  • In-depth Understanding: The analysis allows you to deeply analyze the data and reveal complex relationships and patterns.
  • Flexible Process: This method is customizable and ongoing, which allows you to enhance your research as you collect additional data.

However, challenges might arise with:

  • Time and Resources: Because grounded theory analysis is a continuous process, it requires a large commitment of time and resources.
  • Theoretical Development: Creating a grounded theory involves a thorough understanding of qualitative data analysis software and theoretical concepts.
  • Interpretation of Complexity: Interpreting and incorporating a newly developed theory into existing literature can be intellectually hard.

Example of Grounded Theory Analysis

Assume you’re performing a grounded theory analysis on workplace collaboration interviews. As you open code the data, you will discover notions such as “communication barriers,” “team dynamics,” and “leadership roles.” Axial coding demonstrates links between these notions, emphasizing the significance of efficient communication in developing collaboration.

You create the core “Integrated Communication Strategies” category through selective coding, which unifies new topics.

This theory-driven category serves as the framework for understanding how numerous aspects contribute to effective team collaboration. This example shows how grounded theory analysis allows you to generate a theory directly from the inherent nature of the data.

Method 5: Discourse Analysis

Discourse analysis focuses on language and communication. You’ll look at how language produces meaning and how it reflects power relations, identities, and cultural influences. This strategy examines what is said and how it is said; the words, phrasing, and larger context of communication.

The analysis is precious when investigating power dynamics, identities, and cultural influences encoded in language. By evaluating the language used in your data, you can identify underlying assumptions, cultural standards, and how individuals negotiate meaning through communication.

Steps to Do Discourse Analysis

Conducting discourse analysis entails the following steps:

  • Select Discourse: For analysis, choose language-based data such as texts, speeches, or media content.
  • Analyze Language: Immerse yourself in the conversation, examining language choices, metaphors, and underlying assumptions.
  • Discover Patterns: Recognize the dialogue’s reoccurring themes, ideologies, and power dynamics. To fully understand the effects of these patterns, put them in their larger context.

There are various advantages of using discourse analysis:

  • Understanding Language: It provides an extensive understanding of how language builds meaning and influences perceptions.
  • Uncovering Power Dynamics: The analysis reveals how power dynamics appear via language.
  • Cultural Insights: This method identifies cultural norms, beliefs, and ideologies stored in communication.

However, the following challenges may arise:

  • Complexity of Interpretation: Language analysis involves navigating multiple levels of nuance and interpretation.
  • Subjectivity: Interpretation can be subjective, so controlling researcher bias is important.
  • Time-Intensive: Discourse analysis can take a lot of time because careful linguistic study is required in this analysis.

Example of Discourse Analysis

Consider doing discourse analysis on media coverage of a political event. You notice repeating linguistic patterns in news articles that depict the event as a conflict between opposing parties. Through deconstruction, you can expose how this framing supports particular ideologies and power relations.

You can illustrate how language choices influence public perceptions and contribute to building the narrative around the event by analyzing the speech within the broader political and social context. This example shows how discourse analysis can reveal hidden power dynamics and cultural influences on communication.

How to do Qualitative Data Analysis with the QuestionPro Research suite?

QuestionPro is a popular survey and research platform that offers tools for collecting and analyzing qualitative and quantitative data. Follow these general steps for conducting qualitative data analysis using the QuestionPro Research Suite:

  • Collect Qualitative Data: Set up your survey to capture qualitative responses. It might involve open-ended questions, text boxes, or comment sections where participants can provide detailed responses.
  • Export Qualitative Responses: Export the responses once you’ve collected qualitative data through your survey. QuestionPro typically allows you to export survey data in various formats, such as Excel or CSV.
  • Prepare Data for Analysis: Review the exported data and clean it if necessary. Remove irrelevant or duplicate entries to ensure your data is ready for analysis.
  • Code and Categorize Responses: Segment and label data, letting new patterns emerge naturally, then develop categories through axial coding to structure the analysis.
  • Identify Themes: Analyze the coded responses to identify recurring themes, patterns, and insights. Look for similarities and differences in participants’ responses.
  • Generate Reports and Visualizations: Utilize the reporting features of QuestionPro to create visualizations, charts, and graphs that help communicate the themes and findings from your qualitative research.
  • Interpret and Draw Conclusions: Interpret the themes and patterns you’ve identified in the qualitative data. Consider how these findings answer your research questions or provide insights into your study topic.
  • Integrate with Quantitative Data (if applicable): If you’re also conducting quantitative research using QuestionPro, consider integrating your qualitative findings with quantitative results to provide a more comprehensive understanding.

Qualitative data analysis is vital in uncovering various human experiences, views, and stories. If you’re ready to transform your research journey and apply the power of qualitative analysis, now is the moment to do it. Book a demo with QuestionPro today and begin your journey of exploration.

LEARN MORE         FREE TRIAL

MORE LIKE THIS

When I think of “disconnected”, it is important that this is not just in relation to people analytics, Employee Experience or Customer Experience - it is also relevant to looking across them.

I Am Disconnected – Tuesday CX Thoughts

May 21, 2024

Customer success tools

20 Best Customer Success Tools of 2024

May 20, 2024

AI-Based Services in Market Research

AI-Based Services Buying Guide for Market Research (based on ESOMAR’s 20 Questions) 

data information vs insight

Data Information vs Insight: Essential differences

May 14, 2024

Other categories

  • Academic Research
  • Artificial Intelligence
  • Assessments
  • Brand Awareness
  • Case Studies
  • Communities
  • Consumer Insights
  • Customer effort score
  • Customer Engagement
  • Customer Experience
  • Customer Loyalty
  • Customer Research
  • Customer Satisfaction
  • Employee Benefits
  • Employee Engagement
  • Employee Retention
  • Friday Five
  • General Data Protection Regulation
  • Insights Hub
  • Life@QuestionPro
  • Market Research
  • Mobile diaries
  • Mobile Surveys
  • New Features
  • Online Communities
  • Question Types
  • Questionnaire
  • QuestionPro Products
  • Release Notes
  • Research Tools and Apps
  • Revenue at Risk
  • Survey Templates
  • Training Tips
  • Uncategorized
  • Video Learning Series
  • What’s Coming Up
  • Workforce Intelligence

Grad Coach

Qualitative Data Analysis Methods 101:

The “big 6” methods + examples.

By: Kerryn Warren (PhD) | Reviewed By: Eunice Rautenbach (D.Tech) | May 2020 (Updated April 2023)

Qualitative data analysis methods. Wow, that’s a mouthful. 

If you’re new to the world of research, qualitative data analysis can look rather intimidating. So much bulky terminology and so many abstract, fluffy concepts. It certainly can be a minefield!

Don’t worry – in this post, we’ll unpack the most popular analysis methods , one at a time, so that you can approach your analysis with confidence and competence – whether that’s for a dissertation, thesis or really any kind of research project.

Qualitative data analysis methods

What (exactly) is qualitative data analysis?

To understand qualitative data analysis, we need to first understand qualitative data – so let’s step back and ask the question, “what exactly is qualitative data?”.

Qualitative data refers to pretty much any data that’s “not numbers” . In other words, it’s not the stuff you measure using a fixed scale or complex equipment, nor do you analyse it using complex statistics or mathematics.

So, if it’s not numbers, what is it?

Words, you guessed? Well… sometimes , yes. Qualitative data can, and often does, take the form of interview transcripts, documents and open-ended survey responses – but it can also involve the interpretation of images and videos. In other words, qualitative isn’t just limited to text-based data.

So, how’s that different from quantitative data, you ask?

Simply put, qualitative research focuses on words, descriptions, concepts or ideas – while quantitative research focuses on numbers and statistics . Qualitative research investigates the “softer side” of things to explore and describe , while quantitative research focuses on the “hard numbers”, to measure differences between variables and the relationships between them. If you’re keen to learn more about the differences between qual and quant, we’ve got a detailed post over here .

qualitative data analysis vs quantitative data analysis

So, qualitative analysis is easier than quantitative, right?

Not quite. In many ways, qualitative data can be challenging and time-consuming to analyse and interpret. At the end of your data collection phase (which itself takes a lot of time), you’ll likely have many pages of text-based data or hours upon hours of audio to work through. You might also have subtle nuances of interactions or discussions that have danced around in your mind, or that you scribbled down in messy field notes. All of this needs to work its way into your analysis.

Making sense of all of this is no small task and you shouldn’t underestimate it. Long story short – qualitative analysis can be a lot of work! Of course, quantitative analysis is no piece of cake either, but it’s important to recognise that qualitative analysis still requires a significant investment in terms of time and effort.

Need a helping hand?

research methods of qualitative data analysis

In this post, we’ll explore qualitative data analysis by looking at some of the most common analysis methods we encounter. We’re not going to cover every possible qualitative method and we’re not going to go into heavy detail – we’re just going to give you the big picture. That said, we will of course includes links to loads of extra resources so that you can learn more about whichever analysis method interests you.

Without further delay, let’s get into it.

The “Big 6” Qualitative Analysis Methods 

There are many different types of qualitative data analysis, all of which serve different purposes and have unique strengths and weaknesses . We’ll start by outlining the analysis methods and then we’ll dive into the details for each.

The 6 most popular methods (or at least the ones we see at Grad Coach) are:

  • Content analysis
  • Narrative analysis
  • Discourse analysis
  • Thematic analysis
  • Grounded theory (GT)
  • Interpretive phenomenological analysis (IPA)

Let’s take a look at each of them…

QDA Method #1: Qualitative Content Analysis

Content analysis is possibly the most common and straightforward QDA method. At the simplest level, content analysis is used to evaluate patterns within a piece of content (for example, words, phrases or images) or across multiple pieces of content or sources of communication. For example, a collection of newspaper articles or political speeches.

With content analysis, you could, for instance, identify the frequency with which an idea is shared or spoken about – like the number of times a Kardashian is mentioned on Twitter. Or you could identify patterns of deeper underlying interpretations – for instance, by identifying phrases or words in tourist pamphlets that highlight India as an ancient country.

Because content analysis can be used in such a wide variety of ways, it’s important to go into your analysis with a very specific question and goal, or you’ll get lost in the fog. With content analysis, you’ll group large amounts of text into codes , summarise these into categories, and possibly even tabulate the data to calculate the frequency of certain concepts or variables. Because of this, content analysis provides a small splash of quantitative thinking within a qualitative method.

Naturally, while content analysis is widely useful, it’s not without its drawbacks . One of the main issues with content analysis is that it can be very time-consuming , as it requires lots of reading and re-reading of the texts. Also, because of its multidimensional focus on both qualitative and quantitative aspects, it is sometimes accused of losing important nuances in communication.

Content analysis also tends to concentrate on a very specific timeline and doesn’t take into account what happened before or after that timeline. This isn’t necessarily a bad thing though – just something to be aware of. So, keep these factors in mind if you’re considering content analysis. Every analysis method has its limitations , so don’t be put off by these – just be aware of them ! If you’re interested in learning more about content analysis, the video below provides a good starting point.

QDA Method #2: Narrative Analysis 

As the name suggests, narrative analysis is all about listening to people telling stories and analysing what that means . Since stories serve a functional purpose of helping us make sense of the world, we can gain insights into the ways that people deal with and make sense of reality by analysing their stories and the ways they’re told.

You could, for example, use narrative analysis to explore whether how something is being said is important. For instance, the narrative of a prisoner trying to justify their crime could provide insight into their view of the world and the justice system. Similarly, analysing the ways entrepreneurs talk about the struggles in their careers or cancer patients telling stories of hope could provide powerful insights into their mindsets and perspectives . Simply put, narrative analysis is about paying attention to the stories that people tell – and more importantly, the way they tell them.

Of course, the narrative approach has its weaknesses , too. Sample sizes are generally quite small due to the time-consuming process of capturing narratives. Because of this, along with the multitude of social and lifestyle factors which can influence a subject, narrative analysis can be quite difficult to reproduce in subsequent research. This means that it’s difficult to test the findings of some of this research.

Similarly, researcher bias can have a strong influence on the results here, so you need to be particularly careful about the potential biases you can bring into your analysis when using this method. Nevertheless, narrative analysis is still a very useful qualitative analysis method – just keep these limitations in mind and be careful not to draw broad conclusions . If you’re keen to learn more about narrative analysis, the video below provides a great introduction to this qualitative analysis method.

QDA Method #3: Discourse Analysis 

Discourse is simply a fancy word for written or spoken language or debate . So, discourse analysis is all about analysing language within its social context. In other words, analysing language – such as a conversation, a speech, etc – within the culture and society it takes place. For example, you could analyse how a janitor speaks to a CEO, or how politicians speak about terrorism.

To truly understand these conversations or speeches, the culture and history of those involved in the communication are important factors to consider. For example, a janitor might speak more casually with a CEO in a company that emphasises equality among workers. Similarly, a politician might speak more about terrorism if there was a recent terrorist incident in the country.

So, as you can see, by using discourse analysis, you can identify how culture , history or power dynamics (to name a few) have an effect on the way concepts are spoken about. So, if your research aims and objectives involve understanding culture or power dynamics, discourse analysis can be a powerful method.

Because there are many social influences in terms of how we speak to each other, the potential use of discourse analysis is vast . Of course, this also means it’s important to have a very specific research question (or questions) in mind when analysing your data and looking for patterns and themes, or you might land up going down a winding rabbit hole.

Discourse analysis can also be very time-consuming  as you need to sample the data to the point of saturation – in other words, until no new information and insights emerge. But this is, of course, part of what makes discourse analysis such a powerful technique. So, keep these factors in mind when considering this QDA method. Again, if you’re keen to learn more, the video below presents a good starting point.

QDA Method #4: Thematic Analysis

Thematic analysis looks at patterns of meaning in a data set – for example, a set of interviews or focus group transcripts. But what exactly does that… mean? Well, a thematic analysis takes bodies of data (which are often quite large) and groups them according to similarities – in other words, themes . These themes help us make sense of the content and derive meaning from it.

Let’s take a look at an example.

With thematic analysis, you could analyse 100 online reviews of a popular sushi restaurant to find out what patrons think about the place. By reviewing the data, you would then identify the themes that crop up repeatedly within the data – for example, “fresh ingredients” or “friendly wait staff”.

So, as you can see, thematic analysis can be pretty useful for finding out about people’s experiences , views, and opinions . Therefore, if your research aims and objectives involve understanding people’s experience or view of something, thematic analysis can be a great choice.

Since thematic analysis is a bit of an exploratory process, it’s not unusual for your research questions to develop , or even change as you progress through the analysis. While this is somewhat natural in exploratory research, it can also be seen as a disadvantage as it means that data needs to be re-reviewed each time a research question is adjusted. In other words, thematic analysis can be quite time-consuming – but for a good reason. So, keep this in mind if you choose to use thematic analysis for your project and budget extra time for unexpected adjustments.

Thematic analysis takes bodies of data and groups them according to similarities (themes), which help us make sense of the content.

QDA Method #5: Grounded theory (GT) 

Grounded theory is a powerful qualitative analysis method where the intention is to create a new theory (or theories) using the data at hand, through a series of “ tests ” and “ revisions ”. Strictly speaking, GT is more a research design type than an analysis method, but we’ve included it here as it’s often referred to as a method.

What’s most important with grounded theory is that you go into the analysis with an open mind and let the data speak for itself – rather than dragging existing hypotheses or theories into your analysis. In other words, your analysis must develop from the ground up (hence the name). 

Let’s look at an example of GT in action.

Assume you’re interested in developing a theory about what factors influence students to watch a YouTube video about qualitative analysis. Using Grounded theory , you’d start with this general overarching question about the given population (i.e., graduate students). First, you’d approach a small sample – for example, five graduate students in a department at a university. Ideally, this sample would be reasonably representative of the broader population. You’d interview these students to identify what factors lead them to watch the video.

After analysing the interview data, a general pattern could emerge. For example, you might notice that graduate students are more likely to read a post about qualitative methods if they are just starting on their dissertation journey, or if they have an upcoming test about research methods.

From here, you’ll look for another small sample – for example, five more graduate students in a different department – and see whether this pattern holds true for them. If not, you’ll look for commonalities and adapt your theory accordingly. As this process continues, the theory would develop . As we mentioned earlier, what’s important with grounded theory is that the theory develops from the data – not from some preconceived idea.

So, what are the drawbacks of grounded theory? Well, some argue that there’s a tricky circularity to grounded theory. For it to work, in principle, you should know as little as possible regarding the research question and population, so that you reduce the bias in your interpretation. However, in many circumstances, it’s also thought to be unwise to approach a research question without knowledge of the current literature . In other words, it’s a bit of a “chicken or the egg” situation.

Regardless, grounded theory remains a popular (and powerful) option. Naturally, it’s a very useful method when you’re researching a topic that is completely new or has very little existing research about it, as it allows you to start from scratch and work your way from the ground up .

Grounded theory is used to create a new theory (or theories) by using the data at hand, as opposed to existing theories and frameworks.

QDA Method #6:   Interpretive Phenomenological Analysis (IPA)

Interpretive. Phenomenological. Analysis. IPA . Try saying that three times fast…

Let’s just stick with IPA, okay?

IPA is designed to help you understand the personal experiences of a subject (for example, a person or group of people) concerning a major life event, an experience or a situation . This event or experience is the “phenomenon” that makes up the “P” in IPA. Such phenomena may range from relatively common events – such as motherhood, or being involved in a car accident – to those which are extremely rare – for example, someone’s personal experience in a refugee camp. So, IPA is a great choice if your research involves analysing people’s personal experiences of something that happened to them.

It’s important to remember that IPA is subject – centred . In other words, it’s focused on the experiencer . This means that, while you’ll likely use a coding system to identify commonalities, it’s important not to lose the depth of experience or meaning by trying to reduce everything to codes. Also, keep in mind that since your sample size will generally be very small with IPA, you often won’t be able to draw broad conclusions about the generalisability of your findings. But that’s okay as long as it aligns with your research aims and objectives.

Another thing to be aware of with IPA is personal bias . While researcher bias can creep into all forms of research, self-awareness is critically important with IPA, as it can have a major impact on the results. For example, a researcher who was a victim of a crime himself could insert his own feelings of frustration and anger into the way he interprets the experience of someone who was kidnapped. So, if you’re going to undertake IPA, you need to be very self-aware or you could muddy the analysis.

IPA can help you understand the personal experiences of a person or group concerning a major life event, an experience or a situation.

How to choose the right analysis method

In light of all of the qualitative analysis methods we’ve covered so far, you’re probably asking yourself the question, “ How do I choose the right one? ”

Much like all the other methodological decisions you’ll need to make, selecting the right qualitative analysis method largely depends on your research aims, objectives and questions . In other words, the best tool for the job depends on what you’re trying to build. For example:

  • Perhaps your research aims to analyse the use of words and what they reveal about the intention of the storyteller and the cultural context of the time.
  • Perhaps your research aims to develop an understanding of the unique personal experiences of people that have experienced a certain event, or
  • Perhaps your research aims to develop insight regarding the influence of a certain culture on its members.

As you can probably see, each of these research aims are distinctly different , and therefore different analysis methods would be suitable for each one. For example, narrative analysis would likely be a good option for the first aim, while grounded theory wouldn’t be as relevant. 

It’s also important to remember that each method has its own set of strengths, weaknesses and general limitations. No single analysis method is perfect . So, depending on the nature of your research, it may make sense to adopt more than one method (this is called triangulation ). Keep in mind though that this will of course be quite time-consuming.

As we’ve seen, all of the qualitative analysis methods we’ve discussed make use of coding and theme-generating techniques, but the intent and approach of each analysis method differ quite substantially. So, it’s very important to come into your research with a clear intention before you decide which analysis method (or methods) to use.

Start by reviewing your research aims , objectives and research questions to assess what exactly you’re trying to find out – then select a qualitative analysis method that fits. Never pick a method just because you like it or have experience using it – your analysis method (or methods) must align with your broader research aims and objectives.

No single analysis method is perfect, so it can often make sense to adopt more than one  method (this is called triangulation).

Let’s recap on QDA methods…

In this post, we looked at six popular qualitative data analysis methods:

  • First, we looked at content analysis , a straightforward method that blends a little bit of quant into a primarily qualitative analysis.
  • Then we looked at narrative analysis , which is about analysing how stories are told.
  • Next up was discourse analysis – which is about analysing conversations and interactions.
  • Then we moved on to thematic analysis – which is about identifying themes and patterns.
  • From there, we went south with grounded theory – which is about starting from scratch with a specific question and using the data alone to build a theory in response to that question.
  • And finally, we looked at IPA – which is about understanding people’s unique experiences of a phenomenon.

Of course, these aren’t the only options when it comes to qualitative data analysis, but they’re a great starting point if you’re dipping your toes into qualitative research for the first time.

If you’re still feeling a bit confused, consider our private coaching service , where we hold your hand through the research process to help you develop your best work.

research methods of qualitative data analysis

Psst... there’s more!

This post was based on one of our popular Research Bootcamps . If you're working on a research project, you'll definitely want to check this out ...

You Might Also Like:

Sampling methods and strategies in research

84 Comments

Richard N

This has been very helpful. Thank you.

netaji

Thank you madam,

Mariam Jaiyeola

Thank you so much for this information

Nzube

I wonder it so clear for understand and good for me. can I ask additional query?

Lee

Very insightful and useful

Susan Nakaweesi

Good work done with clear explanations. Thank you.

Titilayo

Thanks so much for the write-up, it’s really good.

Hemantha Gunasekara

Thanks madam . It is very important .

Gumathandra

thank you very good

Pramod Bahulekar

This has been very well explained in simple language . It is useful even for a new researcher.

Derek Jansen

Great to hear that. Good luck with your qualitative data analysis, Pramod!

Adam Zahir

This is very useful information. And it was very a clear language structured presentation. Thanks a lot.

Golit,F.

Thank you so much.

Emmanuel

very informative sequential presentation

Shahzada

Precise explanation of method.

Alyssa

Hi, may we use 2 data analysis methods in our qualitative research?

Thanks for your comment. Most commonly, one would use one type of analysis method, but it depends on your research aims and objectives.

Dr. Manju Pandey

You explained it in very simple language, everyone can understand it. Thanks so much.

Phillip

Thank you very much, this is very helpful. It has been explained in a very simple manner that even a layman understands

Anne

Thank nicely explained can I ask is Qualitative content analysis the same as thematic analysis?

Thanks for your comment. No, QCA and thematic are two different types of analysis. This article might help clarify – https://onlinelibrary.wiley.com/doi/10.1111/nhs.12048

Rev. Osadare K . J

This is my first time to come across a well explained data analysis. so helpful.

Tina King

I have thoroughly enjoyed your explanation of the six qualitative analysis methods. This is very helpful. Thank you!

Bromie

Thank you very much, this is well explained and useful

udayangani

i need a citation of your book.

khutsafalo

Thanks a lot , remarkable indeed, enlighting to the best

jas

Hi Derek, What other theories/methods would you recommend when the data is a whole speech?

M

Keep writing useful artikel.

Adane

It is important concept about QDA and also the way to express is easily understandable, so thanks for all.

Carl Benecke

Thank you, this is well explained and very useful.

Ngwisa

Very helpful .Thanks.

Hajra Aman

Hi there! Very well explained. Simple but very useful style of writing. Please provide the citation of the text. warm regards

Hillary Mophethe

The session was very helpful and insightful. Thank you

This was very helpful and insightful. Easy to read and understand

Catherine

As a professional academic writer, this has been so informative and educative. Keep up the good work Grad Coach you are unmatched with quality content for sure.

Keep up the good work Grad Coach you are unmatched with quality content for sure.

Abdulkerim

Its Great and help me the most. A Million Thanks you Dr.

Emanuela

It is a very nice work

Noble Naade

Very insightful. Please, which of this approach could be used for a research that one is trying to elicit students’ misconceptions in a particular concept ?

Karen

This is Amazing and well explained, thanks

amirhossein

great overview

Tebogo

What do we call a research data analysis method that one use to advise or determining the best accounting tool or techniques that should be adopted in a company.

Catherine Shimechero

Informative video, explained in a clear and simple way. Kudos

Van Hmung

Waoo! I have chosen method wrong for my data analysis. But I can revise my work according to this guide. Thank you so much for this helpful lecture.

BRIAN ONYANGO MWAGA

This has been very helpful. It gave me a good view of my research objectives and how to choose the best method. Thematic analysis it is.

Livhuwani Reineth

Very helpful indeed. Thanku so much for the insight.

Storm Erlank

This was incredibly helpful.

Jack Kanas

Very helpful.

catherine

very educative

Wan Roslina

Nicely written especially for novice academic researchers like me! Thank you.

Talash

choosing a right method for a paper is always a hard job for a student, this is a useful information, but it would be more useful personally for me, if the author provide me with a little bit more information about the data analysis techniques in type of explanatory research. Can we use qualitative content analysis technique for explanatory research ? or what is the suitable data analysis method for explanatory research in social studies?

ramesh

that was very helpful for me. because these details are so important to my research. thank you very much

Kumsa Desisa

I learnt a lot. Thank you

Tesfa NT

Relevant and Informative, thanks !

norma

Well-planned and organized, thanks much! 🙂

Dr. Jacob Lubuva

I have reviewed qualitative data analysis in a simplest way possible. The content will highly be useful for developing my book on qualitative data analysis methods. Cheers!

Nyi Nyi Lwin

Clear explanation on qualitative and how about Case study

Ogobuchi Otuu

This was helpful. Thank you

Alicia

This was really of great assistance, it was just the right information needed. Explanation very clear and follow.

Wow, Thanks for making my life easy

C. U

This was helpful thanks .

Dr. Alina Atif

Very helpful…. clear and written in an easily understandable manner. Thank you.

Herb

This was so helpful as it was easy to understand. I’m a new to research thank you so much.

cissy

so educative…. but Ijust want to know which method is coding of the qualitative or tallying done?

Ayo

Thank you for the great content, I have learnt a lot. So helpful

Tesfaye

precise and clear presentation with simple language and thank you for that.

nneheng

very informative content, thank you.

Oscar Kuebutornye

You guys are amazing on YouTube on this platform. Your teachings are great, educative, and informative. kudos!

NG

Brilliant Delivery. You made a complex subject seem so easy. Well done.

Ankit Kumar

Beautifully explained.

Thanks a lot

Kidada Owen-Browne

Is there a video the captures the practical process of coding using automated applications?

Thanks for the comment. We don’t recommend using automated applications for coding, as they are not sufficiently accurate in our experience.

Mathewos Damtew

content analysis can be qualitative research?

Hend

THANK YOU VERY MUCH.

Dev get

Thank you very much for such a wonderful content

Kassahun Aman

do you have any material on Data collection

Prince .S. mpofu

What a powerful explanation of the QDA methods. Thank you.

Kassahun

Great explanation both written and Video. i have been using of it on a day to day working of my thesis project in accounting and finance. Thank you very much for your support.

BORA SAMWELI MATUTULI

very helpful, thank you so much

Submit a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

  • Print Friendly
  • Privacy Policy

Research Method

Home » Qualitative Research – Methods, Analysis Types and Guide

Qualitative Research – Methods, Analysis Types and Guide

Table of Contents

Qualitative Research

Qualitative Research

Qualitative research is a type of research methodology that focuses on exploring and understanding people’s beliefs, attitudes, behaviors, and experiences through the collection and analysis of non-numerical data. It seeks to answer research questions through the examination of subjective data, such as interviews, focus groups, observations, and textual analysis.

Qualitative research aims to uncover the meaning and significance of social phenomena, and it typically involves a more flexible and iterative approach to data collection and analysis compared to quantitative research. Qualitative research is often used in fields such as sociology, anthropology, psychology, and education.

Qualitative Research Methods

Types of Qualitative Research

Qualitative Research Methods are as follows:

One-to-One Interview

This method involves conducting an interview with a single participant to gain a detailed understanding of their experiences, attitudes, and beliefs. One-to-one interviews can be conducted in-person, over the phone, or through video conferencing. The interviewer typically uses open-ended questions to encourage the participant to share their thoughts and feelings. One-to-one interviews are useful for gaining detailed insights into individual experiences.

Focus Groups

This method involves bringing together a group of people to discuss a specific topic in a structured setting. The focus group is led by a moderator who guides the discussion and encourages participants to share their thoughts and opinions. Focus groups are useful for generating ideas and insights, exploring social norms and attitudes, and understanding group dynamics.

Ethnographic Studies

This method involves immersing oneself in a culture or community to gain a deep understanding of its norms, beliefs, and practices. Ethnographic studies typically involve long-term fieldwork and observation, as well as interviews and document analysis. Ethnographic studies are useful for understanding the cultural context of social phenomena and for gaining a holistic understanding of complex social processes.

Text Analysis

This method involves analyzing written or spoken language to identify patterns and themes. Text analysis can be quantitative or qualitative. Qualitative text analysis involves close reading and interpretation of texts to identify recurring themes, concepts, and patterns. Text analysis is useful for understanding media messages, public discourse, and cultural trends.

This method involves an in-depth examination of a single person, group, or event to gain an understanding of complex phenomena. Case studies typically involve a combination of data collection methods, such as interviews, observations, and document analysis, to provide a comprehensive understanding of the case. Case studies are useful for exploring unique or rare cases, and for generating hypotheses for further research.

Process of Observation

This method involves systematically observing and recording behaviors and interactions in natural settings. The observer may take notes, use audio or video recordings, or use other methods to document what they see. Process of observation is useful for understanding social interactions, cultural practices, and the context in which behaviors occur.

Record Keeping

This method involves keeping detailed records of observations, interviews, and other data collected during the research process. Record keeping is essential for ensuring the accuracy and reliability of the data, and for providing a basis for analysis and interpretation.

This method involves collecting data from a large sample of participants through a structured questionnaire. Surveys can be conducted in person, over the phone, through mail, or online. Surveys are useful for collecting data on attitudes, beliefs, and behaviors, and for identifying patterns and trends in a population.

Qualitative data analysis is a process of turning unstructured data into meaningful insights. It involves extracting and organizing information from sources like interviews, focus groups, and surveys. The goal is to understand people’s attitudes, behaviors, and motivations

Qualitative Research Analysis Methods

Qualitative Research analysis methods involve a systematic approach to interpreting and making sense of the data collected in qualitative research. Here are some common qualitative data analysis methods:

Thematic Analysis

This method involves identifying patterns or themes in the data that are relevant to the research question. The researcher reviews the data, identifies keywords or phrases, and groups them into categories or themes. Thematic analysis is useful for identifying patterns across multiple data sources and for generating new insights into the research topic.

Content Analysis

This method involves analyzing the content of written or spoken language to identify key themes or concepts. Content analysis can be quantitative or qualitative. Qualitative content analysis involves close reading and interpretation of texts to identify recurring themes, concepts, and patterns. Content analysis is useful for identifying patterns in media messages, public discourse, and cultural trends.

Discourse Analysis

This method involves analyzing language to understand how it constructs meaning and shapes social interactions. Discourse analysis can involve a variety of methods, such as conversation analysis, critical discourse analysis, and narrative analysis. Discourse analysis is useful for understanding how language shapes social interactions, cultural norms, and power relationships.

Grounded Theory Analysis

This method involves developing a theory or explanation based on the data collected. Grounded theory analysis starts with the data and uses an iterative process of coding and analysis to identify patterns and themes in the data. The theory or explanation that emerges is grounded in the data, rather than preconceived hypotheses. Grounded theory analysis is useful for understanding complex social phenomena and for generating new theoretical insights.

Narrative Analysis

This method involves analyzing the stories or narratives that participants share to gain insights into their experiences, attitudes, and beliefs. Narrative analysis can involve a variety of methods, such as structural analysis, thematic analysis, and discourse analysis. Narrative analysis is useful for understanding how individuals construct their identities, make sense of their experiences, and communicate their values and beliefs.

Phenomenological Analysis

This method involves analyzing how individuals make sense of their experiences and the meanings they attach to them. Phenomenological analysis typically involves in-depth interviews with participants to explore their experiences in detail. Phenomenological analysis is useful for understanding subjective experiences and for developing a rich understanding of human consciousness.

Comparative Analysis

This method involves comparing and contrasting data across different cases or groups to identify similarities and differences. Comparative analysis can be used to identify patterns or themes that are common across multiple cases, as well as to identify unique or distinctive features of individual cases. Comparative analysis is useful for understanding how social phenomena vary across different contexts and groups.

Applications of Qualitative Research

Qualitative research has many applications across different fields and industries. Here are some examples of how qualitative research is used:

  • Market Research: Qualitative research is often used in market research to understand consumer attitudes, behaviors, and preferences. Researchers conduct focus groups and one-on-one interviews with consumers to gather insights into their experiences and perceptions of products and services.
  • Health Care: Qualitative research is used in health care to explore patient experiences and perspectives on health and illness. Researchers conduct in-depth interviews with patients and their families to gather information on their experiences with different health care providers and treatments.
  • Education: Qualitative research is used in education to understand student experiences and to develop effective teaching strategies. Researchers conduct classroom observations and interviews with students and teachers to gather insights into classroom dynamics and instructional practices.
  • Social Work : Qualitative research is used in social work to explore social problems and to develop interventions to address them. Researchers conduct in-depth interviews with individuals and families to understand their experiences with poverty, discrimination, and other social problems.
  • Anthropology : Qualitative research is used in anthropology to understand different cultures and societies. Researchers conduct ethnographic studies and observe and interview members of different cultural groups to gain insights into their beliefs, practices, and social structures.
  • Psychology : Qualitative research is used in psychology to understand human behavior and mental processes. Researchers conduct in-depth interviews with individuals to explore their thoughts, feelings, and experiences.
  • Public Policy : Qualitative research is used in public policy to explore public attitudes and to inform policy decisions. Researchers conduct focus groups and one-on-one interviews with members of the public to gather insights into their perspectives on different policy issues.

How to Conduct Qualitative Research

Here are some general steps for conducting qualitative research:

  • Identify your research question: Qualitative research starts with a research question or set of questions that you want to explore. This question should be focused and specific, but also broad enough to allow for exploration and discovery.
  • Select your research design: There are different types of qualitative research designs, including ethnography, case study, grounded theory, and phenomenology. You should select a design that aligns with your research question and that will allow you to gather the data you need to answer your research question.
  • Recruit participants: Once you have your research question and design, you need to recruit participants. The number of participants you need will depend on your research design and the scope of your research. You can recruit participants through advertisements, social media, or through personal networks.
  • Collect data: There are different methods for collecting qualitative data, including interviews, focus groups, observation, and document analysis. You should select the method or methods that align with your research design and that will allow you to gather the data you need to answer your research question.
  • Analyze data: Once you have collected your data, you need to analyze it. This involves reviewing your data, identifying patterns and themes, and developing codes to organize your data. You can use different software programs to help you analyze your data, or you can do it manually.
  • Interpret data: Once you have analyzed your data, you need to interpret it. This involves making sense of the patterns and themes you have identified, and developing insights and conclusions that answer your research question. You should be guided by your research question and use your data to support your conclusions.
  • Communicate results: Once you have interpreted your data, you need to communicate your results. This can be done through academic papers, presentations, or reports. You should be clear and concise in your communication, and use examples and quotes from your data to support your findings.

Examples of Qualitative Research

Here are some real-time examples of qualitative research:

  • Customer Feedback: A company may conduct qualitative research to understand the feedback and experiences of its customers. This may involve conducting focus groups or one-on-one interviews with customers to gather insights into their attitudes, behaviors, and preferences.
  • Healthcare : A healthcare provider may conduct qualitative research to explore patient experiences and perspectives on health and illness. This may involve conducting in-depth interviews with patients and their families to gather information on their experiences with different health care providers and treatments.
  • Education : An educational institution may conduct qualitative research to understand student experiences and to develop effective teaching strategies. This may involve conducting classroom observations and interviews with students and teachers to gather insights into classroom dynamics and instructional practices.
  • Social Work: A social worker may conduct qualitative research to explore social problems and to develop interventions to address them. This may involve conducting in-depth interviews with individuals and families to understand their experiences with poverty, discrimination, and other social problems.
  • Anthropology : An anthropologist may conduct qualitative research to understand different cultures and societies. This may involve conducting ethnographic studies and observing and interviewing members of different cultural groups to gain insights into their beliefs, practices, and social structures.
  • Psychology : A psychologist may conduct qualitative research to understand human behavior and mental processes. This may involve conducting in-depth interviews with individuals to explore their thoughts, feelings, and experiences.
  • Public Policy: A government agency or non-profit organization may conduct qualitative research to explore public attitudes and to inform policy decisions. This may involve conducting focus groups and one-on-one interviews with members of the public to gather insights into their perspectives on different policy issues.

Purpose of Qualitative Research

The purpose of qualitative research is to explore and understand the subjective experiences, behaviors, and perspectives of individuals or groups in a particular context. Unlike quantitative research, which focuses on numerical data and statistical analysis, qualitative research aims to provide in-depth, descriptive information that can help researchers develop insights and theories about complex social phenomena.

Qualitative research can serve multiple purposes, including:

  • Exploring new or emerging phenomena : Qualitative research can be useful for exploring new or emerging phenomena, such as new technologies or social trends. This type of research can help researchers develop a deeper understanding of these phenomena and identify potential areas for further study.
  • Understanding complex social phenomena : Qualitative research can be useful for exploring complex social phenomena, such as cultural beliefs, social norms, or political processes. This type of research can help researchers develop a more nuanced understanding of these phenomena and identify factors that may influence them.
  • Generating new theories or hypotheses: Qualitative research can be useful for generating new theories or hypotheses about social phenomena. By gathering rich, detailed data about individuals’ experiences and perspectives, researchers can develop insights that may challenge existing theories or lead to new lines of inquiry.
  • Providing context for quantitative data: Qualitative research can be useful for providing context for quantitative data. By gathering qualitative data alongside quantitative data, researchers can develop a more complete understanding of complex social phenomena and identify potential explanations for quantitative findings.

When to use Qualitative Research

Here are some situations where qualitative research may be appropriate:

  • Exploring a new area: If little is known about a particular topic, qualitative research can help to identify key issues, generate hypotheses, and develop new theories.
  • Understanding complex phenomena: Qualitative research can be used to investigate complex social, cultural, or organizational phenomena that are difficult to measure quantitatively.
  • Investigating subjective experiences: Qualitative research is particularly useful for investigating the subjective experiences of individuals or groups, such as their attitudes, beliefs, values, or emotions.
  • Conducting formative research: Qualitative research can be used in the early stages of a research project to develop research questions, identify potential research participants, and refine research methods.
  • Evaluating interventions or programs: Qualitative research can be used to evaluate the effectiveness of interventions or programs by collecting data on participants’ experiences, attitudes, and behaviors.

Characteristics of Qualitative Research

Qualitative research is characterized by several key features, including:

  • Focus on subjective experience: Qualitative research is concerned with understanding the subjective experiences, beliefs, and perspectives of individuals or groups in a particular context. Researchers aim to explore the meanings that people attach to their experiences and to understand the social and cultural factors that shape these meanings.
  • Use of open-ended questions: Qualitative research relies on open-ended questions that allow participants to provide detailed, in-depth responses. Researchers seek to elicit rich, descriptive data that can provide insights into participants’ experiences and perspectives.
  • Sampling-based on purpose and diversity: Qualitative research often involves purposive sampling, in which participants are selected based on specific criteria related to the research question. Researchers may also seek to include participants with diverse experiences and perspectives to capture a range of viewpoints.
  • Data collection through multiple methods: Qualitative research typically involves the use of multiple data collection methods, such as in-depth interviews, focus groups, and observation. This allows researchers to gather rich, detailed data from multiple sources, which can provide a more complete picture of participants’ experiences and perspectives.
  • Inductive data analysis: Qualitative research relies on inductive data analysis, in which researchers develop theories and insights based on the data rather than testing pre-existing hypotheses. Researchers use coding and thematic analysis to identify patterns and themes in the data and to develop theories and explanations based on these patterns.
  • Emphasis on researcher reflexivity: Qualitative research recognizes the importance of the researcher’s role in shaping the research process and outcomes. Researchers are encouraged to reflect on their own biases and assumptions and to be transparent about their role in the research process.

Advantages of Qualitative Research

Qualitative research offers several advantages over other research methods, including:

  • Depth and detail: Qualitative research allows researchers to gather rich, detailed data that provides a deeper understanding of complex social phenomena. Through in-depth interviews, focus groups, and observation, researchers can gather detailed information about participants’ experiences and perspectives that may be missed by other research methods.
  • Flexibility : Qualitative research is a flexible approach that allows researchers to adapt their methods to the research question and context. Researchers can adjust their research methods in real-time to gather more information or explore unexpected findings.
  • Contextual understanding: Qualitative research is well-suited to exploring the social and cultural context in which individuals or groups are situated. Researchers can gather information about cultural norms, social structures, and historical events that may influence participants’ experiences and perspectives.
  • Participant perspective : Qualitative research prioritizes the perspective of participants, allowing researchers to explore subjective experiences and understand the meanings that participants attach to their experiences.
  • Theory development: Qualitative research can contribute to the development of new theories and insights about complex social phenomena. By gathering rich, detailed data and using inductive data analysis, researchers can develop new theories and explanations that may challenge existing understandings.
  • Validity : Qualitative research can offer high validity by using multiple data collection methods, purposive and diverse sampling, and researcher reflexivity. This can help ensure that findings are credible and trustworthy.

Limitations of Qualitative Research

Qualitative research also has some limitations, including:

  • Subjectivity : Qualitative research relies on the subjective interpretation of researchers, which can introduce bias into the research process. The researcher’s perspective, beliefs, and experiences can influence the way data is collected, analyzed, and interpreted.
  • Limited generalizability: Qualitative research typically involves small, purposive samples that may not be representative of larger populations. This limits the generalizability of findings to other contexts or populations.
  • Time-consuming: Qualitative research can be a time-consuming process, requiring significant resources for data collection, analysis, and interpretation.
  • Resource-intensive: Qualitative research may require more resources than other research methods, including specialized training for researchers, specialized software for data analysis, and transcription services.
  • Limited reliability: Qualitative research may be less reliable than quantitative research, as it relies on the subjective interpretation of researchers. This can make it difficult to replicate findings or compare results across different studies.
  • Ethics and confidentiality: Qualitative research involves collecting sensitive information from participants, which raises ethical concerns about confidentiality and informed consent. Researchers must take care to protect the privacy and confidentiality of participants and obtain informed consent.

Also see Research Methods

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

Questionnaire

Questionnaire – Definition, Types, and Examples

Case Study Research

Case Study – Methods, Examples and Guide

Observational Research

Observational Research – Methods and Guide

Quantitative Research

Quantitative Research – Methods, Types and...

Qualitative Research Methods

Qualitative Research Methods

Explanatory Research

Explanatory Research – Types, Methods, Guide

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, automatically generate references for free.

  • Knowledge Base
  • Methodology
  • What Is Qualitative Research? | Methods & Examples

What Is Qualitative Research? | Methods & Examples

Published on 4 April 2022 by Pritha Bhandari . Revised on 30 January 2023.

Qualitative research involves collecting and analysing non-numerical data (e.g., text, video, or audio) to understand concepts, opinions, or experiences. It can be used to gather in-depth insights into a problem or generate new ideas for research.

Qualitative research is the opposite of quantitative research , which involves collecting and analysing numerical data for statistical analysis.

Qualitative research is commonly used in the humanities and social sciences, in subjects such as anthropology, sociology, education, health sciences, and history.

  • How does social media shape body image in teenagers?
  • How do children and adults interpret healthy eating in the UK?
  • What factors influence employee retention in a large organisation?
  • How is anxiety experienced around the world?
  • How can teachers integrate social issues into science curriculums?

Table of contents

Approaches to qualitative research, qualitative research methods, qualitative data analysis, advantages of qualitative research, disadvantages of qualitative research, frequently asked questions about qualitative research.

Qualitative research is used to understand how people experience the world. While there are many approaches to qualitative research, they tend to be flexible and focus on retaining rich meaning when interpreting data.

Common approaches include grounded theory, ethnography, action research, phenomenological research, and narrative research. They share some similarities, but emphasise different aims and perspectives.

Prevent plagiarism, run a free check.

Each of the research approaches involve using one or more data collection methods . These are some of the most common qualitative methods:

  • Observations: recording what you have seen, heard, or encountered in detailed field notes.
  • Interviews:  personally asking people questions in one-on-one conversations.
  • Focus groups: asking questions and generating discussion among a group of people.
  • Surveys : distributing questionnaires with open-ended questions.
  • Secondary research: collecting existing data in the form of texts, images, audio or video recordings, etc.
  • You take field notes with observations and reflect on your own experiences of the company culture.
  • You distribute open-ended surveys to employees across all the company’s offices by email to find out if the culture varies across locations.
  • You conduct in-depth interviews with employees in your office to learn about their experiences and perspectives in greater detail.

Qualitative researchers often consider themselves ‘instruments’ in research because all observations, interpretations and analyses are filtered through their own personal lens.

For this reason, when writing up your methodology for qualitative research, it’s important to reflect on your approach and to thoroughly explain the choices you made in collecting and analysing the data.

Qualitative data can take the form of texts, photos, videos and audio. For example, you might be working with interview transcripts, survey responses, fieldnotes, or recordings from natural settings.

Most types of qualitative data analysis share the same five steps:

  • Prepare and organise your data. This may mean transcribing interviews or typing up fieldnotes.
  • Review and explore your data. Examine the data for patterns or repeated ideas that emerge.
  • Develop a data coding system. Based on your initial ideas, establish a set of codes that you can apply to categorise your data.
  • Assign codes to the data. For example, in qualitative survey analysis, this may mean going through each participant’s responses and tagging them with codes in a spreadsheet. As you go through your data, you can create new codes to add to your system if necessary.
  • Identify recurring themes. Link codes together into cohesive, overarching themes.

There are several specific approaches to analysing qualitative data. Although these methods share similar processes, they emphasise different concepts.

Qualitative research often tries to preserve the voice and perspective of participants and can be adjusted as new research questions arise. Qualitative research is good for:

  • Flexibility

The data collection and analysis process can be adapted as new ideas or patterns emerge. They are not rigidly decided beforehand.

  • Natural settings

Data collection occurs in real-world contexts or in naturalistic ways.

  • Meaningful insights

Detailed descriptions of people’s experiences, feelings and perceptions can be used in designing, testing or improving systems or products.

  • Generation of new ideas

Open-ended responses mean that researchers can uncover novel problems or opportunities that they wouldn’t have thought of otherwise.

Researchers must consider practical and theoretical limitations in analysing and interpreting their data. Qualitative research suffers from:

  • Unreliability

The real-world setting often makes qualitative research unreliable because of uncontrolled factors that affect the data.

  • Subjectivity

Due to the researcher’s primary role in analysing and interpreting data, qualitative research cannot be replicated . The researcher decides what is important and what is irrelevant in data analysis, so interpretations of the same data can vary greatly.

  • Limited generalisability

Small samples are often used to gather detailed data about specific contexts. Despite rigorous analysis procedures, it is difficult to draw generalisable conclusions because the data may be biased and unrepresentative of the wider population .

  • Labour-intensive

Although software can be used to manage and record large amounts of text, data analysis often has to be checked or performed manually.

Quantitative research deals with numbers and statistics, while qualitative research deals with words and meanings.

Quantitative methods allow you to test a hypothesis by systematically collecting and analysing data, while qualitative methods allow you to explore ideas and experiences in depth.

There are five common approaches to qualitative research :

  • Grounded theory involves collecting data in order to develop new theories.
  • Ethnography involves immersing yourself in a group or organisation to understand its culture.
  • Narrative research involves interpreting stories to understand how people make sense of their experiences and perceptions.
  • Phenomenological research involves investigating phenomena through people’s lived experiences.
  • Action research links theory and practice in several cycles to drive innovative changes.

Data collection is the systematic process by which observations or measurements are gathered in research. It is used in many different contexts by academics, governments, businesses, and other organisations.

There are various approaches to qualitative data analysis , but they all share five steps in common:

  • Prepare and organise your data.
  • Review and explore your data.
  • Develop a data coding system.
  • Assign codes to the data.
  • Identify recurring themes.

The specifics of each step depend on the focus of the analysis. Some common approaches include textual analysis , thematic analysis , and discourse analysis .

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the ‘Cite this Scribbr article’ button to automatically add the citation to our free Reference Generator.

Bhandari, P. (2023, January 30). What Is Qualitative Research? | Methods & Examples. Scribbr. Retrieved 21 May 2024, from https://www.scribbr.co.uk/research-methods/introduction-to-qualitative-research/

Is this article helpful?

Pritha Bhandari

Pritha Bhandari

  • AI & NLP
  • Churn & Loyalty
  • Customer Experience
  • Customer Journeys
  • Customer Metrics
  • Feedback Analysis
  • Product Experience
  • Product Updates
  • Sentiment Analysis
  • Surveys & Feedback Collection
  • Try Thematic

Welcome to the community

research methods of qualitative data analysis

Qualitative Data Analysis: Step-by-Step Guide (Manual vs. Automatic)

When we conduct qualitative methods of research, need to explain changes in metrics or understand people's opinions, we always turn to qualitative data. Qualitative data is typically generated through:

  • Interview transcripts
  • Surveys with open-ended questions
  • Contact center transcripts
  • Texts and documents
  • Audio and video recordings
  • Observational notes

Compared to quantitative data, which captures structured information, qualitative data is unstructured and has more depth. It can answer our questions, can help formulate hypotheses and build understanding.

It's important to understand the differences between quantitative data & qualitative data . But unfortunately, analyzing qualitative data is difficult. While tools like Excel, Tableau and PowerBI crunch and visualize quantitative data with ease, there are a limited number of mainstream tools for analyzing qualitative data . The majority of qualitative data analysis still happens manually.

That said, there are two new trends that are changing this. First, there are advances in natural language processing (NLP) which is focused on understanding human language. Second, there is an explosion of user-friendly software designed for both researchers and businesses. Both help automate the qualitative data analysis process.

In this post we want to teach you how to conduct a successful qualitative data analysis. There are two primary qualitative data analysis methods; manual & automatic. We will teach you how to conduct the analysis manually, and also, automatically using software solutions powered by NLP. We’ll guide you through the steps to conduct a manual analysis, and look at what is involved and the role technology can play in automating this process.

More businesses are switching to fully-automated analysis of qualitative customer data because it is cheaper, faster, and just as accurate. Primarily, businesses purchase subscriptions to feedback analytics platforms so that they can understand customer pain points and sentiment.

Overwhelming quantity of feedback

We’ll take you through 5 steps to conduct a successful qualitative data analysis. Within each step we will highlight the key difference between the manual, and automated approach of qualitative researchers. Here's an overview of the steps:

The 5 steps to doing qualitative data analysis

  • Gathering and collecting your qualitative data
  • Organizing and connecting into your qualitative data
  • Coding your qualitative data
  • Analyzing the qualitative data for insights
  • Reporting on the insights derived from your analysis

What is Qualitative Data Analysis?

Qualitative data analysis is a process of gathering, structuring and interpreting qualitative data to understand what it represents.

Qualitative data is non-numerical and unstructured. Qualitative data generally refers to text, such as open-ended responses to survey questions or user interviews, but also includes audio, photos and video.

Businesses often perform qualitative data analysis on customer feedback. And within this context, qualitative data generally refers to verbatim text data collected from sources such as reviews, complaints, chat messages, support centre interactions, customer interviews, case notes or social media comments.

How is qualitative data analysis different from quantitative data analysis?

Understanding the differences between quantitative & qualitative data is important. When it comes to analyzing data, Qualitative Data Analysis serves a very different role to Quantitative Data Analysis. But what sets them apart?

Qualitative Data Analysis dives into the stories hidden in non-numerical data such as interviews, open-ended survey answers, or notes from observations. It uncovers the ‘whys’ and ‘hows’ giving a deep understanding of people’s experiences and emotions.

Quantitative Data Analysis on the other hand deals with numerical data, using statistics to measure differences, identify preferred options, and pinpoint root causes of issues.  It steps back to address questions like "how many" or "what percentage" to offer broad insights we can apply to larger groups.

In short, Qualitative Data Analysis is like a microscope,  helping us understand specific detail. Quantitative Data Analysis is like the telescope, giving us a broader perspective. Both are important, working together to decode data for different objectives.

Qualitative Data Analysis methods

Once all the data has been captured, there are a variety of analysis techniques available and the choice is determined by your specific research objectives and the kind of data you’ve gathered.  Common qualitative data analysis methods include:

Content Analysis

This is a popular approach to qualitative data analysis. Other qualitative analysis techniques may fit within the broad scope of content analysis. Thematic analysis is a part of the content analysis.  Content analysis is used to identify the patterns that emerge from text, by grouping content into words, concepts, and themes. Content analysis is useful to quantify the relationship between all of the grouped content. The Columbia School of Public Health has a detailed breakdown of content analysis .

Narrative Analysis

Narrative analysis focuses on the stories people tell and the language they use to make sense of them.  It is particularly useful in qualitative research methods where customer stories are used to get a deep understanding of customers’ perspectives on a specific issue. A narrative analysis might enable us to summarize the outcomes of a focused case study.

Discourse Analysis

Discourse analysis is used to get a thorough understanding of the political, cultural and power dynamics that exist in specific situations.  The focus of discourse analysis here is on the way people express themselves in different social contexts. Discourse analysis is commonly used by brand strategists who hope to understand why a group of people feel the way they do about a brand or product.

Thematic Analysis

Thematic analysis is used to deduce the meaning behind the words people use. This is accomplished by discovering repeating themes in text. These meaningful themes reveal key insights into data and can be quantified, particularly when paired with sentiment analysis . Often, the outcome of thematic analysis is a code frame that captures themes in terms of codes, also called categories. So the process of thematic analysis is also referred to as “coding”. A common use-case for thematic analysis in companies is analysis of customer feedback.

Grounded Theory

Grounded theory is a useful approach when little is known about a subject. Grounded theory starts by formulating a theory around a single data case. This means that the theory is “grounded”. Grounded theory analysis is based on actual data, and not entirely speculative. Then additional cases can be examined to see if they are relevant and can add to the original grounded theory.

Methods of qualitative data analysis; approaches and techniques to qualitative data analysis

Challenges of Qualitative Data Analysis

While Qualitative Data Analysis offers rich insights, it comes with its challenges. Each unique QDA method has its unique hurdles. Let’s take a look at the challenges researchers and analysts might face, depending on the chosen method.

  • Time and Effort (Narrative Analysis): Narrative analysis, which focuses on personal stories, demands patience. Sifting through lengthy narratives to find meaningful insights can be time-consuming, requires dedicated effort.
  • Being Objective (Grounded Theory): Grounded theory, building theories from data, faces the challenges of personal biases. Staying objective while interpreting data is crucial, ensuring conclusions are rooted in the data itself.
  • Complexity (Thematic Analysis): Thematic analysis involves identifying themes within data, a process that can be intricate. Categorizing and understanding themes can be complex, especially when each piece of data varies in context and structure. Thematic Analysis software can simplify this process.
  • Generalizing Findings (Narrative Analysis): Narrative analysis, dealing with individual stories, makes drawing broad challenging. Extending findings from a single narrative to a broader context requires careful consideration.
  • Managing Data (Thematic Analysis): Thematic analysis involves organizing and managing vast amounts of unstructured data, like interview transcripts. Managing this can be a hefty task, requiring effective data management strategies.
  • Skill Level (Grounded Theory): Grounded theory demands specific skills to build theories from the ground up. Finding or training analysts with these skills poses a challenge, requiring investment in building expertise.

Benefits of qualitative data analysis

Qualitative Data Analysis (QDA) is like a versatile toolkit, offering a tailored approach to understanding your data. The benefits it offers are as diverse as the methods. Let’s explore why choosing the right method matters.

  • Tailored Methods for Specific Needs: QDA isn't one-size-fits-all. Depending on your research objectives and the type of data at hand, different methods offer unique benefits. If you want emotive customer stories, narrative analysis paints a strong picture. When you want to explain a score, thematic analysis reveals insightful patterns
  • Flexibility with Thematic Analysis: thematic analysis is like a chameleon in the toolkit of QDA. It adapts well to different types of data and research objectives, making it a top choice for any qualitative analysis.
  • Deeper Understanding, Better Products: QDA helps you dive into people's thoughts and feelings. This deep understanding helps you build products and services that truly matches what people want, ensuring satisfied customers
  • Finding the Unexpected: Qualitative data often reveals surprises that we miss in quantitative data. QDA offers us new ideas and perspectives, for insights we might otherwise miss.
  • Building Effective Strategies: Insights from QDA are like strategic guides. They help businesses in crafting plans that match people’s desires.
  • Creating Genuine Connections: Understanding people’s experiences lets businesses connect on a real level. This genuine connection helps build trust and loyalty, priceless for any business.

How to do Qualitative Data Analysis: 5 steps

Now we are going to show how you can do your own qualitative data analysis. We will guide you through this process step by step. As mentioned earlier, you will learn how to do qualitative data analysis manually , and also automatically using modern qualitative data and thematic analysis software.

To get best value from the analysis process and research process, it’s important to be super clear about the nature and scope of the question that’s being researched. This will help you select the research collection channels that are most likely to help you answer your question.

Depending on if you are a business looking to understand customer sentiment, or an academic surveying a school, your approach to qualitative data analysis will be unique.

Once you’re clear, there’s a sequence to follow. And, though there are differences in the manual and automatic approaches, the process steps are mostly the same.

The use case for our step-by-step guide is a company looking to collect data (customer feedback data), and analyze the customer feedback - in order to improve customer experience. By analyzing the customer feedback the company derives insights about their business and their customers. You can follow these same steps regardless of the nature of your research. Let’s get started.

Step 1: Gather your qualitative data and conduct research (Conduct qualitative research)

The first step of qualitative research is to do data collection. Put simply, data collection is gathering all of your data for analysis. A common situation is when qualitative data is spread across various sources.

Classic methods of gathering qualitative data

Most companies use traditional methods for gathering qualitative data: conducting interviews with research participants, running surveys, and running focus groups. This data is typically stored in documents, CRMs, databases and knowledge bases. It’s important to examine which data is available and needs to be included in your research project, based on its scope.

Using your existing qualitative feedback

As it becomes easier for customers to engage across a range of different channels, companies are gathering increasingly large amounts of both solicited and unsolicited qualitative feedback.

Most organizations have now invested in Voice of Customer programs , support ticketing systems, chatbot and support conversations, emails and even customer Slack chats.

These new channels provide companies with new ways of getting feedback, and also allow the collection of unstructured feedback data at scale.

The great thing about this data is that it contains a wealth of valubale insights and that it’s already there! When you have a new question about user behavior or your customers, you don’t need to create a new research study or set up a focus group. You can find most answers in the data you already have.

Typically, this data is stored in third-party solutions or a central database, but there are ways to export it or connect to a feedback analysis solution through integrations or an API.

Utilize untapped qualitative data channels

There are many online qualitative data sources you may not have considered. For example, you can find useful qualitative data in social media channels like Twitter or Facebook. Online forums, review sites, and online communities such as Discourse or Reddit also contain valuable data about your customers, or research questions.

If you are considering performing a qualitative benchmark analysis against competitors - the internet is your best friend. Gathering feedback in competitor reviews on sites like Trustpilot, G2, Capterra, Better Business Bureau or on app stores is a great way to perform a competitor benchmark analysis.

Customer feedback analysis software often has integrations into social media and review sites, or you could use a solution like DataMiner to scrape the reviews.

G2.com reviews of the product Airtable. You could pull reviews from G2 for your analysis.

Step 2: Connect & organize all your qualitative data

Now you all have this qualitative data but there’s a problem, the data is unstructured. Before feedback can be analyzed and assigned any value, it needs to be organized in a single place. Why is this important? Consistency!

If all data is easily accessible in one place and analyzed in a consistent manner, you will have an easier time summarizing and making decisions based on this data.

The manual approach to organizing your data

The classic method of structuring qualitative data is to plot all the raw data you’ve gathered into a spreadsheet.

Typically, research and support teams would share large Excel sheets and different business units would make sense of the qualitative feedback data on their own. Each team collects and organizes the data in a way that best suits them, which means the feedback tends to be kept in separate silos.

An alternative and a more robust solution is to store feedback in a central database, like Snowflake or Amazon Redshift .

Keep in mind that when you organize your data in this way, you are often preparing it to be imported into another software. If you go the route of a database, you would need to use an API to push the feedback into a third-party software.

Computer-assisted qualitative data analysis software (CAQDAS)

Traditionally within the manual analysis approach (but not always), qualitative data is imported into CAQDAS software for coding.

In the early 2000s, CAQDAS software was popularised by developers such as ATLAS.ti, NVivo and MAXQDA and eagerly adopted by researchers to assist with the organizing and coding of data.  

The benefits of using computer-assisted qualitative data analysis software:

  • Assists in the organizing of your data
  • Opens you up to exploring different interpretations of your data analysis
  • Allows you to share your dataset easier and allows group collaboration (allows for secondary analysis)

However you still need to code the data, uncover the themes and do the analysis yourself. Therefore it is still a manual approach.

The user interface of CAQDAS software 'NVivo'

Organizing your qualitative data in a feedback repository

Another solution to organizing your qualitative data is to upload it into a feedback repository where it can be unified with your other data , and easily searchable and taggable. There are a number of software solutions that act as a central repository for your qualitative research data. Here are a couple solutions that you could investigate:  

  • Dovetail: Dovetail is a research repository with a focus on video and audio transcriptions. You can tag your transcriptions within the platform for theme analysis. You can also upload your other qualitative data such as research reports, survey responses, support conversations, and customer interviews. Dovetail acts as a single, searchable repository. And makes it easier to collaborate with other people around your qualitative research.
  • EnjoyHQ: EnjoyHQ is another research repository with similar functionality to Dovetail. It boasts a more sophisticated search engine, but it has a higher starting subscription cost.

Organizing your qualitative data in a feedback analytics platform

If you have a lot of qualitative customer or employee feedback, from the likes of customer surveys or employee surveys, you will benefit from a feedback analytics platform. A feedback analytics platform is a software that automates the process of both sentiment analysis and thematic analysis . Companies use the integrations offered by these platforms to directly tap into their qualitative data sources (review sites, social media, survey responses, etc.). The data collected is then organized and analyzed consistently within the platform.

If you have data prepared in a spreadsheet, it can also be imported into feedback analytics platforms.

Once all this rich data has been organized within the feedback analytics platform, it is ready to be coded and themed, within the same platform. Thematic is a feedback analytics platform that offers one of the largest libraries of integrations with qualitative data sources.

Some of qualitative data integrations offered by Thematic

Step 3: Coding your qualitative data

Your feedback data is now organized in one place. Either within your spreadsheet, CAQDAS, feedback repository or within your feedback analytics platform. The next step is to code your feedback data so we can extract meaningful insights in the next step.

Coding is the process of labelling and organizing your data in such a way that you can then identify themes in the data, and the relationships between these themes.

To simplify the coding process, you will take small samples of your customer feedback data, come up with a set of codes, or categories capturing themes, and label each piece of feedback, systematically, for patterns and meaning. Then you will take a larger sample of data, revising and refining the codes for greater accuracy and consistency as you go.

If you choose to use a feedback analytics platform, much of this process will be automated and accomplished for you.

The terms to describe different categories of meaning (‘theme’, ‘code’, ‘tag’, ‘category’ etc) can be confusing as they are often used interchangeably.  For clarity, this article will use the term ‘code’.

To code means to identify key words or phrases and assign them to a category of meaning. “I really hate the customer service of this computer software company” would be coded as “poor customer service”.

How to manually code your qualitative data

  • Decide whether you will use deductive or inductive coding. Deductive coding is when you create a list of predefined codes, and then assign them to the qualitative data. Inductive coding is the opposite of this, you create codes based on the data itself. Codes arise directly from the data and you label them as you go. You need to weigh up the pros and cons of each coding method and select the most appropriate.
  • Read through the feedback data to get a broad sense of what it reveals. Now it’s time to start assigning your first set of codes to statements and sections of text.
  • Keep repeating step 2, adding new codes and revising the code description as often as necessary.  Once it has all been coded, go through everything again, to be sure there are no inconsistencies and that nothing has been overlooked.
  • Create a code frame to group your codes. The coding frame is the organizational structure of all your codes. And there are two commonly used types of coding frames, flat, or hierarchical. A hierarchical code frame will make it easier for you to derive insights from your analysis.
  • Based on the number of times a particular code occurs, you can now see the common themes in your feedback data. This is insightful! If ‘bad customer service’ is a common code, it’s time to take action.

We have a detailed guide dedicated to manually coding your qualitative data .

Example of a hierarchical coding frame in qualitative data analysis

Using software to speed up manual coding of qualitative data

An Excel spreadsheet is still a popular method for coding. But various software solutions can help speed up this process. Here are some examples.

  • CAQDAS / NVivo - CAQDAS software has built-in functionality that allows you to code text within their software. You may find the interface the software offers easier for managing codes than a spreadsheet.
  • Dovetail/EnjoyHQ - You can tag transcripts and other textual data within these solutions. As they are also repositories you may find it simpler to keep the coding in one platform.
  • IBM SPSS - SPSS is a statistical analysis software that may make coding easier than in a spreadsheet.
  • Ascribe - Ascribe’s ‘Coder’ is a coding management system. Its user interface will make it easier for you to manage your codes.

Automating the qualitative coding process using thematic analysis software

In solutions which speed up the manual coding process, you still have to come up with valid codes and often apply codes manually to pieces of feedback. But there are also solutions that automate both the discovery and the application of codes.

Advances in machine learning have now made it possible to read, code and structure qualitative data automatically. This type of automated coding is offered by thematic analysis software .

Automation makes it far simpler and faster to code the feedback and group it into themes. By incorporating natural language processing (NLP) into the software, the AI looks across sentences and phrases to identify common themes meaningful statements. Some automated solutions detect repeating patterns and assign codes to them, others make you train the AI by providing examples. You could say that the AI learns the meaning of the feedback on its own.

Thematic automates the coding of qualitative feedback regardless of source. There’s no need to set up themes or categories in advance. Simply upload your data and wait a few minutes. You can also manually edit the codes to further refine their accuracy.  Experiments conducted indicate that Thematic’s automated coding is just as accurate as manual coding .

Paired with sentiment analysis and advanced text analytics - these automated solutions become powerful for deriving quality business or research insights.

You could also build your own , if you have the resources!

The key benefits of using an automated coding solution

Automated analysis can often be set up fast and there’s the potential to uncover things that would never have been revealed if you had given the software a prescribed list of themes to look for.

Because the model applies a consistent rule to the data, it captures phrases or statements that a human eye might have missed.

Complete and consistent analysis of customer feedback enables more meaningful findings. Leading us into step 4.

Step 4: Analyze your data: Find meaningful insights

Now we are going to analyze our data to find insights. This is where we start to answer our research questions. Keep in mind that step 4 and step 5 (tell the story) have some overlap . This is because creating visualizations is both part of analysis process and reporting.

The task of uncovering insights is to scour through the codes that emerge from the data and draw meaningful correlations from them. It is also about making sure each insight is distinct and has enough data to support it.

Part of the analysis is to establish how much each code relates to different demographics and customer profiles, and identify whether there’s any relationship between these data points.

Manually create sub-codes to improve the quality of insights

If your code frame only has one level, you may find that your codes are too broad to be able to extract meaningful insights. This is where it is valuable to create sub-codes to your primary codes. This process is sometimes referred to as meta coding.

Note: If you take an inductive coding approach, you can create sub-codes as you are reading through your feedback data and coding it.

While time-consuming, this exercise will improve the quality of your analysis. Here is an example of what sub-codes could look like.

Example of sub-codes

You need to carefully read your qualitative data to create quality sub-codes. But as you can see, the depth of analysis is greatly improved. By calculating the frequency of these sub-codes you can get insight into which  customer service problems you can immediately address.

Correlate the frequency of codes to customer segments

Many businesses use customer segmentation . And you may have your own respondent segments that you can apply to your qualitative analysis. Segmentation is the practise of dividing customers or research respondents into subgroups.

Segments can be based on:

  • Demographic
  • And any other data type that you care to segment by

It is particularly useful to see the occurrence of codes within your segments. If one of your customer segments is considered unimportant to your business, but they are the cause of nearly all customer service complaints, it may be in your best interest to focus attention elsewhere. This is a useful insight!

Manually visualizing coded qualitative data

There are formulas you can use to visualize key insights in your data. The formulas we will suggest are imperative if you are measuring a score alongside your feedback.

If you are collecting a metric alongside your qualitative data this is a key visualization. Impact answers the question: “What’s the impact of a code on my overall score?”. Using Net Promoter Score (NPS) as an example, first you need to:

  • Calculate overall NPS
  • Calculate NPS in the subset of responses that do not contain that theme
  • Subtract B from A

Then you can use this simple formula to calculate code impact on NPS .

Visualizing qualitative data: Calculating the impact of a code on your score

You can then visualize this data using a bar chart.

You can download our CX toolkit - it includes a template to recreate this.

Trends over time

This analysis can help you answer questions like: “Which codes are linked to decreases or increases in my score over time?”

We need to compare two sequences of numbers: NPS over time and code frequency over time . Using Excel, calculate the correlation between the two sequences, which can be either positive (the more codes the higher the NPS, see picture below), or negative (the more codes the lower the NPS).

Now you need to plot code frequency against the absolute value of code correlation with NPS. Here is the formula:

Analyzing qualitative data: Calculate which codes are linked to increases or decreases in my score

The visualization could look like this:

Visualizing qualitative data trends over time

These are two examples, but there are more. For a third manual formula, and to learn why word clouds are not an insightful form of analysis, read our visualizations article .

Using a text analytics solution to automate analysis

Automated text analytics solutions enable codes and sub-codes to be pulled out of the data automatically. This makes it far faster and easier to identify what’s driving negative or positive results. And to pick up emerging trends and find all manner of rich insights in the data.

Another benefit of AI-driven text analytics software is its built-in capability for sentiment analysis, which provides the emotive context behind your feedback and other qualitative textual data therein.

Thematic provides text analytics that goes further by allowing users to apply their expertise on business context to edit or augment the AI-generated outputs.

Since the move away from manual research is generally about reducing the human element, adding human input to the technology might sound counter-intuitive. However, this is mostly to make sure important business nuances in the feedback aren’t missed during coding. The result is a higher accuracy of analysis. This is sometimes referred to as augmented intelligence .

Codes displayed by volume within Thematic. You can 'manage themes' to introduce human input.

Step 5: Report on your data: Tell the story

The last step of analyzing your qualitative data is to report on it, to tell the story. At this point, the codes are fully developed and the focus is on communicating the narrative to the audience.

A coherent outline of the qualitative research, the findings and the insights is vital for stakeholders to discuss and debate before they can devise a meaningful course of action.

Creating graphs and reporting in Powerpoint

Typically, qualitative researchers take the tried and tested approach of distilling their report into a series of charts, tables and other visuals which are woven into a narrative for presentation in Powerpoint.

Using visualization software for reporting

With data transformation and APIs, the analyzed data can be shared with data visualisation software, such as Power BI or Tableau , Google Studio or Looker. Power BI and Tableau are among the most preferred options.

Visualizing your insights inside a feedback analytics platform

Feedback analytics platforms, like Thematic, incorporate visualisation tools that intuitively turn key data and insights into graphs.  This removes the time consuming work of constructing charts to visually identify patterns and creates more time to focus on building a compelling narrative that highlights the insights, in bite-size chunks, for executive teams to review.

Using a feedback analytics platform with visualization tools means you don’t have to use a separate product for visualizations. You can export graphs into Powerpoints straight from the platforms.

Two examples of qualitative data visualizations within Thematic

Conclusion - Manual or Automated?

There are those who remain deeply invested in the manual approach - because it’s familiar, because they’re reluctant to spend money and time learning new software, or because they’ve been burned by the overpromises of AI.  

For projects that involve small datasets, manual analysis makes sense. For example, if the objective is simply to quantify a simple question like “Do customers prefer X concepts to Y?”. If the findings are being extracted from a small set of focus groups and interviews, sometimes it’s easier to just read them

However, as new generations come into the workplace, it’s technology-driven solutions that feel more comfortable and practical. And the merits are undeniable.  Especially if the objective is to go deeper and understand the ‘why’ behind customers’ preference for X or Y. And even more especially if time and money are considerations.

The ability to collect a free flow of qualitative feedback data at the same time as the metric means AI can cost-effectively scan, crunch, score and analyze a ton of feedback from one system in one go. And time-intensive processes like focus groups, or coding, that used to take weeks, can now be completed in a matter of hours or days.

But aside from the ever-present business case to speed things up and keep costs down, there are also powerful research imperatives for automated analysis of qualitative data: namely, accuracy and consistency.

Finding insights hidden in feedback requires consistency, especially in coding.  Not to mention catching all the ‘unknown unknowns’ that can skew research findings and steering clear of cognitive bias.

Some say without manual data analysis researchers won’t get an accurate “feel” for the insights. However, the larger data sets are, the harder it is to sort through the feedback and organize feedback that has been pulled from different places.  And, the more difficult it is to stay on course, the greater the risk of drawing incorrect, or incomplete, conclusions grows.

Though the process steps for qualitative data analysis have remained pretty much unchanged since psychologist Paul Felix Lazarsfeld paved the path a hundred years ago, the impact digital technology has had on types of qualitative feedback data and the approach to the analysis are profound.  

If you want to try an automated feedback analysis solution on your own qualitative data, you can get started with Thematic .

research methods of qualitative data analysis

Community & Marketing

Tyler manages our community of CX, insights & analytics professionals. Tyler's goal is to help unite insights professionals around common challenges.

We make it easy to discover the customer and product issues that matter.

Unlock the value of feedback at scale, in one platform. Try it for free now!

  • Questions to ask your Feedback Analytics vendor
  • How to end customer churn for good
  • Scalable analysis of NPS verbatims
  • 5 Text analytics approaches
  • How to calculate the ROI of CX

Our experts will show you how Thematic works, how to discover pain points and track the ROI of decisions. To access your free trial, book a personal demo today.

Secondary menu

  • Log in to your Library account
  • Hours and Maps
  • Connect from Off Campus
  • UC Berkeley Home

Search form

Research methods--quantitative, qualitative, and more: qualitative research.

  • Quantitative Research
  • Qualitative Research
  • Data Science Methods (Machine Learning, AI, Big Data)
  • Text Mining and Computational Text Analysis
  • Evidence Synthesis/Systematic Reviews
  • Get Data, Get Help!

About Qualitative Data

Qualitative data are data representing information and concepts that are not represented by numbers. They are often gathered from interviews and focus groups, personal diaries and lab notebooks, maps, photographs, and other printed materials or observations. Qualitative data are distinguished from  quantitative data , which focus primarily on data that can be represented with numbers. 

Qualitative data can be analyzed in multiple ways. One common method is data coding, which refers to the process of transforming the raw collected data into a set of meaningful categories that describe essential concepts of the data. Qualitative data and methods may be used more frequently in humanities or social science research and may be collected in descriptive studies.

(From the Data Glossary , National Center for Data Services, National Library of Medicine)

Methods Texts

Below are some methods texts recommended by qualitative workshop leaders from the UC Berkeley Library and the D-Lab: 

UCB access only

Workshops and Training

  • Managing qualitative data 101 Tips on managing qualitative materials from your qualitative research librarian.
  • D-Lab workshops Free online workshops on quant and qualitative skills, including coding and using qualitative analysis software.
  • Institute for the Study of Societal Issues (ISSI) Training Ethnographic methods workshop from a campus institute.
  • Qualitative Methods classes Filter to upcoming semesters and look for qualitative methods classes; the Graduate School of Education and School of Public Health offer extensive methods training.

Qualitative Data Analysis Software

Unfortunately, Berkeley does not yet have a sitewide license for any qualitative analysis software.

If you are a student, you can find affordable student licenses with a web search.

If you are a faculty member, instructor, lecturer, or visiting scholar without grant funding, unfortunately software is quite expensive.

You can find reviews of many qualitative software packages at this University of Surrey link:

  • Choosing an Appropriate CAQDAS package .

You can also check out the websites of several major options below: 

  • Taguette Taguette has fewer features than other qualitative analysis software, but is free and open-source.
  • Atlas.ti Atlas.ti is a major qualitative analysis software, and has affordable licenses for students.
  • MaxQDA MaxQDA is a major qualitative analysis software, with affordable student licenses. The D-Lab often teaches workshops on this software.
  • NVIVO NVIVO is an established QDA software, with affordable student licenses.
  • Dedoose Dedoose supports qualitaive and mixed methods research, using an online interface. Students pay $11 per month.

Resources for Qualitative Data Management

  • Managing and Sharing Qualitative Data 101 This page from Berkeley's research data management website offers several things to consider.
  • Tutorials on Ethnographic Data Management This curricula includes eight presentations and accompanying exercises for you to think through your qualitative data project--or coach others to do the same.
  • Support Your Data: Evaluation Rubric Download the evaluation rubric on this page to assess where you are with qualitative data management, and consider areas to explore next.
  • The Qualitative Data Repository (QDR) QDR is one of the top US-based repositories focused on the challenges of managing, storing, and sharing qualitative research materials.
  • Research Data @ Berkeley Email Research Data for a consultation about how to set up your qualitative data management plan; they can help you locate other resources on campus.

Mixed Methods Research

Interpretations related to mixed (sometimes called merged) methods vary; be wary of jargon!  Gery Ryan, of the Kaiser Permanente School of Medicine, gives these definitions, while arguing that we should be thinking of the purposes of the research rather than the methodological labels:

Mixed methods research : “Combines elements of qualitative and quantitative research approaches (e. g., use of qualitative and quantitative viewpoints, data collection, analysis, inference techniques) for the broad purposes of breadth and depth of understanding and corroboration.”

Multimethod research : “Either solely combine multiple qualitative approaches or solely combine multiple quantitative approaches.”

Data triangulation : “Uses multiple sources of data or multiple approaches to analyzing data to enhance the credibility of a research study.”

(From " Mixed Methods Research Designs and Data Triangulation " by Gery Ryan, Kaiser Permanente School of Medicine)

  • << Previous: Quantitative Research
  • Next: Data Science Methods (Machine Learning, AI, Big Data) >>
  • Last Updated: Apr 25, 2024 11:09 AM
  • URL: https://guides.lib.berkeley.edu/researchmethods
  • Tools and Resources
  • Customer Services
  • Original Language Spotlight
  • Alternative and Non-formal Education 
  • Cognition, Emotion, and Learning
  • Curriculum and Pedagogy
  • Education and Society
  • Education, Change, and Development
  • Education, Cultures, and Ethnicities
  • Education, Gender, and Sexualities
  • Education, Health, and Social Services
  • Educational Administration and Leadership
  • Educational History
  • Educational Politics and Policy
  • Educational Purposes and Ideals
  • Educational Systems
  • Educational Theories and Philosophies
  • Globalization, Economics, and Education
  • Languages and Literacies
  • Professional Learning and Development
  • Research and Assessment Methods
  • Technology and Education
  • Share This Facebook LinkedIn Twitter

Article contents

Qualitative data analysis and the use of theory.

  • Carol Grbich Carol Grbich Flinders University
  • https://doi.org/10.1093/acrefore/9780190264093.013.554
  • Published online: 23 May 2019

The role of theory in qualitative data analysis is continually shifting and offers researchers many choices. The dynamic and inclusive nature of qualitative research has encouraged the entry of a number of interested disciplines into the field. These discipline groups have introduced new theoretical practices that have influenced and diversified methodological approaches. To add to these, broader shifts in chronological theoretical orientations in qualitative research can be seen in the four waves of paradigmatic change; the first wave showed a developing concern with the limitations of researcher objectivity, and empirical observation of evidence based data, leading to the second wave with its focus on realities - mutually constructed by researcher and researched, participant subjectivity, and the remedying of societal inequalities and mal-distributed power. The third wave was prompted by the advent of Postmodernism and Post- structuralism with their emphasis on chaos, complexity, intertextuality and multiple realities; and most recently the fourth wave brought a focus on visual images, performance, both an active researcher and an interactive audience, and the crossing of the theoretical divide between social science and classical physics. The methods and methodological changes, which have evolved from these paradigm shifts, can be seen to have followed a similar pattern of change. The researcher now has multiple paradigms, co-methodologies, diverse methods and a variety of theoretical choices, to consider. This continuum of change has shifted the field of qualitative research dramatically from limited choices to multiple options, requiring clarification of researcher decisions and transparency of process. However, there still remains the difficult question of the role that theory will now play in such a high level of complex design and critical researcher reflexivity.

  • qualitative research
  • data analysis
  • methodologies

Theory and Qualitative Data Analysis

Researchers new to qualitative research, and particularly those coming from the quantitative tradition, have often expressed frustration at the need for what appears to be an additional and perhaps unnecessary process—that of the theoretical interpretation of their carefully designed, collected, and analyzed data. The justifications for this process have tended to fall into one of two areas: the need to lift data to a broader interpretation beyond the Monty Pythonesque “this is my theory and it’s my very own,” to illumination of findings from another perspective—by placing the data in its relevant discipline field for comparison with previous theoretical data interpretations, while possibly adding something original to the field.

“Theory” is broadly seen as a set of assumptions or propositions, developed from observation or investigation of perceived realties, that attempt to provide an explanation of relationships or phenomena. The framing of data via theoretical imposition can occur at different levels. At the lowest level, various concepts such as “role,” “power,” “socialization,” “evaluation,” or “learning styles” refer to limited aspects of social organization and are usually applied to a specific group of people.

At a more complex level, theories of the Middle Range, identified by Robert Merton to link theory and practice, are used to build theory from empirical data. These tend to be discipline specific and incorporate concepts plus variables such as “gender,” “race,” or “class.” Concepts and variables are then combined into meaningful statements, which can be applied to more diverse social groups. For example, in education an investigation of student performance could emphasize such concepts as “safety,” “zero bullying,” “communication,” and “tolerance,” with variables such as “race” and “gender” to lead to a statement that good microsystems and a focus on individual needs are necessary for optimal student performance.

The third and most complex level uses the established or grand theories such as those of Sigmund Freud’s stages of children’s development, Jean Piaget’s theory of cognitive development, or Urie Bronfenbrenner’s ecological systems, which have been widely accepted as meaningful across a number of disciplines and provide abstract explanations of the uniformity of aspects of social organization, social behavior, and social change.

The trend in qualitative research regarding the application of chosen levels of theory has been generally either toward theory direction/verification or theory generation, although the two are often intertwined. In the first, a relevant existing theory is chosen early and acts as a point of critical comparison for the data to be collected. This approach requires the researcher to think theoretically as s/he designs the study, collects data, and collates it into analytical groupings. The danger of theory direction is that an over focus on a chosen theoretical orientation may limit what the researcher can access or “see” in the data, but on the upside, this approach can also enable the generation of new theoretical aspects, as it is rare that findings will fall precisely within the implications of existing statements. Theory generation is a much looser approach and involves either one or a range of relevant levels of theory being identified at any point in the research process, and from which, in conjunction with data findings, some new combination or distillation can enhance interpretation.

The question of whether a well-designed study should negate the need for theoretical interpretation has been minimally debated. Mehdi and Mansor ( 2010 ) identified three trends in the literature on this topic: that theory in qualitative research relates to integrated methodology and epistemology; that theory is a separate and additional element to any methodological underpinnings; and that theory has no solid relationship with qualitative research. No clear agreement on any of these is evident. Overall, there appears to be general acceptance that the process of using theory, albeit etically (imposed) or emically (integrated), enhances outcomes, and moves research away from being a-theoretical or unilluminated by other ideas. However, regarding praxis, a closer look at the issue of the use of theory and data may be in order. Theoretical interpretation, as currently practiced, has limits. To begin with, the playing field is not level. In the grounded theory tradition, Glaser and Strauss ( 1967 ) were initially clear that in order to prevent undue influence on design and interpretation, the researcher should avoid reviewing the literature on a topic until after some data collection and analysis had been undertaken. The presumption that most researchers would already be well versed in theory/ies and would have a broad spectrum to draw on in order to facilitate the constant comparative process from which data-based concepts could be generated was found to be incorrect. Glaser ( 1978 ) suggested this lack could be improved at the conceptual level via personal and professional reflexivity.

This issue became even more of a problem with the advent of practice-led disciplines such as education and health into the field of qualitative research. These groups had not been widely exposed to the theories of the traditional social sciences such as sociology, psychology, and philosophy, although in education they would have been familiar with John Dewey’s concept of “pragmatism” linking learning with hands-on activity, and were more used to developing and using models of practice for comparison with current realities. By the mid- 20th century , Education was more established in research and had moved toward the use of middle range theories and the late 20th-century grand theorists: Michel Foucault, with his emphasis on power and knowledge control, and Jurgen Habermas, with his focus on pragmatism, communication, and knowledge management.

In addition to addictive identification with particular levels of theory and discipline-preferred theories and methods, activity across qualitative research seems to fall between two extremes. At one end it involves separate processes of data collection and analysis before searching for a theoretical framework within which to discuss the findings—often choosing a framework that has gained traction in a specific discipline. This “best/most acceptable fit” approach often adds little to the relevant field beyond repetition and appears somewhat forced. At the other extreme there are those who weave methods, methodologies, data, and theory throughout the whole research process, actively critiquing and modifying it as they go, usually with the outcome of creating some new direction for both theory and practice. The majority of qualitative research practice, however, tends to fall somewhere between these two.

The final aspect of framing data lies in the impact of researchers themselves, and the early- 21st-century emphasis is on exposing relevant personal frames, particularly those of culture, gender, socioeconomic class, life experiences such as education, work, and socialization, and the researcher’s own values and beliefs. The twin purposes of this exposure are to create researcher awareness and encourage accountability for their impact on the data, as well as allowing the reader to assess the value of research outcomes in terms of potential researcher bias or prejudice. This critical reflexivity is supposed to be undertaken at all stages of the research but it is not always clear that it has occurred.

Paradigms: From Interactionism to Performativity

It appears that there are potentially five sources of theory: that which is generally available and can be sourced from different disciplines; that which is imbedded in the chosen paradigm/s; that which underpins particular methodologies; that which the researcher brings, and that which the researched incorporate within their stories. Of these, the paradigm/s chosen are probably the most influential in terms of researcher position and design. The variety of the sets of assumptions, beliefs, and researcher practices that comprise the theoretical paradigms, perspectives, or broad world views available to researchers, and within which they are expected to locate their individual position and their research approach, has shifted dramatically since the 1930s. The changes have been distinct and identifiable, with their roots located in the societal shifts prompted by political, social, and economic change.

The First Wave

The Positivist paradigm dominated research, largely unquestioned, prior to the early 20th century . It emphasized the distancing of the researcher from his/her subjects; researcher objectivity; a focus on objective, cause–effect, evidence-based data derived from empirical observation of external realities; experimental quantitative methods involving testing hypotheses; and the provision of finite answers and unassailable future predictions. From the 1930s, concerns about the limitations of findings and the veracity of research outcomes, together with improved communication and exposure to the worldviews of other cultures, led to the advent of the realist/post-positivist paradigm. Post-positivism, or critical realism, recognized that certainty in proving the truth of a hypothesis was unachievable and that outcomes were probably limited to falsification (Popper, 1963 ), that true objectivity was unattainable and that the researcher was most likely to impact on or to contaminate data, that both qualitative and quantitative approaches were valuable, and that methodological pluralism was desirable.

The Second Wave

Alongside the worldwide political shifts toward “people power” in the 1960s and 1970s, two other paradigms emerged. The first, the Interpretivist/Constructivist, focused on the social situations in which we as humans develop and how our construction of knowledge occurs through interactions with others in these contexts. This paradigm also emphasized the gaining of an understanding of the subjective views or experiences of the participants being researched, and recognized the impact of the researcher on researcher–researched mutually constructed realities. Here, theory generation is the preferred outcome to explain the what, how, and why of the findings. This usually involves the development of a conceptual model, forged from both the data gained and from the application/integration of relevant theory, to provide explanations for and interpretations of findings, together with a new perspective for the field/discipline.

The second paradigm, termed the Critical/Emancipatory, focused on locating, critiquing, and changing inequalities in society. The identification of the location of systemic power discrepancies or systematic power misuse in situations involving gender, sexuality, class, and race is expected to be followed by moves to right any oppression discovered. Here, the use of theory has been focused more on predetermined concept application for “fit.” This is because the very strong notion of problematic societal structures and power inappropriately wielded have been the dominant underpinnings.

In both the Interpretive and Critical paradigms, researcher position shifted from the elevated and distant position of positivism, to one of becoming equal with those being researched, and the notion of researcher framing emerged to cover this shift and help us—the readers—to “see” (and judge) the researcher and her/his processes of data management more clearly.

The Third Wave

In the 1980s, the next wave of paradigmatic options—postmodernism and poststructuralism—emerged. Postmodernism, with its overarching cultural implications, and poststructuralism, with its focus on language, severely challenged the construction, limitations, and claims to veracity of all knowledge and in particular the use of theory derived from siloed disciplines and confined research methods. Regardless of whether the postmodern/poststructural label is attached to grounded theory, ethnography, phenomenology, action, or evaluative designs, one general aspect that prevails is a focus on language. Language has become viewed as dubious, with notions of “slippage”—the multiple meanings of individual words, and “difference”—the difference and deferral of textual meaning (Derrida, 1970 , 1972 ), adding complexity. Double coding, irony, and juxtaposition are encouraged to further identify meaning, and to uncover aspects of social organization and behavior that have been previously marginalized or made invisible by existing discourses and discursive practices. Texts are seen as complex constructions, and intertextuality is favored, resulting in multiply constructed texts. The world is viewed as chaotic and unknowable; individuals are no longer seen as two dimensional—they are viewed as multifaceted with multiple realities. Complex “truths” are perceived as limited by time and context, requiring multiple data sets and many voices to illuminate them, and small-scale focused local research is seen as desirable. The role of researcher also changed: the politics of position and self-reflexivity dominate and the researcher needs to clearly expose past influences and formerly hidden aspects of his/her life. S/he inhabits the position of an offstage or decentered facilitator, presenting data for the reader to judge.

Theory is used mainly at the conceptual level with no particular approach being privileged. The researcher has become a “bricoleur” (Levi-Strauss, 1962 ) or handyman, using whatever methods or theories that are within reach, to adapt, craft, and meld technological skills with mythical intellectual reflection in order to create unique perspectives on the topic. Transitional interpretations dominate, awaiting further challenges and deconstruction by the next researcher in the field.

The need for multifaceted data sets in the 1990s led inevitably to a search for other research structures, and mixed and multiple methods have become topical. In crossing the divide between qualitative and quantitative approaches, the former initially developed its own sub-paradigms: pragmatist (complimentary communication and shared meanings) and transformative/emancipatory (inequalities in race, class, gender, and disability, to be righted). An increasing focus on multiple methods led to the advent of dialectics (multiple paradigm use) and critical realism (the acceptance of divergent results) (Shannon-Baker, 2016 ). The dilemmas of theory use raised by these changes include whether to segregate data sets and try to explain disparate outcomes in terms of diversity using different theories; whether to integrate them through a homogeneous “smoothing” process—one theory fits all, in order to promote a singular interpretation; or whether to let the strongest paradigm—in terms of data—dominate the theoretical findings.

The Fourth Wave

During the early 21st century , as the third wave was becoming firmly established, the Performative paradigm emerged. The incorporation of fine art–based courses into universities has challenged the prescribed rules of the doctoral thesis, initially resulting in a debate—with echoes of Glaser and Strauss—as to whether theory, if used initially, is too directive, thereby potentially contaminating the performance, or whether theory application should be an outcome to enhance performances, or even whether academic guidelines regarding theory use need to be changed to accommodate these disciplines (Bolt, 2004 ; Freeman, 2010 ; Riley & Hunter, 2009 ). Performativity is seen in terms of “effect,” a notion derived from John Austin’s ( 1962 ) assertion that words and speech utterances do not just act as descriptors of content, they have social force and impact on reality. Following this, a productive work is seen as capable of transforming reality (Bolt, 2016 ). The issue most heard here is the problem of how to judge this form of research when traditional guidelines of dependability, transformability, and trustworthiness appear to be irrelevant. Barbara Bolt suggests that drawing on Austin’s ( 1962 ) terms “locutionary” (semantic meaning), “illocutionary” (force), and “perlocutionary” (effect achieved on receivers), together with the mapping of these effects in material, effective, and discursive domains, may be useful, despite the fact that mapping transformation may be difficult to track in the short term.

During the second decade of the 21st century , however, discussions relating to the use of theory have increased dramatically in academic performative research and a variety of theoreticians are now cited apart from John Austin. These include Maurice Merleu-Ponty ( 1945 and the spatiality of lived events; Jacques Derrida ( 1982 ) on iterability, simultaneous sameness, and difference; Giles Deleuze and Felix Guatarri ( 1987 ) on rituals of material objects and transformative potential; Jean-Francois Lyotard ( 1988 ) on plurality of micro narratives, “affect,” and its silent disruption of discourse; and Bruno Latour ( 2005 ) with regard to actor network theory—where theory is used to engage with rather than to explain the world in a reflective political manner.

In performative doctoral theses, qualitative theory and methods are being creatively challenged. For example, from the discipline of theater and performance Lee Miller and Joanne/Bob Whalley ( 2010 ) disrupt the notion of usual spaces for sincere events by taking their six-hour-long performance Partly Cloudy, Chance of Rain , involving a public reaffirmation of their marriage vows, out of the usual habitats to a service station on a highway. The performance involves a choir, a band, a pianist, 20 performers dressed as brides and grooms, photographers, a TV crew, an Anglican priest, plus 50 guests. The theories applied to this event include an exploration of Marc Auge’s ( 1992 ) conception of the “non-place”; Mikhail Bakhtin’s ( 1992 ) concepts of “dialogism” (many voices) together with “heteroglossia” (juxtaposition of many voices in a dialogue); and Ludwig Wittgenstein’s ( 1953 ) discussion of the “duck rabbit”—once the rabbit is seen (participatory experience) the duck (audience) is always infected by its presence. This couple further challenged the guidelines of traditional doctoral theses by successfully negotiating two doctoral awards for a joint piece of research

A more formal example of a doctoral thesis (Reik, 2014 ) using traditional qualitative approaches has examined at school level the clash of paradigms of performative creative styles of teaching with the neoliberalist focus on testing, curriculum standardization, and student outcomes.

Leah Mercer ( 2012 ), an academic in performative studies, used the performative paradigm in her doctoral thesis to challenge and breach not only the methodological but also the theoretical silos of the quantitative–qualitative divide. The physics project is an original work using live performances of personal storytelling with video and web streaming to depict the memories, preoccupations, and the formative relationship of two women, an Australian and an American, living in contemporary mediatized society. Using scientific theory, Mercer explores personal identity by reframing the principles of contemporary physics (quantum mechanics and uncertainty principle) as aesthetic principles (uncertainty and light) with the physics of space (self), time (memory), light (inspiration), and complementarity (the reconciliation of opposites) to illuminate these experiences.

The performative paradigm has also shifted the focus on the reader, developed in postmodernism, to a broader group—an active audience. Multi-methods have been increased to include symbolic imagery, in particular visual images, as well as sound and live action. The researcher’s role here is often that of performer within a cultural frame, creating and investigating multiple realities and providing the link between the text/script and the audience/public. Theory is either minimized to the level of concepts or used to break through the silos of different disciplines to integrate and reconcile aspects from long-lasting theoretical divides.

In these chronological lines of paradigm shifts, changes in researcher position and changes in the application of theory can clearly be seen. The researcher has moved out of the shadows and into the mainstream; her/his role has shifted from an authoritarian collector and presenter of finite “truths” to a creator and often performer of multiple and disparate data images for the audience to respond to. Theory options have shifted from direction and generation within existing perspectives to creative amalgamations of concepts from disciplines previously rarely combined.

Methodologies: From Anthropology to Fine Arts

It would be a simple matter if all the researcher had to contend with was siting oneself in a particular paradigm/s. Unfortunately, not only have paradigms shifted in terms of researcher position and theoretical usage but so also have methodological choices and research design. One of the most popular methodologies, ethnography, with its roots in classical anthropology and its fieldwork-based observations of action and interaction in cultural contexts, can illustrate the process of methodological change following paradigm shift. If a researcher indicates that he/she has undertaken an ethnographic study, the reader will be most likely to query “which form?”: classical?, critical?, auto?, visual?, ethno drama?, cyber/net?, or performative? The following examples from this methodology should indicate how paradigm shifts have resulted in increasing complexity of design, methods, and interpretive options.

In c lassical ethnography the greatest borrowing is from traditional anthropology in terms of process and tools, and this can be seen with the inclusion of initial time spent in the setting to learn the language of the culture and to generally “bathe” oneself in the environment, often with minimal data collection. This process is supposed to help increase researcher understanding of the culture and minimize the problem of “othering” (treating as a different species/alien). Then a fairly lengthy amount of time is usually spent in the cultural setting either as an observer or as a participant observer to collect as much data as is relevant to answer the research question. This is followed by a return to post-check whether the findings previously gathered have stood the test of time. The analytical toolkit can involve domain analysis, freelists, pilesorts, triads and taxonomies, frame and social network, and event analysis. Truncated mini-ethnographies became more common as time became an issue, but these can still involve years of managing descriptive data, often collected by several participating researchers as seen in Douglas, Rasmussen, and Flanagan’s ( 1977 ) study of the culture of a nudist beach. Shorter versions undertaken by one researcher, for example Sohn ( 2015 ), have explored strategies of teacher and student learning in a science classroom. Theoretical interpretation can be by conceptual application for testing, such as Margaret Mead’s ( 1931 ) testing of the concept of “adolescence”—derived from American culture—in Samoan culture, or, more generally, by concept generation. The latter can be seen in David Rozenhan’s ( 1973 ) investigation of the experience of a group of researcher pseudo-patients admitted to hospitals for the mentally ill in the United States. The main concepts generated were labeling, powerlessness, and depersonalization.

De-colonial ethnography recognizes the “othering” frames of colonial and postcolonial research and takes a position that past colonial supremacy over Third World countries persists in political, economic, educational, and social constructions. Decolonizing requires a critical examination of language, attitudes, and research methods. Kakal Battacharya ( 2016 ) has exposed the micro-discourses of the continuing manifestation of colonial power in a parallel narrative written by a South Asian woman and a white American male. Concepts of colonialism and patriarchy, displayed through the discourses exposed, provide a theoretical critique.

Within critical ethnography , with its focus on power location and alleviation of oppression, Dale Spender ( 1980 ) used structured and timed observations of the styles, quality, and quantity of interaction between staff and students in a range of English classrooms. The theory-directive methodological frames of feminism and gender inequality were applied to identify and expose the lesser time and lesser quality of interaction that teachers had with female students in comparison with that assigned to male students. Widespread distribution of these results alerted education authorities and led to change, in some environments, toward introducing single-sex classrooms for certain topics. This was seen as progress toward alleviating oppressive behaviors. This approach has produced many excellent educational studies, including Peter Willis ( 1977 ) on the preparation of working-class kids for working-class jobs; Michele Fine ( 1991 ) on African American and Latino students who dropped out of a New York high school; Angela Valenzuela ( 1999 ) on emigrant and other under-achievers in American schools; Lisa Patel ( 2013 ) on inclusion and exclusion of immigrants into education; and Jean Anyon ( 1981 ) on social stratification of identical curriculum knowledge in different classrooms

A less concept-driven and more descriptive approach to critical ethnography was emphasized by Phil Carspecken’s hermeneutic approach ( 1996 ), which triggered a move toward data-generated theoretical concepts that could then be used to challenge mainstream theoretical positions.

Post-critical ethnography emphasizes power and ideology and the social practices that contribute to oppression, in particular objectivity, positionality, representation and reflexivity, and critical insufficiency or “antipower.”

Responsibility is shifted to the researcher for the world they create and critique when they interpret their research contexts (Noblit, Flores, & Murillo, 2004 ).

Autoethnography emerged from the postmodern paradigm, with its search for different “truths” and different relationships with readers, and prompted an emphasis on personal experience and documentation of the self in a particular cultural context (Ellis, 2004 ). In order to achieve this, the researcher has to inhabit the dual positions of being the focus of activities, feelings, and emotions experienced in the setting while at the same time being positioned distantly—observing and recording the behaviors of the self in that culture. Well-developed skills of critical reflexivity are required. The rejection of the power-laden discourses/grand theories of the past and the emphasis on transitional explanations has resulted in minimal theorizing and an emphasis on data display, the reader, and the reader’s response. Open presentations of data can be seen in the form of narrative storytelling, or re-presentations in the form of fiction, dramatic performances, and poetry. Carolyn Ellis ( 2004 ) has argued that “story is theory and theory is story” and our “making sense of stories” involves contributing to a broader understanding of human existence. Application/generation of concepts may also occur, and the term “Critical Autoethnography” has been used (Hughes & Pennington, 2017 ), particularly where experiences of race, class, or gender inequality are being experienced. Jennifer Potter ( 2015 ) used the concept “whiteness of silence” to introduce a critical race element into her autoethnographic experience of black–white racial hatred experiences within a university class on African American communication in which she was a student.

Visual ethnography uses a variety of tools, including photography, sketches, movies, social media, the Web and virtual reality, body art, clothing, painting, and sculpture, to demonstrate and track culture. This approach has been available for some time both as a methodology in its own right and as a method of data collection. An example of this approach, which mixes classical and visual ethnography, is Philippe Bourgois and Jeff Schonberg’s 12-year study of two dozen homeless heroin injectors and crack smokers living under a freeway overpass in San Francisco ( 2009 ). Their data comprised extensive black and white photos, dialogue, taped conversations, and fieldwork observation notes. The themes of violence, race relations, family trauma, power relations, and suffering were theoretically interpreted through reworked notions of “power” that incorporated Pierre Bourdieu’s ( 1977 , 1999 ) concepts of “symbolic violence”—linking observed practices to social domination, and “habitus”—an individual’s personal disposition comprising unique feelings and actions grounded in biography and history; Karl Marx’s “lumpen” from “lumpenproletariat” ( 1848 ), the residual class—the vagrants and beggars together with criminal elements that lie beneath the labor force; and Michel Foucault’s “biopower” ( 1978 , 2008 )—the techniques of subjugation used by the state on the population, and “governmentality” ( 1991 )—where individuals are disciplined through institutions and the “knowledge–power” nexus. The ideas of these three theorists were used to create and weave a theory of “lumpen abuse” to interpret the lives of the participants.

Ethno Drama involves transforming the results from an ethnographic study into a performance to be shared, for example the educational experiences of children and youth (Gabriel & Lester, 2013 ). The performance medium can vary from a film (Woo, 2008 ), an article presented in dramatic form (Carter, 2014 ), or more usually a play script to be staged for an audience in a theater (Ethno Theater). One of the main purposes is to provide a hearing space for voices that have been marginalized or previously silenced. These voices and their contexts can be presented by research participants, actors, or the research team, and are often directed at professionals from the field. Audience-based meetings to devise recommendations for further action may follow a performance. Because of the focus on inequality, critical theory has been the major theoretical orientation for this approach. The structure of the presentation invites audiences to identify situations of oppression, in the hope that this will inform them sufficiently to enable modification of their own practices or to be part of the development of recommendations for future change.

Lesnick and Humphrie ( 2018 ) explored the views of identity of LGBTQ+ youth between 14 and 24 years of age via interviews and online questionnaires, the transcriptions of which were woven into a script that was performed by actors presenting stories not congruent with their own racial/gender scripts in order to challenge audience expectations and labels. The research group encouraged the schools where they performed to structure discussion groups to follow the school-located performances. The scripts and discussions revealed and were lightly interpreted through concepts of homelessness, racism, and “oppression Olympics”—the way oppressed people sometimes view one another in competition rather than in solidarity. These issues were found to be relevant to both school and online communities. Support for these young people was discovered to be mostly from virtual sources, being provided by dialogues within Facebook groups.

Cyber/net or/virtual ethnographies involve the study of online communities within particular cultures. Problems which have emerged from the practice of this approach include; discovery of the researcher lurking without permission on sites, gaining prior permission which often disturbs the threads of interaction, gaining permission post–data collection but having many furious people decline participation, the “facelessness” of individuals who may have uncheckable multiple personas, and trying to make sense of very disparate data in incomplete and non-chronological order.. There has been acceptance that online and offline situations can influence each other. Dibbell ( 1993 ) demonstrated that online sexual violence toward another user’s avatar in a text-based “living room” reduced the violated person to tears as she posted pleas for the violator to be removed from the site. Theoretical interpretation at the conceptual level is common; Michel Foucault’s concept of heterotopia ( 1967 , 1984 ) was used to explain such spatio-temporal prisons as online rooms. Heterotropic spaces are seen as having the capacity to reflect and distort real and imagined experiences.

Poststructural ethnography tracks the instability of concepts both culturally and linguistically. This can be demonstrated in the deconstruction of language in education (Lather, 2001 ), particularly the contradictions and paradoxes of sexism, gender, and racism both in texts and in the classroom. These discourses are implicated in relations of power that are dynamic and within which resistance can be observed. Poststructuralism accepts that texts are multiple, as are the personas of those who created them, and that talk such as that which occurs in a classroom can be linked with knowledge control. Walter Humes ( 2000 ) discovered that the educational management discourses of “community,” “leadership,” and “participation” could be disguised by such terms as “learning communities” and “transformational leadership.” He analyzed the results with a conceptual framework derived from management theory and policy studies and linked the findings with political power.

Performative ethnography , from the post-postmodern paradigm, integrates the performances of art and theater with the focus on culture of ethnography (Denzin, 2003 ). A collaborative performance ethnography (van Katwyk & Seko, 2017 ) used a poem re-presenting themes from a previous research study on youth self-harming to form the basis of the creation of a performative dance piece. This process enabled the researcher participants to explore less dominant ways of knowing through co-learning and through the discovery of self-vulnerability. The research was driven by a social justice-derived concern that Foucault’s notion of “sovereignty” was being implemented through a web of relations that commodified and limited knowledge, and sanctioned the exploitation of individuals and communities.

This exploration of the diversity in ethnographic methods, methodologies, and interpretive strategies would be repeated in a similar trek through the interpretive, critical, postmodern, and post-postmodern approaches currently available for undertaking the various versions of grounded theory, phenomenology, feminist research, evaluation, action, or performative research.

Implications of Changes for the Researcher

The onus is now less on finding the “right” (or most familiar in a field) research approaches and following them meticulously, and much more on researchers making their own individual decisions as to which aspects of which methodologies, methods and theoretical explanations will best answer their research question. Ideally this should not be constrained by the state of the discipline they are part of; it should be equally as easy for a fine arts researcher to carry out a classical ethnography with a detailed theoretical interpretation derived from a grand theorist/s as it would be for a researcher in law to undertake a performative study with the minimum of conceptual insights and the maximum of visual and theoretical performances. Unfortunately, the reality is that trends within disciplines dictate publication access, thereby reinforcing the prevailing boundaries of knowledge.

However, the current diversity of choice has indeed shifted the field of qualitative research dramatically away from the position it was in several decades ago. The moves toward visual and performative displays may challenge certain disciplines but these approaches have now become well entrenched in others, and in qualitative research publishing. The creativity of the performative paradigm in daring to scale the siloed and well-protected boundaries of science in order to combine theoretical physics with the theories of social science, and to re-present data in a variety of newer ways from fiction to poetry to researcher performances, is exciting.

Given that theoretical as well as methodological and methods’ domains are now wide open to researchers to pick and choose from, two important aspects—justification and transparency of process—have become essential elements in the process of convincing the reader.

Justification incorporates the why of decision-making. Why was the research question chosen? Why was the particular paradigm, or paradigms, chosen best for the question? Why were the methodology and methods chosen most appropriate for both the paradigm/s and research question/s? And why were the concepts used the most appropriate and illuminating for the study?

Transparency of process not only requires that the researcher clarifies who they are in the field with relation to the research question and the participants chosen, but demands an assessment of what impact their background and personal and professional frames have had on research decisions at all stages from topic choice to theoretical analysis. Problems faced in the research process and how they were managed or overcome also requires exposition as does the chronology of decisions made and changed at all points of the research process.

Now to the issue of theory and the question of “where to?” This brief walk through the paradigmatic, methodological, and theoretical changes has demonstrated a significant move from the use of confined paradigms with limited methodological options to the availability of multiple paradigms, co-methodologies, and methods of many shades, for the researcher to select among Regarding theory use, there has been a clear move away from grand and middle range theories toward the application of individual concepts drawn from a variety of established and minor theoreticians and disciplines, which can be amalgamated into transitory explanations. The examples of theoretical interpretation presented in this article, in my view, very considerably extend, frame, and often shed new light on the themes that have been drawn out via analytical processes. Well-argued theory at any level is a great enhancer, lifting data to heights of illumination and comparison, but it could equally be argued that in the presence of critical researcher reflexivity, complex, layered, longitudinal, and well-justified design, meticulous analysis, and monitored audience response, it may no longer be essential.

Bibliography

  • Eco, U. (1979). The role of the reader . Bloomington: Indiana University Press.
  • Gadamer, H. (1989). Truth and method ( J. Weinheimer & D. Marshall , Trans.). NY: Crossroad.
  • Grbich, C. (2004). New approaches in social Research . London, U.K.: SAGE.
  • Grbich, C. (2013). Qualitative data analysis: An introduction . London, U.K.: SAGE.
  • Lincoln, Y. , & Denzin, N. (2000). Paradigmatic controversies: Contradictions and emerging confluences. In N. Denzin and Y. Lincoln , Handbook of qualitative research (2nd ed.). Thousand Oaks, CA: SAGE.
  • Lyotard, J. (1983). Answering the question: What is post modernism? In E. Hassan & S. Hassan (Eds.), Innovation and renovation . Madison: University of Wisconsin Press.
  • Pink, S. (2012). Advances in visual methodology . Thousand Oaks, CA: SAGE.
  • Riley, S. , & Hunter, L. (Eds.). (2009). Mapping landscapes for performance as research . London: Palgrave Macmillan.
  • Tinkler, P. (2011). Using photography in social and historic research . New Dehli, India: SAGE.
  • Vis, F. , & Thelwall, M. (2012). Researching social media . New Dehli, India: SAGE.
  • Von Hantelmann, D. (2010). How to do things with art . Zurich, Switzerland: JRP Ringier.
  • Anyon, J. (1981). Social class and school knowledge . Curriculum Inquiry , 11 (1), 3–42.
  • Auge, M. (1992). Non-Places: An introduction to supermodernity ( John Howe ). London, U.K.: Verso.
  • Augé, M. (1995). Non-places: An introduction to anthropology of supermodernity . London, U.K.: Verso.
  • Austin, J. (1962). How to do things with words [The William James Lectures, 1955]. Oxford, U.K.: Oxford University Press.
  • Bakhtin, M. (1992). The dialogic imagination: Four essays . Austin: University of Texas Press.
  • Battacharya, K. , & Gillen, N. (2016). Power, race and higher education: A cross-cultural parallel narrative . Rotterdam, The Netherlands: Sense.
  • Bolt, B. (2004). Art beyond representation: The performative power of the image . London, U.K.: I.B Tauris.
  • Bolt, B. (2016). Artistic research: A performative paradigm? Parse. #3 Repetitions and Reneges . Gothenburg, Sweden: University of Gothenburg.
  • Bourdieu, P. (1977). Outline of a theory of practice . Cambridge, U.K.: Cambridge University Press.
  • Bourdieu, P. (1999) The Weight of the world: Social suffering in contemporary society . Cambridge, U.K.: Polity Press.
  • Bourgois, P. , & Schonberg, J. (2009). Righteous dopefiend: Homelessness, addiction, and poverty in urban America . Berkeley: University of California Press.
  • Bronfenbrenner, U. (1979). The ecology of human development: Experiments by nature and design . Cambridge, MA: Harvard University Press.
  • Carspecken, P. (1996). Critical ethnography in educational research . New York: Routledge.
  • Carter, M. (2014). The teacher monologues: Exploring the identities and experiences of artist-teachers . Rotterdam, The Netherlands: Sense.
  • Deleuze, G. , & Guattari, F. (1987). A thousand Plateaus: Capitalism and schizophrenia ( B. Massumi , Trans.). Minneapolis: University of Minnesota Press.
  • Denzin, N. (2003). Performance ethnography : Critical pedagogy and the politics of culture . Thousand Oaks, CA: SAGE.
  • Derrida, J. (1970). Structure, sign and play in the discourse of human sciences lecture presented at johns hopkins university , October 21, 1966.
  • Derrida, J. (1972). Margins of philosophy: Plato to footnotes ( A. Bass , Trans.). Sussex, U.K.: Brighton and Harvester Press.
  • Derrida, J. (1982). Sending: On representation . Social Research , 49 (2), 294–326.
  • Dewey, J. (1998). The essential Dewey ( L. Hickman & T. Alexander , Eds.). (2 vols.) Bloomington: Indiana University Press.
  • Dibbell, J. (1993). A rape in cyberspace from chapter 1 of My Tiny Life . Published in The Village Voice .
  • Douglas, J. , Rasmussen, P. , & Fanagan, C. (1977). The nude beach . Thousand Oaks, CA: SAGE.
  • Ellis, C. (2004). The ethnographic I: A methodological novel about autoethnography . Walnut Creek, CA: AltaMira.
  • Fine, M. (1991). Framing dropouts: Notes on the politics of an urban high school . New York, NY: SUNY Press.
  • Foucault, M. (1967, 1984). Des espaces autres (J. Miscowiec, Trans.). [ Of other spaces: Utopias and heterotropias]. Archtecture/Mouvement/Continuite , October.
  • Foucault, M. (1978). Security, territory, population. Lectures at the Collège de France 1977–1978 ( A. Davidson , Ed.). New York, NY: Palgrave Macmillan.
  • Foucault, M. (1991). Studies in governmentality: Two lectures by and an interview with Michel Foucault ( G. Burchell , C. Gordon , & and P. Miller , Eds.). Chicago: University of Chicago Press.
  • Foucault, M. (2008). The birth of biopolitics . Lectures at the Collège de France 1978–1979 ( M. Senellart , Ed, Trans.; G. Burchell , Trans.). New York, NY: Palgrave Macmillan. (First published as Naissance de la Biopolitique 2004).
  • Freeman, J. (2010). Blood, sweat and theory: Research through practice in performance . Faringdon, U.K.: Libri.
  • Freud, S. (1953–1974). The standard edition of the complete psychological works of Sigmund Freud ( J. Strachey , Trans. and Ed.) (24 Vols). London, U.K.: Hogarth and the Institute of Psycho-Analysis.
  • Gabriel, R. , & Lester, J. (2013). Performances of research: Critical issues in K–12 education . New York, NY: Peter Lang.
  • Glaser, B. (1978). Theoretical sensitivity: Advances in the theory of grounded theory . Mill Valley, CA: Sociology Press.
  • Glaser, B. , & Strauss, A. (1967). The discovery of grounded theory: Strategies for qualitative research . London: Aldine.
  • Hughes, S. , & Pennington, J. (2017). Autoethnography: Process, product and possibility for critical social research . Thousand Oaks, CA: SAGE.
  • Humes, W. (2000). The discourses of educational management . Journal of Educational Enquiry , 1 (1).
  • Lather, P. (2001). Postmodernism, post-structuralism and post(critical) ethnography: Of ruins, aporias and angels. In P. Atkinson , A. Coffey , S. Delamont , J. Lofland , & L. Lofland (Eds.), Handbook of ethnography . Thousand Oaks, CA: SAGE.
  • Latour, B. (2005). Reassembling the social: An introduction to actor–network theory . Oxford, U.K.: Oxford University Press.
  • Lesnick, E. , & Humphrie, J. (2018). Amplifying youth voices through ethnodrama research . Washington, DC: National Association of Independent Schools.
  • Levi-Strauss, C. (1962). The savage mind ( La pensee sauvage) ( G. Weidenfield & Nicholson Ltd. , Trans.). Chicago, IL: University of Chicago Press.
  • Lyotard, J. (1988). Peregrinations: Law, form, event . The Wellek Library lectures at the University of California, Irvine. New York, NY.: Columbia University Press.
  • Marx, K. , & Engels, F. (1848). The Communist Manifesto ( S. Moore , Trans.).
  • Mead, M. (1931). The life of a Samoan girl: ALL TRUE. The record of actual adventures that have happened to ten women of today . New York, NY.: Brewer, Warren and Putnam.
  • Mehdi, T. , & Mansor, A. (2010). A General perspective on the role of theory in qualitative research . Journal of International Social Research , 3 (11).
  • Mercer, L. (2012). The Physics Project. In L. Mercer , J. Robson , & D. Fenton (Eds.), Live research: Methods of practice led inquiry in research . Nerang: Queensland, Ladyfinger.
  • Merleau-Ponty, M. (1945). Phénoménologie de la Perception . Paris, France: Editions Gallimard.
  • Merleau-Ponty, M. (2005). Phenomenology of perception ( D. Landes , Trans.). London, U.K.: Routledge.
  • Merton, R. (1968). Social theory and social structure . New York, NY: Free Press.
  • Miller, L. , & Whalley, J. (2010). Case Study 11: Partly cloudy, chance of rain. In J. Freeman (Ed.), Blood, sweat and theory: Research through practice in performance . Faringdon, U.K.: Libri.
  • Noblit, G. , Flores, S. , & Nurillo, E. (2004). Postcritical ethnography: Reinscribing critique . New York: Hampton Press.
  • Patel, L. (2013). Youth held at the border: Immigration, education and the politics of inclusion . New York, NY: Columbia University Teacher’s College Press.
  • Piaget, J. (1936). Origins of intelligence in the child . London, U.K.: Routledge and Kegan Paul.
  • Popper, K. (1963). Science: Conjectures and refutations. Online lecture transcript .
  • Potter, J. (2015). The whiteness of silence: A critical autoethnographic tale of a strategic rhetoric . Qualitative Report , 20 (9), 7.
  • Reik, R. (2014). Arts education in a culture of performativity: A case study of what is valued in one Queensland school community . (Doctoral dissertation), Griffith University, Brisbane.
  • Riley, S. R. , & Hunter, L. (Eds.). (2009). Mapping landscapes for performance as research . London, U.K.: Palgrave Macmillan.
  • Rosenhan, D. (1973). On being sane in insane places . Science , 17 (January), 250–258.
  • Shannon-Baker, P. (2016). Making paradigms meaningful in mixed methods research . Journal of Mixed Methods Research , 10 (4).
  • Sohn, L. (2015). Ethnographic case study of a high school science classroom: Strategies in STEM education . (Doctoral dissertation). Texas A&M University, Corpus Christi, Texas.
  • Spender, D. (1980). Talking in class. In D. Spender & E. Sarah (Eds.), Leaming to lose: Sexism and education . London, U.K.: Women’s Press.
  • Valenzuela, A. (1999). Subtractive schooling: U.S.–Mexican youth and the politics of caring . Albany: SUNY.
  • van Katwyk, T. , & Seko, Y. (2017). Knowing through improvisational dance: A collaborative autoethnography . Forum: Qualitative Social Research , 18 (2), 1.
  • Willis, P. (1977). Learning to labour . New York, NY: Columbia University Press.
  • Wittgenstein, L. (1953). Philosophical investigations ( G. Anscombe , Trans.). Oxford, U.K.: Blackwell.
  • Woo, Y. (2008). Engaging new audiences: Translating research into popular media. Educational Researcher , 37 (6), 321–329.

Related Articles

  • Autoethnography
  • Risky Truth-Making in Qualitative Inquiry
  • Ethnography and Education
  • Qualitative Design Research Methods
  • Qualitative Approaches to Studying Marginalized Communities
  • Arts Education Research
  • Poststructural Temporalities in School Ethnography
  • Détournement as a Qualitative Method
  • Ethnographies of Education and Anthropological Knowledge Production
  • Network Ethnography as an Approach for the Study of New Governance Structures in Education
  • Black Feminist Thought and Qualitative Research in Education
  • Poetic Approaches to Qualitative Data Analysis
  • Performance-Based Ethnography
  • Complexity Theory as a Guide to Qualitative Methodology in Teacher Education
  • Observing Schools and Classrooms

Printed from Oxford Research Encyclopedias, Education. Under the terms of the licence agreement, an individual user may print out a single article for personal use (for details see Privacy Policy and Legal Notice).

date: 24 May 2024

  • Cookie Policy
  • Privacy Policy
  • Legal Notice
  • Accessibility
  • [66.249.64.20|185.80.151.9]
  • 185.80.151.9

Character limit 500 /500

Research-Methodology

Qualitative Data Analysis

Qualitative data refers to non-numeric information such as interview transcripts, notes, video and audio recordings, images and text documents. Qualitative data analysis can be divided into the following five categories:

1. Content analysis . This refers to the process of categorizing verbal or behavioural data to classify, summarize and tabulate the data.

2. Narrative analysis . This method involves the reformulation of stories presented by respondents taking into account context of each case and different experiences of each respondent. In other words, narrative analysis is the revision of primary qualitative data by researcher.

3. Discourse analysis . A method of analysis of naturally occurring talk and all types of written text.

4. Framework analysis . This is more advanced method that consists of several stages such as familiarization, identifying a thematic framework, coding, charting, mapping and interpretation.

5. Grounded theory . This method of qualitative data analysis starts with an analysis of a single case to formulate a theory. Then, additional cases are examined to see if they contribute to the theory.

Qualitative data analysis can be conducted through the following three steps:

Step 1: Developing and Applying Codes . Coding can be explained as categorization of data. A ‘code’ can be a word or a short phrase that represents a theme or an idea. All codes need to be assigned meaningful titles. A wide range of non-quantifiable elements such as events, behaviours, activities, meanings etc. can be coded.

There are three types of coding:

  • Open coding . The initial organization of raw data to try to make sense of it.
  • Axial coding . Interconnecting and linking the categories of codes.
  • Selective coding . Formulating the story through connecting the categories.

Coding can be done manually or using qualitative data analysis software such as

 NVivo,  Atlas ti 6.0,  HyperRESEARCH 2.8,  Max QDA and others.

When using manual coding you can use folders, filing cabinets, wallets etc. to gather together materials that are examples of similar themes or analytic ideas. Manual method of coding in qualitative data analysis is rightly considered as labour-intensive, time-consuming and outdated.

In computer-based coding, on the other hand, physical files and cabinets are replaced with computer based directories and files. When choosing software for qualitative data analysis you need to consider a wide range of factors such as the type and amount of data you need to analyse, time required to master the software and cost considerations.

Moreover, it is important to get confirmation from your dissertation supervisor prior to application of any specific qualitative data analysis software.

The following table contains examples of research titles, elements to be coded and identification of relevant codes:

 Qualitative data coding

Step 2: Identifying themes, patterns and relationships . Unlike quantitative methods , in qualitative data analysis there are no universally applicable techniques that can be applied to generate findings. Analytical and critical thinking skills of researcher plays significant role in data analysis in qualitative studies. Therefore, no qualitative study can be repeated to generate the same results.

Nevertheless, there is a set of techniques that you can use to identify common themes, patterns and relationships within responses of sample group members in relation to codes that have been specified in the previous stage.

Specifically, the most popular and effective methods of qualitative data interpretation include the following:

  • Word and phrase repetitions – scanning primary data for words and phrases most commonly used by respondents, as well as, words and phrases used with unusual emotions;
  • Primary and secondary data comparisons – comparing the findings of interview/focus group/observation/any other qualitative data collection method with the findings of literature review and discussing differences between them;
  • Search for missing information – discussions about which aspects of the issue was not mentioned by respondents, although you expected them to be mentioned;
  • Metaphors and analogues – comparing primary research findings to phenomena from a different area and discussing similarities and differences.

Step 3: Summarizing the data . At this last stage you need to link research findings to hypotheses or research aim and objectives. When writing data analysis chapter, you can use noteworthy quotations from the transcript in order to highlight major themes within findings and possible contradictions.

It is important to note that the process of qualitative data analysis described above is general and different types of qualitative studies may require slightly different methods of data analysis.

My  e-book,  The Ultimate Guide to Writing a Dissertation in Business Studies: a step by step approach  contains a detailed, yet simple explanation of qualitative data analysis methods . The e-book explains all stages of the research process starting from the selection of the research area to writing personal reflection. Important elements of dissertations such as research philosophy, research approach, research design, methods of data collection and data analysis are explained in simple words. John Dudovskiy

Qualitative Data Analysis

  • Python For Data Analysis
  • Data Science
  • Data Analysis with R
  • Data Analysis with Python
  • Data Visualization with Python
  • Data Analysis Examples
  • Math for Data Analysis
  • Data Analysis Interview questions
  • Artificial Intelligence
  • Data Analysis Projects
  • Machine Learning
  • Deep Learning
  • Computer Vision
  • How to Use Bard for Data Analysis and Insights
  • Financial Analysis: Objectives, Methods, and Process
  • Time Series Analysis & Visualization in Python
  • Difference Between Data Visualization and Data Analytics
  • What are the 5 methods of statistical analysis?
  • What Is Spatial Analysis, and How Does It Work
  • Qualitative and Quantitative Data
  • What is Geospatial Data Analysis?
  • Data-Driven Design Decisions and Analytics Tools
  • Data analysis and Visualization with Python
  • Exploratory Data Analysis (EDA) - Types and Tools
  • Data Analytics and its type
  • Why Data Visualization Matters in Data Analytics?
  • What is Data Analysis?
  • Data analytics and career opportunities
  • Methods of Economic Analysis
  • Data Science Methodology and Approach
  • Data | Analysis Quiz | Question 1
  • What is Exploratory Data Analysis ?

Qualitative Data Analysis Methodologies and Methods

Qualitative data analysis involves interpreting non-numerical data to identify patterns, themes, and insights. There are several methodologies and methods used in qualitative data analysis.

Qualitative-Data-Analysis-Methodologies

In this article, we will explore qualitative data analysis techniques in great detail, with each method providing a different perspective on how to interpret qualitative data.

Table of Content

Types of Qualitative Data Analysis Methodologies

1. content analysis, 2. thematic analysis, 3. narrative analysis, 4. discourse analysis, 5. grounded theory analysis, 6. text analysis, 7. ethnographic analysis, advantages and disadvantages of different qualitative data analysis methodologies, best practices for qualitative data analysis, qualitative data analysis methods- faq’s.

Lets weigh the benefits and disadvantages of each:

Content analysis involves systematically reading textual content or other types of communication to perceive patterns, themes, and meanings within the content. It provides a dependent technique to inspecting huge volumes of records to discover insights or trends. Researchers categorize and code the content material based on predetermined criteria or emergent themes, taking into consideration quantitative and qualitative interpretation of the facts. Content analysis is regularly an iterative procedure, with researchers revisiting and refining the coding scheme, collecting additional facts, or accomplishing in addition analysis as needed to deepen know-how or cope with new studies questions.

There are 3 fundamental techniques to content analysis:

  • Conventional Content Analysis : In conventional content analysis, researchers technique the records with out preconceived categories or theoretical frameworks. Instead, they allow classes and themes to emerge evidently from the statistics through an iterative system of coding and analysis. This technique is exploratory and bendy, allowing for the discovery of latest insights and styles inside the content material.
  • Directed Content Analysis : Directed content material analysis entails studying the statistics based totally on existing theories or principles. Researchers start with predefined categories or subject matters derived from theoretical frameworks or previous research findings. The analysis is focused on confirming, refining, or extending present theories in place of coming across new ones. Directed content analysis is specifically beneficial whilst researchers intention to test hypotheses or explore particular concepts in the statistics.
  • Summative Content Analysis : Summative content material analysis focuses on quantifying the presence or frequency of precise content within the information. Researchers expand predetermined classes or coding schemes primarily based on predefined criteria, after which systematically code the statistics in line with those classes. The emphasis is on counting occurrences of predefined attributes or topics to provide a numerical summary of the content. Summative content material analysis is frequently used to track modifications over time, examine unique assets of content material, or verify the superiority of specific subject matters inside a dataset.

When to Use Content Analysis?

  • Exploratory Research : Content analysis is appropriate for exploratory research in which the goal is to uncover new insights, discover emerging developments, or recognize the breadth of communique on a particular subject matter.
  • Comparative Analysis: It is useful for comparative analysis, permitting researchers to compare conversation throughout extraordinary sources, time periods, or cultural contexts.
  • Historical Analysis : Content analysis can be carried out to historical research, allowing researchers to analyze ancient files, media content, or archival substances to apprehend conversation styles over the years.
  • Policy Analysis: It is valuable for policy analysis, supporting researchers look at the portrayal of problems in media or public discourse and informing coverage-making methods.
  • Market Research: Content analysis is usually utilized in market research to investigate advertising and marketing substances, social media content, and customer critiques, presenting insights into patron perceptions and possibilities.

Thematic analysis is a method for identifying, analyzing, and reporting styles or topics within qualitative records. It entails systematically coding and categorizing information to become aware of not unusual issues, styles, or ideas that emerge from the dataset. Researchers interact in a method of inductive reasoning to generate topics that capture the essence of the facts, making an allowance for interpretation and exploration of underlying meanings.

Thematic analysis is appropriate when researchers are seeking for to become aware of, analyze, and document patterns or issues inside qualitative records. It is especially beneficial for exploratory studies where the intention is to find new insights or recognize the breadth of studies and views associated with a specific phenomenon.

Thematic analysis offers a bendy and systematic approach for identifying and reading styles or topics within qualitative statistics, making it a treasured method for exploring complex phenomena and producing insights that inform concept, exercise, and policy.

When to use Thematic analysis?

  • Psychology : Thematic analysis is used to explore mental phenomena, which include coping mechanisms in reaction to strain, attitudes towards mental fitness, or stories of trauma.
  • Education : Researchers practice thematic analysis to apprehend student perceptions of getting to know environments, teaching methods, or academic interventions.
  • Healthcare : Thematic analysis enables take a look at affected person reports with healthcare offerings, attitudes towards treatment alternatives, or obstacles to gaining access to healthcare.
  • Market Research: Thematic analysis is applied to research purchaser remarks, perceive product options, or recognize emblem perceptions in marketplace research research.

Narrative analysis entails analyzing and interpreting the memories or narratives that people use to make feel of their stories. It focuses on the shape, content, and which means of narratives to apprehend how people construct and speak their identities, values, and ideals via storytelling. It is especially beneficial for exploring how people assemble and communicate their identities, values, and beliefs through storytelling.

When to use Narrative Analysis?

It’s extensively used throughout numerous disciplines, which includes sociology, psychology, anthropology, literary research, and verbal exchange studies. Some applications of narrative analysis in qualitative statistics analysis methodologies are:

  • Understanding Identity Construction : Narrative analysis can be used to explore how people construct their identities through the tales they tell approximately themselves. Researchers can examine the issues, plot systems, and language utilized in narratives to uncover how individuals perceive themselves and their place inside the world.
  • Exploring Life Experiences : Researchers frequently use narrative analysis to research the lived reports of people or groups. By inspecting the narratives shared by using members, researchers can advantage insights into the demanding situations, triumphs, and extensive events that shape people’s lives.
  • Examining Cultural Meanings and Practices: Narrative analysis can provide treasured insights into cultural meanings and practices. By studying the stories shared within a selected cultural context, researchers can find shared values, ideals, and norms that influence behavior and social interactions.
  • Exploring Trauma and Healing : Narrative analysis is usually utilized in studies on trauma and restoration tactics. By studying narratives of trauma survivors, researchers can explore how individuals make experience of their studies, deal with adversity, and embark on trips of restoration and resilience.
  • Analyzing Media and Popular Culture : Narrative analysis also can be applied to analyze media texts, inclusive of films, tv suggests, and literature. Researchers can have a look at the narratives constructed within these texts to understand how they reflect and shape cultural beliefs, ideologies, and norms.

Narrative analysis offers a powerful technique for exploring the structure, content, and that means of narratives or stories instructed by people, providing insights into their lived reports, identities, and perspectives. However, researchers need to navigate the interpretive subjectivity, time-extensive nature, and moral concerns related to reading narratives in qualitative studies.

Discourse analysis examines the approaches wherein language is used to construct that means, form social interactions, and reproduce electricity members of the family inside society. It makes a speciality of studying spoken or written texts, in addition to the wider social and cultural contexts in which communique happens. Researchers explore how language displays and shapes social norms, ideologies, and power dynamics.

Discourse analysis is employed when researchers are seeking to investigate social interactions, power dynamics, and identity creation through language. It is applied to take a look at how language shapes social relations, constructs identities, and reflects cultural norms and values.

When to use Discourse Analysis?

  • Linguistics and Language Studies : Discourse analysis is foundational to linguistics and language research, where it’s miles used to study language use, communique patterns, and discourse structures. Linguists behavior discourse analysis to investigate how language shapes social interactions, constructs identities, and reflects cultural norms. Discourse analysis facilitates uncover the underlying meanings, ideologies, and energy dynamics embedded in language.
  • Media and Communication : Discourse analysis is applied in media and conversation research to have a look at media representations, discursive practices, and ideological frameworks. Researchers conduct discourse analysis to analyze media texts, information coverage, and political speeches, exploring how language constructs and disseminates social meanings and values. Discourse analysis informs media literacy efforts, media grievance, and media coverage debates.
  • Political Science : Discourse analysis is applied in political science to look at political rhetoric, public discourse, and policymaking tactics. Researchers behavior discourse analysis to research political speeches, party manifestos, and coverage files, analyzing how language constructs political identities, legitimizes authority, and shapes public opinion. Discourse analysis informs political verbal exchange techniques, political campaigning, and policy advocacy.

Grounded theory analysis is an inductive studies approach used to broaden theories or causes based on empirical data. It includes systematically studying qualitative information to perceive ideas, categories, and relationships that emerge from the statistics itself, rather than testing preconceived hypotheses. Researchers have interaction in a procedure of constant assessment and theoretical sampling to refine and increase theoretical insights.

Grounded theory analysis is hired whilst researchers are seeking for to find styles, relationships, and tactics that emerge from the records itself, with out implementing preconceived hypotheses or theoretical assumptions.

When to use Grounded Theory Analysis?

Grounded concept analysis is applied throughout various disciplines and studies contexts, such as:

  • Social Sciences Research : Grounded Theory Analysis is significantly used in sociology, anthropology, psychology, and related disciplines to discover diverse social phenomena together with organization dynamics, social interactions, cultural practices, and societal structures.
  • Healthcare Research : In healthcare, Grounded Theory can be implemented to apprehend affected person reviews, healthcare provider-patient interactions, healthcare delivery procedures, and the impact of healthcare guidelines on individuals and communities.
  • Organizational Studies : Researchers use Grounded Theory to examine organizational conduct, leadership, place of work subculture, and worker dynamics. It enables in knowledge how groups function and the way they may be advanced.
  • Educational Research : In training, Grounded Theory Analysis can be used to discover teaching and getting to know processes, scholar studies, educational regulations, and the effectiveness of educational interventions.

Text analysis involves examining written or verbal communique to extract meaningful insights or styles. It encompasses numerous techniques which includes sentiment analysis, subject matter modeling, and keyword extraction. For instance, in a have a look at on patron opinions of a eating place, textual content analysis is probably used to become aware of established topics along with food first-class, service enjoy, and atmosphere. Key additives and strategies worried in text analysis:

  • Sentiment Analysis : This approach includes determining the sentiment expressed in a piece of textual content, whether or not it is high quality, bad, or impartial. Sentiment analysis algorithms use natural language processing (NLP) to analyze the words, phrases, and context within the text to deduce the overall sentiment. For instance, in customer reviews of a eating place, sentiment analysis could be used to gauge purchaser delight levels based totally on the emotions expressed within the critiques.
  • Topic Modeling : Topic modeling is a statistical technique used to become aware of the underlying topics or issues present within a group of documents or text statistics. It entails uncovering the latent patterns of co-occurring phrases or terms that constitute awesome topics. Techniques like Latent Dirichlet Allocation (LDA) and Latent Semantic Analysis (LSA) are normally used for topic modeling. In the context of eating place opinions, subject matter modeling should assist identify not unusual subject matters inclusive of meals excellent, provider revel in, cleanliness, etc., across a large corpus of opinions.
  • Keyword Extraction : Keyword extraction includes figuring out and extracting the most applicable phrases or phrases from a bit of text that seize its essence or major topics. This technique enables to summarize the important thing content material or subjects mentioned within the textual content. For instance, in eating place analysiss, key-word extraction ought to identify often referred to terms like “scrumptious meals,” “friendly group of workers,” “lengthy wait times,” etc., presenting a quick analysis of customer sentiments and concerns.

When to use Text Analysis?

Text analysis has numerous programs throughout diverse domain names, including:

  • Business and Marketing: Analyzing purchaser remarks, sentiment analysis of social media posts, brand monitoring, and market fashion analysis.
  • Healthcare: Extracting scientific statistics from scientific notes, analyzing patient comments, and detecting unfavorable drug reactions from textual content information.
  • Social Sciences: Studying public discourse, political communique, opinion mining, and discourse analysis in social media.
  • Academic Research: Conducting literature analysiss, analyzing studies articles, and identifying rising studies topics and trends.
  • Customer Experience : Understanding purchaser sentiments, identifying product or service problems, and improving client satisfaction via text-based totally comments analysis.

Ethnographic analysis involves immersing in a selected cultural or social setting to understand the views, behaviors, and interactions of the human beings within that context. Researchers conduct observations, interviews, and participant observations to gain insights into the culture, practices, and social dynamics of the community under study. It is is suitable when researchers aim to gain an in-depth understanding of a particular cultural or social setting, including the perspectives, behaviors, and interactions of the people within that context. Particularly beneficial for reading complex social phenomena of their natural environment, wherein observations and interactions arise organically.

When to use Ethnographic Analysis?

  • Cultural Understanding : Ethnographic analysis is right whilst researchers goal to gain deep insights into the lifestyle, ideals, and social practices of a selected institution or community.
  • Behavioral Observation : It is beneficial while researchers want to observe and apprehend the behaviors, interactions, and each day activities of individuals within their natural surroundings.
  • Contextual Exploration : Ethnographic analysis is valuable for exploring the context and lived stories of individuals, presenting wealthy, exact descriptions of their social and cultural worlds.
  • Complex Social Dynamics: It is suitable whilst analyzing complex social phenomena or phenomena which might be deeply embedded within social contexts, including rituals, traditions, or network dynamics.
  • Qualitative Inquiry: Ethnographic analysis is desired while researchers are seeking for to conduct qualitative inquiry targeted on know-how the subjective meanings and perspectives of individuals inside their cultural context.

Ethnographic analysis gives a effective method for analyzing complex social phenomena of their herbal context, offering rich and nuanced insights into the cultural practices, social dynamics, and lived experiences of individuals inside a particular community. However, researchers need to cautiously bear in mind the time commitment, ethical considerations, and potential biases associated with ethnographic studies.

  • Clearly Defined Research Question : Ground analysis in a clear and targeted research question. This will manual for information series and preserve you on the right track at some point of analysis.
  • Systematic Coding : Develop a coding scheme to categorize facts into significant topics or concepts. Use software gear to assist in organizing and dealing with codes.
  • Constant Comparison : Continuously examine new facts with current codes and subject matters to refine interpretations and make sure consistency.
  • Triangulation : Validate findings by the use of a couple of records sources, strategies, or researchers to corroborate consequences and beautify credibility.

Refine subject matters and interpretations through engaging in repeated cycles of gathering, coding, and analysis.

Qualitative data analysis techniques are effective means of revealing deep insights and comprehending intricate phenomena in both practice and study. Through the use of rigorous analytical approaches, researchers may convert qualitative data into significant ideas, interpretations, and narratives that further knowledge and support evidence-based decision-making.

Is it possible to mix quantitative and qualitative methodologies for data analysis?

A: In order to triangulate results and get a thorough grasp of study concerns, researchers do, in fact, often use mixed methods techniques.

How can I choose the best approach for analyzing qualitative data for my study?

A: To choose the best approach, take the research topic, the properties of the data, and the theoretical framework into consideration.

What are some tactics I might do to improve the reliability and validity of my qualitative data analysis?

Aim for peer debriefing and member verification to improve validity, and maintain transparency, reflexivity, and methodological coherence throughout the analytic process.

Please Login to comment...

Similar reads.

  • Data Analysis

Improve your Coding Skills with Practice

 alt=

What kind of Experience do you want to share?

  • Open access
  • Published: 18 May 2024

Determinants of appropriate antibiotic and NSAID prescribing in unscheduled outpatient settings in the veterans health administration

  • Michael J. Ward 1 , 2 , 3 , 4 ,
  • Michael E. Matheny 1 , 4 , 5 , 6 ,
  • Melissa D. Rubenstein 3 ,
  • Kemberlee Bonnet 7 ,
  • Chloe Dagostino 7 ,
  • David G. Schlundt 7 ,
  • Shilo Anders 4 , 8 ,
  • Thomas Reese 4 &
  • Amanda S. Mixon 1 , 9  

BMC Health Services Research volume  24 , Article number:  640 ( 2024 ) Cite this article

116 Accesses

1 Altmetric

Metrics details

Despite efforts to enhance the quality of medication prescribing in outpatient settings, potentially inappropriate prescribing remains common, particularly in unscheduled settings where patients can present with infectious and pain-related complaints. Two of the most commonly prescribed medication classes in outpatient settings with frequent rates of potentially inappropriate prescribing include antibiotics and nonsteroidal anti-inflammatory drugs (NSAIDs). In the setting of persistent inappropriate prescribing, we sought to understand a diverse set of perspectives on the determinants of inappropriate prescribing of antibiotics and NSAIDs in the Veterans Health Administration.

We conducted a qualitative study guided by the Consolidated Framework for Implementation Research and Theory of Planned Behavior. Semi-structured interviews were conducted with clinicians, stakeholders, and Veterans from March 1, 2021 through December 31, 2021 within the Veteran Affairs Health System in unscheduled outpatient settings at the Tennessee Valley Healthcare System. Stakeholders included clinical operations leadership and methodological experts. Audio-recorded interviews were transcribed and de-identified. Data coding and analysis were conducted by experienced qualitative methodologists adhering to the Consolidated Criteria for Reporting Qualitative Studies guidelines. Analysis was conducted using an iterative inductive/deductive process.

We conducted semi-structured interviews with 66 participants: clinicians ( N  = 25), stakeholders ( N  = 24), and Veterans ( N  = 17). We identified six themes contributing to potentially inappropriate prescribing of antibiotics and NSAIDs: 1) Perceived versus actual Veterans expectations about prescribing; 2) the influence of a time-pressured clinical environment on prescribing stewardship; 3) Limited clinician knowledge, awareness, and willingness to use evidence-based care; 4) Prescriber uncertainties about the Veteran condition at the time of the clinical encounter; 5) Limited communication; and 6) Technology barriers of the electronic health record and patient portal.

Conclusions

The diverse perspectives on prescribing underscore the need for interventions that recognize the detrimental impact of high workload on prescribing stewardship and the need to design interventions with the end-user in mind. This study revealed actionable themes that could be addressed to improve guideline concordant prescribing to enhance the quality of prescribing and to reduce patient harm.

Peer Review reports

Adverse drug events (ADEs) are the most common iatrogenic injury. [ 1 ] Efforts to reduce these events have primarily focused on the inpatient setting. However, the emergency department (ED), urgent care, and urgent primary care clinics are desirable targets for interventions to reduce ADEs because approximately 70% of all outpatient encounters occur in one of these settings. [ 2 ] Two of the most commonly prescribed drug classes during acute outpatient care visits that have frequent rates of potentially inappropriate prescribing include antibiotics and non-steroidal anti-inflammatory drugs (NSAIDs). [ 3 , 4 ]

An estimated 30% of all outpatient oral antibiotic prescriptions may be unnecessary. [ 5 , 6 ] The World Health Organization identified overuse of antibiotics and its resulting antimicrobial resistance as a global threat. [ 7 ] The Centers for Disease Control and Prevention (CDC) conservatively estimates that in the US there are nearly 3 million antibiotic-resistant infections that cause 48,000 deaths annually. [ 8 ] Antibiotics were the second most common source of adverse events with nearly one ADE resulting in an ED visit for every 100 prescriptions. [ 9 ] Inappropriate antibiotic prescriptions (e.g., antibiotic prescription for a viral infection) also contribute to resistance and iatrogenic infections such as C. difficile (antibiotic associated diarrhea) and Methicillin-resistant Staphylococcus aureus (MRSA) . [ 8 ] NSAID prescriptions, on the other hand, result in an ADE at more than twice the rate of antibiotics (2.2%), [ 10 ] are prescribed to patients at an already increased risk of potential ADEs, [ 4 , 11 ] and frequently interact with other medications. [ 12 ] Inappropriate NSAID prescriptions contribute to serious gastrointestinal, [ 13 ] renal, [ 14 ] and cardiovascular [ 15 , 16 ] ADEs such as gastrointestinal bleeding, acute kidney injury, and myocardial infarction or heart failure, respectively. Yet, the use of NSAIDs is ubiquitous; according to the CDC, between 2011 and 2014, 5% of the US population were prescribed an NSAID whereas an additional 2% take NSAIDs over the counter. [ 11 ]

Interventions to reduce inappropriate antibiotic prescribing commonly take the form of antimicrobial stewardship programs. However, no such national programs exist for NSAIDs, particularly in acute outpatient care settings. There is a substantial body of evidence supporting the evidence of such stewardship programs. [ 17 ] The CDC recognizes that such outpatient programs should consist of four core elements of antimicrobial stewardship, [ 18 ] including commitment, action for policy and practice, tracking and reporting, and education and expertise. However, the opportunities to extend antimicrobial stewardship in EDs are vast. Despite the effectiveness, there is a recognized need to understand which implementation strategies and how to implement multifaceted interventions. [ 19 ] Given the unique time-pressured environment of acute outpatient care settings, not all antimicrobial stewardship strategies work in these settings necessitating the development of approaches tailored to these environments. [ 19 , 20 ]

One particularly vulnerable population is within the Veterans Health Administration. With more than 9 million enrollees in the Veterans Health Administration, Veterans who receive care in Veteran Affairs (VA) hospitals and outpatient clinics may be particularly vulnerable to ADEs. Older Veterans have greater medical needs than younger patients, given their concomitant medical and mental health conditions as well as cognitive and social issues. Among Veterans seen in VA EDs and Urgent Care Clinics (UCCs), 50% are age 65 and older, [ 21 ] nearly three times the rate of non-VA emergency care settings (18%). [ 22 ] Inappropriate prescribing in ED and UCC settings is problematic with inappropriate antibiotic prescribing estimated to be higher than 40%. [ 23 ] In a sample of older Veterans discharged from VA ED and UCC settings, NSAIDs were found to be implicated in 77% of drug interactions. [ 24 ]

Learning from antimicrobial stewardship programs and applying to a broader base of prescribing in acute outpatient care settings, it is necessary to understand not only why potentially inappropriate prescribing remains a problem for antibiotics, but for medications (e.g., NSAIDs) which have received little stewardship focus previously. This understanding is essential to develop and implement interventions to reduce iatrogenic harm for vulnerable patients seen in unscheduled settings. In the setting of the Veterans Health Administration, we sought to use these two drug classes (antibiotics and NSAIDs) that have frequent rates of inappropriate prescribing in unscheduled outpatient care settings, to understand a diverse set of perspectives on why potentially inappropriate prescribing continues to occur.

Selection of participants

Participants were recruited from three groups in outpatient settings representing emergency care, urgent care, and urgent primary care in the VA: 1) Clinicians-VA clinicians such as physicians, advanced practice providers, and pharmacists 2) Stakeholders-VA and non-VA clinical operational and clinical content experts such as local and regional medical directors, national clinical, research, and administrative leadership in emergency care, primary care, and pharmacy including geriatrics; and 3) Veterans seeking unscheduled care for infectious or pain symptoms.

Clinicians and stakeholders were recruited using email, informational flyers, faculty/staff meetings, national conferences, and snowball sampling, when existing participants identify additional potential research subjects for recruitment. [ 25 ] Snowball sampling is useful for identifying and recruiting participants who may not be readily apparent to investigators and/or hard to reach. Clinician inclusion criteria consisted of: 1) at least 1 year of VA experience; and 2) ≥ 1 clinical shift in the last 30 days at any VA ED, urgent care, or primary care setting in which unscheduled visits occur. Veterans were recruited in-person at the VA by key study personnel. Inclusion criteria consisted of: 1) clinically stable as determined by the treating clinician; 2) 18 years or older; and 3) seeking care for infectious or pain symptoms in the local VA Tennessee Valley Healthcare System (TVHS). TVHS includes an ED at the Nashville campus with over 30,000 annual visits, urgent care clinic in Murfreesboro, TN with approximately 15,000 annual visits, and multiple primary care locations throughout the middle Tennessee region. This study was approved by the VA TVHS Institutional Review Board as minimal risk.

Data collection

Semi-structured interview guides (Supplemental Table 1) were developed using the Consolidated Framework for Implementation Research (CFIR) [ 26 ] and the Theory of Planned Behavior [ 27 , 28 ] to understand attitudes and beliefs as they relate to behaviors, and potential determinants of a future intervention. Interview guides were modified and finalized by conducting pilot interviews with three members of each participant group. Interview guides were tailored to each group of respondents and consisted of questions relating to: 1) determinants of potentially inappropriate prescribing; and 2) integration into practice (Table. 1 ). Clinicians were also asked about knowledge and awareness of evidence-based prescribing practices for antibiotics and NSAIDs. The interviewer asked follow-up questions to elicit clarity of responses and detail.

Each interview was conducted by a trained interviewer (MDR). Veteran interviews were conducted in-person while Veterans waited for clinical care so as not to disrupt clinical operations. Interviews with clinicians and stakeholders were scheduled virtually. All interviews (including in-person) were recorded and transcribed in a manner compliant with VA information security policies using Microsoft Teams (Redmond, WA). The audio-recorded interviews were transcribed and de-identified by a transcriptionist and stored securely behind the VA firewall using Microsoft Teams. Study personnel maintained a recording log on a password-protected server and each participant was assigned a unique participant ID number. Once 15 interviews were conducted per group, we planned to review interviews with the study team to discuss content, findings, and to decide collectively when thematic saturation was achieved, the point at which no new information was obtained. [ 29 ] If not achieved, we planned to conduct at least 2 additional interviews prior to group review for saturation. We estimated that approximately 20–25 interviews per group were needed to achieve thematic saturation.

Qualitative data coding and analysis was managed by the Vanderbilt University Qualitative Research Core. A hierarchical coding system (Supplemental Table 2) was developed and refined using an iterative inductive/deductive approach [ 30 , 31 , 32 ] guided by a combination of: 1) Consolidated Framework for Implementation Research (CFIR) [ 26 ]; 2) the Theory of Planned Behavior [ 27 , 28 ]; 3) interview guide questions; and 4) a preliminary review of the transcripts. Eighteen major categories (Supplemental Table 3) were identified and were further divided into subcategories, with some subcategories having additional levels of hierarchical division. Definitions and rules were written for the use of each of the coding categories. The process was iterative in that the coding system was both theoretically informed and derived from the qualitative data. The coding system was finalized after it was piloted by the coders. Data coding and analysis met the Consolidated Criteria for Reporting Qualitative Research (COREQ) guidelines. [ 33 ]

Four experienced qualitative coders were trained by independently coding two transcripts from each of the three participant categories. Coding was then compared, and any discrepancies resolved by reconciliation. After establishing reliability in using the coding system, the coders divided and independently coded the remaining transcripts in sequential order. Each statement was treated as a separate quote and could be assigned up to 21 different codes. Coded transcripts were combined and sorted by code.

Following thematic saturation, the frequency of each code was calculated to understand the distribution of quotes. Quotes were then cross-referenced with coding as a barrier to understand potential determinants of inappropriate prescribing. A thematic analysis of the barriers was conducted and presented in an iterative process with the research team of qualitative methodologists and clinicians to understand the nuances and refine the themes and subthemes from the coded transcripts. Transcripts, quotations, and codes were managed using Microsoft Excel and SPSS version 28.0.

We approached 132 individuals and 66 (50%) agreed to be interviewed. Participants included 25 clinicians, 24 stakeholders, and 17 Veterans whose demographic characteristics are presented in Table 2 . The clinicians were from 14 VA facilities throughout the US and 20 physicians, and five advanced practice providers. Of the clinicians, 21 (84%) worked in either an ED or urgent care while the remainder practiced in primary care. The 24 stakeholders included 13 (54%) clinical service chiefs or deputy chief (including medical directors), five (21%) national directors, and six (25%) experts in clinical content and methodology. The 17 Veterans interviewed included 15 (88%) who were seen for pain complaints.

Results are organized by the six thematic categories with several subthemes in each category. Themes and subthemes are presented in Table 3  and are visually represented in Fig.  1 . The six themes were: 1) perceived versus actual Veterans expectations about prescribing, 2) the influence of a time-pressured clinical environment on prescribing stewardship, 3) limited clinician knowledge, awareness, and willingness to use evidence-based care, 4) uncertainties about the Veteran condition at the time of the clinical encounter, 5) limited communication, and 6) technology barriers.

figure 1

Visual representation of themes and subthemes from 66 clinician, stakeholder, and Veteran interviews

Theme 1: Perception that Veterans routinely expect a medication from their visit, despite clinical inappropriateness

According to clinicians, Veterans frequently expect to receive a prescription even when this decision conflicts with good clinical practice.

Certainly lots of people would say you know if you feel like you’re up against some strong expectations from the patients or caregivers or families around the utility of an antibiotic when it’s probably not indicated…In the emergency department the bias is to act and assume the worst and assume like the worst for the clinical trajectory for the patient rather than the reverse. [Clinician 49, Physician, ED]

In addition, stakeholders further stated that patient prescription expectations are quite influential and are likely shaped by Veterans’ prior experiences.

I think the patients, particularly for antibiotics, have strong feelings about whether they should or shouldn’t get something prescribed. [Stakeholder 34] You know I think the biggest challenge, I think, is adjusting patients’ expectations because you know they got better the last time they were doing an antibiotic. [Stakeholder 64]

Patient satisfaction and clinician workload may also influence the clinician’s prescription decision.

We have a lot of patients that come in with back pain or knee pain or something. We’ll get an x-ray and see there’s nothing actually wrong physically that can be identified on x-ray at least and you have to do something. Otherwise, patient satisfaction will dip, and patients leave angry. [Clinician 28, Physician, urgent care clinic] For some clinicians it’s just easier to prescribe an antibiotic when they know that’s the patient’s expectation and it shortens their in-room discussion and evaluation. [Clinician 55, Physician, ED]

Despite clinician perception, Veterans communicated that they did not necessarily expect a prescription and were instead focused on the clinical interaction and the clinician’s decision.

I’m not sure if they’ll give me [unintelligible] a prescription or what they’ll do. I don’t care as long as they stop the pain. [Patient 40, urgent care clinic] I don’t expect to [receive a prescription], but I mean whatever the doctor finds is wrong with me I will follow what he says. [Patient 31, ED]

Theme 2: Hectic clinical environments and unique practice conditions in unscheduled settings provide little time to focus on prescribing practices

Clinicians and stakeholders reported that the time-constrained clinical environment and need to move onto the next patient were major challenges to prescribing stewardship.

The number one reason is to get a patient out of your office or exam bay and move on to the next one. [Stakeholder 28] It takes a lot of time and you have to be very patient and understanding. So, you end up having to put a fair bit of emotional investment and intelligence into an encounter to not prescribe. [Stakeholder 1]

Stakeholders also noted that unique shift conditions and clinician perceptions that their patients were “different” might influence prescribing practices.

A common pushback was ‘well my patients are different.’ [Stakeholder 4] Providers who worked different types of shifts, so if you happened to work on a Monday when the clinics were open and had more adults from the clinics you were more likely to prescribe antibiotics than if you worked over night and had fewer patients. Providers who worked primarily holidays or your Friday prescribing pattern may be very different if you could get them into a primary care provider the next day. [Stakeholder 22]

Clinicians also reported that historical practices in the clinical environment practices may also contribute to inappropriate prescribing.

I came from working in the [outpatient] Clinic as a new grad and they’re very strict about prescribing only according to evidence-based practice. And then when I came here things are with other colleagues are a little more loose with that type of thing. It can be difficult because you start to adopt that practice to. [Clinician 61, Nurse Practitioner, ED]

Theme 3: Clinician knowledge, awareness, and willingness to use evidence-based care

Stakeholders felt that clinicians had a lack of knowledge about prescribing of NSAIDs and antibiotics.

Sometimes errors are a lack of knowledge or awareness of the need to maybe specifically dose for let’s say impaired kidney function or awareness of current up to date current antibiotic resistance patterns in the location that might inform a more tailored antibiotic choice for a given condition. [Stakeholder 37] NSAIDs are very commonly used in the emergency department for patients of all ages…the ED clinician is simply not being aware that for specific populations this is not recommended and again just doing routine practice for patients of all ages and not realizing that for older patients you actually probably should not be using NSAIDs. [Stakeholder 40]

Some clinicians may be unwilling to change their prescribing practices due to outright resistance, entrenched habits, or lack of interest in doing so.

It sounds silly but there’s always some opposition to people being mandated to do something. But there are some people who would look and go ‘okay we already have a handle on that so why do we need something else? I know who prescribes inappropriately and who doesn’t. Is this a requirement, am I evaluated on it? That would come from supervisors. Is this one more thing on my annual review?’ [Stakeholder 28] If people have entrenched habits that are difficult to change and are physicians are very individualistic people who think that they are right more often than the non-physician because of their expensive training and perception of professionalism. [Stakeholder 4]

Theme 4: Uncertainty about whether an adverse event will occur

Clinicians cited the challenge of understanding the entirety of a Veteran’s condition, potential drug-drug interactions, and existing comorbidities in knowing whether an NSAID prescription may result in an adverse event.

It’s oftentimes a judgement call if someone has renal function that’s right at the precipice of being too poor to merit getting NSAIDs that may potentially cause issues. [Clinician 43, Physician, inpatient and urgent care] It depends on what the harm is. So, for instance, you can’t always predict allergic reactions. Harm from the non-steroidals would be more if you didn’t pre-identify risk factors for harm. So, they have ulcer disease, they have kidney problems where a non-steroidal would not be appropriate for that patient. Or potential for a drug-drug interaction between that non-steroid and another medication in particular. [Clinician 16, Physician, ED]

Rather than be concerned about the adverse events resulting from the medication itself, stakeholders identified the uncertainty that clinicians experience about whether a Veteran may experience an adverse event from an infection if nothing is done. This uncertainty contributes to the prescription of an antibiotic.

My experience in working with providers at the VA over the years is that they worry more about the consequences of not treating an infection than about the consequences of the antibiotic itself. [Stakeholder 19] Sometimes folks like to practice conservatively and they’ll say even though I didn’t really see any hard evidence of a bacterial infection, the patient’s older and sicker and they didn’t want to risk it. [Stakeholder 16]

Theme 5: Limited communication during and after the clinical encounter

The role and type of communication about prescribing depended upon the respondent. Clinicians identified inadequate communication and coordination with the Veteran’s primary care physician during the clinical encounter.

I would like to have a little more communication with the primary doctors. They don’t seem to be super interested in talking to anyone in the emergency room about their patients… A lot of times you don’t get an answer from the primary doctor or you get I’m busy in clinic. You can just pick something or just do what you think is right. [Clinician 25, Physician, ED]

Alternatively, stakeholders identified post-encounter patient outcome and clinical performance feedback as potential barriers.

Physicians tend to think that they are doing their best for every individual patient and without getting patient by patient feedback there is a strong cognitive bias to think well there must have been some exception and reason that I did it in this setting. [Stakeholder 34] It’s really more their own awareness of like their clinical performance and how they’re doing. [Stakeholder 40]

Veterans, however, prioritized communication during the clinical encounter. They expressed the need for clear and informative communication with the clinician, and the need for the clinician to provide a rationale for the choice and medication-specific details along with a need to ask any questions.

I expect him to tell me why I’m taking it, what it should do, and probably the side effects. [Patient 25, ED] I’d like to have a better description of how to take it because I won’t remember all the time and sometimes what they put on the bottle is not quite as clear. [Patient 22, ED]

Veterans reported their desire for a simple way to learn about medication information. They provided feedback on the current approaches to educational materials about prescriptions.

Probably most pamphlets that people get they’re not going to pay attention to them. Websites can be overwhelming. [Patient 3, ED] Posters can be offsetting. If you’re sick, you’re not going to read them…if you’re sick you may glance at that poster and disregard it. So, you’re not really going to see it but if you give them something in the hand people will tend to look at it because it’s in their hand. [Patient 19, ED] It would be nice if labels or something just told me what I needed to know. You know take this exactly when and reminds me here’s why you’re taking it for and just real clear and not small letters. [Patient 7, ED]

Theme 6: Technology barriers limited the usefulness of clinical decision support for order checking and patient communication tools

Following the decision to prescribe a medication, clinicians complained that electronic health record pop-ups with clinical decision support warnings for potential safety concerns (e.g., drug-drug interactions) were both excessive and not useful in a busy clinical environment.

The more the pop ups, the more they get ignored. So, it’s finding that sweet spot right where you’re not constantly having to click out of something because you’re so busy. Particularly in our clinical setting where we have very limited amount of time to read the little monograph. Most of the time you click ‘no’ and off you go. (Clinician 16, Physician, ED) Some of these mechanisms like the EMR [electronic medical record] or pop-up decision-making windows really limit your time. If you know the guidelines appropriately and doing the right thing, even if you’re doing the right thing it takes you a long time to get through something. (Clinician 19, Physician, Primary care clinic)

For post-encounter communication that builds on Theme 5 about patient communication, patients reported finding using the VA patient portal (MyHealtheVet) challenging for post-event communication with their primary care physician and to review the medications they were prescribed.

I’ve got to get help to get onto MyHealtheVet but I would probably like to try and use that, but I haven’t been on it in quite some time. [Patient 22, ED] I tried it [MyHealtheVet] once and it’s just too complicated so I’m not going to deal with it. [Patient 37, Urgent care]

This work examined attitudes and perceptions of barriers to appropriate prescribing of antibiotics and NSAIDs in unscheduled outpatient care settings in the Veterans Health Administration. Expanding on prior qualitative work on antimicrobial stewardship programs, we also included an examination of NSAID prescribing, a medication class which has received little attention focused on prescribing stewardship. This work seeks to advance the understanding of fundamental problems underlying prescribing stewardship to facilitate interventions designed to improve not only the decision to prescribe antibiotics and NSAIDs, but enhances the safety checks once a decision to prescribe is made. Specifically, we identified six themes during these interviews: perceived versus actual Veteran expectations about prescribing, the influence of a time-pressured clinical environment on prescribing stewardship, limited clinician knowledge, awareness, and willingness to use evidence-based care, uncertainties about the Veteran condition at the time of the clinical encounter, limited communication, and technology barriers.

Sensitive to patient expectations, clinicians believed that Veterans would be dissatisfied if they did not receive an antibiotic prescription, [ 34 ] even though most patients presenting to the ED for upper respiratory tract infections do not expect antibiotics. [ 35 ] However, recent work by Staub et al. found that among patients with respiratory tract infections, receipt of an antibiotic was not independently associated with improved satisfaction. [ 36 ] Instead, they found that receipt of antibiotics had to match the patient’s expectations to affect patient satisfaction and recommended that clinicians communicate with their patients about prescribing expectations. This finding complements our results in the present study and the importance of communication about expectations is similarly important for NSAID prescribing as well.

A commitment to stewardship and modification of clinician behavior may be compromised by the time-pressured clinical environment, numerous potential drug interactions, comorbidities of a vulnerable Veteran population, and normative practices. The decision to prescribe medications such as antibiotics is a complex clinical decision and may be influenced by both clinical and non-clinical factors. [ 34 , 37 , 38 ] ED crowding, which occurs when the demand for services exceeds a system’s ability to provide care, [ 39 ] is a well-recognized manifestation of a chaotic clinical environment and is associated with detrimental effects on the hospital system and patient outcomes. [ 40 , 41 ] The likelihood that congestion and wait times will improve is unlikely as the COVID-19 pandemic has exacerbated the already existing crowding and boarding crisis in EDs. [ 42 , 43 ]

Another theme was the uncertainty in the anticipation of adverse events that was exacerbated by the lack of a feedback loop. Feedback on clinical care processes and patient outcomes is uncommonly provided in emergency care settings, [ 44 ] yet may provide an opportunity to change clinician behavior, particularly for antimicrobial stewardship. [ 45 ] However, the frequent use of ineffective feedback strategies [ 46 ] compromises the ability to implement effective feedback interventions; feedback must be specific [ 47 ] and address the Intention-to-Action gap [ 48 ] by including co-interventions to address recipient characteristics (i.e., beliefs and capabilities) and context to maximize impact. Without these, feedback may be ineffective.

An additional barrier identified from this work is the limited communication with primary care following discharge. A 2017 National Quality Forum report on ED care transitions [ 49 ] recommended that EDs and their supporting hospital systems should expand infrastructure and enhance health information technology to support care transitions as Veterans may not understand discharge instructions, may not receive post-ED or urgent care, [ 50 , 51 , 52 ] or may not receive a newly prescribed medication. [ 24 ] While there are existing mechanisms to communicate between the ED and primary care teams such as notifications when a Veteran presents to the ED and when an emergency clinician copies a primary care physician on a note, these mechanisms are insufficient to address care transition gaps and are variable in best practice use. To address this variability, the VA ED PACT Tool was developed using best practices (standardized processes, "closed-loop" communication, embedding into workflow) to facilitate and standardize communication between VA EDs and follow-up care clinicians. [ 53 ] While the ED PACT Tool is implemented at the Greater Los Angeles VA and can create a care coordination order upon ED discharge, its use is not yet widely adopted throughout the VA.

In the final theme about technology barriers, once the decision has been made to prescribe a medication, existing electronic tools that are key components of existing stewardship interventions designed to curtail potentially inappropriate prescriptions may be compromised by their lack of usability. For example, clinician and stakeholder interview respondents described how usability concerns were exacerbated in a time-pressured clinical environment (e.g., electronic health record clinical decision support tools). Clinical decision support is an effective tool to improve healthcare process measures in a diverse group of clinical environments; [ 54 ] however, usability remains a barrier when alerts must be frequently overridden. [ 55 , 56 ] Alert fatigue, as expressed in our interviews for order checking and recognized within the VA’s EHR, [ 57 , 58 ] may contribute to excessive overrides reducing the benefit of clinical decision support, [ 56 , 59 ] there was a notable lack of discussion about the decision to initiate appropriate prescriptions, which is a key action of the CDC’s outpatient antibiotic stewardship campaign. [ 18 ] Thus, a potentially more effective, albeit challenging approach, is to “nudge” clinicians towards appropriate prescribing and away from the initial decision to prescribe (e.g., inappropriate antibiotic prescribing for viral upper respiratory tract infections) with either default order sets for symptom management or to enhance prescription decisions through reminders about potential contraindications to specific indications (e.g., high risk comorbidities). Beyond EHR-based solutions that might change clinician behavior, the CDC’s outpatient antibiotic stewardship program provides a framework to change the normative practices around inappropriate prescribing and includes a commitment to appropriate prescribing, action for policy and change, tracking and reporting, and education and expertise. [ 18 ]

Another technical barrier faces patients through patient-facing electronic tools such as the VA’s MyHealtheVet portal, which was developed to enhance patient communication following care transitions and to allow Veterans to review their medications and to communicate with their primary care clinical team. Patient portals can be an effective tool for medication adherence [ 60 ] and offer promise to provide patient education [ 61 ] following a clinical encounter. However, they are similarly limited by usability concerns, representing an adoption barrier to broader Veteran use after unscheduled outpatient care visits [ 62 ], particularly in an older patient population.

These interviews further underscored that lack of usability of clinical decision support for order checking that arises from ineffective design and is a key barrier preventing health information technology from reaching its promise of improving patient safety. [ 63 ] A common and recognized reason for these design challenges include the failure to place the user (i.e., acute care clinician) at the center of the design process resulting in underutilization, workarounds, [ 64 ] and unintended consequences, [ 65 ] all of which diminish patient safety practices and fail to change clinician behavior (i.e., prescribing). Complex adaptive systems work best when the relative strengths of humans (e.g., context sensitivity, situation specificity) are properly integrated with the information processing power of computerized systems. [ 66 ] One potential approach to address usability concerns is through the integration of user-centered design into technology design represents an opportunity to design more clinician- and patient-centric systems of care to advance prescribing stewardship interventions that may have lacked broader adoption previously. As antimicrobial stewardship and additional prescribing stewardship efforts focus on time-pressured environments where usability is essential to adoption, taking a user-centered design approach to not only the development of electronic tools but also in addressing the identified barriers in prescribing represents a promising approach to enhance the quality of prescribing.

Limitations

The study findings should be considered in light of its limitations. First, the setting for this work was the Veterans Health Administration, the largest integrated health system in the US. Also, while we focused on the stewardship of two drug classes, there are numerous additional drug classes that are prescribed in these settings. Studies in other settings or on other drug classes may not generalize to other settings and drug classes. Second, while clinicians and stakeholder perspectives included diverse, national representation, the Veterans interviewed were local to the Tennessee Valley Healthcare System. Given the concurrent COVID-19 pandemic at the time of enrollment, most of the Veterans were seen for pain-related complaints, and only two infectious-related complaints were included. However, we also asked them about antibiotic prescribing. Clinician and stakeholder narratives may not completely reflect their practice patterns as their responses could be influenced by social desirability bias. Third, responses may be subject to recall bias and may influence the data collected. Finally, the themes and subthemes identified may overlap and have potential interactions. While we used an iterative process to identify discrete themes and subthemes, prescription decisions represent a complex decision process that are influenced by numerous patient and contextual factors and may not be completely independent.

Despite numerous interventions to improve the quality of prescribing, the appropriate prescription of antibiotics and NSAIDs in unscheduled outpatient care settings remains a challenge. Using the Veterans Health Administration, this study found that challenges to high quality prescribing include perceived Veteran expectations about receipt of medications, a hectic clinical environment deprioritizing stewardship, limited clinician knowledge, awareness, and willingness to use evidence-based care, uncertainty about the potential for adverse events, limited communication, and technology barriers. Findings from these interviews suggest that interventions should consider the detrimental impact of high workload on prescribing stewardship, clinician workflow, the initial decision to prescribe medications, and incorporate end-users into the intervention design process. Doing so is a promising approach to enhance adoption of high quality prescribing practices in order to improve the quality and patient outcomes from NSAID and antibiotic prescribing.

Availability of data and materials

De-identified datasets used and/or analysed during the current study will be made available from the corresponding author on reasonable request.

Leape LL, Brennan TA, Laird N, et al. The nature of adverse events in hospitalized patients. Results of the Harvard Medical Practice Study II. N Engl J Med. 1991;324(6):377–384.

Article   CAS   PubMed   Google Scholar  

Pitts SR, Carrier ER, Rich EC, Kellermann AL. Where Americans get acute care: increasingly, it’s not at their doctor’s office. Health Aff (Millwood). 2010;29(9):1620–9.

Article   PubMed   Google Scholar  

Palms DL, Hicks LA, Bartoces M, et al. Comparison of antibiotic prescribing in retail clinics, urgent care centers, emergency departments, and traditional ambulatory care settings in the United States. Jama Intern Med. 2018;178(9):1267–9.

Article   PubMed   PubMed Central   Google Scholar  

Davis JS, Lee HY, Kim J, et al. Use of non-steroidal anti-inflammatory drugs in US adults: changes over time and by demographic. Open Heart. 2017;4(1):e000550.

Fleming-Dutra KE, Hersh AL, Shapiro DJ, et al. Prevalence of inappropriate antibiotic prescriptions among US ambulatory care visits, 2010–2011. JAMA. 2016;315(17):1864–73.

Shively NR, Buehrle DJ, Clancy CJ, Decker BK. Prevalence of Inappropriate Antibiotic Prescribing in Primary Care Clinics within a Veterans Affairs Health Care System. Antimicrob Agents Chemother. 2018;62(8):e00337–18. https://doi.org/10.1128/AAC.00337-18 .  https://pubmed.ncbi.nlm.nih.gov/29967028/ .

World Health Organization. Global antimicrobial resistance and use surveillance system (GLASS) report: 2022. 2022.

Centers for Disease Control and Prevention. COVID-19: U.S. Impact on Antimicrobial Resistance, Special Report 2022. Atlanta: U.S. Department of Health and Human Services, CDC; 2022.

Google Scholar  

Shehab N, Lovegrove MC, Geller AI, Rose KO, Weidle NJ, Budnitz DS. US emergency department visits for outpatient adverse drug events, 2013–2014. JAMA. 2016;316(20):2115–25.

Fassio V, Aspinall SL, Zhao X, et al. Trends in opioid and nonsteroidal anti-inflammatory use and adverse events. Am J Manag Care. 2018;24(3):e61–72.

PubMed   Google Scholar  

Centers for Disease Control and Prevention. Chronic Kidney Disease Surveillance System—United States. http://www.cdc.gov/ckd . Accessed 21 March 2023.

Cahir C, Fahey T, Teeling M, Teljeur C, Feely J, Bennett K. Potentially inappropriate prescribing and cost outcomes for older people: a national population study. Br J Clin Pharmacol. 2010;69(5):543–52.

Gabriel SE, Jaakkimainen L, Bombardier C. Risk for Serious Gastrointestinal Complications Related to Use of Nonsteroidal Antiinflammatory Drugs - a Metaanalysis. Ann Intern Med. 1991;115(10):787–96.

Zhang X, Donnan PT, Bell S, Guthrie B. Non-steroidal anti-inflammatory drug induced acute kidney injury in the community dwelling general population and people with chronic kidney disease: systematic review and meta-analysis. BMC Nephrol. 2017;18(1):256.

McGettigan P, Henry D. Cardiovascular risk with non-steroidal anti-inflammatory drugs: systematic review of population-based controlled observational studies. PLoS Med. 2011;8(9): e1001098.

Article   CAS   PubMed   PubMed Central   Google Scholar  

Holt A, Strange JE, Nouhravesh N, et al. Heart Failure Following Anti-Inflammatory Medications in Patients With Type 2 Diabetes Mellitus. J Am Coll Cardiol. 2023;81(15):1459–70.

Davey P, Marwick CA, Scott CL, et al. Interventions to improve antibiotic prescribing practices for hospital inpatients. Cochrane Database Syst Rev. 2017;2(2):CD003543.

Sanchez GV, Fleming-Dutra KE, Roberts RM, Hicks LA. Core Elements of Outpatient Antibiotic Stewardship. MMWR Recomm Rep. 2016;65(6):1–12.

May L, Martin Quiros A, Ten Oever J, Hoogerwerf J, Schoffelen T, Schouten J. Antimicrobial stewardship in the emergency department: characteristics and evidence for effectiveness of interventions. Clin Microbiol Infect. 2021;27(2):204–9.

May L, Cosgrove S, L'Archeveque M, et al. A call to action for antimicrobial stewardship in the emergency department: approaches and strategies. Ann Emerg Med. 2013;62(1):69–77 e62.

Veterans Health Administration Emergency Medicine Management Tool. EDIS GeriatricsAgeReport v3.

Cairns C KK, Santo L. National Hospital Ambulatory Medical Care Survey: 2020 emergency department summary tables. NHAMCS Factsheets - EDs Web site. https://www.cdc.gov/nchs/data/nhamcs/web_tables/2020-nhamcs-ed-web-tables-508.pdf . Accessed 20 Dec 2022.

Lowery JL, Alexander B, Nair R, Heintz BH, Livorsi DJ. Evaluation of antibiotic prescribing in emergency departments and urgent care centers across the Veterans’ Health Administration. Infect Control Hosp Epidemiol. 2021;42(6):694–701.

Hastings SN, Sloane RJ, Goldberg KC, Oddone EZ, Schmader KE. The quality of pharmacotherapy in older veterans discharged from the emergency department or urgent care clinic. J Am Geriatr Soc. 2007;55(9):1339–48.

Goodman LA. Snowball sampling. The annals of mathematical statistics. 1961. pp. 148–170.

Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50.

Ajzen I. The theory of planned behavior. Organ Behav Hum Decis Process. 1991;50(2):179–211.

Article   Google Scholar  

Ajzen I. The theory of planned behaviour: reactions and reflections. Psychol Health. 2011;26(9):1113–27.  https://doi.org/10.1080/08870446.2011.613995 .  https://www.tandfonline.com/doi/full/10.1080/08870446.2011.613995 .

Morse JM. The significance of saturation. Qual Health Res. 1995;5(2):147–9.

Azungah T. Qualitative research: deductive and inductive approaches to data analysis. Qual Res J. 2018;18(4):383–400.

Tjora A. Qualitative research as stepwise-deductive induction. Routledge; 2018.  https://www.routledge.com/Qualitative-Research-as-Stepwise-Deductive-Induction/Tjora/p/book/9781138304499 .

Fereday J, Muir-Cochrane E. Demonstrating rigor using thematic analysis: A hybrid approach of inductive and deductive coding and theme development. Int J Qual Methods. 2006;5(1):80–92.

Tong A, Sainsbury P, Craig J. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. Int J Qual Health Care. 2007;19(6):349–57.

Patel A, Pfoh ER, Misra Hebert AD, et al. Attitudes of High Versus Low Antibiotic Prescribers in the Management of Upper Respiratory Tract Infections: a Mixed Methods Study. J Gen Intern Med. 2020;35(4):1182–8.

May L, Gudger G, Armstrong P, et al. Multisite exploration of clinical decision making for antibiotic use by emergency medicine providers using quantitative and qualitative methods. Infect Control Hosp Epidemiol. 2014;35(9):1114–25.

Staub MB, Pellegrino R, Gettler E, et al. Association of antibiotics with veteran visit satisfaction and antibiotic expectations for upper respiratory tract infections. Antimicrob Steward Healthc Epidemiol. 2022;2(1): e100.

Schroeck JL, Ruh CA, Sellick JA Jr, Ott MC, Mattappallil A, Mergenhagen KA. Factors associated with antibiotic misuse in outpatient treatment for upper respiratory tract infections. Antimicrob Agents Chemother. 2015;59(7):3848–52.

Hruza HR, Velasquez T, Madaras-Kelly KJ, Fleming-Dutra KE, Samore MH, Butler JM. Evaluation of clinicians’ knowledge, attitudes, and planned behaviors related to an intervention to improve acute respiratory infection management. Infect Control Hosp Epidemiol. 2020;41(6):672–9.

American College of Emergency Physicians Policy Statement. Crowding. https://www.acep.org/globalassets/new-pdfs/policy-statements/crowding.pdf . Published 2019. Accessed 11 Oct 2023.

Bernstein SL, Aronsky D, Duseja R, et al. The effect of emergency department crowding on clinically oriented outcomes. Acad Emerg Med. 2009;16(1):1–10.

Rasouli HR, Esfahani AA, Nobakht M, et al. Outcomes of crowding in emergency departments; a systematic review. Arch Acad Emerg Med. 2019;7(1):e52.

PubMed   PubMed Central   Google Scholar  

Janke AT, Melnick ER, Venkatesh AK. Monthly Rates of Patients Who Left Before Accessing Care in US Emergency Departments, 2017–2021. JAMA Netw Open. 2022;5(9): e2233708.

Janke AT, Melnick ER, Venkatesh AK. Hospital Occupancy and Emergency Department Boarding During the COVID-19 Pandemic. JAMA Netw Open. 2022;5(9): e2233964.

Lavoie CF, Plint AC, Clifford TJ, Gaboury I. “I never hear what happens, even if they die”: a survey of emergency physicians about outcome feedback. CJEM. 2009;11(6):523–8.

Ivers N, Jamtvedt G, Flottorp S, et al. Audit and feedback: effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev. 2012;(6):CD000259. https://doi.org/10.1002/14651858.CD000259.pub3 .

Hysong SJ, SoRelle R, Hughes AM. Prevalence of Effective Audit-and-Feedback Practices in Primary Care Settings: A Qualitative Examination Within Veterans Health Administration. Hum Factors. 2022;64(1):99–108.

Presseau J, McCleary N, Lorencatto F, Patey AM, Grimshaw JM, Francis JJ. Action, actor, context, target, time (AACTT): a framework for specifying behaviour. Implement Sci. 2019;14(1):102.

Desveaux L, Ivers NM, Devotta K, Ramji N, Weyman K, Kiran T. Unpacking the intention to action gap: a qualitative study understanding how physicians engage with audit and feedback. Implement Sci. 2021;16(1):19.

National Quality Forum. Emergency Department Transitions of Care: A Quality Measurement Framework—Final Report: DHHS contract HHSM‐500–2012–000091, Task Order HHSM‐500‐T0025. Washington, DC: National Quality Forum; 2017.

Kyriacou DN, Handel D, Stein AC, Nelson RR. Brief report: factors affecting outpatient follow-up compliance of emergency department patients. J Gen Intern Med. 2005;20(10):938–42.

Vukmir RB, Kremen R, Ellis GL, DeHart DA, Plewa MC, Menegazzi J. Compliance with emergency department referral: the effect of computerized discharge instructions. Ann Emerg Med. 1993;22(5):819–23.

Engel KG, Heisler M, Smith DM, Robinson CH, Forman JH, Ubel PA. Patient comprehension of emergency department care and instructions: are patients aware of when they do not understand? Ann Emerg Med. 2009;53(4):454–461 e415.

Cordasco KM, Saifu HN, Song HS, et al. The ED-PACT Tool Initiative: Communicating Veterans’ Care Needs After Emergency Department Visits. J Healthc Qual. 2020;42(3):157–65.

Bright TJ, Wong A, Dhurjati R, et al. Effect of clinical decision-support systems: a systematic review. Ann Intern Med. 2012;157(1):29–43.

Weingart SN, Toth M, Sands DZ, Aronson MD, Davis RB, Phillips RS. Physicians’ decisions to override computerized drug alerts in primary care. Arch Intern Med. 2003;163(21):2625–31.

van der Sijs H, Aarts J, Vulto A, Berg M. Overriding of drug safety alerts in computerized physician order entry. J Am Med Inform Assoc. 2006;13(2):138–47.

Shah T, Patel-Teague S, Kroupa L, Meyer AND, Singh H. Impact of a national QI programme on reducing electronic health record notifications to clinicians. BMJ Qual Saf. 2019;28(1):10–4.

Lin CP, Payne TH, Nichol WP, Hoey PJ, Anderson CL, Gennari JH. Evaluating clinical decision support systems: monitoring CPOE order check override rates in the Department of Veterans Affairs’ Computerized Patient Record System. J Am Med Inform Assoc. 2008;15(5):620–6.

Middleton B, Bloomrosen M, Dente MA, et al. Enhancing patient safety and quality of care by improving the usability of electronic health record systems: recommendations from AMIA. J Am Med Inform Assoc. 2013;20(e1):e2-8.

Han HR, Gleason KT, Sun CA, et al. Using Patient Portals to Improve Patient Outcomes: Systematic Review. JMIR Hum Factors. 2019;6(4): e15038.

Johnson AM, Brimhall AS, Johnson ET, et al. A systematic review of the effectiveness of patient education through patient portals. JAMIA Open. 2023;6(1):ooac085.

Lazard AJ, Watkins I, Mackert MS, Xie B, Stephens KK, Shalev H. Design simplicity influences patient portal use: the role of aesthetic evaluations for technology acceptance. J Am Med Inform Assoc. 2016;23(e1):e157-161.

IOM. Health IT and Patient Safety: Building Safer Systems for Better Care. Washington, DC: NAP;2012.

Koppel R, Wetterneck T, Telles JL, Karsh BT. Workarounds to barcode medication administration systems: their occurrences, causes, and threats to patient safety. J Am Med Inform Assoc. 2008;15(4):408–23.

Ash JS, Sittig DF, Poon EG, Guappone K, Campbell E, Dykstra RH. The extent and importance of unintended consequences related to computerized provider order entry. J Am Med Inform Assoc. 2007;14(4):415–23.

Hollnagel E, Woods D. Joint Cognitive Systems: Foundations of Cognitive Systems Engineering. Boca Raton: CRC Press; 2006.

Download references

Acknowledgements

This material is based upon work supported by the Department of Veterans Affairs, Veterans Health Administration, Office of Research and Development, Health Services Research and Development (I01HX003057). The content is solely the responsibility of the authors and does not necessarily represent the official views of the VA.

Author information

Authors and affiliations.

Education, and Clinical Center (GRECC), VA , Geriatric Research, Tennessee Valley Healthcare System, 2525 West End Avenue, Ste. 1430, Nashville, TN, 37203, USA

Michael J. Ward, Michael E. Matheny & Amanda S. Mixon

Medicine Service, Tennessee Valley Healthcare System, Nashville, TN, USA

Michael J. Ward

Department of Emergency Medicine, Vanderbilt University Medical Center, Nashville, TN, USA

Michael J. Ward & Melissa D. Rubenstein

Department of Biomedical Informatics, Vanderbilt University Medical Center, Nashville, TN, USA

Michael J. Ward, Michael E. Matheny, Shilo Anders & Thomas Reese

Department of Biostatistics, Vanderbilt University Medical Center, Nashville, TN, USA

Michael E. Matheny

Division of General Internal Medicine & Public Health, Vanderbilt University Medical Center, Nashville, TN, USA

Department of Psychology, Vanderbilt University, Nashville, TN, USA

Kemberlee Bonnet, Chloe Dagostino & David G. Schlundt

Center for Research and Innovation in Systems Safety, Vanderbilt University Medical Center, Nashville, TN, USA

Shilo Anders

Section of Hospital Medicine, Vanderbilt University Medical Center, Nashville, TN, USA

Amanda S. Mixon

You can also search for this author in PubMed   Google Scholar

Contributions

Conceptualization: MJW, ASM, MEM, DS, SA. Methodology: MJW, ASM, MEM, DS, KB, SA, TR. Formal analysis: KB, DS, CD, MJW. Investigation: MJW, MDR, DS. Resources: MJW, MEM. Writing—Original Draft. Preparation: MJW, ASM, KB, MDR. Writing—Review & Editing: All investigators. Supervision: MJW, ASM, MEM. Funding acquisition: MJW, MEM.

Corresponding author

Correspondence to Michael J. Ward .

Ethics declarations

Ethics approval and consent to participate.

This study was approved by the VA Tennessee Valley Healthcare System Institutional Review Board as minimal risk (#1573619). A waiver of informed consent was approved and each subject was verbally consented prior to interviews. The IRB determined that all requirements set forth in 38CFR16.111 in accordance for human subjects research have been satisfied. All the methods were carried out according the relevant guidelines and regulations.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Supplementary material 1., rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Ward, M.J., Matheny, M.E., Rubenstein, M.D. et al. Determinants of appropriate antibiotic and NSAID prescribing in unscheduled outpatient settings in the veterans health administration. BMC Health Serv Res 24 , 640 (2024). https://doi.org/10.1186/s12913-024-11082-0

Download citation

Received : 11 October 2023

Accepted : 07 May 2024

Published : 18 May 2024

DOI : https://doi.org/10.1186/s12913-024-11082-0

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Non-Steroidal Anti-Inflammatory Drugs
  • Antibiotics
  • Qualitative Methods
  • Emergency Department
  • Urgent Care
  • Primary Care
  • Prescribing Stewardship

BMC Health Services Research

ISSN: 1472-6963

research methods of qualitative data analysis

U.S. flag

An official website of the United States government

The .gov means it's official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you're on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • Browse Titles

NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

StatPearls [Internet]. Treasure Island (FL): StatPearls Publishing; 2024 Jan-.

Cover of StatPearls

StatPearls [Internet].

Qualitative study.

Steven Tenny ; Janelle M. Brannan ; Grace D. Brannan .

Affiliations

Last Update: September 18, 2022 .

  • Introduction

Qualitative research is a type of research that explores and provides deeper insights into real-world problems. [1] Instead of collecting numerical data points or intervening or introducing treatments just like in quantitative research, qualitative research helps generate hypothenar to further investigate and understand quantitative data. Qualitative research gathers participants' experiences, perceptions, and behavior. It answers the hows and whys instead of how many or how much. It could be structured as a standalone study, purely relying on qualitative data, or part of mixed-methods research that combines qualitative and quantitative data. This review introduces the readers to some basic concepts, definitions, terminology, and applications of qualitative research.

Qualitative research, at its core, asks open-ended questions whose answers are not easily put into numbers, such as "how" and "why." [2] Due to the open-ended nature of the research questions, qualitative research design is often not linear like quantitative design. [2] One of the strengths of qualitative research is its ability to explain processes and patterns of human behavior that can be difficult to quantify. [3] Phenomena such as experiences, attitudes, and behaviors can be complex to capture accurately and quantitatively. In contrast, a qualitative approach allows participants themselves to explain how, why, or what they were thinking, feeling, and experiencing at a particular time or during an event of interest. Quantifying qualitative data certainly is possible, but at its core, qualitative data is looking for themes and patterns that can be difficult to quantify, and it is essential to ensure that the context and narrative of qualitative work are not lost by trying to quantify something that is not meant to be quantified.

However, while qualitative research is sometimes placed in opposition to quantitative research, where they are necessarily opposites and therefore "compete" against each other and the philosophical paradigms associated with each other, qualitative and quantitative work are neither necessarily opposites, nor are they incompatible. [4] While qualitative and quantitative approaches are different, they are not necessarily opposites and certainly not mutually exclusive. For instance, qualitative research can help expand and deepen understanding of data or results obtained from quantitative analysis. For example, say a quantitative analysis has determined a correlation between length of stay and level of patient satisfaction, but why does this correlation exist? This dual-focus scenario shows one way in which qualitative and quantitative research could be integrated.

Qualitative Research Approaches

Ethnography

Ethnography as a research design originates in social and cultural anthropology and involves the researcher being directly immersed in the participant’s environment. [2] Through this immersion, the ethnographer can use a variety of data collection techniques to produce a comprehensive account of the social phenomena that occurred during the research period. [2] That is to say, the researcher’s aim with ethnography is to immerse themselves into the research population and come out of it with accounts of actions, behaviors, events, etc, through the eyes of someone involved in the population. Direct involvement of the researcher with the target population is one benefit of ethnographic research because it can then be possible to find data that is otherwise very difficult to extract and record.

Grounded theory

Grounded Theory is the "generation of a theoretical model through the experience of observing a study population and developing a comparative analysis of their speech and behavior." [5] Unlike quantitative research, which is deductive and tests or verifies an existing theory, grounded theory research is inductive and, therefore, lends itself to research aimed at social interactions or experiences. [3] [2] In essence, Grounded Theory’s goal is to explain how and why an event occurs or how and why people might behave a certain way. Through observing the population, a researcher using the Grounded Theory approach can then develop a theory to explain the phenomena of interest.

Phenomenology

Phenomenology is the "study of the meaning of phenomena or the study of the particular.” [5] At first glance, it might seem that Grounded Theory and Phenomenology are pretty similar, but the differences can be seen upon careful examination. At its core, phenomenology looks to investigate experiences from the individual's perspective. [2] Phenomenology is essentially looking into the "lived experiences" of the participants and aims to examine how and why participants behaved a certain way from their perspective. Herein lies one of the main differences between Grounded Theory and Phenomenology. Grounded Theory aims to develop a theory for social phenomena through an examination of various data sources. In contrast, Phenomenology focuses on describing and explaining an event or phenomenon from the perspective of those who have experienced it.

Narrative research

One of qualitative research’s strengths lies in its ability to tell a story, often from the perspective of those directly involved in it. Reporting on qualitative research involves including details and descriptions of the setting involved and quotes from participants. This detail is called a "thick" or "rich" description and is a strength of qualitative research. Narrative research is rife with the possibilities of "thick" description as this approach weaves together a sequence of events, usually from just one or two individuals, hoping to create a cohesive story or narrative. [2] While it might seem like a waste of time to focus on such a specific, individual level, understanding one or two people’s narratives for an event or phenomenon can help to inform researchers about the influences that helped shape that narrative. The tension or conflict of differing narratives can be "opportunities for innovation." [2]

Research Paradigm

Research paradigms are the assumptions, norms, and standards underpinning different research approaches. Essentially, research paradigms are the "worldviews" that inform research. [4] It is valuable for qualitative and quantitative researchers to understand what paradigm they are working within because understanding the theoretical basis of research paradigms allows researchers to understand the strengths and weaknesses of the approach being used and adjust accordingly. Different paradigms have different ontologies and epistemologies. Ontology is defined as the "assumptions about the nature of reality,” whereas epistemology is defined as the "assumptions about the nature of knowledge" that inform researchers' work. [2] It is essential to understand the ontological and epistemological foundations of the research paradigm researchers are working within to allow for a complete understanding of the approach being used and the assumptions that underpin the approach as a whole. Further, researchers must understand their own ontological and epistemological assumptions about the world in general because their assumptions about the world will necessarily impact how they interact with research. A discussion of the research paradigm is not complete without describing positivist, postpositivist, and constructivist philosophies.

Positivist versus postpositivist

To further understand qualitative research, we must discuss positivist and postpositivist frameworks. Positivism is a philosophy that the scientific method can and should be applied to social and natural sciences. [4] Essentially, positivist thinking insists that the social sciences should use natural science methods in their research. It stems from positivist ontology, that there is an objective reality that exists that is wholly independent of our perception of the world as individuals. Quantitative research is rooted in positivist philosophy, which can be seen in the value it places on concepts such as causality, generalizability, and replicability.

Conversely, postpositivists argue that social reality can never be one hundred percent explained, but could be approximated. [4] Indeed, qualitative researchers have been insisting that there are “fundamental limits to the extent to which the methods and procedures of the natural sciences could be applied to the social world,” and therefore, postpositivist philosophy is often associated with qualitative research. [4] An example of positivist versus postpositivist values in research might be that positivist philosophies value hypothesis-testing, whereas postpositivist philosophies value the ability to formulate a substantive theory.

Constructivist

Constructivism is a subcategory of postpositivism. Most researchers invested in postpositivist research are also constructivist, meaning they think there is no objective external reality that exists but instead that reality is constructed. Constructivism is a theoretical lens that emphasizes the dynamic nature of our world. "Constructivism contends that individuals' views are directly influenced by their experiences, and it is these individual experiences and views that shape their perspective of reality.” [6]  constructivist thought focuses on how "reality" is not a fixed certainty and how experiences, interactions, and backgrounds give people a unique view of the world. Constructivism contends, unlike positivist views, that there is not necessarily an "objective"reality we all experience. This is the ‘relativist’ ontological view that reality and our world are dynamic and socially constructed. Therefore, qualitative scientific knowledge can be inductive as well as deductive.” [4]

So why is it important to understand the differences in assumptions that different philosophies and approaches to research have? Fundamentally, the assumptions underpinning the research tools a researcher selects provide an overall base for the assumptions the rest of the research will have. It can even change the role of the researchers. [2] For example, is the researcher an "objective" observer, such as in positivist quantitative work? Or is the researcher an active participant in the research, as in postpositivist qualitative work? Understanding the philosophical base of the study undertaken allows researchers to fully understand the implications of their work and their role within the research and reflect on their positionality and bias as it pertains to the research they are conducting.

Data Sampling 

The better the sample represents the intended study population, the more likely the researcher is to encompass the varying factors. The following are examples of participant sampling and selection: [7]

  • Purposive sampling- selection based on the researcher’s rationale for being the most informative.
  • Criterion sampling selection based on pre-identified factors.
  • Convenience sampling- selection based on availability.
  • Snowball sampling- the selection is by referral from other participants or people who know potential participants.
  • Extreme case sampling- targeted selection of rare cases.
  • Typical case sampling selection based on regular or average participants. 

Data Collection and Analysis

Qualitative research uses several techniques, including interviews, focus groups, and observation. [1] [2] [3] Interviews may be unstructured, with open-ended questions on a topic, and the interviewer adapts to the responses. Structured interviews have a predetermined number of questions that every participant is asked. It is usually one-on-one and appropriate for sensitive topics or topics needing an in-depth exploration. Focus groups are often held with 8-12 target participants and are used when group dynamics and collective views on a topic are desired. Researchers can be participant-observers to share the experiences of the subject or non-participants or detached observers.

While quantitative research design prescribes a controlled environment for data collection, qualitative data collection may be in a central location or the participants' environment, depending on the study goals and design. Qualitative research could amount to a large amount of data. Data is transcribed, which may then be coded manually or using computer-assisted qualitative data analysis software or CAQDAS such as ATLAS.ti or NVivo. [8] [9] [10]

After the coding process, qualitative research results could be in various formats. It could be a synthesis and interpretation presented with excerpts from the data. [11] Results could also be in the form of themes and theory or model development.

Dissemination

The healthcare team can use two reporting standards to standardize and facilitate the dissemination of qualitative research outcomes. The Consolidated Criteria for Reporting Qualitative Research or COREQ is a 32-item checklist for interviews and focus groups. [12] The Standards for Reporting Qualitative Research (SRQR) is a checklist covering a more comprehensive range of qualitative research. [13]

Applications

Many times, a research question will start with qualitative research. The qualitative research will help generate the research hypothesis, which can be tested with quantitative methods. After the data is collected and analyzed with quantitative methods, a set of qualitative methods can be used to dive deeper into the data to better understand what the numbers truly mean and their implications. The qualitative techniques can then help clarify the quantitative data and also help refine the hypothesis for future research. Furthermore, with qualitative research, researchers can explore poorly studied subjects with quantitative methods. These include opinions, individual actions, and social science research.

An excellent qualitative study design starts with a goal or objective. This should be clearly defined or stated. The target population needs to be specified. A method for obtaining information from the study population must be carefully detailed to ensure no omissions of part of the target population. A proper collection method should be selected that will help obtain the desired information without overly limiting the collected data because, often, the information sought is not well categorized or obtained. Finally, the design should ensure adequate methods for analyzing the data. An example may help better clarify some of the various aspects of qualitative research.

A researcher wants to decrease the number of teenagers who smoke in their community. The researcher could begin by asking current teen smokers why they started smoking through structured or unstructured interviews (qualitative research). The researcher can also get together a group of current teenage smokers and conduct a focus group to help brainstorm factors that may have prevented them from starting to smoke (qualitative research).

In this example, the researcher has used qualitative research methods (interviews and focus groups) to generate a list of ideas of why teens start to smoke and factors that may have prevented them from starting to smoke. Next, the researcher compiles this data. The research found that, hypothetically, peer pressure, health issues, cost, being considered "cool," and rebellious behavior all might increase or decrease the likelihood of teens starting to smoke.

The researcher creates a survey asking teen participants to rank how important each of the above factors is in either starting smoking (for current smokers) or not smoking (for current nonsmokers). This survey provides specific numbers (ranked importance of each factor) and is thus a quantitative research tool.

The researcher can use the survey results to focus efforts on the one or two highest-ranked factors. Let us say the researcher found that health was the primary factor that keeps teens from starting to smoke, and peer pressure was the primary factor that contributed to teens starting smoking. The researcher can go back to qualitative research methods to dive deeper into these for more information. The researcher wants to focus on keeping teens from starting to smoke, so they focus on the peer pressure aspect.

The researcher can conduct interviews and focus groups (qualitative research) about what types and forms of peer pressure are commonly encountered, where the peer pressure comes from, and where smoking starts. The researcher hypothetically finds that peer pressure often occurs after school at the local teen hangouts, mostly in the local park. The researcher also hypothetically finds that peer pressure comes from older, current smokers who provide the cigarettes.

The researcher could further explore this observation made at the local teen hangouts (qualitative research) and take notes regarding who is smoking, who is not, and what observable factors are at play for peer pressure to smoke. The researcher finds a local park where many local teenagers hang out and sees that the smokers tend to hang out in a shady, overgrown area of the park. The researcher notes that smoking teenagers buy their cigarettes from a local convenience store adjacent to the park, where the clerk does not check identification before selling cigarettes. These observations fall under qualitative research.

If the researcher returns to the park and counts how many individuals smoke in each region, this numerical data would be quantitative research. Based on the researcher's efforts thus far, they conclude that local teen smoking and teenagers who start to smoke may decrease if there are fewer overgrown areas of the park and the local convenience store does not sell cigarettes to underage individuals.

The researcher could try to have the parks department reassess the shady areas to make them less conducive to smokers or identify how to limit the sales of cigarettes to underage individuals by the convenience store. The researcher would then cycle back to qualitative methods of asking at-risk populations their perceptions of the changes and what factors are still at play, and quantitative research that includes teen smoking rates in the community and the incidence of new teen smokers, among others. [14] [15]

Qualitative research functions as a standalone research design or combined with quantitative research to enhance our understanding of the world. Qualitative research uses techniques including structured and unstructured interviews, focus groups, and participant observation not only to help generate hypotheses that can be more rigorously tested with quantitative research but also to help researchers delve deeper into the quantitative research numbers, understand what they mean, and understand what the implications are. Qualitative research allows researchers to understand what is going on, especially when things are not easily categorized. [16]

  • Issues of Concern

As discussed in the sections above, quantitative and qualitative work differ in many ways, including the evaluation criteria. There are four well-established criteria for evaluating quantitative data: internal validity, external validity, reliability, and objectivity. Credibility, transferability, dependability, and confirmability are the correlating concepts in qualitative research. [4] [11] The corresponding quantitative and qualitative concepts can be seen below, with the quantitative concept on the left and the qualitative concept on the right:

  • Internal validity: Credibility
  • External validity: Transferability
  • Reliability: Dependability
  • Objectivity: Confirmability

In conducting qualitative research, ensuring these concepts are satisfied and well thought out can mitigate potential issues from arising. For example, just as a researcher will ensure that their quantitative study is internally valid, qualitative researchers should ensure that their work has credibility. 

Indicators such as triangulation and peer examination can help evaluate the credibility of qualitative work.

  • Triangulation: Triangulation involves using multiple data collection methods to increase the likelihood of getting a reliable and accurate result. In our above magic example, the result would be more reliable if we interviewed the magician, backstage hand, and the person who "vanished." In qualitative research, triangulation can include telephone surveys, in-person surveys, focus groups, and interviews and surveying an adequate cross-section of the target demographic.
  • Peer examination: A peer can review results to ensure the data is consistent with the findings.

A "thick" or "rich" description can be used to evaluate the transferability of qualitative research, whereas an indicator such as an audit trail might help evaluate the dependability and confirmability.

  • Thick or rich description:  This is a detailed and thorough description of details, the setting, and quotes from participants in the research. [5] Thick descriptions will include a detailed explanation of how the study was conducted. Thick descriptions are detailed enough to allow readers to draw conclusions and interpret the data, which can help with transferability and replicability.
  • Audit trail: An audit trail provides a documented set of steps of how the participants were selected and the data was collected. The original information records should also be kept (eg, surveys, notes, recordings).

One issue of concern that qualitative researchers should consider is observation bias. Here are a few examples:

  • Hawthorne effect: The effect is the change in participant behavior when they know they are being observed. Suppose a researcher wanted to identify factors that contribute to employee theft and tell the employees they will watch them to see what factors affect employee theft. In that case, one would suspect employee behavior would change when they know they are being protected.
  • Observer-expectancy effect: Some participants change their behavior or responses to satisfy the researcher's desired effect. This happens unconsciously for the participant, so it is essential to eliminate or limit the transmission of the researcher's views.
  • Artificial scenario effect: Some qualitative research occurs in contrived scenarios with preset goals. In such situations, the information may not be accurate because of the artificial nature of the scenario. The preset goals may limit the qualitative information obtained.
  • Clinical Significance

Qualitative or quantitative research helps healthcare providers understand patients and the impact and challenges of the care they deliver. Qualitative research provides an opportunity to generate and refine hypotheses and delve deeper into the data generated by quantitative research. Qualitative research is not an island apart from quantitative research but an integral part of research methods to understand the world around us. [17]

  • Enhancing Healthcare Team Outcomes

Qualitative research is essential for all healthcare team members as all are affected by qualitative research. Qualitative research may help develop a theory or a model for health research that can be further explored by quantitative research. Much of the qualitative research data acquisition is completed by numerous team members, including social workers, scientists, nurses, etc. Within each area of the medical field, there is copious ongoing qualitative research, including physician-patient interactions, nursing-patient interactions, patient-environment interactions, healthcare team function, patient information delivery, etc. 

  • Review Questions
  • Access free multiple choice questions on this topic.
  • Comment on this article.

Disclosure: Steven Tenny declares no relevant financial relationships with ineligible companies.

Disclosure: Janelle Brannan declares no relevant financial relationships with ineligible companies.

Disclosure: Grace Brannan declares no relevant financial relationships with ineligible companies.

This book is distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) ( http://creativecommons.org/licenses/by-nc-nd/4.0/ ), which permits others to distribute the work, provided that the article is not altered or used commercially. You are not required to obtain permission to distribute this article, provided that you credit the author and journal.

  • Cite this Page Tenny S, Brannan JM, Brannan GD. Qualitative Study. [Updated 2022 Sep 18]. In: StatPearls [Internet]. Treasure Island (FL): StatPearls Publishing; 2024 Jan-.

In this Page

Bulk download.

  • Bulk download StatPearls data from FTP

Related information

  • PMC PubMed Central citations
  • PubMed Links to PubMed

Similar articles in PubMed

  • Suicidal Ideation. [StatPearls. 2024] Suicidal Ideation. Harmer B, Lee S, Rizvi A, Saadabadi A. StatPearls. 2024 Jan
  • Folic acid supplementation and malaria susceptibility and severity among people taking antifolate antimalarial drugs in endemic areas. [Cochrane Database Syst Rev. 2022] Folic acid supplementation and malaria susceptibility and severity among people taking antifolate antimalarial drugs in endemic areas. Crider K, Williams J, Qi YP, Gutman J, Yeung L, Mai C, Finkelstain J, Mehta S, Pons-Duran C, Menéndez C, et al. Cochrane Database Syst Rev. 2022 Feb 1; 2(2022). Epub 2022 Feb 1.
  • Macromolecular crowding: chemistry and physics meet biology (Ascona, Switzerland, 10-14 June 2012). [Phys Biol. 2013] Macromolecular crowding: chemistry and physics meet biology (Ascona, Switzerland, 10-14 June 2012). Foffi G, Pastore A, Piazza F, Temussi PA. Phys Biol. 2013 Aug; 10(4):040301. Epub 2013 Aug 2.
  • Review Evidence Brief: The Effectiveness Of Mandatory Computer-Based Trainings On Government Ethics, Workplace Harassment, Or Privacy And Information Security-Related Topics [ 2014] Review Evidence Brief: The Effectiveness Of Mandatory Computer-Based Trainings On Government Ethics, Workplace Harassment, Or Privacy And Information Security-Related Topics Peterson K, McCleery E. 2014 May
  • Review Public sector reforms and their impact on the level of corruption: A systematic review. [Campbell Syst Rev. 2021] Review Public sector reforms and their impact on the level of corruption: A systematic review. Mugellini G, Della Bella S, Colagrossi M, Isenring GL, Killias M. Campbell Syst Rev. 2021 Jun; 17(2):e1173. Epub 2021 May 24.

Recent Activity

  • Qualitative Study - StatPearls Qualitative Study - StatPearls

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

Connect with NLM

National Library of Medicine 8600 Rockville Pike Bethesda, MD 20894

Web Policies FOIA HHS Vulnerability Disclosure

Help Accessibility Careers

statistics

Not all data are created equal; some are structured, but most of them are unstructured. Structured and unstructured data are sourced, collected and scaled in different ways and each one resides in a different type of database.

In this article, we will take a deep dive into both types so that you can get the most out of your data.

Structured data—typically categorized as quantitative data—is highly organized and easily decipherable by  machine learning algorithms .  Developed by IBM® in 1974 , structured query language (SQL) is the programming language used to manage structured data. By using a  relational (SQL) database , business users can quickly input, search and manipulate structured data.

Examples of structured data include dates, names, addresses, credit card numbers, among others. Their benefits are tied to ease of use and access, while liabilities revolve around data inflexibility:

  • Easily used by machine learning (ML) algorithms:  The specific and organized architecture of structured data eases the manipulation and querying of ML data.
  • Easily used by business users:  Structured data do not require an in-depth understanding of different types of data and how they function. With a basic understanding of the topic relative to the data, users can easily access and interpret the data.
  • Accessible by more tools:  Since structured data predates unstructured data, there are more tools available for using and analyzing structured data.
  • Limited usage:  Data with a predefined structure can only be used for its intended purpose, which limits its flexibility and usability.
  • Limited storage options:  Structured data are usually stored in data storage systems with rigid schemas (for example, “ data warehouses ”). Therefore, changes in data requirements necessitate an update of all structured data, which leads to a massive expenditure of time and resources.
  • OLAP :  Performs high-speed, multidimensional data analysis from unified, centralized data stores.
  • SQLite : (link resides outside ibm.com)  Implements a self-contained,  serverless , zero-configuration, transactional relational database engine.
  • MySQL :  Embeds data into mass-deployed software, particularly mission-critical, heavy-load production system.
  • PostgreSQL :  Supports SQL and JSON querying as well as high-tier programming languages (C/C+, Java,  Python , among others.).
  • Customer relationship management (CRM):  CRM software runs structured data through analytical tools to create datasets that reveal customer behavior patterns and trends.
  • Online booking:  Hotel and ticket reservation data (for example, dates, prices, destinations, among others.) fits the “rows and columns” format indicative of the pre-defined data model.
  • Accounting:  Accounting firms or departments use structured data to process and record financial transactions.

Unstructured data, typically categorized as qualitative data, cannot be processed and analyzed through conventional data tools and methods. Since unstructured data does not have a predefined data model, it is best managed in  non-relational (NoSQL) databases . Another way to manage unstructured data is to use  data lakes  to preserve it in raw form.

The importance of unstructured data is rapidly increasing.  Recent projections  (link resides outside ibm.com) indicate that unstructured data is over 80% of all enterprise data, while 95% of businesses prioritize unstructured data management.

Examples of unstructured data include text, mobile activity, social media posts, Internet of Things (IoT) sensor data, among others. Their benefits involve advantages in format, speed and storage, while liabilities revolve around expertise and available resources:

  • Native format:  Unstructured data, stored in its native format, remains undefined until needed. Its adaptability increases file formats in the database, which widens the data pool and enables data scientists to prepare and analyze only the data they need.
  • Fast accumulation rates:  Since there is no need to predefine the data, it can be collected quickly and easily.
  • Data lake storage:  Allows for massive storage and pay-as-you-use pricing, which cuts costs and eases scalability.
  • Requires expertise:  Due to its undefined or non-formatted nature, data science expertise is required to prepare and analyze unstructured data. This is beneficial to data analysts but alienates unspecialized business users who might not fully understand specialized data topics or how to utilize their data.
  • Specialized tools:  Specialized tools are required to manipulate unstructured data, which limits product choices for data managers.
  • MongoDB :  Uses flexible documents to process data for cross-platform applications and services.
  • DynamoDB :  (link resides outside ibm.com) Delivers single-digit millisecond performance at any scale through built-in security, in-memory caching and backup and restore.
  • Hadoop :  Provides distributed processing of large data sets using simple programming models and no formatting requirements.
  • Azure :  Enables agile cloud computing for creating and managing apps through Microsoft’s data centers.
  • Data mining :  Enables businesses to use unstructured data to identify consumer behavior, product sentiment and purchasing patterns to better accommodate their customer base.
  • Predictive data analytics :  Alert businesses of important activity ahead of time so they can properly plan and accordingly adjust to significant market shifts.
  • Chatbots :  Perform text analysis to route customer questions to the appropriate answer sources.

While structured (quantitative) data gives a “birds-eye view” of customers, unstructured (qualitative) data provides a deeper understanding of customer behavior and intent. Let’s explore some of the key areas of difference and their implications:

  • Sources:  Structured data is sourced from GPS sensors, online forms, network logs, web server logs,  OLTP systems , among others; whereas unstructured data sources include email messages, word-processing documents, PDF files, and others.
  • Forms:  Structured data consists of numbers and values, whereas unstructured data consists of sensors, text files, audio and video files, among others.
  • Models:  Structured data has a predefined data model and is formatted to a set data structure before being placed in data storage (for example, schema-on-write), whereas unstructured data is stored in its native format and not processed until it is used (for example, schema-on-read).
  • Storage:  Structured data is stored in tabular formats (for example, excel sheets or SQL databases) that require less storage space. It can be stored in data warehouses, which makes it highly scalable. Unstructured data, on the other hand, is stored as media files or NoSQL databases, which require more space. It can be stored in data lakes, which makes it difficult to scale.
  • Uses:  Structured data is used in machine learning (ML) and drives its algorithms, whereas unstructured data is used in  natural language processing  (NLP) and text mining.

Semi-structured data (for example, JSON, CSV, XML) is the “bridge” between structured and unstructured data. It does not have a predefined data model and is more complex than structured data, yet easier to store than unstructured data.

Semi-structured data uses “metadata” (for example, tags and semantic markers) to identify specific data characteristics and scale data into records and preset fields. Metadata ultimately enables semi-structured data to be better cataloged, searched and analyzed than unstructured data.

  • Example of metadata usage:  An online article displays a headline, a snippet, a featured image, image alt-text, slug, among others, which helps differentiate one piece of web content from similar pieces.
  • Example of semi-structured data vs. structured data:  A tab-delimited file containing customer data versus a database containing CRM tables.
  • Example of semi-structured data vs. unstructured data:  A tab-delimited file versus a list of comments from a customer’s Instagram.

Recent developments in  artificial intelligence  (AI) and machine learning (ML) are driving the future wave of data, which is enhancing business intelligence and advancing industrial innovation. In particular, the data formats and models that are covered in this article are helping business users to do the following:

  • Analyze digital communications for compliance:  Pattern recognition and email threading analysis software that can search email and chat data for potential noncompliance.
  • Track high-volume customer conversations in social media:  Text analytics and sentiment analysis that enables monitoring of marketing campaign results and identifying online threats.
  • Gain new marketing intelligence:  ML analytics tools that can quickly cover massive amounts of data to help businesses analyze customer behavior.

Furthermore, smart and efficient usage of data formats and models can help you with the following:

  • Understand customer needs at a deeper level to better serve them
  • Create more focused and targeted marketing campaigns
  • Track current metrics and create new ones
  • Create better product opportunities and offerings
  • Reduce operational costs

Whether you are a seasoned data expert or a novice business owner, being able to handle all forms of data is conducive to your success. By using structured, semi-structured and unstructured data options, you can perform optimal data management that will ultimately benefit your mission.

Get the latest tech insights and expert thought leadership in your inbox.

To better understand data storage options for whatever kind of data best serves you, check out IBM Cloud Databases

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.

IMAGES

  1. Qualitative Data Analysis: Step-by-Step Guide (Manual vs. Automatic

    research methods of qualitative data analysis

  2. Understanding Qualitative Research: An In-Depth Study Guide

    research methods of qualitative data analysis

  3. Methods of qualitative data analysis.

    research methods of qualitative data analysis

  4. 6 Types of Qualitative Research Methods

    research methods of qualitative data analysis

  5. CHOOSING A QUALITATIVE DATA ANALYSIS (QDA) PLAN

    research methods of qualitative data analysis

  6. Qualitative Data Analysis stock illustration. Illustration of

    research methods of qualitative data analysis

VIDEO

  1. QUALITATIVE RESEARCH: Methods of data collection

  2. Qualitative Research Tools

  3. Research Methods

  4. Five Types of Data Analysis

  5. Session 04: Data Analysis techniques in Qualitative Research

  6. Qualitative and Quantitative Research design

COMMENTS

  1. Learning to Do Qualitative Data Analysis: A Starting Point

    For many researchers unfamiliar with qualitative research, determining how to conduct qualitative analyses is often quite challenging. Part of this challenge is due to the seemingly limitless approaches that a qualitative researcher might leverage, as well as simply learning to think like a qualitative researcher when analyzing data. From framework analysis (Ritchie & Spencer, 1994) to content ...

  2. What Is Qualitative Research?

    Qualitative research methods. Each of the research approaches involve using one or more data collection methods.These are some of the most common qualitative methods: Observations: recording what you have seen, heard, or encountered in detailed field notes. Interviews: personally asking people questions in one-on-one conversations. Focus groups: asking questions and generating discussion among ...

  3. Qualitative Data Analysis: What is it, Methods + Examples

    Qualitative data analysis is a systematic process of examining non-numerical data to extract meaning, patterns, and insights. In contrast to quantitative analysis, which focuses on numbers and statistical metrics, the qualitative study focuses on the qualitative aspects of data, such as text, images, audio, and videos.

  4. How to use and assess qualitative research methods

    How to conduct qualitative research? Given that qualitative research is characterised by flexibility, openness and responsivity to context, the steps of data collection and analysis are not as separate and consecutive as they tend to be in quantitative research [13, 14].As Fossey puts it: "sampling, data collection, analysis and interpretation are related to each other in a cyclical ...

  5. Qualitative Data Analysis Methods: Top 6 + Examples

    QDA Method #1: Qualitative Content Analysis. Content analysis is possibly the most common and straightforward QDA method. At the simplest level, content analysis is used to evaluate patterns within a piece of content (for example, words, phrases or images) or across multiple pieces of content or sources of communication. For example, a collection of newspaper articles or political speeches.

  6. Qualitative Research

    Qualitative Research. Qualitative research is a type of research methodology that focuses on exploring and understanding people's beliefs, attitudes, behaviors, and experiences through the collection and analysis of non-numerical data. It seeks to answer research questions through the examination of subjective data, such as interviews, focus ...

  7. Introduction to qualitative research methods

    INTRODUCTION. Qualitative research methods refer to techniques of investigation that rely on nonstatistical and nonnumerical methods of data collection, analysis, and evidence production. Qualitative research techniques provide a lens for learning about nonquantifiable phenomena such as people's experiences, languages, histories, and cultures.

  8. What Is Qualitative Research?

    Qualitative research methods. Each of the research approaches involve using one or more data collection methods.These are some of the most common qualitative methods: Observations: recording what you have seen, heard, or encountered in detailed field notes. Interviews: personally asking people questions in one-on-one conversations. Focus groups: asking questions and generating discussion among ...

  9. Qualitative Data Analysis: Step-by-Step Guide (Manual vs ...

    Qualitative Data Analysis methods. Once all the data has been captured, there are a variety of analysis techniques available and the choice is determined by your specific research objectives and the kind of data you've gathered. Common qualitative data analysis methods include: Content Analysis. This is a popular approach to qualitative data ...

  10. Qualitative Data Analysis

    Summary. Qualitative analysis—the analysis of textual, visual, or audio data—covers a spectrum from confirmation to exploration. Qualitative studies can be directed by a conceptual framework, suggesting, in part, a deductive thrust, or driven more by the data itself, suggesting an inductive process. Generic or basic qualitative research ...

  11. Data Analysis in Qualitative Research: A Brief Guide to Using Nvivo

    In some cases, qualitative data can also include pictorial display, audio or video clips (e.g. audio and visual recordings of patients, radiology film, and surgery videos), or other multimedia materials. Data analysis is the part of qualitative research that most distinctively differentiates from quantitative research methods.

  12. Qualitative Research Methods Course by University of Amsterdam

    In this course you will be introduced to the basic ideas behind the qualitative research in social science. You will learn about data collection, description, analysis and interpretation in qualitative research. Qualitative research often involves an iterative process. We will focus on the ingredients required for this process: data collection ...

  13. Qualitative Data Analysis

    The Third Edition's presentation of the fundamentals of research design and data management is followed by five distinct methods of analysis: exploring, describing, ordering, explaining, and predicting. Miles and Huberman's original research studies are profiled and accompanied with new examples from Saldaña's recent qualitative work.

  14. Qualitative Research

    Qualitative Data Analysis: A Methods Sourcebook by Miles, Huberman and Saldana Miles, Huberman, and Saldaña's book is the authoritative text for analyzing and displaying qualitative research data... As the authors demonstrate, when researchers "think display," their analyses of social life capture the complex and vivid processes of the people ...

  15. Qualitative Data Analysis and the Use of Theory

    The First Wave. The Positivist paradigm dominated research, largely unquestioned, prior to the early 20th century.It emphasized the distancing of the researcher from his/her subjects; researcher objectivity; a focus on objective, cause-effect, evidence-based data derived from empirical observation of external realities; experimental quantitative methods involving testing hypotheses; and the ...

  16. Qualitative Data Analysis

    5. Grounded theory. This method of qualitative data analysis starts with an analysis of a single case to formulate a theory. Then, additional cases are examined to see if they contribute to the theory. Qualitative data analysis can be conducted through the following three steps: Step 1: Developing and Applying Codes.

  17. (PDF) Data Analysis Methods for Qualitative Research: Managing the

    Thematic analysis is a method of data analysis in qualitative research that most researchers use, and it is flexible because it can be applied and utilized broadly across various epistemologies ...

  18. Qualitative Research: Data Collection, Analysis, and Management

    INTRODUCTION. In an earlier paper, 1 we presented an introduction to using qualitative research methods in pharmacy practice. In this article, we review some principles of the collection, analysis, and management of qualitative data to help pharmacists interested in doing research in their practice to continue their learning in this area.

  19. Qualitative Data Analysis Methodologies and Methods

    Types of Qualitative Data Analysis Methodologies. Systematically analyzing textual, visual, or auditory content to identify patterns, themes, and meanings. Includes conventional, directed, and summative approaches. Identifying, analyzing, and reporting patterns or themes within qualitative data. Offers a systematic approach to coding and ...

  20. Integrating AI language models in qualitative research: Replicating

    In specific research areas, the depth of qualitative text analysis extends beyond theme and pattern identification. For instance, in system dynamics, researchers engage in rigorous coding of textual data to discern model variables, causal links, and feedback loops (Kim & Andersen, 2012; Newberry & Carhart, 2023; Tomoaia-Cotisel et al., 2022).

  21. Determinants of appropriate antibiotic and NSAID prescribing in

    Qualitative data coding and analysis was managed by the Vanderbilt University Qualitative Research Core. ... The IRB determined that all requirements set forth in 38CFR16.111 in accordance for human subjects research have been satisfied. All the methods were carried out according the relevant guidelines and regulations. Consent for publication ...

  22. Qualitative Study

    Qualitative research is a type of research that explores and provides deeper insights into real-world problems.[1] Instead of collecting numerical data points or intervening or introducing treatments just like in quantitative research, qualitative research helps generate hypothenar to further investigate and understand quantitative data. Qualitative research gathers participants' experiences ...

  23. Structured vs. unstructured data: What's the difference?

    Unstructured data, typically categorized as qualitative data, cannot be processed and analyzed through conventional data tools and methods. Since unstructured data does not have a predefined data model, it is best managed in non-relational (NoSQL) databases.Another way to manage unstructured data is to use data lakes to preserve it in raw form. ...

  24. Unique green chromatographic method for the qualitative and

    Current research portrays a simultaneous determination method for the qualitative and quantitative analysis of Ketoconazole, its impurities and preservatives from Ketoconazole cream formulation. The unique method was developed using the RP-UPLC technique with a gradient elution mode. The separation of isomeric preservative components was achieved on the Waters Acquity UPLC BEH C<SUB loc="post ...