Product Design Bundle and save

User Research New

Content Design

UX Design Fundamentals

Software and Coding Fundamentals for UX

  • UX training for teams
  • Hire our alumni
  • Student Stories
  • State of UX Hiring Report 2024
  • Our mission
  • Advisory Council

Education for every phase of your UX career

Professional Diploma

Learn the full user experience (UX) process from research to interaction design to prototyping.

Combine the UX Diploma with the UI Certificate to pursue a career as a product designer.

Professional Certificates

Learn how to plan, execute, analyse and communicate user research effectively.

Master content design and UX writing principles, from tone and style to writing for interfaces.

Understand the fundamentals of UI elements and design systems, as well as the role of UI in UX.

Short Courses

Gain a solid foundation in the philosophy, principles and methods of user experience design.

Learn the essentials of software development so you can work more effectively with developers.

Give your team the skills, knowledge and mindset to create great digital products.

Join our hiring programme and access our list of certified professionals.

Learn about our mission to set the global standard in UX education.

Meet our leadership team with UX and education expertise.

Members of the council connect us to the wider UX industry.

Our team are available to answer any of your questions.

Fresh insights from experts, alumni and the wider design community.

Success stories from our course alumni building thriving careers.

Discover a wealth of UX expertise on our YouTube channel.

Latest industry insights. A practical guide to landing a job in UX.

How to conduct effective user interviews for UX research

User interviews are a popular UX research technique, providing valuable insight into how your users think and feel. Learn about the different types of user interviews and how to conduct your own in this guide.

Free course promotion image

Free course: Introduction to UX Design

What is UX? Why has it become so important? Could it be a career for you? Learn the answers, and more, with a free 7-lesson video course.

user interviews in UX research blog header

User interviews are a popular UX research technique, providing valuable insight into how your users think and feel. Learn about the different types of user interviews and how to conduct your own in this guide. 

User research is fundamental for good UX. It helps you get to know your users and design products that meet their needs and solve their pain-points. 

One of the most popular UX research methods is user interviews. With this technique, you get to hear from your users first-hand, learning about their needs, goals, expectations, and frustrations—anything they think and feel in relation to the problem space.

But when should you conduct user interviews and how do you make sure they yield valuable results?

Follow this guide and you’ll be a user interview pro. We explain:

What are user interviews in UX research?

What are the different types of user interviews, when should you conduct user interviews, what data and insights do you get from user interviews, how to conduct effective user interviews for ux research: a step-by-step guide.

  • What happens next? How to analyse your user interview data

First things first: What are user interviews?

[GET CERTIFIED IN USER RESEARCH]

Interviews are one of the most popular UX research methods. They provide valuable insight into how your users think, feel, and talk about a particular topic or scenario—allowing you to paint a rich and detailed picture of their needs and goals. 

interviews take place on a one-to-one basis, with a UX designer or UX researcher asking the user questions and recording their answers. They can last anywhere between 30 minutes and an hour, and they can be done at various stages of a UX design project. 

There are several different types of user interviews. They can be:

  • Structured, semi-structured, or unstructured
  • Generative, contextual, or continuous
  • Remote or in-person

Let’s explore these in more detail.

Structured vs. semi-structured vs. unstructured user interviews

Structured interviews follow a set list of questions in a set order. The questions are usually closed—i.e. there’s a limit to how participants can respond (e.g. “Yes” or “No”). Structured interviews ensure that all research participants get exactly the same questions, and are most appropriate when you already have a good understanding of the topic/area you’re researching. 

Structured interviews also make it easier to compare the data gathered from each interview. However, a disadvantage is that they are rather restrictive; they don’t invite much elaboration or nuance. 

Semi-structured interviews are based on an interview guide rather than a full script, providing some pre-written questions. These tend to be open-ended questions, allowing the user to answer freely. The interviewer will then ask follow-up questions to gain a deeper understanding of the user’s answers. Semi-structured interviews are great for eliciting rich user insights—but, without a set script of questions, there’s a high risk of researcher bias (for example, asking questions that unintentionally lead the participant in a certain direction). 

Unstructured user interviews are completely unscripted. It’s up to the interviewer to come up with questions on the spot, based on the user’s previous answers. These are some of the trickiest types of user interviews—you’re under pressure to think fast while avoiding questions that might bias the user’s answer. Still, if done well, unstructured interviews are great if you have very little knowledge or data about the domain and want to explore it openly. 

Generative vs. contextual vs. continuous user interviews

Generative user interviews are ideal for early-stage exploration and discovery. They help you to uncover what you don’t know—in other words, what insights are you missing? What user problem should you be trying to solve? Which areas and topics can you identify for further user research? Generative interviews are usually unstructured or semi-structured. 

Contextual user interviews take place in a specific context—while the user is carrying out a certain task, for example. This allows you to not only observe the user’s actions/behaviour first-hand, but also to ask questions and learn more about why the user takes certain actions and how they feel in the process. Contextual interviews tend to be semi-structured. 

Continuous user interviews are conducted as part of continuous UX research. While traditional user research is done within the scope of a specific project, continuous UX research is ongoing, conducted at regular interviews (e.g. weekly or monthly) with the goal of continuous product improvement. Continuous interviews are like regular check-ins with your users, giving you ongoing insight into their needs, goals, and pain-points. 

[GET CERTIFIED IN UX]

Remote vs. in-person user interviews

A final distinction to make is between remote and in-person interviews. 

In-person user interviews take place with the user and researcher in the same room. A big advantage of in-person interviews is that you’re privy to the user’s body language—an additional insight into how they feel. 

Remote user interviews take place via video call. Like any kind of remote work, they’re more flexible and may be more accessible for research participants as they don’t require any travel. 

User interviews provide value at various stages of a design project. You can use them for:

  • Discovery and ideation —when you want to learn more about your target users and the problems they need you to solve.
  • UX testing and product improvement —when you want to get user feedback on an existing design concept or solution.
  • Continuous UX research —you can run regular interviews as part of a continuous UX research framework. 

Let’s take a closer look. 

User interviews for discovery and ideation

User interviews can be useful right at the beginning of a UX project, when you don’t know much (or anything) about the domain and don’t yet have a design direction. At this stage, everything is pretty open and your user interviews will be exploratory. 

Conducting user interviews early in the process will help you to answer questions such as “Who are our target users?”, “What problems do they need us to solve?” and “What are their goals and expectations in relation to the problem space?”

Here you’ll be focusing on generative user interviews (i.e. finding out what you don’t know), and they’ll likely be unstructured or semi-structured.

User interviews as part of UX testing and product improvement

User interviews also come in handy when you have an idea or concept you want to evaluate, or even a working product you want to test. 

At this stage, you might present the user with a prototype and ask them questions about it. If you’re further along in the design process, you can run user interviews as an add-on to UX testing —having the user interact with a working prototype (or the product itself) and asking them questions at the same time. These are the contextual interviews we described earlier. 

Conducting user interviews at this stage will help you gain insight into how your users feel about a concept/product/experience and to identify pain-points or usability issues within the existing design. 

User interviews as part of continuous UX research

User interviews are also valuable as part of a continuous UX research framework. Here, there is no project-specific goal—rather, you’re interviewing users regularly to gain ongoing user insights. This enables you to maintain a user-centric design process and to evolve your product continuously as you learn more about your users. 

You can learn more about the importance of continuous UX research here .

User interviews allow you to hear from the user, in their own words, how they think and feel about a particular problem space/experience/task. This provides rich insights into their thoughts, beliefs, experiences, problems, goals, desires, motivations, and expectations, as well as the rationale or thought process behind certain actions. 

As such, user interviews generate qualitative data . That is, data which tells you about a person’s thoughts, feelings, and subjective experiences. It’s the opposite of quantitative data which is objective, numerical, and measurable. You can learn more about the difference between quantitative and qualitative user research data here .

Note that user interviews generate self-reported data . Self-reported data is based on what the user chooses to share with you (you’re not observing it; rather, you’re hearing it from the user). It’s how they report to be feeling or thinking. 

If you conduct contextual user interviews, you’ll gather a mixture of observational data (based on what you observe the user doing) and self-reported data. 

After conducting user interviews, you’ll end up with lots of data in the form of interview transcripts, audio or video recordings, and your own notes. We’ll look at how to analyse your user interview data in the final section of this guide. 

First, though, here’s a step-by-step plan you can follow to conduct effective user interviews. 

Ready to conduct your own user interviews? Follow our step-by-step guide to get started.

  • Determine what type of user interviews you’ll conduct
  • Write your user interview script (or guide)
  • Set up the necessary tools
  • Recruit your interview participants
  • Perfect your interview technique

Let’s walk through our plan step by step. 

1. Determine what type of user interviews you’ll conduct

Earlier in this guide, we outlined the different types of user interviews: Structured, semi-structured, and unstructured; generative, contextual, and continuous; and remote and in-person. 

The first step is to determine what format your user interviews will take. This depends on:

  • What stage you’re at in the project/process
  • What your research goals are

If you’re at the very early stages of a design project, you’ll likely want to keep your user interviews open and exploratory—opting for unstructured or semi-structured interviews. 

Perhaps you’ve already got a design underway and want to interview your users as they interact with it. In that case, structured or semi-structured contextual interviews may work best. 

Consider what you want to learn from your user interviews and go from there. 

2. Write your user interview script (or guide)

How you approach this step will depend on whether you’re conducting structured, semi-structured, or unstructured user interviews.

For structured interviews, you’ll need to write a full interview script—paying attention to the order of the questions. The script should also incorporate follow-up questions; you won’t have the freedom to improvise or ask additional questions outside of your script, so make sure you’re covering all possible ground. 

For semi-structured interviews, you’ll write an interview guide rather than a rigid script. Come up with a set list of questions you definitely want to ask and use these—and your users’ answers—as a springboard for follow-up questions during the interview itself. 

For unstructured user interviews, you can go in without a script. However, it’s useful to at least brainstorm some questions you might ask to get the interview started. 

Regardless of whether you’re conducting structured, semi-structured, or unstructured interviews, it’s essential that your questions are:

  • Open-ended . These are questions that cannot be answered with a simple “yes” or “no”. They require more elaboration from the user, providing you with much more insightful answers. An example of an open question could be “Can you tell me about your experience of using mobile apps to book train tickets?” versus a closed question such as “Have you ever used a mobile app to book train tickets?”
  • Unbiased and non-leading . You want to be very careful about how you word your questions. It’s important that you don’t unintentionally lead the user or bias their answer in any way. For example, if you ask “How often do you practise app-based meditation?”, you’re assuming that the user practises meditation at all. A better question would be “What are your thoughts on app-based meditation?” 

It’s worth having someone else check your questions before you use them in a user interview. This will help you to remove any unintentionally biased or leading questions which may compromise the quality of your research data. 

3. Recruit your interview participants

Your user interviews should involve people who represent your target users. This might be existing customers and/or representative users who fit the persona you would be designing for. 

Some common methods for recruiting user research participants include:

  • Posting on social media
  • Working with a dedicated agency or platform which will connect you with suitable participants
  • Recruiting from your own customer or user database

The good thing about user interviews is that you don’t need loads of participants to gather valuable data. Focus on quality over quantity, recruiting between five and ten interviewees who closely match your target group. 

4. Set up the necessary tools

Now for the practical matter of getting your user interviews underway. If you’re conducting in-person user interviews, you’ll need to choose an appropriate setting—ideally somewhere quiet and neutral where the user will feel relaxed. 

For remote user interviews, you’ll need to set up the necessary software, such as Zoom , dscout , or Lookback . Consult this guide for more UX research tools . 

You’ll also need to consider how you’re going to record the user’s answers. Will you use good old fashioned pen and paper, a simple note-taking app, or a recording and transcription software? 

Make a list of all the tools you’ll need for a seamless user interview and get everything set up in advance. 

5. Perfect your interview technique

As the interviewer, you have an important role to play in ensuring the success of your user interviews. So what makes a good interviewer? Here are some tips to help you perfect your interview technique:

  • Practise active listening . Show the user that you’re listening to them; maintain eye contact (try not to be too distracted with taking notes), let them speak without rushing, and don’t give any verbal or non-verbal cues that you’re judging their responses.
  • Get comfortable with silence . In everyday conversations, it can be tempting to fill silences. But, in an interview situation, it’s important to lean into the power of the pause. Let the user think and speak when they’re ready—this is usually when you elicit the most interesting insights.
  • Speak the user’s language . Communication is everything in user interviews. Don’t alienate the user by speaking “UX speak”—they may not be familiar with industry-specific terms, and this can add unnecessary friction to the experience. Keep it simple, conversational, and accessible.

Ultimately, the key is to put your users at ease and create a space where they can talk openly and honestly. Perfect your interview technique and you’ll find it much easier to build a rapport with your research participants and uncover valuable, candid insights. 

What happens next? How to analyse your user interview data 

You’ve conducted your user interviews. Now you’re left with lots of unstructured, unorganised qualitative data—i.e. reams of notes. So how do you turn all those interview answers into useful, actionable insights? 

The most common technique for analysing qualitative data is thematic analysis . This is where you read through all the data you’ve gathered (in this case, your notes and transcripts) and use ‘codes’ to denote different patterns that emerge across the dataset. 

You’ll then ‘code’ different excerpts within your interview notes and transcripts, eventually sorting the coded data into a group of overarching themes. 

At this stage, you can create an affinity diagram —writing all relevant findings and data points onto Post-it notes and ‘mapping’ them into topic clusters on a board. This is a great technique for physically working through your data and creating a visualisation of your themes, allowing you to step back and spot important patterns. 

With your research data organised and categorised, you can review your findings in relation to your original research objectives. What do the themes and patterns tell you? What actions can you take from your findings? What gaps still need to be filled with further UX research?

As a final step, you might write up a UX research report and present your findings to relevant stakeholders. 

Learn more about UX research

We hope you now have a clear understanding of what user interviews are, why they’re such a valuable UX research method, and how to conduct your own user interviews. If you’d like to learn more about user research, continue with these guides:

  • A complete introduction to card sorting: What is it and how do you do it?
  • What are UX personas and what are they used for?
  • What’s the future of UX research? An interview with Mitchell Wakefield, User Researcher at NHS Digital
  • user interviews
  • ux research

Subscribe to our newsletter

Get the best UX insights and career advice direct to your inbox each month.

Thanks for subscribing to our newsletter

You'll now get the best career advice, industry insights and UX community content, direct to your inbox every month.

Upcoming courses

Professional diploma in ux design.

Learn the full UX process, from research to design to prototyping.

Professional Certificate in UI Design

Master key concepts and techniques of UI design.

Certificate in Software and Coding Fundamentals for UX

Collaborate effectively with software developers.

Certificate in UX Design Fundamentals

Get a comprehensive introduction to UX design.

Professional Certificate in Content Design

Learn the skills you need to start a career in content design.

Professional Certificate in User Research

Master the research skills that make UX professionals so valuable.

Upcoming course

Build your UX career with a globally-recognised, industry-approved certification. Get the mindset, the skills and the confidence of UX designers.

You may also like

Gemini AI blog article image

What is Gemini AI (formerly known as Bard AI) and how can you use it to become a better UX designer?

Illustration for laws of UX

What are the laws of UX? All 21 laws explained

UI Design tools blog header image

The 10 best user interface (UI) design tools to try in 2024

Build your UX career with a globally recognised, industry-approved qualification. Get the mindset, the confidence and the skills that make UX designers so valuable.

4 June 2024

Advisory boards aren’t only for executives. Join the LogRocket Content Advisory Board today →

LogRocket blog logo

  • Product Management
  • Solve User-Reported Issues
  • Find Issues Faster
  • Optimize Conversion and Adoption

Mastering UX user interviews: Guide, tips, and examples

user research interviews

User interviews are a crucial part of the product design process because they allow us to gain insight into the needs, behaviors, and motivations of the product users.

UX Teams Communicating

The main goal of user interviews is to understand the user’s pain points and needs. These key insights will help us to make decisions about the UX design and ensure that the final product meets the user’s needs.

The purpose of the article is to explain this technique 360 degrees, why we use it during UX research, which kind of data you can collect, how to prepare the questions, what you can do with the data, and provide you with many tips that will help you to make a great user interview process.

Table of contents

Quantitative and qualitative research, what are user interviews, and what are they not, when and why you should conduct user interviews, user interviews have limitations, preparing for a user interview.

  • Onsite vs. remote interviews

Writing the script for the interview

During the interview, after the interview.

We can collect two kinds of data during user research: qualitative and quantitative data.

Quantitative data comes from analytics tools that tell us what happens. For instance, they can show how many users add products to their cart and leave the page without buying anything.

Qualitative data comes from interviews, observations, and focus groups to understand why certain things happen. For example, if we saw that 35 percent of people abandoned their shopping carts with items in them, we could interview them to find out why.

Quantitative and qualitative data are valuable information that help us understand our users. Quantitative data tells us what happens, and qualitative data tells us why.

Quantitative and Qualitative Research

We can indeed gain insights from every conversation we have with a user. But rather than a simple conversation we have along the way, an interview is a structured process we do during the UX research process with a clear objective: to know the user and their needs better. Let’s take a closer look at what a user interview is not.

It is not a sales meeting

User interviews aren’t sales meetings. During a user interview, we can’t push our product. Product designers don’t make sales, but sometimes a sales rep gets invited. We should explain to them that we are listening to the user in that meeting, not selling them anything.

User interviews are not a features feedback meeting

User interviews are not the place to discuss upcoming features or potential solutions to the user’s problems.

First, users cannot consider the product architecture and the company’s dependencies.

Aside from that, users tend to focus more on their existing solutions than on developing new ones. As Henry Ford said, “if I ask what people want, they will say faster horses.”

user research interviews

Over 200k developers and product managers use LogRocket to create better digital experiences

user research interviews

This is not a usability test

When product designers run a usability test, they check if the user understands the designer’s solution. Our objective when performing a usability test is to determine if the user understands the interface to perform some actions and if the flow is smooth.

Therefore, in the user interview, we ask the user about their needs and pain points, and in the usability test, we want to see if they understand the solution.

Interviewing users is a very effective way to build products that solve people’s problems, and it can be used in different ways during a product’s lifecycle.

First, we can use user interviews to find a problem, for example, before we start building a startup company. We can focus on one topic, like buying online, and ask people what they don’t like about it. Maybe we’ll find a big problem many people suffer from, and we’ll figure out that there’s much to fix.

If we have a product and want to grow the business and solve more problems, it’s the same. We can conduct them and collect valuable information.

As a result of user interviews, you will have lots of data that you can use to create a user persona and user journey. That way, all the team members will be on the same page about the user’s needs and what the team wants to solve. Also, making decisions will be easier since everyone knows what the user needs.

First, people often need to remember what they did or they perform tasks automatically, so they don’t remember to share critical details. In addition, since everyone is different, the interviewer should be professional and know how to interact with each person so they will be open with them and share their experience.

It is important to follow certain steps before conducting a user interview, so let’s go over them.

Setting clear goals for the user interview

You need to write down your objectives for the interview. You usually have 3–5 goals, but if you have more than 5, your objectives need to be more focused.

For instance, if you have a website that sells shoes, your goals could be:

  • Learn how people buy shoes online
  • Learn why people like/dislike buying shoes online
  • Learn why people abandon carts with products in them

Take the time to build this part well because if it’s done right, all the other parts will be easier.

Who and how many people to interview

It’s critical to consider your research goals and choose individuals who are relevant to those goals. For example, if you have an online store that sells running shoes, you must search for people who run because it will be more accurate. You won’t get accurate information if you choose users who don’t run.

The number of people you interview can vary, but generally, you should speak with between 3 and 8 people. If you interview fewer than three people, you might not get a representative sample, and interviewing more than eight people might take a lot of time and give you no additional insight. Aiming for a total of 5 interviews is often a good balance, as it allows for identifying patterns while still being manageable.

How to find users, and how to invite them

Finding people to interview can be tricky, but here are some places you can find them:

  • Get in touch with existing users : If the product has some users, you can email them and ask if they’d like to join. You can get help from the customer success team or the sales team since they talk with users daily
  • Forums and social media : Search for forums and other online communities that focus on the same topic and ask them to participate. If your product offers DevOps support, you can locate some developer communities specializing in DevOps and ask them to participate
  • Using your network : If you have an extensive personal network, you may find people to interview through it. For example, if you develop tools for DevOps and have a direct connection with several DevOps professionals, you can ask them to participate

It is best to find users for a user interview through personal connections or introductions rather than cold emails. For example, a sales team member can introduce you to a potential candidate, and this can increase the likelihood that potential candidates will be receptive to being interviewed.

Preparing some templates for your emails will help you work faster. When you make your first connection with a user, you should send a mail explaining who you are and why you are conducting the interview. You should also send another mail explaining how the interview will proceed and set up a time, and the last email thanking the candidate.

To maintain a successful relationship with users, you must be honest with them and quickly respond to their emails.

Onsite vs. remote

You can observe the user’s body language better during an onsite interview. On the other hand, remote interviews have many advantages, such as interviewing people in different time zones and locations. In addition, it is much simpler since the other person doesn’t need to leave their comfort zone.

I prefer the remote option, but if some limitations require it to be done onsite, that is also an option.

The following tips will help you prepare for an onsite or online interview.

Interview onsite

  • Prepare a comfortable place for the interview
  • Make sure a water bottle is on the table
  • Provide directions to the office, so the user knows where it is

Interview online

  • Ensure your internet connection works well
  • Get a quiet place to do the interview
  • Ensure your headphones work and all your voice settings are set up correctly

A clear script with questions is essential to a successful interview. The script has four sections.

Introduction

First, we would like to introduce ourselves and the notetaker (a person who helps us take notes, such as a UX designer, a product manager, or a developer), explain why we are conducting the interview, how it will proceed, and that we are not testing the candidates or their knowledge. This will allow them to feel comfortable during the interview, so they will be more willing to share information.

If you want to record the meeting, now is the time to ask for their permission (I suggest you ask in writing ahead of time). Lastly, ask if they have any more questions before continuing.

Let’s break the ice

In this section, you will ask 3–5 questions to break the ice with your interviewee. For example, you might ask:

  • Can you tell me a little bit about yourself and your professional background?
  • When did you first become interested in the field you work in now?
  • In your current role or responsibilities, what do you enjoy most?
  • When you have free time, what do you like to do?
  • How does your typical day look?

The main interview questions

Here you ask the main research questions. Asking 10–12 questions will give you good results, but you should prepare 20–25 questions since some people are difficult to get information from, so more questions will help. To make it more structured, think about what you want to ask at the beginning and what is at the end. Here are some tips for writing them well:

Ask questions that align with your research goals. Don’t ask questions that are out of the scope of what you’re looking for.

Avoid leading questions because they can lead to biased or inaccurate responses and make the participant feel uncomfortable or pressured to give a certain answer. Let’s say we do user research for an online store that sells running shoes.

If you ask, “Can you tell me which websites you bought running shoes from?” you assume that the user bought the shoe online, but you don’t know that.

Would it be better if I asked, “Where do you buy your running shoes?”

Avoid vague questions because they can lead to unclear or confusing responses, making the participant feel frustrated. Instead, ask clear and specific questions.

An example of a vague question is: What do you think of the shopping experience on the website?

Better questions can be:

  • How easy or difficult was it to find the running shoes you were looking for on the website?
  • Which factors influenced your decision to buy or not buy a pair of shoes?

Open questions are better than closed questions. Open questions are better in user interviews because they allow the participants to express their thoughts and give you more insight. The answer to a closed-ended question can only be yes or no, so it won’t help us find the information we need.

An example of a close question is: “Do you buy your running shoes online?”

An open question will be: “Can you tell me how you buy running shoes?”

We sometimes want to ask a closed question to continue to ask questions. It is okay to ask a closed question and then an open question, so you can better understand how the user behaves. For example, you can ask the user, “Do you buy your running shoes online?”

If the answer is YES, you may ask:

  • Why do you buy shoes online?
  • Where do you buy your shoes online?

If the answer is NO, you may ask:

  • Why do you not buy shoes online?
  • Where do you buy your shoes?

Try to start with a big question. Starting with a big question and then asking related questions will enable you to dig more profoundly step-by-step. You can ask, for example:

“Can you tell me when was the last time you bought new running shoes?”

Then ask subquestions like:

  • When was it?
  • How long did it take you to choose the shoe?
  • Where did you find them?
  • What made you choose this brand/shoe?

The last question : At the end of this part, you can ask a very open question that will give you more information about the user. For example, “If you had a magic stick that could help you choose your next running shoe, what would it do?”

At this point, we’d like to thank the interviewee and explain what we plan to do with the data. Before you end the interview, ask the interviewee if they have any questions. In that way, you give them the final say.

Interview Script

To get quality information from the interviewees, there are some key things you need to do during the interview.

  • During the interview, be calm and not nervous. Think of it as a conversation between two people
  • The purpose of this session is to ask questions and to get information from the participants, so you should speak less and listen more. It is recommended that you speak 15 percent of the time and listen 85 percent of the time
  • Please don’t shy away from negative user feedback about the product; you can gain valuable insight from users’ complaints
  • If you see that the user wants to talk, but you finish all the questions in the script, don’t be afraid to improvise. Getting more experience in user interviews will help you to improvise better
  • Occasionally, people talk little, and the call may feel silent. This is fine, and you can wait until the user continues talking. However, you can break the silence if it takes a long time
  • When doing a remote interview, turn on the automatic captioning feature (you can find this feature in many online call apps). It’ll help you understand people better, especially if you’re doing it in a foreign language or if they talk with a strong accent
  • As far as my experience shows, recording the interview and watching it again is the most effective way to ensure you get everything
  • Search for pain points because this is where you can start finding solutions for users

This is the time to organize all the information you collected. You can summarize each interview and make the main points from it. Once you have summarized all the interviews, you can create a report that you can share with the product manager and the developers. In this way, the team can prioritize the following solutions it wishes to develop.

Further, if you are working on a new product, you can create a user persona and a user journey. As a result, all team members will be more knowledgeable about users and their needs.

All reports should be concise and to the point. Writing a long document is easy, but to effectively communicate the results, you must focus on the key points rather than getting bogged down in unnecessary detail. You can include links to the interview summaries if the team members need more explanation.

To conclude

User interviews are a crucial part of the product design process, as they allow designers to gain insight into the users’ needs, behaviors, and motivations. This article reviewed all the details you need to know to perform user interviews effectively.

We started by discussing what qualitative and quantitative research is and how user interviews fit into these categories. We also looked at when to conduct user interviews and how to prepare for them. This included setting clear goals, identifying users to interview, and deciding between onsite and online interviews. We also discussed how to create a script for the interview.

Next, we saw tips for conducting a smooth interview, and finally, we discussed what to do with the data you collected.

Featured image source: IconScout

LogRocket : Analytics that give you UX insights without the need for interviews

LogRocket lets you replay users' product experiences to visualize struggle, see issues affecting adoption, and combine qualitative and quantitative data so you can create amazing digital experiences.

See how design choices, interactions, and issues affect your users — get a demo of LogRocket today .

Share this:

  • Click to share on Twitter (Opens in new window)
  • Click to share on Reddit (Opens in new window)
  • Click to share on LinkedIn (Opens in new window)
  • Click to share on Facebook (Opens in new window)
  • #ux research

user research interviews

Stop guessing about your digital experience with LogRocket

Recent posts:.

Design Planning Documents on Monitor

How to use design planning for a smooth project delivery

Let’s talk about the design planning process, why projects fail without it, and the necessary elements all designers should include in it.

Structuring A Design Team For Success

Structuring a design team for success

Having a strong design team structure is key to creating the right conditions for your team if you want them to produce their best work.

user research interviews

Color and culture: How color language changes by background

Color variance across cultures and perceptions can impact the way UX design is interpreted and how effective it is.

user research interviews

How to run one-on-one meetings with a design team

Let’s talk about the purpose of a one-on-one meeting, what makes a successful one, and strategies to plan and run one effectively.

user research interviews

Leave a Reply Cancel reply

Complete Guide to User Interviews: Definition, Benefits, and How to Conduct

11 Min Read User interviews are one of the best ways to gain a comprehensive understanding of your best customers. When conducted effectively, they give you the insights you need to accelerate your business growth.

user research interviews

Written by UserInput Team

Quick navigation.

User interviews are a valuable tool for gaining a comprehensive understanding of your users. When conducted effectively, interviews can provide deep insights into your users’ goals, perceptions, and experiences.

Conversely, conducting interviews poorly can lead to inaccurate information that steers the design of your site in the wrong direction.

In this article, we explore the fundamental principles of conducting successful user interviews.

What is a User Interview?

A user interview is a qualitative research method used to gather information and insights from users or potential users of a product, service, or system. It involves conducting one-on-one conversations with individuals who represent the target audience or user group.

The main objective of a user interview is to understand the users’ needs, behaviors, preferences, and challenges related to the product or service being developed. By directly engaging with users, you can gather valuable feedback, uncover user expectations, and gain a deeper understanding of their experiences and perspectives.

During a user interview, the interviewer typically asks open-ended questions to encourage participants to share their thoughts, feelings, and experiences. The questions can cover a wide range of topics, such as the users’ goals, tasks they want to accomplish, their pain points, satisfaction levels, and suggestions for improvement.

User interviews can be conducted in person, over the phone, or through video conferencing tools. They can be structured, semi-structured, or unstructured, depending on the research goals and the level of flexibility needed. The interviews are usually recorded or documented to ensure accurate capturing of the insights and to allow for later analysis.

The findings from user interviews provide valuable input. They help uncover user needs and inform design decisions, leading to the creation of products and services that better meet user requirements and expectations.

Why are User Interviews Important?

User interviews play a pivotal role in the design and development for several reasons.

Deep understanding of your users

User interviews facilitate a deep understanding of your actual users. By engaging in one-on-one conversations, you can gather valuable insights into users’ goals, motivations, behaviors, and challenges. This understanding forms the foundation for creating products and services that align with user needs.

Uncover needs and preferences

By asking open-ended questions and actively listening to users’ responses, user interviews help you identify pain points, unmet needs, and areas where current solutions fall short. This knowledge guides the decision-making process and ensures that the final product addresses the specific requirements of the target audience.

Obtain validation and feedback

By sharing prototypes or design concepts with users, you can obtain firsthand input and validate your original assumptions. This feedback loop enables you to iterate and refine your designs, resulting in more user-centric and effective solutions.

Minimize assumptions and biases

By actively involving users in the design process, you can gain an external perspective and challenge your own preconceptions. This leads to more objective decision-making and reduces the risk of creating products that are based solely on internal assumptions.

Moreover, user interviews foster iterative improvement. By conducting interviews at different stages of the design process, you can gather continuous feedback, test new concepts, and make iterative refinements. This iterative approach ensures that the final product is constantly evolving and improving based on user insights.

Limitations of User Interviews

In contrast to behavioral data, which captures users’ real interactions with a website, interview data is self-reported, reflecting users’ perceptions and emotions. Self-reported data, including interview data, has its limitations due to the following reasons:

  • Flawed human memory: People may not fully or accurately recall events, leading to potential gaps or inaccuracies in the information they provide during interviews.
  • Selective reporting: Participants may not always know what specific details are relevant to the interviewer, and as a result, they may omit certain information. Minor interactions might be considered insignificant and therefore left out.
  • Individual characteristics: Participants differ in their personality traits and tendencies. Some individuals may be proud or private, while others may be shy or easily embarrassed. These factors can influence their willingness to share certain details with a stranger, affecting the completeness and openness of the information provided.

Approach your interview data with a critical mindset and acknowledge its potential limitations. Combining multiple data sources from different research methods can help mitigate these limitations and provide a more comprehensive understanding of user experiences.

When Should You Conduct a User Interview?

Whether you’re developing a site, app, or product, user interviews can offer valuable insights throughout various stages of the development process. However, there are three specific times when they hold particular significance:

  • Initial project phase: Before a clear concept has been established, conducting interviews gives you a better understanding of potential users and their wants and needs. Gathering information upfront helps you create potential solutions.
  • Early stages of product development: When a product team has a preliminary model or concept, involving users in the process can provide valuable feedback. Sharing the prototype with users allows for user-centered improvements and refinements.
  • Post-launch stage:  Interviews conducted after the product has been released can be combined with observation in what is known as a contextual inquiry . These interviews take place within the context of users interacting with the product, offering valuable insights into user behavior and usage patterns.

Each of these interview stages contributes to a holistic understanding of your users and helps you develop user-centric solutions.

How to Conduct User Interviews

We’ve covered the basis of user interviews. Now let’s walk through the steps of conducing one.

1. Find participants who represent your audience

It’s important to find test participants who closely match your actual users, otherwise the feedback you collect won’t be valuable. Conduct a preliminary survey prior to inviting test participants. Make sure their demographics, psychographics, backgrounds, and motivations match your audience.

Here are some great places to find test participants :

Pasted image 0

2. Put yourself in a positive mood

Before starting the user interview, take a deep breath and put a smile on your face. Psychologically, smiling creates a more positive attitude , and positive attitudes are infectious. If you’re in a good mood, your interviewee will feel better about the experience as well.

3. Set a goal for the interview

Before starting the interview, it’s important to have a goal in mind. Don’t settle on something generic, such as “learn about our users.” That won’t focus your questions on what’s relevant.

Come up with a concise goal that’s related to specific user behavior or attitudes. For instance, you might aim to learn what terms people search for to find products like yours, whether they enjoy using your product customizer, or whether they can successfully follow your return process.

It’s okay to have multiple goals for a user interview, as well, provided you have enough time to address them during the session. Just make sure they are focused and relevant.

4. Choose your interview format

User interviews can be structured, unstructured, or somewhere in between called semi-structured. In a structured interview, all questions are prescription. There is no deviation from the script. In an unstructured interview, there is an open dialogue between the interviewer and interviewee that differs from previous interviews.

Semi-structured interviews are most common. This is where you prepare a set of topics, but have the freedom to change the order of questions and add questions where appropriate. Semi-structured interviews feel more natural and offer the flexibility to dig into topics you didn’t expect to arise.

Focus groups are another possible option, but they usually aren’t suitable for user interviews. One person’s feedback can bias the rest of the group.

Here’s how the different types differ :

Pasted image 0

5. Build a rapport with interviewee

At the beginning of the interview, it’s important to put the interviewee at ease. You need to earn their confidence and make them feel comfortable so they give you honest information. Here are some tips to quickly build a rapport:

  • Greet the interviewee by name and offer a warm welcome. Thank them for their contribution. Engage in some small talk if you think that’s appropriate.
  • Use positive body cues, such as maintaining eye contact (but not in a creepy way), smiling, and nodding as they speak. Avoid negative cues, such as crossing your arms, frowning, fidgeting, or seeming distracted.
  • Explain the purpose of the interview and how you plan to use the results. Give the interviewee more context about why you need this information and the types of questions you’ll ask. This will make them less confused.
  • Stick to the product. You don’t want the interviewee to feel like they are being tested, criticized, or evaluated in some way.

6. Ask permission before recording (video or audio)

While note-taking is the prevailing method of data collection in user interviews, audio or video recording can also be an effective way to gather information. However, it is crucial to obtain the participants’ consent before recording.

Prior to commencing the interview, always ask for permission to record, and be prepared to cease the recording immediately if the interviewee expresses discomfort or refuses. Respecting the participants’ privacy and comfort should remain a top priority throughout the entire interview process.

7. Test your visuals and technology

Before starting the interview, make sure that all of the hardware and applications you plan to use are operational. Ensure that your internet connection is reliable. You may find it helpful to ask a friend to do a quick rehearsal with you.

If you plan to use any visuals, links, or other materials during the user interview, pull them up ahead of time on your computer. Make sure to hide anything confidential.

8. Design your interview questions

Never  start an interview without preparing your questions beforehand! Before the interview, it’s helpful to design a full discussion guide. This document includes all of the questions you might ask.

Your discussion guide should be closely tied to the purpose of your research, and the questions should be selected according to your learning goal. It should also include questions you might ask if you have time, as well as follow-up questions for cases where the test participant gives an unexpected answer.

Discussion guides typically have two types of questions: general and product-specific. A general question might be something like:

  • Could you tell me about your habits?
  • What does your daily schedule look like?
  • How do you typically solve this problem?

Product-specific questions focus on your product, such as your website or app. Here are some examples:

  • What did you find hard about [the task]?
  • What could we do to make [the task] easier for you?
  • How long does it take to use [the feature]?
  • How did you feel when using [the feature]?

UX Knowledge Base has a great guide on asking good questions :

Pasted image 0

Your discussion guide is a living document. Iterate on it over time as you get better at conducting user interviews. For instance, if a test participant provides an especially useful piece of information, you might ask future participants questions on the same topic. It’s also okay to abandon questions that continually fail to work.

9. Focus on problems, no solutions

Generally speaking, users don’t know what they want. They aren’t the right people to ask for suggested features or solutions because they aren’t designers and developers. Think of the possibly apocryphal quote from Henry Ford: “If I had asked people what they wanted, they would have said faster horses.”

Instead, focus your questions on their problems. What do they want to achieve? What do they consider a “win?” How do they envision their life after making their problem vanish?

10. Ask follow up questions

As your participants answer questions, don’t be afraid to ask follow up questions based on their responses. (This is the major benefit of the semi-structured format.) Try using the Five Whys  technique. Ask “Why?” multiple times (up to five) until you get to the real meat of their answer.

If you aren’t sure what the participant means, don’t be afraid to ask for clarification. An answer you don’t understand is useless to you. Ask for elaboration, context, or even an anecdote to help you understand what they  mean.

If a user mentions something particularly interesting, ask them to show you on their device. For instance, if they report being unable to find your discounted products in your navigation, have them show you what their screen likes and where they looked.

11. Offer some closing remarks

Towards the end of the interview, it is crucial to spend a few minutes wrapping up and allowing for any unexpected information to surface. Invite the user to share anything they haven’t had the chance to mention yet.

Try concluding the interview with a question like, “Is there anything else you’d like to share with us?” This creates an opportunity to uncover new insights that may be valuable for your research. If the user responds with enthusiasm and says, “Yes, I’d love to tell you about this product I use all the time…” it opens up a new source of inspiration for ideation. Providing space for users to share unprompted information allows for unexpected discoveries.

Some effective questions to ask at the end of an interview include:

  • “Is there anything else you’d like to share with us?”
  • “Is there anything we didn’t discuss today that you think is important?”
  • “What’s one thing that I haven’t mentioned yet but you believe I should know about?”

Then, express gratitude to the users for their time. Take a moment to review your notes or jot down any notable observations that stood out during the interview. Ideally, having another person present to assist with note-taking or recording the session for later review is beneficial.

What to Avoid During an User Interview

Now that you know how to conduct a user interview, here’s what not to do.

Don’t judge or educate the interviewee

The purpose of the user interview is to learn from them, not teach them. If they feel judged, they may withhold information. Besides, you have a limited amount of time to conduct the interview. Don’t waste it on activities that don’t help you learn.

Don’t ask leading questions

Leading questions  can influence the interviewee’s response by guiding them towards a specific answer. It is important to avoid the use of leading questions during interviews to prevent bias and ensure more objective results. When questions are framed in a leading manner, there is a risk of only hearing what you want to hear, distorting the outcomes of the interview sessions.

Pasted image 0

For instance, consider the question, “How often do you watch Hulu?” This question assumes that the interviewee is an active Hulu user, potentially influencing their response. To maintain neutrality and obtain unbiased insights, ask questions in a more neutral manner.

A better approach would be to start with an open and neutral question such as, “Do you have any experience with any video streaming services?” This allows the interviewee to provide their own experiences and opinions without any preconceived assumptions, leading to more genuine and valuable responses.

Don’t be afraid of silence

It’s not uncommon to experience moments of silence during a user interview. Don’t feel like you have to fill this space with unnecessary words. Instead, use these moments as an opportunity to allow the interviewee to provide additional information willingly. By giving them space to gather their thoughts or share more insights, you create a conducive environment for richer and more meaningful responses.

Don’t make assumptions – ask the stupid questions

Asking basic questions is crucial in avoiding the oversight of important information that might be assumed or overlooked. Basic questions also serve as a helpful technique to ease interviewees into the conversation, fostering comfort and engagement.

For example, you might initiate user interviews with a question like, “What does your organization do, and how does your role contribute to it?” Although you may already possess some knowledge of the answers, you’ll often discover additional details and nuances that prove valuable for follow-up inquiries or when constructing user scenarios.

Avoid asking closed questions

Closed questions are questions that elicit a “yes” or “no” response. They don’t naturally solicit any valuable information. For instance, if you ask, “Do you use our product every morning?” the interviewee might respond “yes” without elaboration. A better question would be, “Could you tell me how often you use our product?”

Avoid vague questions

Vague questions are ambiguous and difficult to understand. They tend to confuse your test participants. In some cases, they make interviewees feel guilty for not understanding. To determine if any of your questions are vague, test them on random people to see if they understand the question’s meaning and the kind of answer you’re looking for.

Never mention other users

A quick way to prejudice answers is to compare the interviewee to other users. Here’s an example of a bad question : “A lot of people say our navigation is simple. Do you agree?”

In this case, you just told them that people find the navigation easy. If they find the navigation confusing, they may hold back for fear that you’ll think they’re stupid. People want to fit in,  so they’ll often agree with the crowd, even if they really don’t.

Instead, focus on open-ended questions that don’t assume anything. Here’s a good alternative: “Please describe your experience with the website’s navigation.”

User Interviews Made Easy

User interviews play a vital role in gaining valuable insights into user behavior, preferences, and needs. By conducting thorough interviews and understanding the purpose behind them, we can gather the necessary information to improve our products and designs. Ultimately, user interviews empower us to create better user experiences and drive the success of our projects.

Like most businesses, you don’t have time to conduct your own user interviews. Recruiting people in your target market to provide feedback on your shopping experience is hard.

Let UserInput handle it for you. We let you set up a new research project in just a few clicks, so you can get the input you need while still making progress on your other goals. Our managed customer research service  pairs you with real people in your target market to collect unfiltered feedback about your shopping experience.

The company that understands its customer the best always wins. Our done-with-you research services make sure that company is yours. Start a research project today .

Learn more about Customer Research

Unlocking the Power of Qualitative Insights for Your Ecommerce Brand

Unlocking the Power of Qualitative Insights for Your Ecommerce Brand

by UserInput

6 Min Read While quantitative data can offer you a wealth of important information, qualitative insights help you fully understand your customers. When you combine both research methods, it can significantly enrich your overall data so that you can inform your decisions more effectively.

The Complete Guide to Mobile Usability Testing

The Complete Guide to Mobile Usability Testing

10 Min Read Mobile devices power a larger percentage of online shopping every single year. If you’re not doing usability testing to ensure that your store looks and functions properly on mobile devices, you’re leaving money on the table.

Message Testing: Why It’s Important and How to Test Your Communications

Message Testing: Why It’s Important and How to Test Your Communications

9 Min Read When it comes to maximizing your conversion rate, interface design is just one piece of the puzzle. This guide will teach you how to test and optimize the component that is arguably more impactful, your sales and marketing messages.

Customer Research

Customer experience, product design, product marketing, company updates, privacy overview.

Learn / Guides / User interviews guide

Back to guides

The complete guide to user interviews

Building a better product or service requires one critical insight: a deep understanding of the people you’re building it for. What could be a more effective tactic to achieve this goal than talking to them directly? In this guide, we’ll discuss how to plan, run, and analyze user interviews to get actionable insights.

Last updated

Reading time.

User interviews reveal in-depth insights that no other research method can , helping you gather qualitative data about your audience’s needs, pain points, and experiences. Typically lasting around 30 to 60 minutes, these sessions give you space to ask an individual user about virtually any topic related to your product—then explore their answers in depth.

These types of interviews provide invaluable insights for product teams, UX designers, marketers, and researchers. But they can also be useful to customer support, sales, and even business leadership.

Learn more about your users

Use Recordings and Heatmaps to see what your users are doing, then understand their experiences with Surveys and Interviews.

Types of user interviews

There are three main categories of interviews.

Unstructured vs. semi-structured vs. structured

Structured interviews follow the same format each time, asking participants the same questions with little (if any) space to go in unexpected directions

Unstructured interviews are free-flowing conversations around a specific topic. While they still answer a specific user research question, they don’t follow a script.

Semi-structured interviews are somewhere in between—they may have a basic format, but the interviewer has the freedom to go ‘off-script’ depending on what the interviewee says

Generative vs. contextual vs. continuous

Generative interviews help researchers ‘generate’ new insights and ideas. They typically involve exploring a user’s thoughts and experiences in open conversation.

Contextual interviews are conducted in a specific place or scenario—like the participant’s workplace. The interviewer asks questions and observes the participant completing tasks in that specific context.

Continuous interviews involve interviewing the same participant multiple times to see how their perspectives evolve

Remote vs. in-person

In-person interviews place the interviewer and interviewee in the same (physical) room. Some researchers choose this method because it allows them to observe the interviewee better, and because there are fewer distractions.

Remote interviews are now common thanks to internet-based interview tools. They cost less to conduct than in-person interviews, and can incorporate transcription tools that speed up analysis.

🎤 Conduct better remote interviews with Hotjar

Hotjar Engage helps busy product teams automate interview processes, making user research a breeze. Use Engage to host, record, and transcribe interviews, and share notes with colleagues for effective collaboration.

Bring in your team as observers. And if they can’t make it, don’t worry—you’ll get an automated transcription as soon as the interview ends, so you can share time-stamped notes that highlight insightful parts of the conversation.

user research interviews

Automate your research process with Hotjar

User interviews vs. user testing: what’s the difference?

Both user interviews and user testing are invaluable for teams who need feedback on their work. However, there are a few key differences between them.

User interviews

User testing

Interviews with individual users (or non-users from a target audience)

Research sessions where participants interact with a product and researchers observe their actions and gather feedback

When to do it

At any stage of the product development process (market research, concept testing , pre- and post-launch evaluation)

After developing a prototype, or after a product launch or beta launch

Insights it provides

User experiences and perspectives on any topic

User experiences and actions when using a specific product

5 reasons to start interviewing your users

User interviews help you make better decisions at every stage of the product life cycle, from researching users to marketing your finished product.

1. Understand your market

In the early stages of developing a product, you need to understand where the opportunities are in the market. By interviewing potential users from that very market, you learn valuable insights like:

Your target audience’s needs and pain points

How they view other solutions on the market

What their decision-making process looks like

Direct insights like these help you identify gaps in the market and develop ideas for how your product could better serve target audiences.

2. Develop user personas

User personas are fictional characters that represent a typical user from your target audience. 

They give you a framework for understanding those users so you can craft products, services, and marketing campaigns for them. Plus, they’re handy when you want to draw up a customer journey map for an audience segment.

Creating user personas is simple: interview a handful of users about their needs, asking a small number of set questions. Finally, create a profile of a person who represents an ‘average’ of the common answers you got. 

👉 Read our guide to user personas to learn more

3. Get feedback on ideas and concepts

When you’ve got an idea for a new product or feature, user feedback can help you decide whether to invest in it. 

Conduct user interviews with people from your target audience to see if the product has potential and to learn what you could improve. Finally, use your findings to report back to your team and get buy-in from stakeholders.

4. Improve existing products

After launching a product, you’ll be able to gather data on how users respond to it—like conversion, sales, and product usage figures. However, this kind of data won’t tell you why users act the way they do.

User interviews let you probe users about their likes, dislikes, experiences, and behaviors, so you can identify their frustrations or unmet needs.

5. Create and optimize marketing campaigns

Your users probably see hundreds of marketing messages each day, and they’ll ignore most of them. To create successful campaigns, you need to make sure your marketing materials resonate deeply with your target audiences.

Interviews help you understand user pain points, needs, and perspectives. This way, your marketing messages join the conversation in users’ heads. 

👉 Ready to experience these benefits for yourself? Check out our tips for mastering user interviews .

Combine user interviews with other research methods for a complete picture of your audience

User interviews yield even more valuable insights when combined with other UX research methods . Consider these powerful combinations for an in-depth understanding of your audience.

Usability testing + user interviews

Many product and UX teams regularly conduct usability testing to get feedback on existing products. This research method involves asking participants to complete tasks with a product, then observing their behavior. 

To get further clarity, conduct an interview with the participant immediately after each usability test . This gives you a clearer picture of how real people experience your product, what they struggle with, and what they would change.

Surveys + user interviews

Surveys are a tried-and-tested tactic to help you learn from your customers—but they only let you scratch the surface of a topic. When you discover something significant from your survey responses, follow up with interviews to learn more.

🔥 If you’re using Hotjar

Hotjar Surveys and Interviews are a match made in user research heaven. Conduct interviews to discover what new features your users need, and narrow the list of suggestions down to a top three. Then, decide which feature to build first by sending a survey to your users and getting feedback at scale.

user research interviews

Hotjar Surveys lets you gather qualitative data in the form of open-ended question responses, plus quantitative data from scale-based questions

User behavior data + recordings + user interviews

When conducting UX audits or launching conversion rate optimization (CRO) initiatives, you’ll look at several types of data. For example, you might analyze web analytics data to see which pages users leave from, then view session recordings to find out what users do before they leave.

This kind of data exploration can reveal some interesting behavior trends—but the reasons behind user behavior aren’t always clear. User interviews are an illuminating final step in the process, where users reveal exactly how they perceive problems on your website or product, in their own words. 

Who should you recruit for user interviews?

According to Sara Hefny, a Senior Product Researcher at Hotjar, getting the right audience is key. “Make sure you have a clear idea of who is best equipped to respond to the questions you need answered,” she says. 

Depending on your user research goals, you’ll need to recruit existing users, or members of your target audience who aren’t currently users. 

Here are some common goals and the users you should recruit to achieve them:

Sara also advises creating a screening question to filter out people who don’t fit the criteria you need for your target audience.

Not sure where to start? Recruit participants directly from Hotjar Engage, tapping into our pool of 175,000 users from a wide range of demographics, and automatically screening them.

Try it out! Our free Basic plan gives you one free interview from the Hotjar participant pool, and up to five per month from your own network.

How to do user interviews in 6 steps

Follow these steps to plan, run, and learn from your interviews. We’ll discuss these in greater detail in a later chapter of this guide, so keep a lookout. 👀

🤖 Want to take your user interview game to the next level? Use AI to improve user interviews . Artificial intelligence can help you:

Enhance understanding and connection during interviews

Improve interview inclusivity and accessibility

Streamline administrative interviewing tasks

Find the hidden insights in your user interviews 

When it comes to building great products for your customers, research is key—and the devil is in the details. Often, the most valuable insights will come from passing thoughts your interview participants share, or trends you’ll only notice when analyzing multiple interviews.

For this reason, developing great interview skills is invaluable for teams. By asking the right questions and listening for those important small details, you’ll uncover what your users need most. And when you equip your team with the right tools, you can ensure you don’t miss any of those business-boosting user insights.

Use Recordings and Heatmaps to learn what your users are doing, then understand their experiences with Surveys and user interviews.

User interviews FAQs

What is the best way to conduct user interviews.

The best way to conduct user interviews will depend on your research goals, but in general:

Choose participants that represent your audience 

Set a research goal and plan questions accordingly 

Speak with interview participants for 30 to 60 minutes

Ask open-ended questions and avoid leading questions

Record and transcribe your call for later analysis

How long do user interviews take?

User interviews typically take between 30 to 60 minutes. Aiming for this length of time means you can explore relevant topics deeply without overwhelming your participants.

How many people should I interview?

Many product teams find that around five user interviews are enough to answer research questions about a single audience segment. If you’re trying to learn about several audience segments, you’ll probably need around five interviews per segment.

Skip navigation

Nielsen Norman Group logo

World Leaders in Research-Based User Experience

Writing an effective guide for a ux interview.

user research interviews

February 28, 2021 2021-02-28

  • Email article
  • Share on LinkedIn
  • Share on Twitter

In This Article:

Introduction, step 1: write your research questions, step 2: brainstorm interview questions, step 3: broaden your questions, step 4: fill in for unaccounted research questions, step 5: arrange your questions, step 6: prepare additional probing and followup questions, step 7: pilot your guide.

In the discovery phase of product development, user interviews are often used to capture important information about users: their backgrounds, beliefs, motivations, desires, or needs. Typically, the interviews carried out at this stage are semistructured (referred to as “depth interviews” by market researchers) — they generally have a predefined structure, but also allow the interviewer the flexibility to follow up on significant statements made by participants.

In a semistructured interview, the interviewer uses an interview guide (also referred to as a discussion guide). Unlike an interview script — which is used in structured interviews — an interview guide can be used flexibly: interviewers can ask questions in any order they see fit, omit questions, or ask questions that are not in the guide.

A good interview guide paves the way for a deep, free-flowing conversation with participants. (Obviously, the way you conduct the interview will also be important, but that’s another article!) Interview guides should include a few concise, open-ended questions to encourage participants to tell their stories. These will be followed by carefully crafted, probing questions to uncover motivations or beliefs behind certain reported behaviors or utterances.

Constructing a good interview guide can be tricky and time-consuming. It’s not uncommon to spend a full day crafting one. However, it’s important to have one to ensure you get the most out of your interviews. Without an interview guide you run the risk of:

  • Asking leading questions as you try to think of questions on the spot
  • Not covering topics relevant to your research questions in each interview

Ultimately, without an interview guide, you are in danger of compromising the validity of your data. Here are some steps meant to help you create an interview guide if you’re new to this practice.

Sometimes the research questions are clear and obvious. However, sometimes they’re not. Maybe you’ve realized you need to speak to users because you don’t know anything about them. Good! But what do you want to find out from them? These are your research questions. Write these out first before working on your interview guide, as they will shape your interview questions. Below are some examples of research questions:

  • What are users’ expectations in this situation?
  • How do users make a decision in this situation?
  • How have users managed to solve this problem in the past?
  • What aspects of this product do users care most about, and why?

Next, note down all interview questions that come to mind. It doesn’t matter whether they are good or poor — you’ll deal with that later. You can use mind maps , digital whiteboards, or a simple list — whatever works for you! Sometimes, further research questions pop up here. That’s fine; add them to your list of research questions.

It’s typical after step 2 to have a long list of mostly closed questions. Those kinds of questions wouldn’t make for a good interview because they won’t allow for unanticipated stories and statements to emerge and can limit your ability to build rapport with the participant. (Rapport is important if you’re looking to gather accurate, in-depth information from your participants.)

Review your list of questions and ask yourself, for each interview question, is there a broader, more open-ended version of that question that you can ask instead?

For example, consider the following closed questions that could be asked in an interview with an employee.

  • Do you work in an office?
  • Is the work mostly desk-based or paper-based?
  • Do you have to attend meetings during the workday?
  • Do you work in a team?

The above questions could be answered by asking the participant to describe a typical day at work. It’s likely that in doing so, the participant may cover all or many of the above. If the participant has not covered everything, then some of these can be asked as follow-up questions.

Example questions, that prompt the user to recall a certain event and are similar to those used in the critical-incident method , are excellent for gathering stories and unanticipated statements. For instance, imagine you’re conducting an interview to learn about people’s experiences cooking at home. The following example questions provide the opportunity for participants to tell many different stories and give you a glimpse into their lives.

  • Tell me about the last time you cooked at home.
  • Tell me about a time where you cooked something new.
  • Tell me about a time when you cooked something that turned out well.
  • Tell me about a time when you cooked something that didn’t turn out as you hoped.
  • Tell me about a time when you were thinking about cooking something but decided to get takeout instead?

Align each interview question to your research questions. If you have research questions that are not addressed by any of your interview questions, fill in the gap by crafting some more interview questions. Repeat step 3 if needed.

The interview guide can include your research questions. Some researchers like to remind themselves of the aims of the research by displaying these either at the top of the guide or alongside the interview questions.

To make the conversation flow in a logical order and seem natural think about the best order for your questions . For example, if you’re talking about an experience people have had, it makes sense to move in a chronological order. If the experience has set phases (such as discover , choose , purchas e, use , review ) that you might have documented in a user-journey map , service blueprint , or experience map then you may want to align your questions to these phases. That’s not to say you can’t depart from this order in the interview if you need to!

You should also think about preparing some warmup questions that are open-ended and easy to answer to build rapport at the beginning of your interview. For example, “Tell me a little about yourself” is a typical opening question which gets the participant talking. Any questions that require reflection should be featured later in your guide; introducing them too early could be overwhelming and you might get stereotypical responses, as participants haven’t had a chance to recall events, feelings, and form judgments.

Once you’ve ordered your questions, go through each one and prepare followup questions meant to provide you with additional detail and clarification , such as “Where were you when this happened?”, “When did that happen?”, “Tell me why you did that?”, and so on.

You can include probing questions , too, to help you to remember to ask them — for example, “Tell me more about that”, “Tell me why you felt that way”, “Why is that important to you?”

Piloting your guide will give you an idea of:

  • Questions you should ask but aren’t yet included in your guide
  • Questions that need rewording
  • Whether the question order works
  • Whether you will have time for all your questions

Recruit a pilot participant and give yourself enough time to make some changes. It’s okay to make updates to your guide throughout your interviews, but the point of piloting your guide is to fix any glaring issues before commencing research.

A guide will provide focus to your interviews and ensure that they are successful. Your interview guide should consist of broad, open-ended questions that allow participants to tell you about their experience in detail. These questions will be accompanied by many probing and followup questions, used to capture further details and gain clarification. You can download an example of an interview guide to refer to as you create your own interview guides.

Example Interview Guide (PDF)

Free Downloads

Related courses, user interviews.

Uncover in-depth, accurate insights about your users

ResearchOps: Scaling User Research

Orchestrate and optimize research to amplify its impact

Survey Design and Execution

Learn how to use surveys to drive and evaluate UX design

Related Topics

  • Research Methods Research Methods

Learn More:

user research interviews

How to Present UX Research Results Responsibly

Caleb Sponheim · 3 min

user research interviews

Using the Funnel Technique in User Interviews

Maria Rosala · 3 min

user research interviews

Probing in User Interviews

Related Articles:

Why User Interviews Fail

Maria Rosala · 7 min

Should You Run a Survey?

Maddie Brown · 6 min

The Funnel Technique in Qualitative User Research

Maria Rosala and Kate Moran · 7 min

Open-Ended vs. Closed Questions in User Research

Maria Rosala · 5 min

Quantitative Research: Study Guide

Kate Moran · 8 min

Interviewing Users

Jakob Nielsen · 6 min

  • Reviews / Why join our community?
  • For companies
  • Frequently asked questions

user research interviews

How to Prepare for a User Interview and Ask the Right Questions

Getting great interview results requires careful preparation. You need to be clear about the purpose of your research, decide whom to recruit, do all the practical preparations, and—finally—you need to design a great interview guide detailing the questions you want to ask. Here, you will learn how to prepare for user interviews and how to ensure that you ask the right questions, in the right order.

Preparing for an interview requires you to think through each step of the project: what you need to find out, whom you need to ask, where you will conduct the interview, how you will record the interview and what the best way is to ask questions in order to get a good flow. Here, we will start by discussing the practical preparations; then we will discuss how you decide how many participants you need. Finally, we will elaborate on how to prepare an interview guide with relevant questions.

Interview Preparation

It’s important that you start your interview project with a clear idea of the purpose of your research. That is, why you want to do the research and what you want to find out. In the same sense as you can’t pick a tool before knowing if you need to hammer a nail, turn a screw or drill a hole, you shouldn’t choose your research method before you know what you want to find out . In a qualitative study such as semi-structured interviews , it’s perfectly fine for your overall research question to be broad and exploratory—e.g., “We want to find out how people use video streaming in their everyday lives and how they feel about the services they use.” When defining the purpose of your research, remember to involve the most important stakeholders in the design project you are working on. It’s vital that you ensure that you have stakeholder buy-in for your interview project, and a good way to do that is to make sure that you all agree on what you will get out of the research and the effort it will require.

In relation to more practical considerations, you need to think about what user group(s) you want to involve, how you are going to recruit them, where the interviews should take place, and how you will record data from the interview.

  • Transcript loading…

Here, professor of Human-Computer Interaction at University College London and expert in qualitative user studies Ann Blandford talks about different ways in which the interview setting affects the atmosphere for the interview.

As you answer these questions, you will find yourself with a list of practical tasks you need to perform before you do the interviews—e.g., arrange travel, book a suitable location, recruit participants, and decide on a recording device. Many researchers just use their smartphones for audio or video recording. That’s fine, but remember to ensure you have enough storage space and battery power to record the duration of the interview.

Recruiting Participants

There are no set rules as to how many participants you should include in an interview project—it depends on what you are studying, how many user groups you want to involve, and how many resources you have for your project. The grounded theory methodology recommends that you don’t decide beforehand how many participants you need, but let the results guide you as to when you should stop. In that case, you start out interviewing (e.g.) five participants. You then look at what your participants have told you so far in order to see how similar the participants’ answers are and how many new topics are appearing with each new interview. If you notice by participant 5 that you didn’t get much information you didn’t already have from the previous interviews, you probably don’t need to recruit any more participants. However, if you see that participants give very different answers and new topics continue to arise, you should recruit more participants, and—ideally—you should continue the process until you reach a point where you have information saturation. Similarly, some of the results might indicate that you need to include a user group that you had not previously considered and you have to modify your recruitment criteria. Let’s say you have interviewed one female and four male participants and you discover that the female participant gives completely different information than the male participants. In this situation, you would probably want to recruit more female participants so as to ensure that you explore both perspectives. In most interview projects, more practical considerations such as time and resource constraints also play a big role in deciding how many participants you end up recruiting. That means that you often have to interview fewer participants than you would have ideally preferred. In that case, you need to narrow down the focus of your project and refrain from following up on all potentially interesting themes in your research. This may seem hard when you’re on the cusp of discovering some fascinating areas you hadn’t first noticed, but practicability is essential—you need to home in on a localized area and take things from there.

10 stick-figures. 9 of them are shaded back, one is in red.

© Clker-Free-Vector-Images, CC0

Aside from having a clear idea of the purpose of your research and ensuring that you make all the practical arrangements, writing your interview guide is the most important preparation for the interview itself.

The Interview Guide

The interview guide is a document in which you formulate the questions you want to ask your participants, in the order that you expect to ask them. In other words, your interview guide is your script for the interview . When you write your interview guide, think about what you want to know and then formulate concrete questions based on that. Your interview guide is closely tied to the purpose of your research, and it ensures that you can deliver the insights you promised to deliver. It’s a good idea to test the interview guide—try asking yourself or somebody else the questions, and then determine whether or not you can deliver the insights you promised based on the answers you get. Your interview guide is also a good tool to help you think through the best way of asking questions before the interviews. During the interviews, the guide serves as a reminder of the themes and questions you want to make sure you cover. Although you write down the questions as you would ask them, in the order you think makes the most sense, you don’t necessarily end up following that order during the interview itself. In a semi-structured interview, you have a carefully laid plan and you are responsible for the overall structure of the conversation, but you also let the flow of the conversation decide how and when to ask questions. In other words, you have to safeguard the interview process from being derailed by off-topic points and the like, at the same time allowing a natural style so that the interview can carry itself. This takes a good ear and a sharp eye (and memory) for detail.

In this video, Ann Blandford talks about the best way to order questions in a semi-structured interview session and how much you should stick to your theme when you do an interview.

You can download our overview of how to structure a user interview here:

How to Structure a User Interview

How to ask questions

“The best scientists and explorers have the attributes of kids! They ask questions and have a sense of wonder. They have curiosity.” —Sylvia Earle, Marine biologist, explorer, and author

When you conduct an interview, you will want to make sure that you ask questions in such a way that gives you the information you are looking for and which makes it easy for participants to answer . When you ask questions, they should be relatively brief and easy to understand. Try to speak in a vocabulary that is familiar to your participant. If you think of the video streaming example from earlier, it’s not a good idea to ask your participants directly how video streaming fits into their everyday lives. You must turn your overall research question into more concrete questions that will then answer your overall question. Instead, you could ask questions like “Can you tell me about the last time you used video streaming?” or “How have your movie/TV-watching habits changed since you started using video streaming services?”. In their book InterViews: Learning the Craft of Qualitative Research Interviewing , psychology researchers Steinar Kvale and Svend Brinkmann state that you should ask the concrete “how” and “what” questions before you ask the more abstract “why” questions. Even if you are mostly interested in why people are doing something, it can be difficult for them to answer. “How…?” and “What…?” prompt direct responses from an interviewee, but “Why…?” involves getting behind the scenes and looking at factors driving a person. A “why” question can also sound rhetorical in some cases—so, the person asked may feel a little defensive in addition to feeling some confusion. Given that, sometimes you’re better off finding answers by deducing what your interviewees are doing in concrete situations.

A good way to help people recollect how they normally do something is to ask for concrete examples. In this video, you will see Ann Blandford explain how to use concrete examples and the critical incidents technique in interviews.

You can download our template explaining how to use concrete examples and critical incidents in interviews here:

Concrete Examples and Critical Incidents

You can also download an example of an interview guide here:

Interview Guide

The Take Away

Before you can conduct user interviews, you need to make practical preparations and design a good interview guide. When you design your interview guide, think about what questions are most suited for the beginning, middle, and end of your interviews—but be prepared to change the order of questions to suit the flow of the conversation during the interview. When you ask questions, be as concrete as possible and ask them in a way that makes it as easy as possible for people to recollect. An effective way to help people recollect past experiences is to use the critical incidents technique.

References & Where to Learn More

Ann Blandford, Dominic Furniss and Stephann Makri , Qualitative HCI Research: Going Behind the Scenes, Morgan & Claypool Publishers, 2016

Steinar Kvale and Svend Brinkmann, InterViews: Learning the Craft of Qualitative Research Interviewing. SAGE Publications, 2009

You can read more about why it’s a great idea to ask lots of open-ended questions in this article from the Nielsen Norman Group: Open-Ended vs. Closed-Ended Questions in User Research

Hero Image: © SteveRaubenstine, CC0

The Practical Guide to Usability

user research interviews

Get Weekly Design Insights

Topics in this article, what you should read next, a simple introduction to lean ux.

user research interviews

  • 1.3k shares
  • 3 years ago

How to Do a Thematic Analysis of User Interviews

user research interviews

  • 1.2k shares

How to Conduct User Interviews

user research interviews

  • 2 years ago

7 Great, Tried and Tested UX Research Techniques

user research interviews

User Research: What It Is and Why You Should Do It

user research interviews

  • 1.1k shares

How to Conduct User Observations

user research interviews

How to Visualize Your Qualitative User Research Results for Maximum Impact

user research interviews

Shadowing in User Research - Do You See What They See?

user research interviews

Creating Personas from User Research Results

user research interviews

The Basics of Recruiting Participants for User Research

user research interviews

Open Access—Link to us!

We believe in Open Access and the  democratization of knowledge . Unfortunately, world-class educational materials such as this page are normally hidden behind paywalls or in expensive textbooks.

If you want this to change , cite this article , link to us, or join us to help us democratize design knowledge !

Privacy Settings

Our digital services use necessary tracking technologies, including third-party cookies, for security, functionality, and to uphold user rights. Optional cookies offer enhanced features, and analytics.

Experience the full potential of our site that remembers your preferences and supports secure sign-in.

Governs the storage of data necessary for maintaining website security, user authentication, and fraud prevention mechanisms.

Enhanced Functionality

Saves your settings and preferences, like your location, for a more personalized experience.

Referral Program

We use cookies to enable our referral program, giving you and your friends discounts.

Error Reporting

We share user ID with Bugsnag and NewRelic to help us track errors and fix issues.

Optimize your experience by allowing us to monitor site usage. You’ll enjoy a smoother, more personalized journey without compromising your privacy.

Analytics Storage

Collects anonymous data on how you navigate and interact, helping us make informed improvements.

Differentiates real visitors from automated bots, ensuring accurate usage data and improving your website experience.

Lets us tailor your digital ads to match your interests, making them more relevant and useful to you.

Advertising Storage

Stores information for better-targeted advertising, enhancing your online ad experience.

Personalization Storage

Permits storing data to personalize content and ads across Google services based on user behavior, enhancing overall user experience.

Advertising Personalization

Allows for content and ad personalization across Google services based on user behavior. This consent enhances user experiences.

Enables personalizing ads based on user data and interactions, allowing for more relevant advertising experiences across Google services.

Receive more relevant advertisements by sharing your interests and behavior with our trusted advertising partners.

Enables better ad targeting and measurement on Meta platforms, making ads you see more relevant.

Allows for improved ad effectiveness and measurement through Meta’s Conversions API, ensuring privacy-compliant data sharing.

LinkedIn Insights

Tracks conversions, retargeting, and web analytics for LinkedIn ad campaigns, enhancing ad relevance and performance.

LinkedIn CAPI

Enhances LinkedIn advertising through server-side event tracking, offering more accurate measurement and personalization.

Google Ads Tag

Tracks ad performance and user engagement, helping deliver ads that are most useful to you.

Share Knowledge, Get Respect!

or copy link

Cite according to academic standards

Simply copy and paste the text below into your bibliographic reference list, onto your blog, or anywhere else. You can also just hyperlink to this article.

New to UX Design? We’re giving you a free ebook!

The Basics of User Experience Design

Download our free ebook The Basics of User Experience Design to learn about core concepts of UX design.

In 9 chapters, we’ll cover: conducting user interviews, design thinking, interaction design, mobile UX design, usability, UX research, and many more!

New to UX Design? We’re Giving You a Free ebook!

The Complete Guide to Conducting UX Research Interviews

Forbes says that every dollar invested in UX yields $100. That’s an impressive 9,900% ROI . However, you can’t create a product for the user if you don’t know what they need. One of the best ways to get data that improves product design is to ask the user.

UX research interviews help researchers, product teams , and UX designers to create better user experiences. The insight you gather helps you understand the needs, wants, and pain points of your target audience.

So, how do you determine who will use your product? Which demographics should you target? How do you design questions that generate the most insight for you? These are some of the questions we’ll be answering.

two people doing user interviews

In this guide, we’ll cover:

  • What is a UX research interview?
  • When and why to conduct user interviews
  • How to recruit participants for user interviews?
  • How to prepare for a user interview
  • How to conduct user interviews?
  • Framing interview questions to uncover insight
  • Turning interviews into research findings

What is UX research?

UX research is the study of user interaction to obtain insights that improve the design process. With UX research, you can create products and solutions that cater to a user’s needs. The primary goal of UX research is to build products for the end-user based on real data not what you think the user wants.

For example, United Airlines increased online ticketing by 200% and doubled the number of daily sessions by conducting UX research to better understand their audience.

United Airlines increased online ticketing by 200% from user research

UX researchers employ various research methods to gather data and uncover design opportunities. Most researchers start the UX research process with qualitative measures to determine the user’s needs and motivations. However, they also use quantitative methodologies to test their findings.

What Is A User Interview?

UX interview is a qualitative UX research method performed with prospective users of a product during early concept development. It’s a popular technique that allows researchers to cover related topics around the user’s motivations, feelings, and even how they use various products.

During user interviews, the UX researcher asks participants questions around behavior, use of a system, and habits to learn more about a specific topic.

Whereas multiple users are interviewed at once in focus groups, user interviews are one-on-one sessions, usually with several interviewers.

To ensure success, you must define the topic of the interview and choose the right participants for your target group.

When Should You Conduct User Interviews?

Exploration.

According to Interaction Design Foundation , organizations that invest in UX during a project’s concept phase reduce product development cycles by 33 to 50%. Also, the cost of fixing errors after development is 100x more than before development.

Conduct user interviews at the start of the project before you have a clear concept. Data from UX interviews provide a better understanding of different aspects of your user’s daily lives as it relates to the project.

User interviews can also be used when you have an early model. You learn flaws in the product and use the feedback to improve the user experience.

In this scenario, context shows how the product will be used in the user’s domain such as the office, workstation, or daily routine. The insight could form the basis for future user research such as questions to ask in larger surveys.

Observations

After launch, you can combine interviews with observing user actions to see how users interact with your product. Interviews don’t always provide sufficient insight because it’s difficult for users to explain how to use your solution in daily activities.

Contextual inquiry integrates observation and interviews. You ask participants a question and observe them perform an activity with your product. The participant also walks you through their interaction so you can ask follow-up questions after.

Why Should You Conduct User Interviews?

6 reasons to do ux and user interviews

The User Knows Best

According to Acquia, 53% of consumers feel that brands do not meet their experience standards. Dozens of UX research methods are focused on validating ideas, concepts, and designs with a goal to answer the question; “Does this product work?”

The answer is based on a deep understanding of the human psyche, previous knowledge of the problem, and rationality.

However, interviews are more flexible and informal. The structure ensures that you’re getting answers from users to inform product design.

Also, user interviews help you to plug knowledge gaps in your product. Finding information that you weren’t necessarily looking for (good or bad) could make all the difference in a successful product launch.

Gain Deeper Insights Into Topics

If you have a problem that requires further research, user interviews help you to understand the user’s experience or opinion about a concept or product.

It’s a more intimate setting to listen to your target audience talk about an issue and gain a deeper understanding of the topic. They highlight the best features of the product and areas of improvement.

More than just words, you’re listening for clues around how they express themselves when talking about pain points, wishes, and needs.

Humanize Your Product

One of the key steps in conducting user research interviews is to create personas. Each user segment is represented by a name, face, job, and other key demographics you associate with your ideal user.

Each time you conduct an interview, you can check if personas accurately represent your target audience or if there was an important feature you left out when considering demographics.

More importantly, you connect with your audience as humans and understand how the problem affects their daily lives. The insight from such personalized interactions leads to better product integration in their lives.

How to Recruit Participants For UX Research Interviews

how to recruit for user interviews ux research

1. Decide Whom to Recruit

As a rule of thumb, you should recruit participants that represent your end-users. Characteristics could be as narrow or broad as the scope of the project. However, it’s crucial to have a diverse group and to consider accessibility, such as how disabled users would interact with your design.

To prevent bias, avoid recruiting your colleagues, family members, or close friends. They may feel obliged to say nice things about the product which defeats the purpose of the research.

Also, avoid recruiting many participants from one profession. Interaction Design describes this risk as deformation professionelle .

For example, if you’re creating a robot vacuum and you only invite women 35 and older, you risk alienating men or younger people who may have a need for the product. This demographic provides feedback based on their knowledge of similar solutions and it could lead to dysfunctional products that are unable to please. Make sure you’re capturing participants across all segments of your audience.

However, none of this would be possible if you haven’t mapped out buyer personas to guide recruitment.

2. Build Research Panels

Building a database of prospective research participants requires time investment. However, it creates a sustainable process for finding research candidates quickly. A research panel also ensures that you’re contacting people who are interested in your product.

Use the following research methodologies to help:

3. Through Customer Support

Since customer support teams have direct interactions with customers, they can help you identify participants for research projects. They regularly get feedback from customers who want to suggest future product improvements or complain about features that aren’t working for them.

4. Live Intercepts

Live intercept is an affordable way to capture users in real-time as they’re doing tasks. You can use tools like Ethnio to identify and screen website uses who may be a good fit for your research.

Neilsen Norman used this technique to recruit participants for a moderated usability study aimed at evaluating the success of their content.

Nielsen Norman Group intercept popup

Image Source: NNN Group

If users qualify, you can schedule a conversation and include other researchers to observe remotely .

5. Social Media

Social media is a great way to show your target audience that you’re invested in improving product experience as well as promoting the role of research in your company.

If you have a decent to a large following on social media platforms, ask your followers if they would like to participate in your research project.

For example, if your target audience is a group of mums with toddlers, you could search Facebook for relevant communities. Here’s a sample result

social media groups for ux research

Conversely, if the product is for SaaS founders, you could find participants both on Facebook and LinkedIn communities.

linkedin groups for ux research

Where does your target audience hang out? That’s where you want to be.

6. Search Online communities

Similar to social media, online communities like Reddit and Slack channels have ready-made participants waiting for you.

If you’re creating software to improve product design, you’d want a group of product designers for your user interview. I searched Google for “product design Slack” and this featured snippet came up.

slack groups for ux research

How to Conduct User Interviews?

user interview checklist

Before the interview

Screen participants.

At this stage, you have a list of participants, but not everyone is going to be a great fit for your project. Screening user research participants help you find candidates who represent your ideal target audience.

A few tips to help ensure screening success include:

  • Define the attributes you want to see in participants such as behaviors, psychographics, and demographics
  • Ensure you have a diverse pool of candidates
  • Write down your screener questions based on the attributes above
  • Ask qualifying questions at the beginning of the survey
  • Use a survey tool to build a library of screening questions
  • Use accessible language and avoid jargon-speak
  • Limit the number of open-ended questions
  • Keep the screener short and precise
  • Start broad and get narrow as you progress

Determine Incentives for Participants

Incentives are a great way to entice participants for UX research. When determining incentives, make sure it’s commensurate with the time and efforts of participants or you’ll risk a low response rate.

Ensure that people aren’t signing up just for the money or their responses might not be as insightful. An NN Group research found that 63% of incentives are monetary, 13% received a mix of monetary and non-monetary incentives and 9% didn’t receive an incentive.

ux research incentives guide

However, non-monetary incentives tend to be the norm for remote interviews. An Amazon gift card or prepaid visa card ensures participants show up and are engaged during the interview process.

Set Clear Goals

It’s crucial that you have a clear purpose for each interview project.

A few questions to guide you here include:

  • What information do I need from our users?
  • How will the knowledge inform the UX design process?
  • What do stakeholders want to learn from the research?

Make sure stakeholder goals are realistic. Broad goals make it impossible to get feedback that is relevant to your UX design needs.

Prepare for the Interview

Allocate sufficient time for each interview and prep time between interviews. You’ll need to go through your interview guide to feel confident when starting the interview.

Write Down Your UX Research Interview Questions

Never go into a user interview without a discussion guide. This is not the place to “wing it”. A discussion guide is a document that contains a list of questions to ask research participants. It must be tied to the purpose of the research and chosen according to your learning goal.

Preparing a list of questions ensures that you will:

  • Include your team’s feedback in the interview process
  • Write clear and concise question
  • Cover all the questions you wanted to ask, which wouldn’t be possible on the spot

Here are Some User Interview Questions to Get the Most Insight

Discovery Questions

  • Tell us about yourself and your background?
  • How did you feel before this product?
  • What are the problems you want this product to solve for you?
  • How did you feel after you started using this product?
  • If this company went out of business, what alternative would you use instead?
  • What do you like about the product?
  • What do you dislike about the product?
  • What apps do you use regularly for the tasks?
  • What is the hardest part of completing the task?
  • Please describe your experience with… or how you use the product?
  • How much do you know about this topic?
  • How often do you use similar products?
  • What exactly do you use the product for?
  • Why do you use the product?
  • What could be done to make the product better for you?

Questions to Gather User Behavior

  • How would you describe your current or past experience with the product, app or website?
  • What is the most important task you need to perform with the product?
  • How do you navigate to the product? If it’s a website or app, do you use search engines, enter the URL directly or bookmark the site?
  • What do you often look for that's missing or hard to find when using this product or application?
  • If you had a question about this product do you know who to contact? If yes, whom would you contact?

Question About Past and Future Use

  • Can you recall a past situation when you faced a challenge with this problem? What did you do?
  • What’s your most memorable interaction with the product?
  • If you could picture it, what does the ideal product experience look like?
  • What are the most important features of the product?
  • How do you usually access the product? Via desktop, tablet, or mobile? (If the product is a software or website)
  • What would make you stop using this product?

Specific Task Questions

  • Could you show me how you use the product to perform the task?
  • Assume that I’ve never used this product before. How would you guide me so I can do it myself next time?
  • Walk me through your process for using the product to complete a task
  • Do you include other tools alongside the product?
  • If yes, can you show me how you integrate the tools and what functions they perform?

Follow Up Questions

Use follow-up questions to dig deeper into a topic. Most times, the participant won’t be clear enough with their answer, and follow-up questions help you to better understand their point of view.

Use the Five Whys Technique to drill down to the root of the problem by asking “why” five times. Without asking “why” you may misinterpret the motivations for the research participant.

five whys technique for ux research

However, it’s also important to know when to stop. You’ve uncovered the problem when the question “why” doesn’t yield any useful response and can’t go any further.

Other examples of follow up questions include:

  • You said… can you explain a bit more about that point?
  • What do you mean by…
  • Interesting, could you give me an example or elaborate to help me understand better?
  • To be clear, it sounds like you’re saying… is that correct?
  • What was it about the product that made you say…
  • Why don’t you like…

Question to Close the Interview

  • Would you recommend this product to someone else? If yes, why? If not, why?
  • Is there any question we haven’t asked that you think would be valuable to our research?
  • Is it okay if I reach out with more questions regarding this project?
  • Do you have any questions for me?

During the interview

Make your interviewee feel comfortable.

Take a deep breath and smile before you enter the interview room. According to Psychology Today , a smile is a powerful tool to improve your mood and make you feel happy. Since positivity is contagious, research participants will feel more at ease during the interview.

Also, dress casually rather than wearing a formal outfit so it doesn’t feel like a job interview. Tell them you’re here to test a product, not the participant.

With remote interviews, start with small talk to loosen them up. Introduce yourself, tell them about the research, and summarize their rights and terms of participation.

Use a Semi-Structured Interview Format to Encourage Dialogue

In a semi-structured interview, you don’t follow a formal list of questions. Rather than yes or no questions, you ask open-ended questions that allow for discussion.

A semi-structured interview encourages a two-way conversation that leads to a comprehensive understanding of the topic. Since the interviewee is at ease, they are more likely to expand on experiences and techniques that offer better insight.

Resist the Urge to Educate

Curiosity and an open mind are key ingredients to a good interview. You’re there to learn, not teach. Do not judge or correct your interviewee no matter how silly their answers seem. Your goal is to get as much information during the limited time for the interview.

Build Rapport with Participants

Building rapport encourages participants to share their thoughts and opinions. Put yourself in their shoes for a moment. You’re walking into a room to speak with strangers. Subconsciously, you censor your thoughts and withhold information because you’re not sure what to say.

Rapport equals a good connection. If participants trust you, they are more forthcoming in their answers.

A few tips to help you build rapport during UX interviews include:

  • Be accommodating
  • Keep your body language open at all times
  • Remove barriers between you and the participant
  • Make eye contact
  • Show interest

Avoid Leading Questions that Frame the User’s Answer

Leading questions prompt the interviewee to provide a predetermined answer. The question contains information that leads to a biased answer.

Let participants tell you what they think with their words. Don’t fear the silence that takes longer than normal. Sometimes, people need a minute to gather their thoughts and provide a cohesive answer.

It’s important to have a partner with you during the interview. One person is taking notes and out of camera focus. The second person is asking questions and listening to the interviewee.

Practice asking follow-up questions to show the participant that you’re paying attention. Use smiles and head nods to engage physically.

The space between the note taker and interviewer should be clear. During remote sessions, you could turn off the camera for the note-taker so it’s easier for the interviewer and interviewee to focus on each other.

After the interview

Ask permission to use the recording.

If you’re recording the conversation, make sure participants give consent to being recorded as well as how you plan to use the recording. It’s also important that they know they can stop the recording at any point in the interview if they become uncomfortable.

Say Thank You and Wrap Up

After the interview, leave time for questions and thank the research participant for their time. This is polite and offers them a chance to ask questions. If they have any feedback on how you could better conduct interviews in the future, now’s a good time to ask.

How to Analyze User Interviews with Aurelius

After conducting the interviews with all the participants, the next step is to analyze the research . You probably have dozens of notes, videos, and audio recordings to go through. Qualitative data like this could be overwhelming if you don’t know what to do.

This is where Aurelius shines best. Here are a few ways Aurelius helps you make sense of user interview data:

Turn Spreadsheets, Videos, Audio Files and More Into Notes

Create a new project, upload your research notes , spreadsheets, video, and audio recording with the magic uploader. If you’ve got data elsewhere, use the bulk input feature to copy and paste data.

Transcribe your video/audio recording into notes automatically.

adding research documents in Aurelius

Here’s a video showing you how to get notes into Aurelius

Find Information Quickly With Tags

Make sense of your research data with our powerful project tags . Use this feature to find or describe your user interviews quickly.

A few examples of tags to use include:

  • Name of interviewee
  • Research goals
  • Questions asked during the interview
  • Name of product or topic

Look for Patterns With Keywords

Looking for patterns or words that come up repeatedly? Use the keyword feature to highlight patterns and find the most used words across all your user interviews.

Automatic keyword analysis for ux research in Aurelius

Highlight Your Major Findings With Key Insights

Use key insights to write down summaries of what you learned from each user interview. It’s a great way to share high level points of the research without asking stakeholders to read all the data.

analyzing ux research and creating key insights in Aurelius

Make Suggestions With Recommendations

Want to suggest next steps based on key insights? Use the recommendations feature to capture suggestions, action items, or outcomes. Link key insight to provide more context to your recommendations. Aurelius automatically adds your recommendations to the report.

Share Reports And Presentations with Project Stakeholders

Reporting is a breeze with Aurelius. We automatically turn recommendations and key insights from your project into reports that you can edit, design, and share with teammates and project stakeholders. You can customize your reports with documents, text, boxes, and lines.

ux research reports automatically created in Aurelius

You can also share a live link to your report, download a pdf or send it via email.

Done Right, User Interviews Help Design Teams to Build User-Friendly Products

You can't build a product for the user without gathering their input. Conduct user interviews to uncover meaningful answers that improve user experience.

Combine user interviews with other research techniques such as user surveys and usability testing to either disprove or validate your hypothesis. When you’ve gathered sufficient data, use Aurelius to sift through the information, find insight, make recommendations and create shareable reports.

Learn how Aurelius can help you organize research data and analyze information from your UX interviews

user research interviews

My Expert Guide to User Interviews

Techniques & Tips with my Interview Cards

User interviews are an important method in our UX researcher toolbox. They help us gain insight about our users and their needs. Well written and well conducted interviews are the key for gathering useful and helpful qualitative data. To help us with that, I created a set user interview and follow-up questions cards and a digital user interview guide workshop board (Miro and Figma). Those tools make writing better interview guides easier. They also help experts plan and conduct user interviews more effectively.

  • What are user interviews
  • How to plan and conduct user interviews sessions
  • Focus on the content of an interview guide
  • Tips to write  good and effective interview questions

Using my User Interview Cards to plan and conduct interviews more effectively

  • Facilitating user interview sessions like experts
  • Tips on how to analyze the data .

What are User Interviews

User Interview is a method to collect information, via direct conversation with users. Interviews can be face to face or remote.

It is one of the main user research methods in our researcher toolbox. It helps us understand user needs, but also motivations, opinions, frustrations, pain points. Unlike surveys, user interviews are live. It is an interactive conversation.

Types of user interviews

There are different types of user interviews:

  • Structured : the researcher prepared a guide they strictly follow.
  • Semi structured : the researcher prepared a guide as pointers, but the rest is follow up, to have a more natural conversation.
  • Contextual interviews: the researcher observes and interviews the participant in their environment. This brings context and gives more insight on the user’s environment. The research asks questions but also asks the participant to perform tasks.

A screen with teams open, a user and a note taker, a second screen shared by participant, and some paper notes

Example of a remote contextual interview, where we asked participants questions about their tasks and activities, and asked them to share their screen so that we could observe them while performing those

When and why do you do user interviews

Most people think user interviews are done at the beginning of a project. It’s true, but not the only time those can help. There are different reasons to do user interviews at different points in a project:

  • Discovery phase: at the beginning of a project, to understand user, their goals, tasks, activites, painpoints and needs
  • Before you start the design: to inform about user journeys, flows, inform future design decisions (that need to be tested later)
  • During usability testing : you ask participants to perform tasks.  But you can also add a few interview questions, especially follow-up ones in there
  • After the product is on market : observation and interview to understand usage of the product or feature you launched. And then you can keep on improving it.

How to plan and conduct user interviews

Setup of user interviews.

3 people in a triangle setup: one is behind a computer and is the interviewer, another one answers she is the participant, the third is taking notes in the corner, she is the note taker

The ideal user interview involves two UX researchers / designers and one user:

  • 1 interviewer who will conduct the interview, ask questions and follow-ups
  • 1 note taker who will take notes and usually not participate in asking the questions
  • 1 participant in the target audience

You can also have some observers. But, make sure they understand they should NOT say anything. If you can, record the interview and show them recording instead.

Prepare & plan

Preparing user interviews entails recruiting participants from your target audience. Here are the main steps:

  • Set the goals of your research (aka prepare a research plan)
  • Prepare your interview guide . The interview guide is a structured document with the list of questions you are going to ask. This is the part where my cards will help you. An interview guide is also super useful to note takers, because it gives them a structure for their notes. (more tips on questions in next session)
  • Test the guide with colleagues. This part is very important to avoid biases in research
  • Schedule the research (dates in the calendar, etc.)

If you want to lear more about biases and mitigate them when creating user interviews check my “ UX Cognitive Bias Cards & Workshop “. It’s a fun way to raise awareness of biases when designing products & services

Recruit participants

I won’t go into full details of recruiting, this deserves an article on its own. But here is the tool long, didn’t read version of what you need to not forget when recruiting participants:

  • Explain the goal (without giving too much info sometimes to not bias)
  • Explain compensation
  • Don’t forget to have a screener (a list of questions to know if they are in your target audience or not)
  • AVOID group sessions at all costs. Focus groups are worthless, you end up with the loudest person’s opinion and a bunch of group biases
  • Schedule session with the participant
  • Send reminders a few days before the session

Structure of the interview guide

user research interviews

Example of an interview guide in word

An interview guide will help you structure your interviews. Remember: it’s a guide. If you feel the need, it’s okay to go beyond this guide to dig into specific topics  that come up with participants. Nevertheless, you still need a solid structure. Mine usually have the following parts

  • I remind participants about the context of the interview
  • I thank them again
  • I make it clear that there is no right or wrong answers
  • I ask permission for recording
  • I remind them that we can take a break or stop anytime
  • Warm up / general questions : Often questions about who the participant is, what is their relationship with the research/products. Those are here to put the participant at ease and help them get used to talking to an interviewer. Also: can be used for rescreening.
  • Main questions : they will depend on your research topic. This is the core of your interview and I’ll detail how to write good interview questions in the next section.
  • Wrap up and end of the interview : I usually ask participants if they have things to add. And if they have questions for me. I also explain how to get the compensation and thank them. Sometimes, I might ask if they can refer other participants to me.

If you need help, you can get my user interview guide template.

Get User Interview Guide Template

How to write good and effective interview questions

A cheatsheet of user interview questions (in yellow) and follow-ups (in purple)

If you need help with the questions, I put together a cheat sheet of interviews and follow up questions . Those are fragments of questions that should help you get started writing your own questions. I organized them in different categories depending on what type of question I want to ask.

Do this when preparing user interview questions

Let’s start with the tips on what you should do, when writing and preparing interview questions.

  • Start with warm up questions . Often questions about who the participant is, what is their relationship with the research/products. The goal is to put people at ease, warm them up.
  • Start with broader themes then break those down into answerable questions
  • Try to focus on past behaviour rather than predict the future
  • Anticipate answers and prepare follow ups based on the goal of the research
  • Ask Why , 5 times
  • It’s okay to have more questions than needed
  • It’s a guide : it’s okay to explore outside it participants are say interesting things

Avoid those questions when preparing user interviews

  • For example “how satisfied are you with our support team”, it assumes the person was satisfied, what if she was not? 
  • Better “how was your experience with our support team?”
  • For example “was the product easy to use and affordable”. 
  • Better: split the questions in 2 different ones. 
  • For example: “Would you prefer to subscribe to a car insurance online or via one of our agents?” This question assumes the person owns a car and needs car insurance. What about people who don’t?
  • Fix: make sure you checked that assumption before in the interview, if it’s not true, pass the question?
  • For example: “how do feel about our product” could lead to an interesting discussion, or a very short close answer
  • Fix: prepare less vague questions on top of that and / or some follow up question ideas if the person isn’t super chatty about that topic
  • Sometimes, you will use yes/no questions as screeners for extra interview questions that only apply to a specific set of users.
  • If you don’t find a more open question, and go for yes/no, try to anticipate follow-up questions for those: if the person says yes, then we ask this follow up, if they say no, then we ask that follow up
  • Avoid asking people what they want / asking them for solutions . Rather try to understand what they are trying to do, to accomplish, their tasks and activities. And then, you can find a way to improve that.
  • Be careful about bringing your own biases to the research. Check 52 UX Cards to Discover Cognitive Biases for more on the topic.

43 User Interview and Follow-up question cards

A PDF with user interview questions, some purple and yellow user interview question and follow-up cards and some cards examples in figma

It’s not always easy to come up with effective user interview questions. I was always checking the same books and articles whenever I wanted to build my own guides. So, I came up with a cheat sheet for user interviews and follow-up questions . This was great, but it’s also a long list. To make this more digestible, I turned it into a set of 43 User Interview Questions . They include questions for the interview and follow-up questions. Those are fragments of questions that will help you get started writing your own questions.

I have mostly 3 ways of using my cards in paper and digital format. If you get them and find other ways, please share by email or on social media!

A pocket sized mini library, of ready to use starter questions

Multiple cards on a table with sticky notes on top of them to write the full question

I use the cheatsheet and cards as starting points,  to build my own interview guide questions . Both interview and follow-up cards are organized in different categories. I can pick and choose question fragments from those categories. Then, I complete the question based on my research needs. It’s like a mini library of ready to use questions, so it helps me save a lot of time.

You can also use the cards as a small reminder on your desk before or  during the interviews. They can help you remember things you might  forget, especially the follow-up questions.

I created a printable version of those cards, with 2 different printing ways: side by side (if your printer plays nice) or fold-able (if your printer doesn’t align properly the side by side).

Get the 43 Printable Interview Cards

A pop-up workshop, to write user interview guides more easily

miro and figma board with some digital cards to build interview guides

When working with teams to build user interview guides, I find it’s easier to not jump straight into MS word. I prefer to first brainstorm questions in a more visual format . My cards are really helpful for this. It can be a “physical”  or a digital brainstorming session. In both cases, here is how I do it:

  • Warm up questions
  • Main interview questions
  • Wrap up questions
  • I prepare the cards and arrange them by theme (interview vs follow-up) and question categories. On the table, or on the board.
  • I ask people (colleagues, students) to brainstorm questions for all 3 parts, based on our research goals and objectives. The cards act as “starters”.
  • Usually, they pick a card, then complete the full question on a sticky note.
  • People can either brainstorm questions on their own, and then we put them together. Or, it can be a group session from the start where everyone adds their questions along the way. This is really up to you, you know your team dynamics better than I do.
  • Once you all agree on the questions that will make it to the guide, we can create a “clean” MS word version of it. (if you need help with guides, check my user interview guide template) 

If you want to save time, I have already prepared some Miro and Figjam boards. You  can import them into your accounts (you need a paying Miro account though) and work with your teams on those.

Get the Miro / Figjam workshop boards

I also try have my research guide proof read by someone (a researcher or designer) who didn’t participate in the question writting session. This help mitigate different researcher biases we might have brought in the guide.

You usually build your user interview guide questions with other researchers and designers . Those are the people you want to invite in the brainstorming session. I advise you to not invite your stakeholders, or untrained people with zero UX research background. Because, you will end up with biased and poorly written questions . And you want to avoid them bringing their own biases into the research. Also, I have years of experience with shitty questions written by untrained people. Trust me, it will just bring unhelpful data as results.

A study guide, to teach how to write better interview questions

The last way I use those cards is as a teaching tool . I teach how to build better products to design students and professionals in the UX industry.

When I was in school, I used to make small cards with the most important information from my lessons. My friends would borrow them because they liked how short and clear they were. They could keep them in their pockets and study them between classes.  If you are a junior or a student, you can use my user interview cards the same way. Those cards are a very nice little study guide to help you get better and more effective at writing your own questions.

The digital board workshop can also be used as a teaching tool for juniors, mentees and students.

Facilitating user interview sessions like experts 

Wow, you made it here. You are ready, to actually, facilitate the interview! Congrats! If there is one thing you need to remember: a user interview is a conversation! It’s NOT a survey. It’s all about active listening on the

Here are a few tips to help conduct good user interview sessions:

  • Start with the introduction , explain there is no right or wrong answer
  • After the introduction, have some warm up questions, like background or previous experience. Those are here to help the participant get comfortable with the process.
  • Try to put people at ease. Don’t judge , but also don’t fake empathy: be friendly but professional
  • Don’t be shy to ask for clarification . Or to play coy: ask about things that might seem obvious. Sometimes the answers can be surprising.
  • Answer questions with questions : if a user asks for something try to return the question.
  • Try to pick up non verbal cues . It’s harder remotely though, you have to check the webcam, tone and voice of the person
  • Get comfortable with silence : don’t rush, pause, let people time to think and answer. . If you struggle with this, my fun advice: count in your head, the same way you count when lightening strikes: 1 Mississippi, 2 Mississippis… up to 5 Mississippis. Eventually, you will get comfortable with silence.
  • If possible, have a note taker , so that you can be present and actively listen to the participant.
  • If you can’t get a note taker, ask if you can record the session . Take minimal notes that help you follow-up, but don’t write a full transcript while interviewing. Be present.
  • Make notes of the questions that didn’t work (for future interviews, don’t change that one)

Report and analysis of user interview data

After the interview, I like to make a short note about the most important things I learned while it is still fresh in my mind. Here are some tips to help with reporting and analysis:

  • Transcribe the notes (you can use automated tools if you recorded)
  • Organize interview content on sticky notes
  • Search for patterns and themes
  • Review, refine
  • You can also build user journeys out of the data, and other UX deliverables that will help you improve your product or service.

user research interviews

Example of an analysis of a user interview sessions: a Miro board with an affinity map of the interview main findings clustered by category and a user journey map that mixes some task analysis content

I won’t enter into the details of data analysis, but here are some resources to help you:

  • 4 methods for analysing user interviews – Rick Dzekman
  • A Strategic Storytelling Tutorial – Modernist Studio
  • Qualitative Coding for UX Research Analysis
  • Techniques to Identify Themes in Qualitative Data
  • How to Analyze Qualitative Data from UX Research: Thematic Analysis
  • Analysing Qualitative User Data – Fr Maria Panagiotidi

User Interviews sound like an “easy and fast” method to a lot of people. Because, it’s sold as “just go grab a few people and ask them a couple of questions” by many articles these days. But, if you want quality quantitative data at the end, to improve your product, you need quality interviews . Having a good research process helps a lot here. And you need unbiased questions. And, if you need help, I have a lot of User Interview Resources available .

The cards I created helped me save a lot of time when building interview guides. They help be more efficient, but also, more consistent. Finally, they encourage me to write better, more unbiased questions. And I think they can help you, and any UX professionals out there.

More resources on the topic

If you want to dig further on the topic of user interviews, you can check those:

  • On my blog: A Cheatsheet for User Interview and Follow Ups Questions
  • “ Just Enough Research – Erika Hall’s book, she has a whole chapter on interviews and great advice
  • If you speak French, Méthodes de design UX (2ed 2018) – Dr Carine Lallemand”
  • 12 Ways To Improve User Interview Questions
  • User Research Methods : usability.gov still has some of the best resources around interviews and testing
  • Interviewing for research
  • User Interviews: How, When, and Why to Conduct Them
  • 16 tips for better user interviews
  • A Guide to Interviewing Users
  • User Interviews – a guide to user interview, by, user interviews
  • How to ask questions like a UX Researcher
  • The Optimizer’s Guide to Conducting User Interviews & Analyzing the Data
  • Learning to build the right thing; how you can apply user research to make your product successful
  • IDF’s article on User Interviews presents and details 8 types of User Interviews

Other articles you might enjoy:

  • A Cheatsheet for User Interview and Follow Ups Questions
  • User Experience – Starter Crash Course for Startups
  • A User Research and UX Design Starter Kit
  • UX Cognitive Bias Cards & Workshop
  • A Designer’s Guide to Documenting Accessibility & User Interactions

' src=

Published on 24 March 2023

in Expert UX Templates & Resources

By Stéphanie Walter

Nice curated content, in your mailbox!

Stay informed when I publish new articles and save some precious time on content curation. Join a select group of developers, designers, UX researchers and agencies who also get exclusive resources and discounts along the year. (If you didn’t get a confirmation email, check your spam box.)

Both fields are mandatory

♥  Design by Stéphanie , code/motion by Geoffrey , SEO by Myriam   ♥ — © 2024 — All Rights Reserved

12 Tips For Conducting Better User Research Interviews

user research interviews

Rev › Blog › Marketing › 12 Tips For Conducting Better User Research Interviews

You can’t develop successful business products in a vacuum. User research is among the most effective ways to ensure your customers get the results they want from your products. Done correctly, user research is a groundswell of useful and creative feedback you can apply to product development.

But you won’t reap the benefits of users’ experiences without interviewing them effectively. That’s why becoming the best interviewer you can during the user research process is so critical to your products’ success.

In this article, we’ve expanded on 12 tips you can employ in the three stages of the user research process—before, during, and after each interview—to ensure you get the greatest value from the user research phase of product development.

Before the User Research Interview

Our first four tips focus on critical steps you should take before a user interview begins. Consider these best practices when finding your interview candidates, preparing questions for each interview, and scheduling each interview in a way that maximizes output.

1. Develop an interview hypothesis.

A hypothesis is simply an assumption that a problem or improvement area exists, which you are setting out to prove. You will spend the interview process attempting to understand the nature of this problem or improvement area; later, you will uncover ways to resolve it.

For example, let’s assume your product helps business users identify the best deals for booking travel arrangements. Your users frequently book their flights using your tool, but stop short when prompted to book hotels. In this case, a reasonable hypothesis is, “Our hotel booking feature is turning off users.” By focusing on this key cause for concern, you ensure you produce results on that specific topic—results that can drive actionable change.

2. Determine the scope of research.

Now that you’ve established your hypothesis, define the scope of your research. According to PhD Thesis , your scope “defines clearly the extent of content that will be covered by the means of the research in order to come to more logical conclusions and give conclusive and satisfactory answers.”

Let’s consider your hypothetical travel booking product. You must determine which aspects of this product you want your research to cover based on what’s actionable by your product development team. Here are a few qualities you might consider targeting when asking users about their experience using the product:

  • Attractive interface
  • Intuitive navigation
  • Clear value to the customer
  • Processing speed
  • Comparability to similar products

These qualities cover very different areas of the user experience, namely design, functionality, appeal, and market value. You may want general feedback on each of these subjects. However, you may also want specifics on only one or two in a key area, such as functionality. The scope of your research should set you up for success when improving your product later.

3. Prepare interview questions.

Now that you have a scope, use those focus areas as material for your interview questions. The following are a few best practices when preparing your interview questions, and when preparing personally for a user interview in general.

Start each line of inquiry with general questions about the subject. Start with general questions about each area of focus to establish a cadence with your interviewees This allows for qualitative feedback and gets users thinking about that topic specifically. Then, follow up with more specific questions on this subject. This will provide more actionable information during later analysis.

Use open questions and follow up questions. Successful interviews establish the right balance between flexibility and adherence to your specific questions. With this in mind, don’t be too rigid with your questioning. You may find new lines of inquiry or a particular user’s feedback valuable.

Keep it professional. Avoid becoming too close to your subjects, which could lead to biased responses. You want objective insights from your users, not superficial praise or indifference.

4. Schedule the user interview.

If interviewees feel you’re wasting their time or inconveniencing them, they’re unlikely to enter the conversation in good faith. While scheduling interviews, offer interviewees plenty of flexibility in selecting a time to meet. Use specialized tools, such as Calendly , which make the scheduling process easier and more personalized. You may also want to offer interviewees incentives as well.

You should also manage your interviewees expectations. Be clear about how long you expect the interview to take and what types of questions you will be asking. If you plan to record the interview—as you should—you must tell them so ahead of time.

During the User Research Interview

The following tips will help you prepare for all aspects of the interview itself. This includes recording, responding to different user responses, your attention to detail during product testing, and best practices to keep your bias, experience, and knowledge out of the interview.

Conducting your interview remotely? Don’t sweat it. We’ve got plenty of video interview tips to help you along the way.

The interview.

Ensure you have a dependable audio or video recording tool. Protect yourself from missteps, such as failing to start a recording, by automating the recording process or assigning a dedicated team member to the task. Ensure the recordings are produced in digital format. You may also need to have interviewees vocally approve of being recorded during the interview. Be sure this is among the very first things they do during the interaction. 

6. Be deliberate in your testing.

Remember, you enter each interview with specific goals in mind. The subjects of your questions should fit within the scope of what you’re testing. Consider doing interviews more frequently if you want data on lots of different subjects.

7. Be an active listener.

Even though you have a clear hypothesis and scope, your job is to glean insights from your interviewees. The interviewee will naturally arrive at subjects within the scope of your research, or arrive at them with limited guidance from you.

Don’t be too quick to come to their rescue if they are struggling, either. Much of the value you gain from these interviews is contextual—if an interviewee struggles to explain a problem, you may benefit from witnessing that struggle.

8. Stay neutral.

You want users’ feedback to come only from their own experiences. That’s why, during the interview, they should not have immediate access to resources beyond what they would normally have when using your product. The less information that comes from you, the better.

9. Keep it relevant.

Your goal is to identify areas of improvement within your product by parsing user interview data. Your goal is not to get ideas for improvements from users during their interview.Keep users talking about what they are currently doing, why they are doing it, how they are doing it, and what is frustrating about it. Avoid talking about what the product could or should do, or even what you are planning internally.

After the User Research Interview

These final steps address what to do after each interview—and the entire interview process—Is over. This includes simple courtesies, like thanking the user, but also how to process and disseminate interview data.

10. Thank the user for participating in the interview.

Establish ahead of time exactly what you’d like to say at the end of each interview. Establish a protocol for following up as well, whether it’s with a personalized thank you or even an incentive. Most importantly, ensure that your closing statements make clear that your relationship with one another in this capacity has come to an end.

11. Transcribe all your recordings.

You should have start-to-finish digital recordings of each interview once your research is complete. Organize these recordings in an intuitive way before submitting them to a reliable transcription service.

Transcribed quotes are a powerful tool for convincing stakeholders of necessary changes to your products. They are also critical to qualitative analysis, which you will need as you disseminate interview data into actionable reports. Rev provides a variety of transcription services so you spend less time on busy work and more time on deriving value from your research.

12. Share your findings.

Following your successful qualitative analyses, you will have a full-length, actionable report or reports to share internally, with stakeholders, or with business partners responsible for your product. Ensure you have all necessary approvals before sharing any personal or sensitive information. With the right insights, you’ll have a valuable resource for enhancing your product and driving long-term business success.

Growing as a User Research Interviewer

User research interviews are a boon for product development; they can be a boon for your professional growth as well. Use your transcripts and findings to identify where you can improve as a user research interviewer. Consider how you can improve your questioning, personability, or success honing in actionable subjects of interest. With accurate, in-depth interview transcripts on hand, interviews become an opportunity for you as well as your organization.

Download FREE Usability Interview Checklist

Related content, latest article.

user research interviews

How to Analyze Focus Group Data

Most popular.

user research interviews

Differences in Focus Groups & In-Depth Interviews for a Successful Market Research

Featured article.

user research interviews

How to Use Automatic Transcription as a Marketing Professional

Everybody’s favorite speech-to-text blog.

We combine AI and a huge community of freelancers to make speech-to-text greatness every day. Wanna hear more about it?

Root out friction in every digital experience, super-charge conversion rates, and optimize digital self-service

Uncover insights from any interaction, deliver AI-powered agent coaching, and reduce cost to serve

Increase revenue and loyalty with real-time insights and recommendations delivered to teams on the ground

Know how your people feel and empower managers to improve employee engagement, productivity, and retention

Take action in the moments that matter most along the employee journey and drive bottom line growth

Whatever they’re are saying, wherever they’re saying it, know exactly what’s going on with your people

Get faster, richer insights with qual and quant tools that make powerful market research available to everyone

Run concept tests, pricing studies, prototyping + more with fast, powerful studies designed by UX research experts

Track your brand performance 24/7 and act quickly to respond to opportunities and challenges in your market

Explore the platform powering Experience Management

  • Free Account
  • For Digital
  • For Customer Care
  • For Human Resources
  • For Researchers
  • Financial Services
  • All Industries

Popular Use Cases

  • Customer Experience
  • Employee Experience
  • Net Promoter Score
  • Voice of Customer
  • Customer Success Hub
  • Product Documentation
  • Training & Certification
  • XM Institute
  • Popular Resources
  • Customer Stories
  • Artificial Intelligence

Market Research

  • Partnerships
  • Marketplace

The annual gathering of the experience leaders at the world’s iconic brands building breakthrough business results, live in Salt Lake City.

  • English/AU & NZ
  • Español/Europa
  • Español/América Latina
  • Português Brasileiro
  • REQUEST DEMO
  • Experience Management
  • User Interviews

Try Qualtrics for free

User interviews: definition & questions to ask.

25 min read What do your users need — and how should you engage with them? Here we explain how you can get to know your users, and their expectations of your products and services, with our complete guide to user interviews.

When you need to get user feedback quickly and efficiently, whether to improve a product or service, or help bring-to-market a new solution, one of the most effective methods is user interviews.

With user interviews, you can learn about how prospects and customers perceive your product or service, as well as its usability, and discover key insights that can help you make what you do (at every stage of the journey) better.

Free eBook: Three principles for delivering experience-led product innovation

What are user interviews?

User interviews are qualitative, UX research methods designed to better understand those who interact with your company, products, or services.

Researchers will select and interview one user (a target audience member who interacts with your company to achieve their goals) and find out about their views on a topic by asking questions and recording the responses.

Unlike focus groups, research interviews are one-to-one and can take place in person, over a video call, or via a telephone call.

User interviews can be used to find out about:

  • The user experience
  • User habits and interactions
  • Likes and dislikes

As well as the above, the interview questions can explore a user’s immediate reaction to a stimulus or gain their reasoning as to why they would act in a certain way.

And by taking a semi-structured or structured approach to user interviews, researchers can quickly and easily obtain user experience insights to help improve products and services.

What types of user research can you do?

There are two types of user research:

  • Qualitative research:  Interviewers collect non-numerical data through open questions and face-to-face interactions. This kind of research provides emotional or experience data (X data) that can be used to explain a user’s emotional decisions and how they perceive the company.
  • Quantitative research:  Interviewers use numerical data from sources that collect information over time and convert it into datasets or closed questions on a demographic survey. This kind of research provides economical or operational data (O data) like user demographic metrics and historical census reports.

Both types of data are useful for supporting business decisions and UX research — as long as you follow best practices.

Qualitative research provides answers to the thinking behind user decisions and explains ‘why’ users behave in a certain way.

Quantitative research can tell you about a user’s past activities and ‘what’ happened.

One can support the other – quantitative data provides context to qualitative experience information, while qualitative data provides meaning to the fact-driven findings of quantitative research.

Why conduct user interviews?

As user interviews give critical insight into what your prospects and customers think about your product, service, website, or any other element of your company, they can be a great way to improve the overall user experience .

User interviews are used for several reasons:

  • They provide general insights about what a user thinks about your products or services, giving your company information as to why users connect or don’t connect with your brand .
  • The information you gain about a user can be placed into your marketing personas or user journey mapping models for better user engagement planning.
  • You can investigate how a user will react to a new product change or a solution to a known problem, which can give development teams the confidence to move ahead with designs.
  • By getting user feedback, you get the users’ perspective on your product or service and ensure that they meet the user’s needs.
  • The insights from analyzing user responses can help management to approve business decisions, as they understand the views of the user.

What are the advantages of user interviews?

As well as the points mentioned above, the main benefits of using user interviews as a user research method are that these interviews can be quick and cheap to do.

Using digital tools and end-to-end software, data can be collected and analyzed using integrated dashboards, helping researchers to keep organized and find results faster. New data can be uploaded in real-time to help guide new strategies and changes, resulting in a proactive — not reactive approach to user experience.

Lastly, user interviews are simple to conduct. Whether they are structured interviews or semi-structured interviews, all that needs to be done is form questions ahead of time, interview a user and record their answers. No special training is required.

What are the disadvantages of user interviews?

Interviewers need to be able to remember and recall user answers accurately to understand the attitudinal information being received. However, researchers may recall an answer incorrectly, which could impact the accuracy of the result.

Bias is another challenge in face-to-face interviews. As the user is aware of the researcher’s presence, they could feel embarrassed or decide to be agreeable in their answers, leading to acquiescence bias . Likewise, the interviewer can also interpret answers and bias their own recall.

If interviewers make notes during the interview, their attention will be diverted and they could miss some vital user insights. The jotting down of notes could distract the user or make them feel uncomfortable speaking.

Asking users about their thoughts on how a product can be used in the future is also fruitless as it asks the user to speculate beyond their knowledge. The same is true for design questions about a product or service. Users are not familiar with design processes so their responses won’t give accurate reasons why a design should be changed from one version to another. In these cases, watching the user’s behavior with a designed solution will help researchers understand if the new design worked.

Sample size is also limited to the size of your interviewing staff, the area in which interviews are conducted and the number of qualified respondents in the area. You may have to conduct your interview across multiple areas to get the quantity and quality of data you are looking for.

When do you conduct user interviews? Where in the research process?

There are many times when user interviews can be useful for internal research, marketing, and development teams. Usually, these opportunities are early on in the development cycle for a product or service, though they can be used at every point in the journey.

For example, if they’re used at the beginning of a project, they can help teams understand the user’s perspective of their thoughts, behavior, and desires.

On the other hand, if done during product development , user interviews can give answers that could help the creation of a new product idea, help improve current product designs or solve bad user experiences.

How do you conduct a user interview?

There are three stages to consider when conducting a user interview:

  • Before the user interview
  • During the user interview
  • At the end of the user interview

1. Before the user interview

You’ve decided to conduct a user interview to help you hear more about what your users think of your product or service. How do you begin?

Define what you want to achieve in the user interview

What is the overall purpose of the research, and what will help you feel you have met that by the end of the user interview?

Consider if you need an additional interviewer in the room

Having an extra person can help spread the workload. One person can interview the user and the other can take notes. This can also provide a nervous interviewer with additional support at the time of the interview.

Target the users you want to interview in advance, and ensure you have consent to record their answers

This could be consent to be interviewed, recorded, and for use of their opinions. Missing this step can slow down the analysis of the answers.

Prepare your interview questions

You must get the questions right before the user interview. See our ‘How to write questions for a user interview?’ section for more information.

Prepare a list of follow-up questions and alternative format questions that ask for the same information

If you’re worried about not getting the right information from the user, a series of follow-up questions or alternative questions that ask the same question can help you discover more information without making the user feel uncomfortable.

Decide your formality level for the interview

Having a completely structured interview stick to a rigid script and don’t allow for more conversation or digressions onto related topics. The opposite is unstructured interviews, which don’t have a script and take a natural conversation format. However, there’s a chance you won’t collect the information you want.

We recommend sticking to a semi-formal structure that uses structured questions, but the interviewer still has the freedom to ask follow-up questions and probe into answers further.

2. During the user interview

Here is a list of things to consider for when you start the interview:

Welcome the interviewee

Introduce yourself and give the interviewee an overview of what the research is for, how it will be used, and their role in the process. Not only does this help start building rapport between you, but it is also polite and gives reassurance to a nervous user who may not feel comfortable just yet.

Start with easier questions

You can ask them about themselves and have a short conversation that is off-topic, or ask simple questions like ‘How old are you?’ Asking simple questions that have an easy answer will give interviewees the confidence to continue in the interview, and help them relax into the interview.

As you ask questions, ask follow-up questions

If what the user said is interesting, feel free to ask more questions about the answer to see whether it gives you more useful information. Likewise, if you didn’t understand the answer, ask the user to rephrase it or explain it to you further.

Keep positive throughout the interview

It can be hard to do an interview when you or the interviewee is uncomfortable. It’s important to keep a positive attitude about the interview and use open body language and gestures. Also, don’t feel obliged to pill in pauses or silence. If you are giving your best impression, the user will respond in kind and give you a positive interview experience.

3. At the end of the user interview

Wrap up the interview.

Tell the user about what the next steps are and how their data will be used . Then, ask them if they have any questions that you can help answer for them. Most importantly, thank the interviewee for their time and taking part.

Best practice interviewer tips for conducting the interview

If you’re okay with the process and plan, but you’re a bit concerned about how you can make a user interview effective and be a good interviewer, we have some tips that can help:

Engage with your interviewee

Make eye contact, nod, and give verbal cues like ‘Yes’ and ‘I understand’ so that the interviewee knows that you’re listening to them and engaged.

Give the user being interviewed space to finish their thoughts before you move on, interrupt or change the topic

Rushing the user will make them feel unappreciated and a burden. Instead, allow them to take their time, be patient and listen to what’s being said. You can always book more time for another session.

Ask neutral questions

By asking neutral questions, you are avoiding adding additional information into the question that could sway the interviewee’s response. Leading questions are examples of bias that assume you agree with a position or opinion to answer. These can be uncomfortable for the interviewee and won’t provide you with honest and clear answers.

Make notes about how the user responds as well as what they say

In addition to noting down the answers that users give, you can also note what they’re doing (descriptive), how they outwardly appear when answering (inferential), and whether they infer a judgment (evaluative).

Treat the interview as a conversation if you’re nervous

If you’re worried about the semi-formal format of the interview, you can return to a friendly tone and aim to have a conversation with them. In this conversation, it’s likely that the answers you need will still come out.

Still not convinced of the power of merging user research and data science to gain rich insights and fuel product/service decisions?

How do you write good user interview questions?

In preparing the user question list, here are some useful dos and don’ts or best practices:

Make the questions with your research goal in mind

If you’re looking for insights into the user experience of a new mobile cooking recipe app, for example, ask questions about the user’s experience of cooking, how they use apps to help them cook, and where they get their recipes from.

Give your questions to another team member to look for errors or misunderstandings

Is the question phrased in the best way for the user to understand what they are answering? The team member can also provide their thoughts on the grouping or order of questions to get the maximum amount of information.

Include a list of follow-on questions that carry on the line of questioning into a theme or topic

These questions can be used if you get stuck while asking questions or to prepare you so that you don’t have to think on your feet. In constructing these questions, think of the responses you’re likely to receive and how you can probe deeper for each one.

Ask for one piece of information in each question

Combining multiple questions into multi-clause questions (like ‘What is your experience of this toy, and if it was bad, what could make it better?’) can be confusing for the user and you may not get enough information back that answers each part sufficiently. Instead, use follow-up questions to delve deeper.

Create extra questions

Some users may speak fast or have little to offer to some of your questions, which means that you run through the original questions quickly. It’s always better to have some spare questions ready to ask or to review your notes and take time to learn more about responses that interest you.

Create a vague question that does not ask for a clear answer or use the same questions continuously

Asking ‘How was your experience of that?’ can be tiring and monotonous after the first few instances. Keep your user engaged by varying question formats and asking for the right information the first time.

Forget to practice

Conducting an interview is hard. There is a lot to do and you must be on alert throughout the experience. If you’ve never done an interview, take some time to run through your interview stages and say the words out loud. You can see where you may need support or more work to make sure you keep to interview times and connect with your user.

Feel you have to stick to the questions you have created

Having a set of predefined questions (or a structured interview) ensures that you don’t miss a line of questioning that would help you reach your research goal. However, don’t feel confined by the questions and not ask any of your own. You can never know what information may come up, and being flexible with your time and questions can gain you insights that you haven’t considered before.

Use closed questions

Limit the use of closed questions that can only be answered with ‘Yes’ or ‘No’ responses as we want the user to open up about their experiences. Begin your questions with who, what, when, where, why, and how to encourage feedback. (You can use shorter closed questions that focus on small-talk at the beginning of the interview to create a rapport with the user. For example, asking ‘Did you have to travel far?’ at the start.)

Don’t directly ask users what they want

Chuck Liu , Head of User Research at Chime said that users don’t know what is possible when they’re asked what they want in an experience. Instead, Chuck said there were three better questions to ask:

  • What are you trying to get done? (Gather context)
  • How do you currently do this? (Analyze workflow)
  • What could be better about how you do this? (Find opportunities)

In addition to the three Chuck shared, here are some more questions you may want to use:

  • When talking about [a topic], what habits do you have?
  • What would make [this process/product] better?
  • How did you find this experience?
  • What is your favorite part about [this product]?
  • How would this [product/service] benefit you daily?
  • What was the hardest part of using [product/service]?
  • What did you enjoy interacting with?
  • What product(s) or service(s) do you use frequently?
  • Why did you use the [product] in that way?
  • Where do you see yourself using this [product/service]?
  • What would stop you from using this [product/service]?

Where do you conduct a user interview?

The environment of the interview room can set the tone for the rest of the interview. If you were being interviewed, you’d like to be in a comfortable, warm, and inviting environment that is semi-formal in appearance where you can relax.

Whether the interview is in your office, in a rented space, in the user’s own environment, or in public, choosing the right environment can help you have a good user interview. Here are some considerations to remember when choosing your location:

Is the space comfortable?

Is there access to a toilet and heating? Can you spend an hour sitting comfortably? Can you provide a glass of water or tea as a refreshment?

Is the location convenient for your target users?

If your user is vulnerable or unable to commute to the interview location easily, this will delay the interview process and may prevent you from getting the information you need from your target audience.

Is the location a brand-neutral zone?

If your location shows favoritism to a particular brand or company, it can unconsciously bias the interviewee, who may feel more inclined to favor the brand in their answers also.

If users are remotely being interviewed, consider whether technology would be a challenge for them and what the best communication method is for carrying out the interview.

How do you analyze user interview data?

After the interview with your user, you’ll have a lot of qualitative notes and these will need capturing and organizing in a central location.

Once you have the information stored, you can start considering what the information tells you, by following these steps:

  • Consolidate your notes (descriptive, inferential, and evaluative) in one place. Fill in any gaps and order the information so that it is grouped by theme or question.
  • Look through the results and see if there are common threads of user experience. Ask yourself what this tells you about the user’s profile, and what that could mean for your marketing personas .
  • Pull out key quotes that can help describe what issues or experiences the user is facing, as these can be used as evidence in stakeholder conversations.
  • Use semantic content analysis to understand the emotional drivers and language choices in the responses.
  • Create keyword clouds or mind maps to help you visualize the information and common words. This could help you identify themes or trends faster.
  • What actions can you take away from the feedback that could help improve the product or service offering?

After this initial stage, you can start thinking about:

Combining this data with other user data

This can help you compare, validate, and support user trends or provide evidence of issues to resolve. With more data sets, you may want to consider using a spreadsheet or an integrated technology platform that can handle text and sentiment analysis for you.

Investigating the data further with more research

If your user interview has identified areas of poor customer satisfaction or poor user journeys , then following up with a customer satisfaction survey (or usability testing or user journey map), can help develop these insights into actions.

Visualizing your data into charts and reports

These can be stakeholder-friendly methods for sharing results and allow a smooth transfer of knowledge to interested parties.

What is the difference between user interviews and usability testing?

Researchers may confuse a user interview with usability testing, as both of these research methods have a focus on the user, however, there are some key differences that make these two methods unique:

Usability testing involves a user interacting with a product or service to see how well it functions and solves their goals. By seeing the end user’s reaction to a product or service’s interface at the mid-to-end stages of the design and development cycle, the company can evaluate the user-friendliness of the system.

Based on this definition, a user interview can be on any topic and occurs at the early stages of user research, while a usability test requires a user to be involved in testing the design and function of the product or service.

Usability testing avoids bias as the user’s interactions are not influenced by an interviewer; instead, it’s based on observing the user’s actions. In a user interview, however, insights are reported directly from the user with the help of an interviewer asking questions.

This face-to-face continuous relationship between the user and the interviewer is kept to a minimum during a usability test, while a user interview relies on the rapport built between the two people to create trust and openness.

Lastly, and most importantly, usability tests are best used to evaluate whether a design is useful and easy to interact with. This can be useful in understanding what design elements to improve or remove.

Get started with user interviews

No matter what kind of approach you want to take to your user interviews (both structured and semi-structured), we can help you to create and conduct sessions that capture the information you need, as well as provide the platform to analyze the data for your research projects.

Our survey software powers more than 1 billion surveys every year and is used by more than 11,000 top brands and 99 of the top 100 business schools. With it, you can get answers to your most important market, brand, customer, and product questions.

You can also use our intuitive survey design software (featuring powerful logic and more than 100 question types) to create the perfect questionnaires for user interviews.

It’s sophisticated, easy to use, and readily scalable. Get answers and insights from your audience — wherever they are — and turn that insight into action.

Related resources

Analysis & Reporting

Data Saturation In Qualitative Research 8 min read

How to determine sample size 12 min read.

Market Segmentation

User Personas 14 min read

Focus Groups

Focus Groups 15 min read

Market intelligence 10 min read, marketing insights 11 min read, thematic analysis 11 min read, request demo.

Ready to learn more about Qualtrics?

InterviewPrep

Top 20 User Research Interview Questions & Answers

Master your responses to User Research related interview questions with our example questions and answers. Boost your chances of landing the job by learning how to effectively communicate your User Research capabilities.

user research interviews

User research is the cornerstone of creating products that not only meet but exceed user expectations. As someone who delves deep into understanding user behaviors, needs, and motivations, you’re on the front lines of helping businesses tailor their offerings to the demands of the market. And when it comes to landing a role in this insightful field, your ability to communicate the value of your findings is just as important as the research itself.

Preparing for an interview in user research means readying yourself to discuss methodologies, share how your insights have influenced decision-making, and demonstrate your empathy for users. To assist you in making a compelling case for why you’re the best candidate for the job, we’ve gathered a collection of typical user research interview questions. These will help you articulate your approach and past experiences effectively, ensuring that potential employers grasp the depth of your expertise.

Common User Research Interview Questions

1. how do you ensure representativeness in your user research sample.

Delving into the demographics and behaviors of a product’s user base is essential for creating solutions that cater to actual user needs and problems. Ensuring representativeness in user research means that the insights gathered will likely reflect the broader population, leading to more inclusive and effective product development. This question reveals the researcher’s knowledge of sampling methods and their commitment to diversity and inclusion, ensuring that the product doesn’t just serve a narrow or biased segment of users.

When responding to this question, highlight your experience with various sampling techniques such as random sampling, stratified sampling, or quota sampling. Explain how you determine the criteria for representativeness based on the product’s target audience and goals. Discuss any tools or frameworks you use to ensure a diverse group of participants and how you adjust your approach when faced with limitations in recruitment or inherent biases in the sample. Share examples from past projects where your attention to representativeness in your sample led to meaningful insights and outcomes.

Example: “ Ensuring representativeness in user research begins with a clear understanding of the target user base and the specific objectives of the research. I employ stratified sampling when the user base is diverse and can be segmented into distinct groups, ensuring that each segment is proportionally represented. For instance, if the product is used by both teenagers and adults, I would stratify the sample to reflect the actual user distribution across these age groups.

In cases where recruitment poses challenges or the population is less defined, I might opt for quota sampling to guarantee that critical user characteristics are represented in the sample. I’ve utilized tools like online panels and recruitment databases that allow for filtering participants based on specific criteria to achieve this. Moreover, I am always vigilant for potential biases such as over-representation of more accessible or willing participants. In a past project, by recognizing and adjusting for a tech-savvy bias in our initial sample, we uncovered crucial usability issues that would have otherwise been missed, leading to significant product improvements that enhanced user satisfaction across a broader audience.”

2. Describe a technique you use to uncover latent user needs.

Delving beyond the surface, user research aims to understand not just what users say they want but also what they actually need, which they might not even be aware of themselves. Techniques to identify these latent needs are essential for creating products that resonate deeply with users, often satisfying desires they hadn’t articulated. This question discerns whether a candidate has the skill set to reveal insights that can drive innovative design and development, ensuring the end product is not just functional but also emotionally engaging and genuinely useful.

When responding, you should outline a specific technique such as ethnographic field studies, diary studies, or jobs-to-be-done interviews. Explain the process of how you apply the method, emphasizing the importance of empathy and observation in understanding user behavior. Illustrate your answer with a concise example from your experience where this technique led to a meaningful insight that informed product design or improvement.

Example: “ One technique I frequently employ to unearth latent user needs is the Jobs-to-Be-Done (JTBD) framework. By conducting interviews focused on the users’ objectives and the context in which they use a product or service, I can identify the ‘jobs’ they are hiring the product to perform. This approach goes beyond superficial feature requests and digs into the underlying motivations and desired outcomes.

For instance, in a recent study, I used the JTBD framework to understand why users were abandoning an online learning platform. Through in-depth interviews, I discovered that users were not just seeking knowledge; they were looking to achieve personal growth and career advancement. This insight led to a pivot in the platform’s feature set, focusing on personalized learning paths and tangible skill accreditation, which significantly reduced churn. The key was to listen not just to what users said, but to interpret the emotional and social drivers of their behavior, aligning product offerings with their true aspirations.”

3. What metrics do you prioritize when evaluating the success of a new feature through user research?

Playing a pivotal role in understanding how well a new feature is meeting user needs, user research prioritizes certain metrics to discern not only the usability and functionality of a feature but also its impact on user satisfaction, engagement, and retention. The metrics chosen often reflect the specific goals of the feature, whether they’re to streamline a process, increase time spent on a platform, or enhance the overall aesthetic appeal. Moreover, these metrics can reveal discrepancies between expected and actual user behavior, providing actionable insights that drive iterative design improvements and strategic decision-making.

When responding, it’s essential to illustrate a comprehensive understanding of various user research metrics, such as task success rate, error rate, abandonment rate, time on task, Net Promoter Score (NPS), customer satisfaction (CSAT), and system usability scale (SUS). Articulate how each metric can offer a different lens on user interaction and why it’s important for the specific feature in question. Discuss how you would correlate quantitative data with qualitative feedback to form a holistic view of the user’s experience. Demonstrating a thoughtful approach to selecting and interpreting these metrics can show your aptitude for making data-informed decisions that enhance product design and user satisfaction.

Example: “ When evaluating the success of a new feature, I prioritize a blend of quantitative and qualitative metrics tailored to the feature’s objectives. For instance, task success rate and time on task are critical for understanding the feature’s efficiency and effectiveness; users should be able to complete their intended actions with minimal friction. Error rate and abandonment rate provide insights into potential usability issues, signaling where users encounter problems or choose to disengage.

Complementing these, I look at Net Promoter Score (NPS) and customer satisfaction (CSAT) to gauge overall sentiment and satisfaction, which reflect the feature’s perceived value and impact on user loyalty. The System Usability Scale (SUS) offers a standardized measure of usability, which is particularly useful for benchmarking against industry standards or previous iterations of the product.

By correlating quantitative data like NPS with qualitative feedback from user interviews or usability tests, I can form a nuanced understanding of user experience. This holistic approach ensures that decisions are not solely based on numbers but are enriched with context and user narratives, leading to more user-centric feature enhancements and product development.”

4. Walk us through how you’d design a diary study for a mobile app.

Employing diary studies in user research is a method to understand user behavior and experiences over time, particularly effective for mobile apps where user interaction can be frequent, yet sporadic, and influenced by context. This approach assesses your methodological knowledge and your ability to empathize with users, including how you plan to capture the naturalistic use of the app, your considerations for participant selection, your strategies for encouraging consistent and honest entries, and how you’ll analyze the rich qualitative data you collect.

When responding, start by outlining the goals of the diary study and how they align with the overall research objectives. Discuss your process for participant recruitment, ensuring a diverse and representative sample. Explain how you’ll design the diary study template, keeping it intuitive and not overly burdensome for participants. Detail the tools and technologies you’ll use to facilitate the diary entries, considering the mobile platform’s constraints and advantages. Share your plans for maintaining participant engagement throughout the study duration, possibly including incentives or reminders. Lastly, touch on your approach to data analysis, emphasizing how you’ll extract actionable insights while respecting user privacy and the confidentiality of the diary entries.

Example: “ In designing a diary study for a mobile app, the primary goal is to understand user behaviors, experiences, and pain points over a certain period. This longitudinal insight aligns with our research objectives to refine user experience and inform feature development. Initially, I would recruit a diverse and representative sample of users to capture a broad range of interactions and contexts. This entails strategic selection to encompass varying demographics, usage patterns, and expertise levels with the app.

The diary study template would be crafted to be user-friendly and succinct, encouraging regular and honest entries without causing fatigue. I would leverage in-app prompts and notifications, utilizing the mobile platform’s immediacy to facilitate timely diary entries. To ensure a high response rate and quality data, I’d integrate micro-incentives for completed entries and periodic reminders to sustain engagement.

For data analysis, I’d employ a mixed-methods approach, combining quantitative metrics with qualitative thematic analysis to distill patterns and narratives. This would involve coding entries for recurring themes and sentiment analysis, while also respecting user privacy by anonymizing data. The insights garnered would directly feed into iterative design improvements, ensuring that the app evolves in a user-centered manner.”

5. In what ways have you adapted traditional user research methods for remote testing?

Adapting to digital environments has significantly impacted user research, necessitating the adaptation of traditional methods to suit remote contexts. This question highlights a candidate’s innovation and flexibility in applying user research techniques when direct, in-person interactions aren’t possible. It delves into the researcher’s ability to maintain the integrity and efficacy of their methods while overcoming the limitations posed by distance, speaking to their problem-solving skills, their aptitude for leveraging technology, and their commitment to obtaining actionable insights despite logistical constraints.

When responding, focus on specific adaptations you’ve implemented, such as using online platforms for interviews and surveys, employing remote usability testing tools, or harnessing social media for unmoderated research. Share a brief case study that demonstrates how you’ve translated a traditional method into a remote one, including the challenges you faced and how you overcame them. Emphasize the results and how they compared to in-person research to show that your adaptations were successful.

Example: “ In adapting traditional user research methods for remote testing, I’ve leveraged online platforms to conduct in-depth interviews and surveys, ensuring that the richness of qualitative data is not lost in the absence of physical presence. For instance, I transitioned what would typically be a face-to-face card sorting exercise into an interactive online session using digital tools that allow participants to drag and drop items, closely mimicking the in-person experience. This adaptation not only maintained the integrity of the data collected but also expanded the geographic diversity of our participant pool.

During a recent project, I employed remote usability testing tools that enabled users to share their screens and think aloud as they navigated our product, which was crucial in identifying usability issues that might not have been as evident through traditional lab settings. The challenge was ensuring that users felt comfortable and uninhibited in this remote setup, which I mitigated by providing clear instructions and building rapport at the beginning of each session. The results were highly comparable to in-person research, with the added benefit of increased convenience for participants and cost-effectiveness for our team.”

6. Share an example where you had to pivot your research strategy due to unexpected findings.

Encountering unexpected findings in user research is a test of a researcher’s ability to adapt and reassess their approach. This question seeks to understand a candidate’s problem-solving skills, their capacity to think critically about data, and their willingness to shift directions when the evidence requires it. It also touches on the researcher’s humility and openness to learning—key traits for anyone dedicated to understanding user behavior and needs.

When responding, provide a clear and concise example that demonstrates your adaptability in the face of unexpected results. Detail the original goals and methods, the surprising findings, and the thought process behind your strategic pivot. Emphasize your analytical thinking, your collaborative approach to problem-solving if others were involved, and the positive outcome or learning experience that resulted from the adaptation.

Example: “ In a recent study aimed at understanding user behavior on a new e-commerce platform, the initial research plan was to conduct a series of structured interviews and surveys to gather quantitative data on user preferences and purchasing patterns. However, early into the data collection phase, it became evident that users were experiencing significant navigation issues that were not anticipated, leading to high abandonment rates.

Recognizing the need to pivot, I shifted the focus to a more ethnographic approach, utilizing usability testing sessions and heuristic evaluations to observe users in real-time and identify the specific pain points in their journey. This change in strategy allowed for a more nuanced understanding of the user experience, revealing that the interface’s complexity was the root cause of frustration. The insights gained led to a redesign of the platform’s navigation system, which resulted in a measurable decrease in abandonment rates and an increase in user satisfaction. This experience underscored the importance of agility in research methodologies and the value of qualitative data in complementing quantitative metrics.”

7. Detail your approach to synthesizing qualitative data from user interviews.

Dealing with qualitative data in user research demands a meticulous and analytical approach, as it is rich in detail but often lacks the straightforwardness of quantitative data. The ability to synthesize this type of information is crucial, involving distilling large volumes of diverse user statements, behaviors, and feedback into actionable insights that can guide the design and development of products. Candidates who can demonstrate a systematic and creative approach to this task ensure that the synthesized data accurately reflects user needs and drives innovation.

When responding, outline a structured method that typically begins with data collection, followed by careful organization of the data into themes or categories. Describe how you interpret patterns and sentiments within the data, possibly using specific frameworks or tools like affinity diagrams or journey mapping. Highlight your attention to detail and the ability to remain unbiased while interpreting user feedback. It’s also beneficial to mention how you prioritize findings and translate them into recommendations that stakeholders can understand and act upon, demonstrating your capacity to bridge the gap between user research and practical application.

Example: “ In synthesizing qualitative data from user interviews, my approach is methodical and iterative, ensuring that the insights gleaned are both comprehensive and actionable. Initially, I transcribe and anonymize the data to preserve user confidentiality and prepare for analysis. I then employ thematic analysis to identify recurring patterns and themes, meticulously coding the data to organize it into meaningful categories. This process often involves creating affinity diagrams to visually map out relationships between different user statements and sentiments.

Once the themes are established, I analyze them to understand the underlying user needs and pain points. I pay particular attention to the frequency and intensity of the sentiments expressed, as these can indicate areas of particular importance or concern for users. I also use journey mapping to contextualize findings within the user’s experience, providing a holistic view of their interactions and emotions throughout their engagement with the product or service.

In the final stage, I prioritize the findings based on their potential impact on user satisfaction and business goals, crafting clear and concise recommendations. I present these insights to stakeholders through compelling narratives and visualizations, ensuring they are accessible and can inform strategic decision-making. My approach is designed to provide a deep understanding of the user experience, driving improvements that are firmly rooted in user research.”

8. What’s your experience with A/B testing, and how do you interpret conflicting results?

Navigating the complexities of data interpretation is a critical skill in user research, especially when employing A/B testing to compare two versions of a product. Results from A/B tests can sometimes be conflicting or counterintuitive, challenging researchers to consider user psychology, environmental variables, and statistical significance. This question assesses the candidate’s ability to reconcile data discrepancies to make informed decisions.

When responding to this question, it’s essential to discuss your systematic approach to setting up and conducting A/B tests, ensuring that they are methodologically sound. Share specific examples where you’ve encountered conflicting results and articulate the steps you took to analyze the data, such as segmenting the user groups, checking for external factors, or running follow-up experiments. Emphasize your critical thinking in discerning the actionable insights from the data and how you used these insights to inform product development or strategy decisions.

Example: “ My experience with A/B testing is extensive, encompassing a variety of contexts from user interface design to feature optimization. When interpreting conflicting results, my approach is to first verify the integrity of the test setup. This includes ensuring that the sample size is adequate, the segmentation is appropriate, and the duration of the test is sufficient to mitigate anomalies such as novelty effects or external events.

Upon confirming the test’s validity, I delve into a deeper analysis. For instance, I segment the results to identify patterns that could explain the discrepancies, such as differences in user behavior across devices or demographics. If the conflict persists, I consider qualitative data to complement the quantitative findings, which may involve user interviews or usability studies to uncover motivations behind the behavior. In cases where the conflicting results remain inconclusive, I may run follow-up experiments with refined hypotheses. The insights gleaned from this rigorous process are critical in making informed decisions that align with user needs and business goals.”

9. Outline a plan for conducting usability tests on a limited budget.

Conducting usability testing allows for direct feedback on how real users interact with a product, which can drive crucial design improvements. It’s a common misconception that usability testing requires a hefty budget. Interviewers are keen to learn if candidates can creatively leverage limited resources to still extract meaningful data, ensuring that even when finances are tight, product development remains user-centered.

When responding, emphasize the importance of prioritizing key functionalities and features to test based on the product’s goals and user needs. Discuss the use of low-fidelity prototypes to minimize costs and the value of recruiting a small but diverse group of participants. Explain how remote testing tools can be cost-effective, and how guerrilla testing methods, like approaching people in coffee shops or public areas, can gather quick and inexpensive feedback. Highlight the importance of a clear testing plan that maximizes the limited resources and time available. Finally, stress the value of iterative testing, starting small and using findings to inform subsequent rounds of more focused usability tests.

Example: “ To effectively conduct usability tests on a limited budget, the first step is to prioritize the most critical features and user flows that align with the product’s core objectives and user needs. By focusing on these elements, we ensure that our limited resources are directed towards areas with the highest impact on user experience and business goals. We can use low-fidelity prototypes, such as paper sketches or wireframes, to test these key functionalities early on, which helps to minimize costs associated with high-fidelity prototyping.

Recruiting a small, diverse group of participants is essential for obtaining valuable insights while keeping expenses low. We can leverage remote testing tools that reduce the need for a physical lab and allow users to participate from their own environments, which can also provide more authentic usage data. Additionally, guerrilla testing methods can be employed to gather quick feedback by informally approaching potential users in public spaces. This approach is not only cost-effective but also allows for immediate and spontaneous reactions to the product.

A clear and concise testing plan is crucial to maximize the efficiency of the testing sessions. This plan should outline specific goals, tasks, and questions for participants, ensuring that every minute is used effectively. After initial testing, we analyze the data to identify patterns and actionable insights. Iterative testing is then conducted, refining the focus based on previous findings to delve deeper into usability issues. This cyclical approach ensures that each round of testing is more targeted, making the most of our limited budget while continuously improving the product’s usability.”

10. How would you balance user feedback with business objectives that seem at odds?

Harmonizing the needs and desires of the user with the strategic goals and constraints of the business is a delicate balance in user research. When faced with feedback that seemingly conflicts with business objectives, it’s essential to demonstrate an ability to critically analyze the feedback, prioritize it based on impact, and integrate it in a way that supports the company’s vision without compromising user satisfaction. This question tests a candidate’s capacity for creative problem-solving and their ability to influence product strategy.

In responding, highlight a structured approach to problem-solving that starts with acknowledging the importance of user feedback. Then, discuss how you would evaluate the feedback’s relevance and potential impact on both the user experience and the business goals. Share a specific example where you faced such a conflict and outline the steps taken to find a solution, such as data-driven decision-making, stakeholder collaboration, or iterative testing. Demonstrating a balance between user advocacy and business acumen will show that you can navigate the tension between these two critical aspects of product development.

Example: “ Balancing user feedback with business objectives requires a strategic approach that aligns user needs with company goals. Initially, I prioritize understanding the user feedback in-depth, categorizing it into themes and assessing its frequency and impact. This analysis provides insight into the user’s pain points and desires. Concurrently, I map these findings against the business objectives to identify areas of synergy and tension. For instance, if users are requesting a feature that seems at odds with current business priorities, I would conduct a cost-benefit analysis to evaluate the potential return on investment and the long-term value to user satisfaction and retention.

In a previous situation where user feedback indicated a demand for a customization option that initially seemed contrary to our streamlined product strategy, I facilitated a cross-functional workshop with stakeholders to explore the feasibility and implications. We utilized A/B testing to validate the concept on a smaller scale, which provided empirical data on user engagement and conversion rates. The results revealed that the customization feature not only met user expectations but also increased user engagement, leading to higher lifetime value. By presenting this data to stakeholders, I was able to align the user feedback with the business objective of increasing revenue, thus integrating user-centric innovation within the business framework.”

11. Tell us about a time you used ethnographic research in a tech context.

In a tech context, ethnographic research allows for a deep dive into user behaviors, routines, and the cultural context that shapes their interaction with technology. This question evaluates whether a candidate can skillfully observe and analyze user behavior beyond mere numbers and data points, thus ensuring that products are not only functional but also resonate with users on a personal and cultural level.

To respond effectively, share a specific instance where you employed ethnographic methods—such as participant observation, interviews, or diary studies—to gather rich, qualitative data. Explain the rationale behind choosing this approach, the process of integrating into the users’ environment, and how the insights gained directly informed the design or improvement of a tech product or service. Highlight what you learned about the user experience and how it influenced the project’s outcomes, showcasing your ability to translate human behaviors into actionable design decisions.

Example: “ In a recent project, we were tasked with improving the user experience of a mobile health application designed for elderly users. Recognizing the unique challenges faced by this demographic, I conducted ethnographic research to immerse myself in their daily routines and understand their interactions with technology. I utilized participant observation and in-depth interviews, spending time with users in their homes to observe their habits and the barriers they encountered when using the app.

The insights gained were illuminating. For instance, I discovered that subtle design elements, such as button size and contrast, significantly affected usability for users with diminished eyesight and fine motor skills. Additionally, I found that many users were reliant on family members for initial app setup, which indicated a need for a simplified onboarding process. These findings were instrumental in guiding the redesign of the app, leading to an increase in user engagement and a reduction in support calls. This experience underscored the value of ethnographic research in uncovering the nuanced ways in which users interact with technology, allowing us to create solutions that are truly user-centered.”

12. When is quantitative research more valuable than qualitative, and vice versa?

Blending different methodologies is a requirement in user research to understand user behavior, preferences, and needs. Quantitative research offers solid data that can support or refute hypotheses with numerical evidence, while qualitative research provides the rich context and deep understanding that numbers alone can’t offer. This question ensures that candidates not only know the difference between the two but also understand when to leverage each type to inform design decisions, product improvements, and business strategies.

When responding, it’s important to articulate that quantitative research is valuable when you need to quantify problems or answers and gather data that can be scaled to a larger population, such as through surveys or analytics. It’s particularly useful in A/B testing scenarios or when assessing the prevalence of an issue. Conversely, qualitative research is crucial when exploring new territories without preconceived notions, gathering user stories, and understanding the nuances of user experience, which can be achieved through methods like interviews or observation studies. Highlighting your ability to choose the appropriate methodology based on the research goals, project stage, and available resources will demonstrate your strategic approach to user research.

Example: “ Quantitative research is particularly valuable when we need to measure user behavior, attitudes, and opinions across a large sample size to generalize findings. For instance, when validating a hypothesis such as the effectiveness of a new feature, we might employ quantitative methods like surveys or usage data analysis to statistically ascertain its impact on user engagement or satisfaction. This approach is also indispensable for A/B testing, where quantifiable metrics can definitively demonstrate which version performs better in terms of conversion rates or click-throughs.

On the other hand, qualitative research shines when we aim to delve deep into the user’s psyche to uncover motivations, pain points, and contextual usage scenarios that numbers alone cannot reveal. It’s the go-to approach during the early stages of product development or when addressing complex issues that require a nuanced understanding of user behavior. Methods like in-depth interviews or ethnographic studies enable us to capture the richness of user experiences, providing insights that can guide the ideation and design process in a way that quantitative data cannot. The choice between qualitative and quantitative methods hinges on the specific research questions at hand, and a seasoned researcher knows when to leverage the depth of qualitative insights or the breadth of quantitative evidence to inform decision-making.”

13. What strategies do you employ to mitigate researcher bias in studies?

Minimizing personal biases is a critical aspect of user research, as they can skew data and lead to less effective or even misleading outcomes. This question tests a candidate’s awareness of their inherent biases and whether they have concrete strategies to minimize their impact, ensuring that the research findings are reliable and actionable. It also evaluates a candidate’s commitment to objectivity and their understanding of methodological rigor in user research.

When responding, it’s crucial to demonstrate your understanding of various types of researcher biases, such as confirmation bias or cultural bias. Discuss your familiarity with techniques like triangulation, using a mix of qualitative and quantitative methods, or involving a diverse team in the research process to provide different perspectives. You can also mention employing tools like blind studies or establishing clear protocols for data collection and analysis. Show that you’re proactive in seeking feedback on your research approach from peers or mentors to further counteract potential biases.

Example: “ To mitigate researcher bias in studies, I employ a multi-faceted approach that begins with the acknowledgment that bias can manifest in various forms. To counteract confirmation bias, I ensure that the research design includes a hypothesis-testing approach, where both supporting and opposing evidence for our assumptions is rigorously sought and considered. This involves actively searching for disconfirming evidence and being open to results that may challenge preconceived notions.

I also utilize methodological triangulation, combining qualitative and quantitative data to cross-validate findings and draw a more comprehensive picture of user behavior and attitudes. This not only strengthens the validity of the research but also helps in identifying and mitigating biases that might arise from relying on a single method. Furthermore, I advocate for diversity within the research team, ensuring that multiple perspectives are represented in both the planning and analysis stages. This diversity can provide a check against cultural biases and helps in creating a more inclusive research process.

Lastly, I implement blind study techniques when feasible, especially in user testing scenarios, to prevent biases related to participant selection and treatment. Establishing clear, standardized protocols for data collection and analysis is also crucial, as it minimizes the risk of subjective interpretation of data. I maintain a practice of seeking peer feedback on research plans and findings, which serves as an additional layer of defense against personal biases potentially skewing the research outcomes.”

14. How do you handle sensitive information obtained during user research?

Ethical handling of sensitive information is paramount in user research, which often involves gathering personal, sometimes confidential data from participants. Researchers must be adept at navigating privacy concerns while ensuring the information they collect is secure and used responsibly, impacting the integrity of the research process and the validity of the results.

When responding to this question, address the protocols and measures you employ to protect sensitive information. This can include discussing informed consent, data anonymization, secure storage practices, and adherence to relevant laws and ethical guidelines. Emphasize your commitment to confidentiality and the steps you take to ensure that all team members handle data appropriately. It’s also beneficial to mention any experience you have with ethics committees or institutional review boards, which often oversee research involving sensitive information.

Example: “ In handling sensitive information obtained during user research, I prioritize establishing trust and maintaining ethical standards. This begins with obtaining informed consent from participants, clearly communicating the scope of the research, how their data will be used, and their right to withdraw at any time. I ensure that all data is anonymized or de-identified to protect participants’ identities, employing rigorous data encryption and secure storage solutions to safeguard the information collected.

I strictly adhere to relevant data protection laws, such as GDPR or HIPAA, and ethical guidelines set forth by recognized bodies. I also work closely with ethics committees or institutional review boards to ensure that our research protocols meet the highest standards of confidentiality and integrity. Any team members involved in the project are thoroughly briefed on these protocols, and I regularly conduct audits to ensure compliance with our established practices. This comprehensive approach ensures that sensitive information is handled with the utmost care throughout the research process.”

15. Describe how you’ve integrated user personas into product development cycles.

Integrating user personas in product development humanizes data, guiding product development to align with user needs, behaviors, and pain points. This question assesses the candidate’s ability to translate research insights into actionable strategies that steer product design and functionality. It also evaluates the candidate’s experience with collaborative processes and their influence on cross-functional teams to ensure user-centricity throughout the product lifecycle.

When responding, highlight specific instances where you’ve crafted user personas based on thorough research and how these personas informed the product development stages. Discuss collaboration with design, engineering, and marketing teams, ensuring your examples reflect a clear link between the personas and product decisions. Share metrics or feedback that demonstrate the success of using these personas in achieving a more user-focused end product.

Example: “ In one project, I developed user personas by conducting ethnographic studies, surveys, and in-depth interviews to gather comprehensive user data. These personas were then workshopped with cross-functional teams to ensure a shared understanding of our target users’ needs, behaviors, and pain points. By integrating these personas into our agile development cycles, we were able to prioritize features and tailor the UX design to address the specific goals and challenges of our most critical user segments.

The personas became a touchstone for decision-making throughout the product development process. For instance, when debating feature sets, we referred back to the personas to evaluate which features would deliver the most value to our primary users. This approach not only streamlined our prioritization but also fostered empathy within the team, leading to more intuitive design choices. The impact was measurable; after launch, we saw a 30% increase in user engagement and a significant uptick in positive user feedback, validating the effectiveness of the personas in guiding a user-centric development approach.”

16. How have you measured long-term user satisfaction post-product launch?

Post-launch assessment is vital for continuous improvement and ensuring that the product remains relevant and valuable. This question reveals whether the candidate has a comprehensive approach to user research that includes post-launch assessment, their ability to design research that can capture data over longer periods, and their commitment to iterating on a product based on real-world usage.

When responding, it’s crucial to detail specific methodologies used to track long-term satisfaction, such as follow-up surveys, interviews, usage data analysis, and Net Promoter Scores (NPS). Discuss any tools you’ve employed to gather and analyze this data, like analytics platforms or customer feedback tools. Highlight a particular instance where this long-term data led to significant product improvements or pivots, showcasing your dedication to the user’s evolving needs and the product’s success.

Example: “ In measuring long-term user satisfaction post-product launch, I’ve implemented a mixed-method approach to gather both quantitative and qualitative data. By deploying longitudinal surveys at regular intervals, I’ve been able to track changes in user sentiment over time, which provides a more nuanced understanding of the user experience as it evolves. I’ve complemented these surveys with in-depth interviews to delve into the reasons behind the satisfaction scores, allowing for a richer interpretation of the data.

I’ve utilized Net Promoter Scores (NPS) as a key metric to gauge overall satisfaction and loyalty, and I’ve monitored usage data through analytics platforms to identify patterns and potential pain points. This combination of methods has been instrumental in identifying trends and informing product iteration cycles. For example, a consistent drop in NPS six months post-launch once highlighted an emerging issue with feature discoverability, leading to a targeted update that improved the user interface and ultimately restored user satisfaction levels. This iterative process, informed by ongoing user feedback, ensures that the product remains aligned with user needs and expectations over time.”

17. Give examples of non-traditional user research methods you’ve found effective.

Innovative approaches beyond standard surveys and focus groups are often required to capture authentic user behavior and feedback. This question assesses whether candidates can think outside the box and adapt their research strategies to meet unique project demands, considering the limitations of conventional techniques.

When responding, highlight specific instances where you’ve implemented creative research methods, such as guerrilla testing, diary studies, or utilizing social listening tools. Explain the rationale behind choosing these methods, how they were executed, and what insights they provided that might not have been uncovered through more traditional means. Discuss the impact these findings had on the project’s direction, design decisions, or product development to underscore your ability to translate unconventional research into actionable results.

Example: “ In exploring non-traditional user research methods, I’ve found guerrilla testing to be particularly effective when needing quick, real-world feedback on prototypes or concepts. For instance, setting up impromptu testing stations in public spaces relevant to the product allowed for immediate user interactions and reactions. This method, while less structured, provided raw and authentic insights into user behaviors and preferences that might have been moderated in a more formal setting.

Another technique I’ve utilized is social listening, which has been invaluable for capturing the unfiltered voice of the customer. By analyzing conversations and trends on social media platforms, I was able to identify pain points and desires that users themselves might not have explicitly articulated in a survey or interview. This approach led to the discovery of a niche user need that was not previously considered, driving a pivot in feature prioritization that significantly increased user satisfaction post-launch.

Lastly, diary studies have proven to be a treasure trove of contextual information. By having users document their experiences over time, I was able to observe the natural usage patterns and the evolution of their interactions with the product. This longitudinal data helped in identifying specific moments of delight and frustration, which informed subtle yet impactful refinements in the user interface that enhanced the overall user experience.”

18. What role does competitor analysis play in your user research process?

Competitor analysis in user research sharpens understanding by highlighting what others in the industry are doing right or wrong, revealing gaps in the market, and helping to position a new product effectively. It also provides a benchmark against which to measure user satisfaction and expectations, informing a more strategic approach to product development.

To respond, outline a systematic approach to competitor analysis within your user research process. Describe how you gather data on competitors, such as through feature comparisons, user reviews, or market trends, and how this information informs user personas, journey mapping, or usability testing. Emphasize how competitor insights help prioritize features, inspire innovation, and avoid common pitfalls. Provide a concrete example of how a past competitor analysis impacted a project, demonstrating an understanding of its strategic value in user-centered design.

Example: “ Competitor analysis is a critical component of user research as it provides a landscape view of where our product stands in relation to the market and user expectations. By conducting a thorough analysis of competitors, including feature comparisons, user reviews, and market trends, we can identify gaps in the market and opportunities for differentiation. This process not only informs the development of user personas and journey mapping by highlighting the strengths and weaknesses of competing products but also guides usability testing by pinpointing areas where users may experience friction.

For instance, in a past project, competitor analysis revealed that while our product was superior in performance, competitors were excelling in user onboarding. This insight led to a focused redesign of our onboarding process, which significantly reduced user drop-off rates and increased engagement. By integrating competitor insights into our user research, we were able to prioritize features that delivered real value to users, inspire innovation within our team, and circumvent common design pitfalls that could have hindered user satisfaction.”

19. How do you determine when enough research has been conducted on a specific feature or product?

Determining the ‘enough’ point in user research hinges on understanding the research goals, user needs, and the diminishing returns of additional information. Interviewers look for a nuanced appreciation of research scope—recognizing when the gathered data has reached a point of saturation and are adept at maximizing the value of the research within the constraints of time and budget.

When responding to this question, articulate how you set clear objectives at the outset, employ milestones to evaluate progress, and use interim findings to assess whether the research is yielding actionable insights or if it’s veering into the territory of over-analysis. Discuss how you weigh the potential impact of additional data against project deadlines and resource availability. It’s also essential to mention how you stay attuned to the product development lifecycle, ensuring that research findings are timely and can be integrated effectively into the design and decision-making processes.

Example: “ Determining when enough research has been conducted hinges on the clarity of the research objectives and the actionable insights derived from the data. At the outset, I establish specific, measurable goals that align with the project’s needs, which serve as a benchmark for assessing the sufficiency of the research. As the study progresses, I evaluate milestones to gauge whether the insights are contributing to a deeper understanding of user needs and behaviors, and if they’re informing design decisions effectively.

In parallel, I maintain a keen awareness of the diminishing returns on additional data. This involves a critical assessment of whether further information will have a substantial impact on improving the product or if it’s simply confirming existing knowledge. I balance this against project constraints such as deadlines and resource availability. If the research reaches a point where it’s no longer yielding significant new insights or the cost of further investigation outweighs the benefits, I consider the research phase complete. Throughout, I ensure that the findings are integrated into the product development lifecycle in a timely manner, allowing for the research to have a meaningful impact on the final outcome.”

20. What’s your method for prioritizing user pain points when making recommendations?

Prioritizing user pain points is essential for creating products that deliver a satisfying user experience. A methodical approach to this prioritization signals to employers that you can discern the most impactful problems from the myriad of feedback and data, focusing on what will drive meaningful improvements for the user and the business.

When responding, detail your process, which might include categorizing pain points by frequency and severity, assessing the potential impact of resolving each issue, and considering the resources required. Discuss how you weigh user feedback against business goals and technical feasibility. Use specific examples from past experiences to illustrate how your method has led to successful outcomes. It’s important to demonstrate that your approach is both data-driven and empathetic to user needs.

Example: “ In prioritizing user pain points, I first categorize them based on frequency and severity, as these dimensions often indicate the urgency and potential impact of the issue on the user experience. I employ a mix of quantitative data from analytics and qualitative insights from user interviews and usability testing to validate the prevalence and depth of each pain point. Then, I assess the potential impact of addressing each issue, not only on the user experience but also on business metrics such as conversion rates, retention, and customer satisfaction scores.

I balance this assessment with a consideration of the resources required to address each pain point, including development time, costs, and potential technical constraints. This helps in creating a cost-benefit analysis to prioritize issues that can deliver the most value to both users and the business. Additionally, I align the prioritization with the product roadmap and business goals, ensuring that the recommendations for resolving pain points are feasible within the given constraints and timelines. For example, a high-frequency, high-severity issue that aligns with the business objective of increasing user retention would be prioritized over less critical issues. This method has consistently led to targeted improvements that enhance the user experience while supporting business objectives.”

Top 20 Operations Management Interview Questions & Answers

Top 20 project management interview questions & answers, you may also be interested in..., top 20 program management interview questions & answers, top 20 facility management interview questions & answers, top 20 engineering mechanics interview questions & answers, top 20 real estate private equity interview questions & answers.

Integrations

What's new?

Prototype Testing

Live Website Testing

Feedback Surveys

Interview Studies

Card Sorting

Tree Testing

In-Product Prompts

Participant Management

Automated Reports

Templates Gallery

Choose from our library of pre-built mazes to copy, customize, and share with your own users

Browse all templates

Financial Services

Tech & Software

Product Designers

Product Managers

User Researchers

By use case

Concept & Idea Validation

Wireframe & Usability Test

Content & Copy Testing

Feedback & Satisfaction

Content Hub

Educational resources for product, research and design teams

Explore all resources

Question Bank

Research Maturity Model

Guides & Reports

Help Center

Future of User Research Report

The Optimal Path Podcast

The best user research questions and how to ask them

User Research

Sep 1, 2022

The best user research questions and how to ask them

To get the right insights, you need to ask the right questions. Here’s the best user research questions to start gathering feedback right away.

Lorelei Bowman

Lorelei Bowman

Content Editor at Maze

Knowing the right user research questions to ask is vital to the success of your UX research. Research is an invaluable source of input for product development, but before you can get started, you need to make sure the questions lined up will get the insights you need, without influencing the data.

Think of this article as your guide to all-things user research questions: what to ask, how to ask it, and how to create your own questions. Let’s get started.

What kind of user research questions are there?

The kind of questions you ask will depend on your research goals—are you looking to gather user feedback, or find out if a particular feature is (or would be) useful? Are you trying to discover what problems bother your user, or whether they’d prefer one solution over another?

Before planning your questions and diving head-first into research, look at your overarching research plan and objectives. Consider this on a project-by-project basis, as your end questions will be drastically different depending on where you are in the product development process . For instance, if you’re in early product discovery , you may want to discover user intent and pain points. Or, if you’re working on a high-fidelity prototype, you might want to see how users interact with the prototype, and how easy it is to use. Asking questions at different stages of your process is a big part of continuous product discovery and ensuring your product remains the best it can be.

💡 If you’re looking to understand the types of question format used in surveys or user questionnaires, take a look at our guide on how to write survey questions .

User research questions can be categorized in many ways—by objective, research scenario, or point in the product journey, to name a few. Since different questions may apply in multiple situations, we’re going to consider questions organized by their research focus.

Questions for user research can typically be categorized three ways:

  • Questions about the problem e.g., what are users’ pain points, what task are they trying to complete, what solution do they want
  • Questions about the people e.g., who they are, how they use products, what they want to accomplish, how likely are they to use the product
  • Questions about the product e.g., how users’ feel about content or design, can they navigate the product, how usable is it, what features do they like or dislike

Now we know what kinds of questions there are, let’s delve into the value of pre-made questions, and some examples of each question type.

Using pre-made user research questions

To elevate your research, you can opt to use pre-existing questions from a question bank. As with all research tools , there are many benefits to this, including saving time and effort, and having many questions to choose from. Using a question bank also ensures questions are always carefully considered, easily understandable for users, and unbiased.

Meet the Maze question bank

An open-source question repository for product teams, our question bank is aimed at helping you ask the best user research questions to gather insight that will help build truly user-centered products.

question-bank-3

A good question bank will be multifunctional, with questions you can use when running moderated to unmoderated testing, conducting generative or evaluative research, or gathering quantitative or qualitative data. So you can have one place to go for all your user research question needs.

🚀 Boost your research with Maze templates

If you’re a Maze user, you can also use the question bank as a handy companion to fuel your team’s research with Maze—check out the templates column and question block suggestions for maximum efficiency when building mazes.

Ultimately, a pre-made question bank can help save you a lot of time, and allow you to focus on conducting the research and processing analysis.

If you’d like to create your own questions, let’s get into the different user research question types, what questions they include, and how to ask them.

question bank for user research questions

Click on the image to head straight to the question bank 👆

Questions about the problem

To support product and design decisions behind any solution, you need to be familiar with the problem you (and your users) are trying to solve. Whether you’re starting product discovery and want to understand user pain points, or you’re testing new features and want to gauge which will be most popular— you can’t begin working on a solution until you’ve honed in on what the problem is.

What’s bothering your users? How can you make their lives easier? What’s their key challenge, and what are they trying to achieve that’s being blocked by that problem?

Only once you’ve narrowed down a key problem statement can you translate solutions into the user experience, and identify opportunities for product development .

Questions focusing on the problem you’re trying to solve are key in product discovery stages and concept validation . The reason for using a particular product or feature may vary between users—consider Instagram’s Explore tab: it could be used to find friends, connect with like-minded people, or find inspiration.

Questions that can help hone into the problem at hand include:

  • What problems do you face when you do [task]?
  • Please complete this sentence: "The most frustrating part of trying to complete [task] is…”
  • What is your main goal when trying to complete [task]?
  • What is your personal measure of success when it comes to [goal]?
  • How are you currently solving [problem]?
  • Describe your ideal solution to [problem]

Questions about the people

Understanding the problem you’re trying to solve goes hand in hand with the people who are facing this problem—who they are and how they think, how they adopt and use products, their wants, needs and dislikes.

Put simply, there’s no point building a product if it solves the problem your user is having—but not in the way they wanted it to.

To really understand how your users think, and the way they approach a product, you need to understand their mental models. Broadly speaking, a mental model determines how someone forms their thinking process—it might impact the way they approach a problem, the kind of solution they’d like, and how they expect certain features to work.

UX research methods like card sorting are a good way to understand people’s mental models, but you can also gather this insight through thoughtful user interviews or research questions.

User-focused questions also cover understanding attitudes towards product adoption, use habits and circumstances, pricing models, and demographics.

Some example questions you could ask to learn more about your target users include:

  • Are there any products that you prefer to use to do [task]?
  • What does your job look like in your company?
  • How do you prefer to be trained on new products?
  • How much would you pay for [product]?
  • Please describe who you might collaborate with when [task]?
  • How often do you do [task]?

Questions about the product

Once you understand the problem your product will solve, and the people who’ll be using it, it’s time to circle back to the product itself. Questions about the product may be about its usability, what features you’re including, how users feel about content or design, and whether it does what they need it to.

Just like all research, it’s a good idea to ask product-related questions multiple time during the research phase, as both questions and answer will vary depending on what development stage you’re at—from prioritizing which feature to focus on developing first, to assessing how navigable a certain product section is, or reviewing the appeal of specific design aspects.

To gain a well-rounded understanding of how users find using your product or feature, usability testing is imperative. And, if you’re trying to nail down product navigation and identify any bumps in the user journey, tree testing is the research method of choice.

Whatever your focus, questions relating to the product are useful in both evaluative and generative research , and critical for creating a user-centered, solution-focused product.

Sample questions you can use to learn more about the product and features could include:

  • How did you find the language (including but not limited to copy, phrasing, tone) used on the website?
  • What’s the single most important thing we could do to make [product] better?
  • On a scale of 1-10, how was your experience when [task]?
  • Was the product navigation as expected?
  • If you could change one thing about the design what would it be and why?
  • Thinking about what [product] offers, which of the following best describes your feelings about it?

🤔 To dive into the questions you should be asking during usability testing, check out how to ask effective usability testing questions .

Regardless of what questions you ask, it’s worth bearing in mind that this information should be considered a guide, not a rule—as sometimes what people think they’ll do is not what they always do in practice . This is why it’s so important to continue research and testing in all stages of product development, so you can always be working off the most reliable and up-to-date insight.

Guidelines for crafting the right user research questions

Research questions set the standard of the data you’ll gather with them, so it’s crucial to properly craft each question to maximize insight and ensure accurate results.

Using a pre-made question bank is one way to keep questions effective, but if you’re writing your own questions, bear in mind that everything from the language you use to the structure or format of questions can influence the user’s answer.

The best questions for user interviews and research are clear , precise , and unbiased . Let’s go over some ultimate tips for crafting questions that fulfill this.

user research interviews

Stay neutral: avoid leading questions

One of the most important points when it comes to any research is being a neutral party, which means removing cognitive bias from your work. Research isn’t helpful if it’s biased, so ensure your questions are as impartial as possible—after all, just because you like Concept A over Concept B, doesn’t mean everyone will.

The key to staying neutral is avoiding leading questions where you subconsciously favor one thing over another, or plant an opinion or idea in the user’s mind, such as “How would you use concept A?”—this assumes they preferred concept A, which they may not have. Instead, try asking which concept they would use, followed by how they would use it.

Take it one question at a time

The majority of us think best when our minds are clear and able to focus on one thing, so avoid bombarding research participants with multiple questions phrased together.

Rather than asking a question like “What did you think about the design, copy and layout of the page?”, ask individually about the design, copy, and layout. Otherwise, you risk users merging their thoughts into one answer, when in fact they may feel very differently about each element.

Of course some questions lend themselves to being combined (e.g., “Which concept did you prefer and why?”), but it’s best to keep things separate when possible, and ask “Why?” in follow up questions, to allow users space to think and form individual answers for each question.

Ask open-ended questions

Similar to ensuring questions are unbiased, it’s also a good idea to ask open-ended questions—that is, to avoid questions which result in simply a ‘yes’ or ‘no’ answer.

The benefit of open-ended questions is that they give participants an opportunity to expand on their answer, work through their experience, and share details with you that may otherwise be missed. Consider that, while asking “Did you like the product?” may answer whether a user liked it, you’ll be left wondering what it is they like about it. Instead, try framing questions in a way that provides space for additional information, e.g. “What did you think about the product?”.

Pro tip ✨ If you do ask closed-ended questions, always keep follow up questions aside to dig deeper gather and extra insight from your participants.

Help users find their own voice

The language we use is incredibly powerful. Used well, words can move us, sway our opinions, educate us, and more.

By helping your research participants to find their own voice, you can unlock powerful statements and user insights which will truly impact your product. Formatting questions with the user at the center—using ‘you’ and asking emotive questions—builds empathy with the user and encourages them to find and share their own opinions through honest answers.

Ask questions you think you know the answer to

Our final question-crafting tip is to use research questions to test and validate your own assumptions and opinions. Ask questions you think you know the answer to—if you believe all users will prefer one new feature over the other, see if you’re right. If you think a certain design element works better on a different page, ask research participants to determine where they prefer it.

As with any research, while you may be user-adjacent, you are not your users. You are the expert in your product; they are the expert in using your product. Trust their opinions, and use their knowledge and experience to confirm your suspicions, or disprove them. Either way, you gain valuable insights.

User research is as effective as the questions you ask

Whether you’re investigating user preferences or conducting usability testing, research is only as effective as the questions you ask—and how you ask them.

Focus on questions that fit your research objectives, phrase your questions in the best way possible, and work to build empathy with your user; you’ll be able to gather valuable insights in no time.

Frequently asked questions and user research questions

What makes a good user research question?

A good research question is open-ended , unbiased , clear , and precise . It helps research participants share their thoughts, feedback, and opinions with researchers, without influencing or limiting their responses.

What type of user research questions are there?

User research questions can broadly be broken down into three categories:

How do you create a user research question?

There are several ways to create a user research question: you can either write your own question, or select premade questions from an existing research question bank.

If you choose to write your own research questions, it’s important to keep them clear and precise above all else—focus on asking questions that encourage users to open up, share additional information, and speak honestly.

Continue Reading

leading-questions-thumbnail

How to avoid leading questions in UX research (+ examples)

ethics-in-ux-research-thumbnail

UX research best practices: Building and researching products with ethics in mind

Try rapid testing now, for free.

Best 21 UX Research Tools in 2024

Avatar photo

Selecting the right UX research tools transforms how product managers, CPOs, and product designers tackle the complex field of user experience (UX) design.

These tools streamline the path through team collaboration, deep user understanding, precise feedback collection, and the crucial task of feature prioritization. Navigating this process effectively means distinguishing between a product that meets the mark and surpasses expectations.

Solution: Remote user research or remote usability testing

Enter the era of all-in-one feedback tools crafted with UX research in mind. These platforms offer a unified solution, making them indispensable for anyone looking to enhance the user experience without the hassle of juggling multiple tools. 

An all-in-one user research tool that simplifies every phase of UX research, from initial data gathering to the final stages of analysis and insight generation, ensuring that every product decision is informed and impactful.

1. Usersnap

2. usertesting, 3. optimal workshop, 4. lookback, 6. usabilityhub, 7. crazy egg, 9. qualtrics, 10. dovetail, 11. userzoom, 12. validately, 13. indigo.design, 14. fullstory, 15. usabilla, 17. userlytics, 19. sessioncam, 21. user interviews, step 1. understanding users: user research, step 2. connecting with users: user interviews, step 3. remote ux research: bridging the gap, step 4. data analysis and actionable insights, try usersnap for streamlined feedback collection.

user research interviews

Usersnap stands out as a top-tier, all-in-one feedback and UX research tool designed to cater to the dynamic needs of product managers, CPOs, and CTOs. This platform revolutionizes how teams capture feedback , understand user needs, and prioritize features through a seamless, integrated approach.

Best Features:

  • UX Rating Micro Surveys: Quickly gauge user satisfaction levels to identify poor experiences for improvements.
  • Polls for New Features: Engage specific user groups to gather insights and research ideas for new features.
  • Capture and annotate screenshots to pinpoint issues and optimize user experience.
  • Record user interactions for specific user groups to gain deeper insights into their needs and behaviors.
  • Seamlessly connect with popular project/product management tools ( Jira , Azure DevOps etc.) to ensure feedback and tasks flow smoothly into existing workflows for further action.
  • Create tailored feedback forms and surveys to address specific research goals and gather high-quality insights.
  • Utilize templates for efficient feedback collection and analysis.
  • Gain insights about new features to inform product development decisions.
  • Track survey response rates and user engagement metrics to gain UX insights faster and make data-driven decisions.

Usersnap offers several pricing plans tailored to meet the needs of different organizations:

  • Startup Plan: $69/month annually or $79 monthly. 
  • Company Plan: $129/month annually. 
  • Premium Plan: $249/month annually.

user research interviews

UserTesting empowers teams with real-time feedback from actual users, making it an indispensable tool for UX research that demands depth and authenticity. Its platform facilitates the gathering of rich, qualitative insights, enabling product managers and UX researchers to make informed decisions confidently.

  • Live Video Feedback: Access to live sessions where users interact with your product, offering unfiltered reactions and valuable insights.
  • Audience Targeting: Ability to target feedback from specific user demographics, ensuring relevance and precision in the research process.
  • Rapid Results: Offers quick turnaround times for feedback, enabling agile response to user needs and market demands.
  • Additionally, UserTesting excels in facilitating qualitative research by capturing deep user insights, which is crucial for understanding user behavior and feelings in UX design and user experience.
  • Validation: $199 per month to validate products with micro surveys
  • Essentials: $799 per month to log feedback on behalf of customers

Pro: $999 per month to collect ideas through an in-app widget and web portal

user research interviews

Optimal Workshop offers tools that empower UX researchers and designers to make informed, data-driven decisions. It simplifies user research through a comprehensive set of tools designed for testing and improving website usability.

Variety of Testing Tools: From card sorting to tree testing, Optimal Workshop provides multiple methods to uncover user insights and improve information architecture.

  • Real-Time Analytics: Instantly access data and insights from your research, allowing for agile adjustments to your UX strategies.
  • Participant Recruitment: This company offers an integrated participant recruitment service, making connecting with target users for research studies easier.

Pricing: 

  • Team Plan: This plan starts at $191 per month per user and provides features suitable for teams to collaborate and conduct UX research​​.
  • Individual Plan: This plan, priced at $208 per month per user, is tailored for professionals who need access to advanced UX research tools​​.

user research interviews

Lookback provides a powerful platform for UX researchers aiming to capture and analyze user interactions in real time. This tool is specially designed for teams that prioritize direct observation of user behaviors and feedback, facilitating remote user testing sessions with ease.

  • Live Remote Testing: Conduct live interviews and usability tests, allowing teams to observe user reactions and behaviors as they happen.
  • User Session Recording: Automatically records all user sessions, providing invaluable insights into user experience and identifying potential usability issues.
  • Interactive Prototypes: Supports testing of interactive prototypes, making it ideal for validating design concepts with real users before full-scale development. Lookback excels in facilitating remote user research, offering features like live remote testing and session recording to support moderated or unmoderated interviews and usability testing conducted remotely.
  • Freelance: $25 per month, 10 sessions per year, with a 24/7 help center and free 60-day trial
  • Team: $149 per month, 100 sessions per year, and includes core features.

user research interviews

Hotjar is an intuitive tool for understanding user behavior on websites and apps. Blending analytics with feedback tools offers a clear window into how users interact with digital products, making it indispensable for UX researchers focused on optimizing user experience.

  • Heatmaps: Visualize user clicks, taps, and scrolling behavior, uncovering how users navigate and interact with your site.
  • Session Recordings: Capture real user journeys through your website or app, identifying usability issues and opportunities for improvement.
  • Feedback Polls and Surveys: Directly engage with users to collect qualitative data, providing deeper insights into user satisfaction and areas for enhancement.
  • Basic: Free, includes essentials like unlimited heatmaps and 5k monthly sessions.
  • Plus: Aimed at small teams for $32/month, offering 7k sessions and more insights.
  • Business: Best for growing companies at $56/month, with 10k sessions and advanced features, including funnels and all integrations.

user research interviews

UsabilityHub, now known as Lyssna, emerges as a vital tool for UX researchers focused on making informed design decisions. This platform enables quick and practical testing of design concepts, ensuring that products are user-friendly and aligned with audience needs from the outset.

  • Five-Second Tests: Measure first impressions of your designs to gauge what users remember and see in just five seconds.
  • Preference Tests: Help determine which design variations resonate most with your audience, guiding your design choices.
  • Task Analysis: Pinpoint how users navigate your designs and where they encounter issues, enabling targeted improvements.
  • Free Plan: Ideal for getting started with user research. It lets you run basic tests, surveys, and interviews with limitations on length, collaborators, and responses.
  • Basic Plan: It starts at $75/month (billed annually) and is suitable for small extents. It extends limits on test duration, collaborators, and storage, and it also allows it. 
  • Pro Plan: Starts at $175/month (billed annually). Ideal for scaling research. Includes everything in the Basic plan, with increased limits,  custom branding, and more management features.
  • Enterprise Plan: Requires contacting Lyssna for a custom quote. Geared towards large organizations with specific needs.

user research interviews

Crazy Egg is a powerful tool for UX researchers and website owners who want to understand and enhance user engagement. With its suite of visual tools, Crazy Egg demystifies user behavior, making it more straightforward to optimize websites for better performance and user satisfaction.

  • Heatmaps: Offer a color-coded representation of how users interact with your site, highlighting high and low engagement areas.
  • Scroll Maps: Show how far down users scroll, helping identify where you’re losing their attention.
  • A/B Testing: Allows testing different page versions to see which performs better, ensuring data-driven design decisions.
  • Basic: $24/month 
  • Standard: $49/month 
  • Plus: $99/month 
  • Pro: $249/month
  • Enterprise: Custom pricing for large organizations

user research interviews

Maze empowers UX researchers and designers with rapid testing and actionable research insights, bridging the gap between ideation and validation. This platform stands out for its agility in facilitating user testing, ensuring products meet and exceed user expectations.

  • Unmoderated User Testing: Conduct tests without real-time supervision, allowing participants to complete tasks conveniently.
  • Prototype Testing: Seamlessly integrates with design tools like Sketch and InVision, making it easy to test designs at any stage of development.
  • Insightful Reporting: Generates clear, actionable reports highlighting user behavior patterns, preferences, and obstacles, guiding informed design decisions.
  • Free Plan: Suitable for individuals and teams starting with light research.
  • Professional Plan: Priced at $99 or $75 per month when billed annually.
  • Organization Plan: Custom pricing is available for enterprise-level teams.

user research interviews

Qualtrics manages experience management, offering UX researchers unparalleled insights into customer, product, brand, and employee experiences. Its comprehensive suite, powered by sophisticated analytics and predictive intelligence, enables organizations to drive improvements across all touchpoints.

  • Advanced Survey Capabilities: Tailor surveys with complex logic to gather deep insights, ensuring high response rates and actionable feedback.
  • Experience Management (XM) Platform: Leverages AI and predictive analytics to measure and optimize experiences across interactions.
  • Integration Ecosystem: Easily integrates with existing business systems, allowing for streamlined data flow and insights application across platforms.

Qualtrics is a subscription-based service that requires annual renewal. Plans start at $1,500 yearly and go up to $5,000 yearly.

user research interviews

Dovetail emerges as a beacon for UX researchers seeking to dive deep into user insights and drive impactful product decisions. With its robust qualitative data analysis capabilities and survey tools, this platform enhances the understanding of user needs, behaviors, and experiences, fostering a user-centered design approach.

  • Rich Text Analysis: Offers powerful tools for analyzing user interviews, survey responses, and feedback, turning qualitative data into actionable insights.
  • Collaborative Workspace: This enables teams to collaborate seamlessly, sharing insights, tagging themes, and building a cohesive understanding of user data.
  • Intuitive Design Repository: Stores all research data in one place, making it easy to access, search, and leverage insights across projects.
  • Dovetail’s compatibility with various UX research methods, including wikis, centralizes research, documents learnings, and edits data from user interviews, enhancing the analysis of qualitative data.

Dovetail offers a single pricing plan at $99.00 per month.

UserZoom offers a comprehensive solution for UX researchers looking to streamline their processes from start to finish. Known for its versatility, this tool facilitates a wide range of user research tools and methods, from usability testing to surveys and card sorting, ensuring that every aspect of user experience is thoroughly examined and optimized.

  • Versatile Research Methods: Supports various UX research techniques, allowing teams to adapt their approach based on project needs.
  • Integrated Analytics: Delivers deep insights with advanced analytics tools, enabling researchers to measure and track user behavior and satisfaction quickly.
  • Scalable Participant Recruitment: This service offers access to a global panel of participants, simplifying gathering diverse and relevant user feedback.
  • UserZoom GO: The basic plan starts at $250 per month and includes a free trial
  • Custom Plans: These cater to enterprises with more features and functionalities. Pricing likely varies depending on chosen features and functionalities.

user research interviews

Validately stands as a key player for UX researchers aiming to simplify the collection and analysis of user feedback. Specializing in usability testing online surveys and user interviews, this tool makes connecting with users easier and gathering meaningful insights that drive product improvement and innovation.

  • Easy Usability Testing: Facilitates moderated and unmoderated usability tests, enabling researchers to capture authentic user interactions with prototypes or live sites.
  • Efficient User Recruitment: A built-in user recruitment feature reduces the time and effort needed to find and schedule participants for research studies.
  • Insightful Analysis Tools: Offers robust tools for analyzing test results, helping to quickly identify trends, pain points, and opportunities for enhancement.

The starting price is $250 per month for one researcher or $3,000 per year. It does not offer a free plan, but it does have a free trial. 

user research interviews

Indigo.Design provides a unique solution that bridges the gap between design and development. Integrating UX research tools with a powerful design-to-code platform enables teams to create, test, and iterate on designs before seamlessly translating them into production-ready code.

  • Usability Testing on Real Designs: This method allows for testing interactive prototypes with real users and collecting feedback directly on the design.
  • Design to Code Conversion: Innovatively converts design elements into usable code, significantly speeding up development.
  • Collaborative Workspace: This workspace facilitates collaboration, enabling designers, developers, and researchers to work efficiently on projects.
  • Design Essentials: This plan starts at $39 per month.
  • Indigo Studio: This plan costs $99 per month.

user research interviews

FullStory stands at the forefront of digital experience insights, offering UX researchers a comprehensive toolkit to understand and optimize user interactions. Its platform goes beyond traditional analytics, providing a deep dive into user behavior, frustrations, and successes across websites and apps.

  • Session Replay: Captures every click, scroll, and interaction, allowing teams to replay user sessions and understand the nuances of user behavior.
  • Digital Experience Intelligence: Leverages machine learning to identify trends, anomalies, and areas of friction, guiding targeted improvements.
  • Conversion Analysis: Tracks user paths to uncover what drives conversions and where potential users drop off, informing strategic design and development decisions.
  • Tiered Pricing: FullStory uses a tiered system based on the number of recorded user sessions per month.
  • Estimated Costs: Full-featured plans start around $2,000 annually. Enterprise plans with advanced features and support can range from $10,000 to over $50,000 annually.
  • Additional Factors: Pricing is also influenced by the number of users (seats) and how long you want to retain data (data retention period).

user research interviews

Usabilla empowers UX researchers and marketers with real-time feedback capabilities to enhance digital experiences across web, mobile, and email channels. This platform excels in gathering actionable insights directly from users, facilitating swift optimizations to improve overall satisfaction and performance.

  • Voice of Customer Insights: This service collects user feedback through targeted surveys and feedback buttons, enabling a deep understanding of user needs and preferences.
  • Visual Feedback Tools: Allows users to highlight specific elements on a webpage or app, providing clear, context-rich feedback that teams can act on immediately.
  • Integration Ease: Seamlessly integrates with various analytics and marketing tools, ensuring feedback data enriches broader business insights and strategies.

Usabilla offers a freemium model that includes basic features with limited functionality. Paid plans start at around $49 monthly, while enterprise plans are custom-priced.

user research interviews

Morae sets the benchmark for usability testing, offering a comprehensive suite of tools designed for UX researchers focused on in-depth analysis and improvement of user interactions. It’s the go-to solution best user research tools for those who require robust, detailed insights into user behavior, preferences, and pain points.

  • Rich Recording Capabilities: This feature captures detailed user interactions, including clicks, keystrokes, and spoken comments, providing a complete picture of the user experience.
  • Powerful Analysis Tools: Features advanced tools for analyzing usability tests, including automated graphs, charts, and video highlights that pinpoint critical moments in the user journey.
  • Real-Time Remote Observation: Enables stakeholders and team members to observe usability tests in real time from anywhere, facilitating immediate feedback and collaborative analysis.
  • Additionally, Morae is perfectly suited for conducting focus groups, thanks to its rich recording capabilities and real-time observation features, making it an essential tool for both remote and in-person research settings.

Morae offers tailored pricing based on your project’s specific needs and scope.

user research interviews

Userlytics emerges as a pivotal tool in UX research, offering an intuitive usability and user experience testing platform. It enables brands to connect with their target audience, gathering valuable insights through interactive testing sessions that cover websites, apps, prototypes, and more.

  • Flexible Testing Framework: This framework supports various tests, including usability, prototype, and app testing, ensuring comprehensive insights across all user touchpoints.
  • Global Participant Pool: Access a diverse global audience for testing, ensuring feedback comes from a representative sample of your user base.
  • Intuitive Analytics Dashboard: This dashboard delivers detailed analytics and video feedback from tests, making it easy to uncover user pain points, preferences, and behaviors.
  • Pay As You Go: They have two options here:
  • Quick & Easy: This plan starts at $49 per user.
  • Enterprise: This plan offers more features and costs $69 per user.
  • Subscription Plans: These offer a set monthly price with a certain number of users included.
  • Starter: This plan starts at $399 per month.
  • Premium: This plan starts at $499 per month.
  • Advanced: This plan starts at $999 per month.

user research interviews

Ethnio emerges as a groundbreaking tool for UX researchers seeking to engage directly with participants in real time. Specializing in intercepts, recruitment, and scheduling remote user interviews, it streamlines connecting with users for various research activities, from interviews to usability tests and surveys.

  • Screening and Recruitment: Filters potential research participants in real-time, directly from your product or any website, ensuring you connect with the most relevant users.
  • Automated Scheduling: Simplifies the logistics of setting up research sessions with a computerized scheduling system, reducing the administrative burden on teams.
  • Versatile Research Support: Facilitates a broad range of research methods, enabling teams to conduct everything from remote usability testing to in-person interviews and everything in between.
  • Little: This plan costs $79 or $66 monthly if billed annually.
  • Big: This plan is priced at $179 or $149 monthly with annual billing.
  • Pro: The Pro plan costs $349 or $289 monthly if billed annually.

user research interviews

SessionCam offers advanced session replay and website optimization tools, allowing businesses to gain deep insights into user behavior. This platform is ideal for UX researchers seeking to understand user interactions at a granular level, enabling them to identify pain points and optimize the user experience effectively.

  • Session Replay: Provides detailed replays of user sessions, allowing researchers to see exactly how users navigate through their website or application.
  • Heatmaps : Offers heatmaps that visualize user interactions, helping identify areas of interest, engagement, and friction. 
  • Conversion Funnels : Tracks user journeys through conversion funnels, enabling researchers to pinpoint drop-off points and optimize conversion rates.
  • Glassbox Basic: Up to 50,000 monthly sessions, 10 user seats (web or mobile) – priced at $50,000 annually.
  • Glassbox Standard: Up to 100,000 monthly sessions, 15 user seats (web or mobile) – priced at $75,000 annually.
  • Glassbox Enterprise: Starts from 100,000 sessions (web and mobile), with a custom quote provided upon contacting them.

user research interviews

Pendo stands out in the UX research tool landscape by offering a comprehensive research platform that gathers user feedback and provides in-depth analytics, in-app guidance, and road mapping capabilities. It’s tailored for product teams eager to understand and improve the user journey within digital products.

  • In-Depth User Analytics: Delivers granular insights into how users interact with your product, identifying trends, drop-off points, and opportunities for engagement.
  • Guided Tours: Enables the creation of in-app messages and walkthroughs to help users navigate and adopt new features more effectively.
  • Feedback Collection and Prioritization: This process integrates user feedback directly into the product development process, facilitating prioritization based on user needs and behaviors.

Pendo offers a freemium pricing model with three paid tiers. 

user research interviews

User Interviews specializes in streamlining the process of finding and managing participants for UX and research projects. This platform matches researchers with the ideal participants, ensuring that every study yields meaningful, actionable insights. It’s designed for UX professionals who value depth and quality in the feedback they collect.

  • Robust Participant Recruitment: Offers access to a vast pool of participants, allowing for precise targeting based on demographics, behaviors, and more.
  • Efficient Project Management: Facilitates the entire research process, from participant recruitment to scheduling and compensation, all within one intuitive platform.
  • Insightful Participant Profiles: Provides detailed participant profiles, ensuring researchers can select individuals who best match their study criteria.
  • Free Plan: They have a free plan that allows you to Screen and recruit a limited number of participants and Conduct basic interviews with limited features.
  • Pay As You Go (Recruit): This plan costs $40 per interview session and is suitable for occasional research needs.
  • Subscription Plans (Recruit & Hub): These plans offer several monthly interview sessions and additional features. There are 60 sessions per year, which cost $125/month (Recruit) and $175/month (Recruit & Hub).

4 Steps for an Easy-to-Understand UX Research Process

Undertaking a UX research journey provides teams with the necessary insights to create user-centered products. This process is crucial for comprehending the essence of user experience and comprises four fundamental steps. 

User research lays the foundation for a solid UX strategy. Engage in diverse user research methods and techniques to grasp the full spectrum of user needs and behaviors. 

Focus on both qualitative and quantitative data to obtain a well-rounded understanding of your users. This approach illuminates user preferences, pain points, and the overall user journey, guiding the subsequent phases of product development.

Conduct user interviews to investigate the insights gathered during the research phase. Compelling interviews reveal the why behind user actions and decisions. 

Approach these conversations with open-ended questions and a keen ear for detail to extract valuable insights that quantitative data alone cannot provide. This connection with users uncovers underlying motivations and unmet needs, enriching the design process with nuanced user perspectives.

In today’s digital-first world, remote UX research plays a pivotal role in understanding users across geographies. Utilize remote usability testing to evaluate how users interact with your product in their natural environment. 

Leverage tools designed for remote UX research to conduct studies efficiently, overcoming the constraints of distance and time zones. This flexibility ensures diverse user feedback, making your UX research more comprehensive and inclusive.

Data analysis translates raw data into actionable insights. Meticulously analyze qualitative and quantitative data to identify patterns, trends, and anomalies. 

This step demands a critical eye and a strategic mindset to distill vast data into coherent, actionable insights. Such insights inform design decisions, prioritize feature development, and guide iterative improvements, ensuring your product evolves in response to real user needs.

Integrating Usersnap into your toolkit revolutionizes how you collect and manage user feedback. This all-in-one platform enables seamless feedback collection through diverse features, catering to research needs and user scenarios. 

Centralize feedback for effortless management, making tracking, analyzing, and acting on user insights easier. Analyze feedback with powerful tools, drawing actionable insights that drive UX enhancements. 

By harnessing Usersnap, you enhance the user experience through iterative improvements, ensuring your product meets and exceeds user expectations. Embrace Usersnap to streamline your customer feedback and collection process, elevating your product’s user experience.

Capture user feedback easily. Get more insights and make confident product decisions.

Microsurveys by Usersnap

And if you’re ready to try out a customer feedback software, Usersnap offers a free trial. Sign up today or book a demo with our feedback specialists.

You might be also interested in

product discovery tools best

15 Best Product Discovery Tools 2024

user research interviews

17 Excellent Open Source Bug Tracking Tools in 2024

css 3 tips

10 powerful things you didn’t know about CSS3

Bug tracking template

15 Bug Report Templates & Examples To Improve Your QA Process

  • Vitaly Friedman
  • May 7, 2024

How To Run UX Research Without Access To Users

  • UX Research , User Experience , Design
  • Share on Twitter ,  LinkedIn

About The Author

Vitaly Friedman loves beautiful content and doesn’t like to give in easily. When he is not writing, he’s most probably running front-end & UX … More about Vitaly ↬

Email Newsletter

Weekly tips on front-end & UX . Trusted by 200,000+ folks.

UX research without users isn’t research . We can shape design ideas with bias, assumptions, guesstimates, and even synthetic users , but it’s anything but UX research. Yet some of us might find ourselves in situations where we literally don’t have access to users — because of legal constraints, high costs, or perhaps users just don’t exist yet. What do we do then?

Luckily, there are some workarounds that help us better understand pain points and issues that users might have when using our products. This holds true even when stakeholders can’t give us time or resources to run actual research, or strict NDAs or privacy regulations prevent us from speaking to users.

Let’s explore how we can make UX research work when there is no or only limited access to users — and what we can do to make a strong case for UX research .

This article is part of our ongoing series on design patterns . It’s also an upcoming part of the 10h-video library on Smart Interface Design Patterns  🍣 and the upcoming live UX training as well. Use code BIRDIE to save 15% off.

Find Colleagues Who Are The Closest To Your Customers

When you don’t have access to users, I always try to establish a connection with colleagues who are the closest to our customers . Connect with people in the organization who speak with customers regularly, especially people in sales, customer success , support, and QA. Ultimately, you could convey your questions indirectly via them.

As Paul Adams noted , there has never been more overlap between designers and salespeople than today. Since many products are subscription-based, sales teams need to maintain relationships with customers over time. This requires a profound understanding of user needs — and meeting these needs well over time to keep retention and increase loyalty.

That’s where research comes in — and that’s exactly where the overlap between UX and sales comes in. In fact, it’s not surprising to find UX researchers sitting within marketing teams under the disguise of Customer Success teams, so whenever you can befriend colleagues from sales and Customer Success teams.

Gaining Insights Without Direct Access To Users

If you can’t get users to come to you, perhaps you could go where they are. You could ask to silently observe and shadow them at their workplace. You could listen in to customer calls and interview call center staff to uncover pain points that users have when interacting with your product. Analytics , CRM reports, and call center logs are also a great opportunity to gain valuable insights, and Google Trends can help you find product-related search queries .

To learn more about potential issues and user frustrations, also turn to search logs, Jira backlogs, and support tickets. Study reviews, discussions, and comments for your or your competitor’s product, and take a look at TrustPilot and app stores to map key themes and user sentiment. Or get active yourself and recruit users via tools like UserTesting, Maze, or UserInterviews.

These techniques won’t always work, but they can help you get off the ground. Beware of drawing big conclusions from very little research, though. You need multiple sources to reduce the impact of assumptions and biases — at a very minimum, you need five users to discover patterns .

Making A Strong Case For UX Research

Ironically, as H Locke noted , the stakeholders who can’t give you time or resources to talk to users often are the first to demand evidence to support your design work. Tap into it and explain what you need. Research doesn’t have to be time-consuming or expensive; ask for a small but steady commitment to gather evidence. Explain that you don’t need much to get started: 5 users × 30 minutes once a month might already be enough to make a positive change.

Sometimes, the reason why companies are reluctant to grant access to users is simply the lack of trust . They don’t want to disturb relationships with big clients, which are carefully maintained by the customer success team. They might feel that research is merely a technical detail that clients shouldn’t be bothered with.

Typically, if you work in B2B or enterprise, you won’t have direct access to users. This might be due to strict NDAs or privacy regulations , or perhaps the user group is very difficult to recruit (e.g., lawyers or doctors).

Show that you care about that relationship. Show the value that your work brings. Explain that design without research is merely guesswork and that designing without enough research is inherently flawed.

Once your impact becomes visible, it will be so much easier to gain access to users that seemed almost impossible initially.

Key Takeaways

  • Ask for reasons for no access to users: there might be none.
  • Find colleagues who are the closest to your customers.
  • Make friends with sales , customer success, support, QA.
  • Convey your questions indirectly via your colleagues.
  • If you can’t get users to come to you, go where they are.
  • Ask to observe or shadow customers at their workplace.
  • Listen in to customer calls and interview call center staff .
  • Gather insights from search logs, Jira backlog, and support tickets .
  • Map key themes and user sentiment on TrustPilot, AppStore, etc.
  • Recruit users via UserTesting, Maze, UserInterviews , etc.
  • Ask for small but steady commitments: 5 users × 30 mins, 1× month .
  • Avoid ad-hoc research: set up regular check-ins and timelines .

Useful Resources

  • The End Of Navel Gazing , by Paul Adams
  • UX Research Cheat Sheet , by Susan Farrell
  • Overcoming Limited Access To Users in UX Research , by Debbie Levitt
  • What Can You Do When You Have No Access To Users? , by H Locke
  • UX Research When You Can’t Talk To Users , by Chris Myhill
  • How To Conduct UX Research Without Users , by Mariia Kasym
  • User Research When You Can’t Talk to Your Users , by Jon Peterson

Meet Smart Interface Design Patterns

If you are interested in similar insights around UX, take a look at Smart Interface Design Patterns , our 10h-video course with 100s of practical examples from real-life projects — with a live UX training later this year. Everything from mega-dropdowns to complex enterprise tables — with 5 new segments added every year. Jump to a free preview .

100 design patterns & real-life examples. 10h-video course + live UX training. Free preview .

Smashing Newsletter

Tips on front-end & UX, delivered weekly in your inbox. Just the things you can actually use.

Front-End & UX Workshops, Online

With practical takeaways, live sessions, video recordings and a friendly Q&A.

TypeScript in 50 Lessons

Everything TypeScript, with code walkthroughs and examples. And other printed books.

  • Open access
  • Published: 18 May 2024

Determinants of appropriate antibiotic and NSAID prescribing in unscheduled outpatient settings in the veterans health administration

  • Michael J. Ward 1 , 2 , 3 , 4 ,
  • Michael E. Matheny 1 , 4 , 5 , 6 ,
  • Melissa D. Rubenstein 3 ,
  • Kemberlee Bonnet 7 ,
  • Chloe Dagostino 7 ,
  • David G. Schlundt 7 ,
  • Shilo Anders 4 , 8 ,
  • Thomas Reese 4 &
  • Amanda S. Mixon 1 , 9  

BMC Health Services Research volume  24 , Article number:  640 ( 2024 ) Cite this article

Metrics details

Despite efforts to enhance the quality of medication prescribing in outpatient settings, potentially inappropriate prescribing remains common, particularly in unscheduled settings where patients can present with infectious and pain-related complaints. Two of the most commonly prescribed medication classes in outpatient settings with frequent rates of potentially inappropriate prescribing include antibiotics and nonsteroidal anti-inflammatory drugs (NSAIDs). In the setting of persistent inappropriate prescribing, we sought to understand a diverse set of perspectives on the determinants of inappropriate prescribing of antibiotics and NSAIDs in the Veterans Health Administration.

We conducted a qualitative study guided by the Consolidated Framework for Implementation Research and Theory of Planned Behavior. Semi-structured interviews were conducted with clinicians, stakeholders, and Veterans from March 1, 2021 through December 31, 2021 within the Veteran Affairs Health System in unscheduled outpatient settings at the Tennessee Valley Healthcare System. Stakeholders included clinical operations leadership and methodological experts. Audio-recorded interviews were transcribed and de-identified. Data coding and analysis were conducted by experienced qualitative methodologists adhering to the Consolidated Criteria for Reporting Qualitative Studies guidelines. Analysis was conducted using an iterative inductive/deductive process.

We conducted semi-structured interviews with 66 participants: clinicians ( N  = 25), stakeholders ( N  = 24), and Veterans ( N  = 17). We identified six themes contributing to potentially inappropriate prescribing of antibiotics and NSAIDs: 1) Perceived versus actual Veterans expectations about prescribing; 2) the influence of a time-pressured clinical environment on prescribing stewardship; 3) Limited clinician knowledge, awareness, and willingness to use evidence-based care; 4) Prescriber uncertainties about the Veteran condition at the time of the clinical encounter; 5) Limited communication; and 6) Technology barriers of the electronic health record and patient portal.

Conclusions

The diverse perspectives on prescribing underscore the need for interventions that recognize the detrimental impact of high workload on prescribing stewardship and the need to design interventions with the end-user in mind. This study revealed actionable themes that could be addressed to improve guideline concordant prescribing to enhance the quality of prescribing and to reduce patient harm.

Peer Review reports

Adverse drug events (ADEs) are the most common iatrogenic injury. [ 1 ] Efforts to reduce these events have primarily focused on the inpatient setting. However, the emergency department (ED), urgent care, and urgent primary care clinics are desirable targets for interventions to reduce ADEs because approximately 70% of all outpatient encounters occur in one of these settings. [ 2 ] Two of the most commonly prescribed drug classes during acute outpatient care visits that have frequent rates of potentially inappropriate prescribing include antibiotics and non-steroidal anti-inflammatory drugs (NSAIDs). [ 3 , 4 ]

An estimated 30% of all outpatient oral antibiotic prescriptions may be unnecessary. [ 5 , 6 ] The World Health Organization identified overuse of antibiotics and its resulting antimicrobial resistance as a global threat. [ 7 ] The Centers for Disease Control and Prevention (CDC) conservatively estimates that in the US there are nearly 3 million antibiotic-resistant infections that cause 48,000 deaths annually. [ 8 ] Antibiotics were the second most common source of adverse events with nearly one ADE resulting in an ED visit for every 100 prescriptions. [ 9 ] Inappropriate antibiotic prescriptions (e.g., antibiotic prescription for a viral infection) also contribute to resistance and iatrogenic infections such as C. difficile (antibiotic associated diarrhea) and Methicillin-resistant Staphylococcus aureus (MRSA) . [ 8 ] NSAID prescriptions, on the other hand, result in an ADE at more than twice the rate of antibiotics (2.2%), [ 10 ] are prescribed to patients at an already increased risk of potential ADEs, [ 4 , 11 ] and frequently interact with other medications. [ 12 ] Inappropriate NSAID prescriptions contribute to serious gastrointestinal, [ 13 ] renal, [ 14 ] and cardiovascular [ 15 , 16 ] ADEs such as gastrointestinal bleeding, acute kidney injury, and myocardial infarction or heart failure, respectively. Yet, the use of NSAIDs is ubiquitous; according to the CDC, between 2011 and 2014, 5% of the US population were prescribed an NSAID whereas an additional 2% take NSAIDs over the counter. [ 11 ]

Interventions to reduce inappropriate antibiotic prescribing commonly take the form of antimicrobial stewardship programs. However, no such national programs exist for NSAIDs, particularly in acute outpatient care settings. There is a substantial body of evidence supporting the evidence of such stewardship programs. [ 17 ] The CDC recognizes that such outpatient programs should consist of four core elements of antimicrobial stewardship, [ 18 ] including commitment, action for policy and practice, tracking and reporting, and education and expertise. However, the opportunities to extend antimicrobial stewardship in EDs are vast. Despite the effectiveness, there is a recognized need to understand which implementation strategies and how to implement multifaceted interventions. [ 19 ] Given the unique time-pressured environment of acute outpatient care settings, not all antimicrobial stewardship strategies work in these settings necessitating the development of approaches tailored to these environments. [ 19 , 20 ]

One particularly vulnerable population is within the Veterans Health Administration. With more than 9 million enrollees in the Veterans Health Administration, Veterans who receive care in Veteran Affairs (VA) hospitals and outpatient clinics may be particularly vulnerable to ADEs. Older Veterans have greater medical needs than younger patients, given their concomitant medical and mental health conditions as well as cognitive and social issues. Among Veterans seen in VA EDs and Urgent Care Clinics (UCCs), 50% are age 65 and older, [ 21 ] nearly three times the rate of non-VA emergency care settings (18%). [ 22 ] Inappropriate prescribing in ED and UCC settings is problematic with inappropriate antibiotic prescribing estimated to be higher than 40%. [ 23 ] In a sample of older Veterans discharged from VA ED and UCC settings, NSAIDs were found to be implicated in 77% of drug interactions. [ 24 ]

Learning from antimicrobial stewardship programs and applying to a broader base of prescribing in acute outpatient care settings, it is necessary to understand not only why potentially inappropriate prescribing remains a problem for antibiotics, but for medications (e.g., NSAIDs) which have received little stewardship focus previously. This understanding is essential to develop and implement interventions to reduce iatrogenic harm for vulnerable patients seen in unscheduled settings. In the setting of the Veterans Health Administration, we sought to use these two drug classes (antibiotics and NSAIDs) that have frequent rates of inappropriate prescribing in unscheduled outpatient care settings, to understand a diverse set of perspectives on why potentially inappropriate prescribing continues to occur.

Selection of participants

Participants were recruited from three groups in outpatient settings representing emergency care, urgent care, and urgent primary care in the VA: 1) Clinicians-VA clinicians such as physicians, advanced practice providers, and pharmacists 2) Stakeholders-VA and non-VA clinical operational and clinical content experts such as local and regional medical directors, national clinical, research, and administrative leadership in emergency care, primary care, and pharmacy including geriatrics; and 3) Veterans seeking unscheduled care for infectious or pain symptoms.

Clinicians and stakeholders were recruited using email, informational flyers, faculty/staff meetings, national conferences, and snowball sampling, when existing participants identify additional potential research subjects for recruitment. [ 25 ] Snowball sampling is useful for identifying and recruiting participants who may not be readily apparent to investigators and/or hard to reach. Clinician inclusion criteria consisted of: 1) at least 1 year of VA experience; and 2) ≥ 1 clinical shift in the last 30 days at any VA ED, urgent care, or primary care setting in which unscheduled visits occur. Veterans were recruited in-person at the VA by key study personnel. Inclusion criteria consisted of: 1) clinically stable as determined by the treating clinician; 2) 18 years or older; and 3) seeking care for infectious or pain symptoms in the local VA Tennessee Valley Healthcare System (TVHS). TVHS includes an ED at the Nashville campus with over 30,000 annual visits, urgent care clinic in Murfreesboro, TN with approximately 15,000 annual visits, and multiple primary care locations throughout the middle Tennessee region. This study was approved by the VA TVHS Institutional Review Board as minimal risk.

Data collection

Semi-structured interview guides (Supplemental Table 1) were developed using the Consolidated Framework for Implementation Research (CFIR) [ 26 ] and the Theory of Planned Behavior [ 27 , 28 ] to understand attitudes and beliefs as they relate to behaviors, and potential determinants of a future intervention. Interview guides were modified and finalized by conducting pilot interviews with three members of each participant group. Interview guides were tailored to each group of respondents and consisted of questions relating to: 1) determinants of potentially inappropriate prescribing; and 2) integration into practice (Table. 1 ). Clinicians were also asked about knowledge and awareness of evidence-based prescribing practices for antibiotics and NSAIDs. The interviewer asked follow-up questions to elicit clarity of responses and detail.

Each interview was conducted by a trained interviewer (MDR). Veteran interviews were conducted in-person while Veterans waited for clinical care so as not to disrupt clinical operations. Interviews with clinicians and stakeholders were scheduled virtually. All interviews (including in-person) were recorded and transcribed in a manner compliant with VA information security policies using Microsoft Teams (Redmond, WA). The audio-recorded interviews were transcribed and de-identified by a transcriptionist and stored securely behind the VA firewall using Microsoft Teams. Study personnel maintained a recording log on a password-protected server and each participant was assigned a unique participant ID number. Once 15 interviews were conducted per group, we planned to review interviews with the study team to discuss content, findings, and to decide collectively when thematic saturation was achieved, the point at which no new information was obtained. [ 29 ] If not achieved, we planned to conduct at least 2 additional interviews prior to group review for saturation. We estimated that approximately 20–25 interviews per group were needed to achieve thematic saturation.

Qualitative data coding and analysis was managed by the Vanderbilt University Qualitative Research Core. A hierarchical coding system (Supplemental Table 2) was developed and refined using an iterative inductive/deductive approach [ 30 , 31 , 32 ] guided by a combination of: 1) Consolidated Framework for Implementation Research (CFIR) [ 26 ]; 2) the Theory of Planned Behavior [ 27 , 28 ]; 3) interview guide questions; and 4) a preliminary review of the transcripts. Eighteen major categories (Supplemental Table 3) were identified and were further divided into subcategories, with some subcategories having additional levels of hierarchical division. Definitions and rules were written for the use of each of the coding categories. The process was iterative in that the coding system was both theoretically informed and derived from the qualitative data. The coding system was finalized after it was piloted by the coders. Data coding and analysis met the Consolidated Criteria for Reporting Qualitative Research (COREQ) guidelines. [ 33 ]

Four experienced qualitative coders were trained by independently coding two transcripts from each of the three participant categories. Coding was then compared, and any discrepancies resolved by reconciliation. After establishing reliability in using the coding system, the coders divided and independently coded the remaining transcripts in sequential order. Each statement was treated as a separate quote and could be assigned up to 21 different codes. Coded transcripts were combined and sorted by code.

Following thematic saturation, the frequency of each code was calculated to understand the distribution of quotes. Quotes were then cross-referenced with coding as a barrier to understand potential determinants of inappropriate prescribing. A thematic analysis of the barriers was conducted and presented in an iterative process with the research team of qualitative methodologists and clinicians to understand the nuances and refine the themes and subthemes from the coded transcripts. Transcripts, quotations, and codes were managed using Microsoft Excel and SPSS version 28.0.

We approached 132 individuals and 66 (50%) agreed to be interviewed. Participants included 25 clinicians, 24 stakeholders, and 17 Veterans whose demographic characteristics are presented in Table 2 . The clinicians were from 14 VA facilities throughout the US and 20 physicians, and five advanced practice providers. Of the clinicians, 21 (84%) worked in either an ED or urgent care while the remainder practiced in primary care. The 24 stakeholders included 13 (54%) clinical service chiefs or deputy chief (including medical directors), five (21%) national directors, and six (25%) experts in clinical content and methodology. The 17 Veterans interviewed included 15 (88%) who were seen for pain complaints.

Results are organized by the six thematic categories with several subthemes in each category. Themes and subthemes are presented in Table 3  and are visually represented in Fig.  1 . The six themes were: 1) perceived versus actual Veterans expectations about prescribing, 2) the influence of a time-pressured clinical environment on prescribing stewardship, 3) limited clinician knowledge, awareness, and willingness to use evidence-based care, 4) uncertainties about the Veteran condition at the time of the clinical encounter, 5) limited communication, and 6) technology barriers.

figure 1

Visual representation of themes and subthemes from 66 clinician, stakeholder, and Veteran interviews

Theme 1: Perception that Veterans routinely expect a medication from their visit, despite clinical inappropriateness

According to clinicians, Veterans frequently expect to receive a prescription even when this decision conflicts with good clinical practice.

Certainly lots of people would say you know if you feel like you’re up against some strong expectations from the patients or caregivers or families around the utility of an antibiotic when it’s probably not indicated…In the emergency department the bias is to act and assume the worst and assume like the worst for the clinical trajectory for the patient rather than the reverse. [Clinician 49, Physician, ED]

In addition, stakeholders further stated that patient prescription expectations are quite influential and are likely shaped by Veterans’ prior experiences.

I think the patients, particularly for antibiotics, have strong feelings about whether they should or shouldn’t get something prescribed. [Stakeholder 34] You know I think the biggest challenge, I think, is adjusting patients’ expectations because you know they got better the last time they were doing an antibiotic. [Stakeholder 64]

Patient satisfaction and clinician workload may also influence the clinician’s prescription decision.

We have a lot of patients that come in with back pain or knee pain or something. We’ll get an x-ray and see there’s nothing actually wrong physically that can be identified on x-ray at least and you have to do something. Otherwise, patient satisfaction will dip, and patients leave angry. [Clinician 28, Physician, urgent care clinic] For some clinicians it’s just easier to prescribe an antibiotic when they know that’s the patient’s expectation and it shortens their in-room discussion and evaluation. [Clinician 55, Physician, ED]

Despite clinician perception, Veterans communicated that they did not necessarily expect a prescription and were instead focused on the clinical interaction and the clinician’s decision.

I’m not sure if they’ll give me [unintelligible] a prescription or what they’ll do. I don’t care as long as they stop the pain. [Patient 40, urgent care clinic] I don’t expect to [receive a prescription], but I mean whatever the doctor finds is wrong with me I will follow what he says. [Patient 31, ED]

Theme 2: Hectic clinical environments and unique practice conditions in unscheduled settings provide little time to focus on prescribing practices

Clinicians and stakeholders reported that the time-constrained clinical environment and need to move onto the next patient were major challenges to prescribing stewardship.

The number one reason is to get a patient out of your office or exam bay and move on to the next one. [Stakeholder 28] It takes a lot of time and you have to be very patient and understanding. So, you end up having to put a fair bit of emotional investment and intelligence into an encounter to not prescribe. [Stakeholder 1]

Stakeholders also noted that unique shift conditions and clinician perceptions that their patients were “different” might influence prescribing practices.

A common pushback was ‘well my patients are different.’ [Stakeholder 4] Providers who worked different types of shifts, so if you happened to work on a Monday when the clinics were open and had more adults from the clinics you were more likely to prescribe antibiotics than if you worked over night and had fewer patients. Providers who worked primarily holidays or your Friday prescribing pattern may be very different if you could get them into a primary care provider the next day. [Stakeholder 22]

Clinicians also reported that historical practices in the clinical environment practices may also contribute to inappropriate prescribing.

I came from working in the [outpatient] Clinic as a new grad and they’re very strict about prescribing only according to evidence-based practice. And then when I came here things are with other colleagues are a little more loose with that type of thing. It can be difficult because you start to adopt that practice to. [Clinician 61, Nurse Practitioner, ED]

Theme 3: Clinician knowledge, awareness, and willingness to use evidence-based care

Stakeholders felt that clinicians had a lack of knowledge about prescribing of NSAIDs and antibiotics.

Sometimes errors are a lack of knowledge or awareness of the need to maybe specifically dose for let’s say impaired kidney function or awareness of current up to date current antibiotic resistance patterns in the location that might inform a more tailored antibiotic choice for a given condition. [Stakeholder 37] NSAIDs are very commonly used in the emergency department for patients of all ages…the ED clinician is simply not being aware that for specific populations this is not recommended and again just doing routine practice for patients of all ages and not realizing that for older patients you actually probably should not be using NSAIDs. [Stakeholder 40]

Some clinicians may be unwilling to change their prescribing practices due to outright resistance, entrenched habits, or lack of interest in doing so.

It sounds silly but there’s always some opposition to people being mandated to do something. But there are some people who would look and go ‘okay we already have a handle on that so why do we need something else? I know who prescribes inappropriately and who doesn’t. Is this a requirement, am I evaluated on it? That would come from supervisors. Is this one more thing on my annual review?’ [Stakeholder 28] If people have entrenched habits that are difficult to change and are physicians are very individualistic people who think that they are right more often than the non-physician because of their expensive training and perception of professionalism. [Stakeholder 4]

Theme 4: Uncertainty about whether an adverse event will occur

Clinicians cited the challenge of understanding the entirety of a Veteran’s condition, potential drug-drug interactions, and existing comorbidities in knowing whether an NSAID prescription may result in an adverse event.

It’s oftentimes a judgement call if someone has renal function that’s right at the precipice of being too poor to merit getting NSAIDs that may potentially cause issues. [Clinician 43, Physician, inpatient and urgent care] It depends on what the harm is. So, for instance, you can’t always predict allergic reactions. Harm from the non-steroidals would be more if you didn’t pre-identify risk factors for harm. So, they have ulcer disease, they have kidney problems where a non-steroidal would not be appropriate for that patient. Or potential for a drug-drug interaction between that non-steroid and another medication in particular. [Clinician 16, Physician, ED]

Rather than be concerned about the adverse events resulting from the medication itself, stakeholders identified the uncertainty that clinicians experience about whether a Veteran may experience an adverse event from an infection if nothing is done. This uncertainty contributes to the prescription of an antibiotic.

My experience in working with providers at the VA over the years is that they worry more about the consequences of not treating an infection than about the consequences of the antibiotic itself. [Stakeholder 19] Sometimes folks like to practice conservatively and they’ll say even though I didn’t really see any hard evidence of a bacterial infection, the patient’s older and sicker and they didn’t want to risk it. [Stakeholder 16]

Theme 5: Limited communication during and after the clinical encounter

The role and type of communication about prescribing depended upon the respondent. Clinicians identified inadequate communication and coordination with the Veteran’s primary care physician during the clinical encounter.

I would like to have a little more communication with the primary doctors. They don’t seem to be super interested in talking to anyone in the emergency room about their patients… A lot of times you don’t get an answer from the primary doctor or you get I’m busy in clinic. You can just pick something or just do what you think is right. [Clinician 25, Physician, ED]

Alternatively, stakeholders identified post-encounter patient outcome and clinical performance feedback as potential barriers.

Physicians tend to think that they are doing their best for every individual patient and without getting patient by patient feedback there is a strong cognitive bias to think well there must have been some exception and reason that I did it in this setting. [Stakeholder 34] It’s really more their own awareness of like their clinical performance and how they’re doing. [Stakeholder 40]

Veterans, however, prioritized communication during the clinical encounter. They expressed the need for clear and informative communication with the clinician, and the need for the clinician to provide a rationale for the choice and medication-specific details along with a need to ask any questions.

I expect him to tell me why I’m taking it, what it should do, and probably the side effects. [Patient 25, ED] I’d like to have a better description of how to take it because I won’t remember all the time and sometimes what they put on the bottle is not quite as clear. [Patient 22, ED]

Veterans reported their desire for a simple way to learn about medication information. They provided feedback on the current approaches to educational materials about prescriptions.

Probably most pamphlets that people get they’re not going to pay attention to them. Websites can be overwhelming. [Patient 3, ED] Posters can be offsetting. If you’re sick, you’re not going to read them…if you’re sick you may glance at that poster and disregard it. So, you’re not really going to see it but if you give them something in the hand people will tend to look at it because it’s in their hand. [Patient 19, ED] It would be nice if labels or something just told me what I needed to know. You know take this exactly when and reminds me here’s why you’re taking it for and just real clear and not small letters. [Patient 7, ED]

Theme 6: Technology barriers limited the usefulness of clinical decision support for order checking and patient communication tools

Following the decision to prescribe a medication, clinicians complained that electronic health record pop-ups with clinical decision support warnings for potential safety concerns (e.g., drug-drug interactions) were both excessive and not useful in a busy clinical environment.

The more the pop ups, the more they get ignored. So, it’s finding that sweet spot right where you’re not constantly having to click out of something because you’re so busy. Particularly in our clinical setting where we have very limited amount of time to read the little monograph. Most of the time you click ‘no’ and off you go. (Clinician 16, Physician, ED) Some of these mechanisms like the EMR [electronic medical record] or pop-up decision-making windows really limit your time. If you know the guidelines appropriately and doing the right thing, even if you’re doing the right thing it takes you a long time to get through something. (Clinician 19, Physician, Primary care clinic)

For post-encounter communication that builds on Theme 5 about patient communication, patients reported finding using the VA patient portal (MyHealtheVet) challenging for post-event communication with their primary care physician and to review the medications they were prescribed.

I’ve got to get help to get onto MyHealtheVet but I would probably like to try and use that, but I haven’t been on it in quite some time. [Patient 22, ED] I tried it [MyHealtheVet] once and it’s just too complicated so I’m not going to deal with it. [Patient 37, Urgent care]

This work examined attitudes and perceptions of barriers to appropriate prescribing of antibiotics and NSAIDs in unscheduled outpatient care settings in the Veterans Health Administration. Expanding on prior qualitative work on antimicrobial stewardship programs, we also included an examination of NSAID prescribing, a medication class which has received little attention focused on prescribing stewardship. This work seeks to advance the understanding of fundamental problems underlying prescribing stewardship to facilitate interventions designed to improve not only the decision to prescribe antibiotics and NSAIDs, but enhances the safety checks once a decision to prescribe is made. Specifically, we identified six themes during these interviews: perceived versus actual Veteran expectations about prescribing, the influence of a time-pressured clinical environment on prescribing stewardship, limited clinician knowledge, awareness, and willingness to use evidence-based care, uncertainties about the Veteran condition at the time of the clinical encounter, limited communication, and technology barriers.

Sensitive to patient expectations, clinicians believed that Veterans would be dissatisfied if they did not receive an antibiotic prescription, [ 34 ] even though most patients presenting to the ED for upper respiratory tract infections do not expect antibiotics. [ 35 ] However, recent work by Staub et al. found that among patients with respiratory tract infections, receipt of an antibiotic was not independently associated with improved satisfaction. [ 36 ] Instead, they found that receipt of antibiotics had to match the patient’s expectations to affect patient satisfaction and recommended that clinicians communicate with their patients about prescribing expectations. This finding complements our results in the present study and the importance of communication about expectations is similarly important for NSAID prescribing as well.

A commitment to stewardship and modification of clinician behavior may be compromised by the time-pressured clinical environment, numerous potential drug interactions, comorbidities of a vulnerable Veteran population, and normative practices. The decision to prescribe medications such as antibiotics is a complex clinical decision and may be influenced by both clinical and non-clinical factors. [ 34 , 37 , 38 ] ED crowding, which occurs when the demand for services exceeds a system’s ability to provide care, [ 39 ] is a well-recognized manifestation of a chaotic clinical environment and is associated with detrimental effects on the hospital system and patient outcomes. [ 40 , 41 ] The likelihood that congestion and wait times will improve is unlikely as the COVID-19 pandemic has exacerbated the already existing crowding and boarding crisis in EDs. [ 42 , 43 ]

Another theme was the uncertainty in the anticipation of adverse events that was exacerbated by the lack of a feedback loop. Feedback on clinical care processes and patient outcomes is uncommonly provided in emergency care settings, [ 44 ] yet may provide an opportunity to change clinician behavior, particularly for antimicrobial stewardship. [ 45 ] However, the frequent use of ineffective feedback strategies [ 46 ] compromises the ability to implement effective feedback interventions; feedback must be specific [ 47 ] and address the Intention-to-Action gap [ 48 ] by including co-interventions to address recipient characteristics (i.e., beliefs and capabilities) and context to maximize impact. Without these, feedback may be ineffective.

An additional barrier identified from this work is the limited communication with primary care following discharge. A 2017 National Quality Forum report on ED care transitions [ 49 ] recommended that EDs and their supporting hospital systems should expand infrastructure and enhance health information technology to support care transitions as Veterans may not understand discharge instructions, may not receive post-ED or urgent care, [ 50 , 51 , 52 ] or may not receive a newly prescribed medication. [ 24 ] While there are existing mechanisms to communicate between the ED and primary care teams such as notifications when a Veteran presents to the ED and when an emergency clinician copies a primary care physician on a note, these mechanisms are insufficient to address care transition gaps and are variable in best practice use. To address this variability, the VA ED PACT Tool was developed using best practices (standardized processes, "closed-loop" communication, embedding into workflow) to facilitate and standardize communication between VA EDs and follow-up care clinicians. [ 53 ] While the ED PACT Tool is implemented at the Greater Los Angeles VA and can create a care coordination order upon ED discharge, its use is not yet widely adopted throughout the VA.

In the final theme about technology barriers, once the decision has been made to prescribe a medication, existing electronic tools that are key components of existing stewardship interventions designed to curtail potentially inappropriate prescriptions may be compromised by their lack of usability. For example, clinician and stakeholder interview respondents described how usability concerns were exacerbated in a time-pressured clinical environment (e.g., electronic health record clinical decision support tools). Clinical decision support is an effective tool to improve healthcare process measures in a diverse group of clinical environments; [ 54 ] however, usability remains a barrier when alerts must be frequently overridden. [ 55 , 56 ] Alert fatigue, as expressed in our interviews for order checking and recognized within the VA’s EHR, [ 57 , 58 ] may contribute to excessive overrides reducing the benefit of clinical decision support, [ 56 , 59 ] there was a notable lack of discussion about the decision to initiate appropriate prescriptions, which is a key action of the CDC’s outpatient antibiotic stewardship campaign. [ 18 ] Thus, a potentially more effective, albeit challenging approach, is to “nudge” clinicians towards appropriate prescribing and away from the initial decision to prescribe (e.g., inappropriate antibiotic prescribing for viral upper respiratory tract infections) with either default order sets for symptom management or to enhance prescription decisions through reminders about potential contraindications to specific indications (e.g., high risk comorbidities). Beyond EHR-based solutions that might change clinician behavior, the CDC’s outpatient antibiotic stewardship program provides a framework to change the normative practices around inappropriate prescribing and includes a commitment to appropriate prescribing, action for policy and change, tracking and reporting, and education and expertise. [ 18 ]

Another technical barrier faces patients through patient-facing electronic tools such as the VA’s MyHealtheVet portal, which was developed to enhance patient communication following care transitions and to allow Veterans to review their medications and to communicate with their primary care clinical team. Patient portals can be an effective tool for medication adherence [ 60 ] and offer promise to provide patient education [ 61 ] following a clinical encounter. However, they are similarly limited by usability concerns, representing an adoption barrier to broader Veteran use after unscheduled outpatient care visits [ 62 ], particularly in an older patient population.

These interviews further underscored that lack of usability of clinical decision support for order checking that arises from ineffective design and is a key barrier preventing health information technology from reaching its promise of improving patient safety. [ 63 ] A common and recognized reason for these design challenges include the failure to place the user (i.e., acute care clinician) at the center of the design process resulting in underutilization, workarounds, [ 64 ] and unintended consequences, [ 65 ] all of which diminish patient safety practices and fail to change clinician behavior (i.e., prescribing). Complex adaptive systems work best when the relative strengths of humans (e.g., context sensitivity, situation specificity) are properly integrated with the information processing power of computerized systems. [ 66 ] One potential approach to address usability concerns is through the integration of user-centered design into technology design represents an opportunity to design more clinician- and patient-centric systems of care to advance prescribing stewardship interventions that may have lacked broader adoption previously. As antimicrobial stewardship and additional prescribing stewardship efforts focus on time-pressured environments where usability is essential to adoption, taking a user-centered design approach to not only the development of electronic tools but also in addressing the identified barriers in prescribing represents a promising approach to enhance the quality of prescribing.

Limitations

The study findings should be considered in light of its limitations. First, the setting for this work was the Veterans Health Administration, the largest integrated health system in the US. Also, while we focused on the stewardship of two drug classes, there are numerous additional drug classes that are prescribed in these settings. Studies in other settings or on other drug classes may not generalize to other settings and drug classes. Second, while clinicians and stakeholder perspectives included diverse, national representation, the Veterans interviewed were local to the Tennessee Valley Healthcare System. Given the concurrent COVID-19 pandemic at the time of enrollment, most of the Veterans were seen for pain-related complaints, and only two infectious-related complaints were included. However, we also asked them about antibiotic prescribing. Clinician and stakeholder narratives may not completely reflect their practice patterns as their responses could be influenced by social desirability bias. Third, responses may be subject to recall bias and may influence the data collected. Finally, the themes and subthemes identified may overlap and have potential interactions. While we used an iterative process to identify discrete themes and subthemes, prescription decisions represent a complex decision process that are influenced by numerous patient and contextual factors and may not be completely independent.

Despite numerous interventions to improve the quality of prescribing, the appropriate prescription of antibiotics and NSAIDs in unscheduled outpatient care settings remains a challenge. Using the Veterans Health Administration, this study found that challenges to high quality prescribing include perceived Veteran expectations about receipt of medications, a hectic clinical environment deprioritizing stewardship, limited clinician knowledge, awareness, and willingness to use evidence-based care, uncertainty about the potential for adverse events, limited communication, and technology barriers. Findings from these interviews suggest that interventions should consider the detrimental impact of high workload on prescribing stewardship, clinician workflow, the initial decision to prescribe medications, and incorporate end-users into the intervention design process. Doing so is a promising approach to enhance adoption of high quality prescribing practices in order to improve the quality and patient outcomes from NSAID and antibiotic prescribing.

Availability of data and materials

De-identified datasets used and/or analysed during the current study will be made available from the corresponding author on reasonable request.

Leape LL, Brennan TA, Laird N, et al. The nature of adverse events in hospitalized patients. Results of the Harvard Medical Practice Study II. N Engl J Med. 1991;324(6):377–384.

Article   CAS   PubMed   Google Scholar  

Pitts SR, Carrier ER, Rich EC, Kellermann AL. Where Americans get acute care: increasingly, it’s not at their doctor’s office. Health Aff (Millwood). 2010;29(9):1620–9.

Article   PubMed   Google Scholar  

Palms DL, Hicks LA, Bartoces M, et al. Comparison of antibiotic prescribing in retail clinics, urgent care centers, emergency departments, and traditional ambulatory care settings in the United States. Jama Intern Med. 2018;178(9):1267–9.

Article   PubMed   PubMed Central   Google Scholar  

Davis JS, Lee HY, Kim J, et al. Use of non-steroidal anti-inflammatory drugs in US adults: changes over time and by demographic. Open Heart. 2017;4(1):e000550.

Fleming-Dutra KE, Hersh AL, Shapiro DJ, et al. Prevalence of inappropriate antibiotic prescriptions among US ambulatory care visits, 2010–2011. JAMA. 2016;315(17):1864–73.

Shively NR, Buehrle DJ, Clancy CJ, Decker BK. Prevalence of Inappropriate Antibiotic Prescribing in Primary Care Clinics within a Veterans Affairs Health Care System. Antimicrob Agents Chemother. 2018;62(8):e00337–18. https://doi.org/10.1128/AAC.00337-18 .  https://pubmed.ncbi.nlm.nih.gov/29967028/ .

World Health Organization. Global antimicrobial resistance and use surveillance system (GLASS) report: 2022. 2022.

Centers for Disease Control and Prevention. COVID-19: U.S. Impact on Antimicrobial Resistance, Special Report 2022. Atlanta: U.S. Department of Health and Human Services, CDC; 2022.

Google Scholar  

Shehab N, Lovegrove MC, Geller AI, Rose KO, Weidle NJ, Budnitz DS. US emergency department visits for outpatient adverse drug events, 2013–2014. JAMA. 2016;316(20):2115–25.

Fassio V, Aspinall SL, Zhao X, et al. Trends in opioid and nonsteroidal anti-inflammatory use and adverse events. Am J Manag Care. 2018;24(3):e61–72.

PubMed   Google Scholar  

Centers for Disease Control and Prevention. Chronic Kidney Disease Surveillance System—United States. http://www.cdc.gov/ckd . Accessed 21 March 2023.

Cahir C, Fahey T, Teeling M, Teljeur C, Feely J, Bennett K. Potentially inappropriate prescribing and cost outcomes for older people: a national population study. Br J Clin Pharmacol. 2010;69(5):543–52.

Gabriel SE, Jaakkimainen L, Bombardier C. Risk for Serious Gastrointestinal Complications Related to Use of Nonsteroidal Antiinflammatory Drugs - a Metaanalysis. Ann Intern Med. 1991;115(10):787–96.

Zhang X, Donnan PT, Bell S, Guthrie B. Non-steroidal anti-inflammatory drug induced acute kidney injury in the community dwelling general population and people with chronic kidney disease: systematic review and meta-analysis. BMC Nephrol. 2017;18(1):256.

McGettigan P, Henry D. Cardiovascular risk with non-steroidal anti-inflammatory drugs: systematic review of population-based controlled observational studies. PLoS Med. 2011;8(9): e1001098.

Article   CAS   PubMed   PubMed Central   Google Scholar  

Holt A, Strange JE, Nouhravesh N, et al. Heart Failure Following Anti-Inflammatory Medications in Patients With Type 2 Diabetes Mellitus. J Am Coll Cardiol. 2023;81(15):1459–70.

Davey P, Marwick CA, Scott CL, et al. Interventions to improve antibiotic prescribing practices for hospital inpatients. Cochrane Database Syst Rev. 2017;2(2):CD003543.

Sanchez GV, Fleming-Dutra KE, Roberts RM, Hicks LA. Core Elements of Outpatient Antibiotic Stewardship. MMWR Recomm Rep. 2016;65(6):1–12.

May L, Martin Quiros A, Ten Oever J, Hoogerwerf J, Schoffelen T, Schouten J. Antimicrobial stewardship in the emergency department: characteristics and evidence for effectiveness of interventions. Clin Microbiol Infect. 2021;27(2):204–9.

May L, Cosgrove S, L'Archeveque M, et al. A call to action for antimicrobial stewardship in the emergency department: approaches and strategies. Ann Emerg Med. 2013;62(1):69–77 e62.

Veterans Health Administration Emergency Medicine Management Tool. EDIS GeriatricsAgeReport v3.

Cairns C KK, Santo L. National Hospital Ambulatory Medical Care Survey: 2020 emergency department summary tables. NHAMCS Factsheets - EDs Web site. https://www.cdc.gov/nchs/data/nhamcs/web_tables/2020-nhamcs-ed-web-tables-508.pdf . Accessed 20 Dec 2022.

Lowery JL, Alexander B, Nair R, Heintz BH, Livorsi DJ. Evaluation of antibiotic prescribing in emergency departments and urgent care centers across the Veterans’ Health Administration. Infect Control Hosp Epidemiol. 2021;42(6):694–701.

Hastings SN, Sloane RJ, Goldberg KC, Oddone EZ, Schmader KE. The quality of pharmacotherapy in older veterans discharged from the emergency department or urgent care clinic. J Am Geriatr Soc. 2007;55(9):1339–48.

Goodman LA. Snowball sampling. The annals of mathematical statistics. 1961. pp. 148–170.

Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50.

Ajzen I. The theory of planned behavior. Organ Behav Hum Decis Process. 1991;50(2):179–211.

Article   Google Scholar  

Ajzen I. The theory of planned behaviour: reactions and reflections. Psychol Health. 2011;26(9):1113–27.  https://doi.org/10.1080/08870446.2011.613995 .  https://www.tandfonline.com/doi/full/10.1080/08870446.2011.613995 .

Morse JM. The significance of saturation. Qual Health Res. 1995;5(2):147–9.

Azungah T. Qualitative research: deductive and inductive approaches to data analysis. Qual Res J. 2018;18(4):383–400.

Tjora A. Qualitative research as stepwise-deductive induction. Routledge; 2018.  https://www.routledge.com/Qualitative-Research-as-Stepwise-Deductive-Induction/Tjora/p/book/9781138304499 .

Fereday J, Muir-Cochrane E. Demonstrating rigor using thematic analysis: A hybrid approach of inductive and deductive coding and theme development. Int J Qual Methods. 2006;5(1):80–92.

Tong A, Sainsbury P, Craig J. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. Int J Qual Health Care. 2007;19(6):349–57.

Patel A, Pfoh ER, Misra Hebert AD, et al. Attitudes of High Versus Low Antibiotic Prescribers in the Management of Upper Respiratory Tract Infections: a Mixed Methods Study. J Gen Intern Med. 2020;35(4):1182–8.

May L, Gudger G, Armstrong P, et al. Multisite exploration of clinical decision making for antibiotic use by emergency medicine providers using quantitative and qualitative methods. Infect Control Hosp Epidemiol. 2014;35(9):1114–25.

Staub MB, Pellegrino R, Gettler E, et al. Association of antibiotics with veteran visit satisfaction and antibiotic expectations for upper respiratory tract infections. Antimicrob Steward Healthc Epidemiol. 2022;2(1): e100.

Schroeck JL, Ruh CA, Sellick JA Jr, Ott MC, Mattappallil A, Mergenhagen KA. Factors associated with antibiotic misuse in outpatient treatment for upper respiratory tract infections. Antimicrob Agents Chemother. 2015;59(7):3848–52.

Hruza HR, Velasquez T, Madaras-Kelly KJ, Fleming-Dutra KE, Samore MH, Butler JM. Evaluation of clinicians’ knowledge, attitudes, and planned behaviors related to an intervention to improve acute respiratory infection management. Infect Control Hosp Epidemiol. 2020;41(6):672–9.

American College of Emergency Physicians Policy Statement. Crowding. https://www.acep.org/globalassets/new-pdfs/policy-statements/crowding.pdf . Published 2019. Accessed 11 Oct 2023.

Bernstein SL, Aronsky D, Duseja R, et al. The effect of emergency department crowding on clinically oriented outcomes. Acad Emerg Med. 2009;16(1):1–10.

Rasouli HR, Esfahani AA, Nobakht M, et al. Outcomes of crowding in emergency departments; a systematic review. Arch Acad Emerg Med. 2019;7(1):e52.

PubMed   PubMed Central   Google Scholar  

Janke AT, Melnick ER, Venkatesh AK. Monthly Rates of Patients Who Left Before Accessing Care in US Emergency Departments, 2017–2021. JAMA Netw Open. 2022;5(9): e2233708.

Janke AT, Melnick ER, Venkatesh AK. Hospital Occupancy and Emergency Department Boarding During the COVID-19 Pandemic. JAMA Netw Open. 2022;5(9): e2233964.

Lavoie CF, Plint AC, Clifford TJ, Gaboury I. “I never hear what happens, even if they die”: a survey of emergency physicians about outcome feedback. CJEM. 2009;11(6):523–8.

Ivers N, Jamtvedt G, Flottorp S, et al. Audit and feedback: effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev. 2012;(6):CD000259. https://doi.org/10.1002/14651858.CD000259.pub3 .

Hysong SJ, SoRelle R, Hughes AM. Prevalence of Effective Audit-and-Feedback Practices in Primary Care Settings: A Qualitative Examination Within Veterans Health Administration. Hum Factors. 2022;64(1):99–108.

Presseau J, McCleary N, Lorencatto F, Patey AM, Grimshaw JM, Francis JJ. Action, actor, context, target, time (AACTT): a framework for specifying behaviour. Implement Sci. 2019;14(1):102.

Desveaux L, Ivers NM, Devotta K, Ramji N, Weyman K, Kiran T. Unpacking the intention to action gap: a qualitative study understanding how physicians engage with audit and feedback. Implement Sci. 2021;16(1):19.

National Quality Forum. Emergency Department Transitions of Care: A Quality Measurement Framework—Final Report: DHHS contract HHSM‐500–2012–000091, Task Order HHSM‐500‐T0025. Washington, DC: National Quality Forum; 2017.

Kyriacou DN, Handel D, Stein AC, Nelson RR. Brief report: factors affecting outpatient follow-up compliance of emergency department patients. J Gen Intern Med. 2005;20(10):938–42.

Vukmir RB, Kremen R, Ellis GL, DeHart DA, Plewa MC, Menegazzi J. Compliance with emergency department referral: the effect of computerized discharge instructions. Ann Emerg Med. 1993;22(5):819–23.

Engel KG, Heisler M, Smith DM, Robinson CH, Forman JH, Ubel PA. Patient comprehension of emergency department care and instructions: are patients aware of when they do not understand? Ann Emerg Med. 2009;53(4):454–461 e415.

Cordasco KM, Saifu HN, Song HS, et al. The ED-PACT Tool Initiative: Communicating Veterans’ Care Needs After Emergency Department Visits. J Healthc Qual. 2020;42(3):157–65.

Bright TJ, Wong A, Dhurjati R, et al. Effect of clinical decision-support systems: a systematic review. Ann Intern Med. 2012;157(1):29–43.

Weingart SN, Toth M, Sands DZ, Aronson MD, Davis RB, Phillips RS. Physicians’ decisions to override computerized drug alerts in primary care. Arch Intern Med. 2003;163(21):2625–31.

van der Sijs H, Aarts J, Vulto A, Berg M. Overriding of drug safety alerts in computerized physician order entry. J Am Med Inform Assoc. 2006;13(2):138–47.

Shah T, Patel-Teague S, Kroupa L, Meyer AND, Singh H. Impact of a national QI programme on reducing electronic health record notifications to clinicians. BMJ Qual Saf. 2019;28(1):10–4.

Lin CP, Payne TH, Nichol WP, Hoey PJ, Anderson CL, Gennari JH. Evaluating clinical decision support systems: monitoring CPOE order check override rates in the Department of Veterans Affairs’ Computerized Patient Record System. J Am Med Inform Assoc. 2008;15(5):620–6.

Middleton B, Bloomrosen M, Dente MA, et al. Enhancing patient safety and quality of care by improving the usability of electronic health record systems: recommendations from AMIA. J Am Med Inform Assoc. 2013;20(e1):e2-8.

Han HR, Gleason KT, Sun CA, et al. Using Patient Portals to Improve Patient Outcomes: Systematic Review. JMIR Hum Factors. 2019;6(4): e15038.

Johnson AM, Brimhall AS, Johnson ET, et al. A systematic review of the effectiveness of patient education through patient portals. JAMIA Open. 2023;6(1):ooac085.

Lazard AJ, Watkins I, Mackert MS, Xie B, Stephens KK, Shalev H. Design simplicity influences patient portal use: the role of aesthetic evaluations for technology acceptance. J Am Med Inform Assoc. 2016;23(e1):e157-161.

IOM. Health IT and Patient Safety: Building Safer Systems for Better Care. Washington, DC: NAP;2012.

Koppel R, Wetterneck T, Telles JL, Karsh BT. Workarounds to barcode medication administration systems: their occurrences, causes, and threats to patient safety. J Am Med Inform Assoc. 2008;15(4):408–23.

Ash JS, Sittig DF, Poon EG, Guappone K, Campbell E, Dykstra RH. The extent and importance of unintended consequences related to computerized provider order entry. J Am Med Inform Assoc. 2007;14(4):415–23.

Hollnagel E, Woods D. Joint Cognitive Systems: Foundations of Cognitive Systems Engineering. Boca Raton: CRC Press; 2006.

Download references

Acknowledgements

This material is based upon work supported by the Department of Veterans Affairs, Veterans Health Administration, Office of Research and Development, Health Services Research and Development (I01HX003057). The content is solely the responsibility of the authors and does not necessarily represent the official views of the VA.

Author information

Authors and affiliations.

Education, and Clinical Center (GRECC), VA , Geriatric Research, Tennessee Valley Healthcare System, 2525 West End Avenue, Ste. 1430, Nashville, TN, 37203, USA

Michael J. Ward, Michael E. Matheny & Amanda S. Mixon

Medicine Service, Tennessee Valley Healthcare System, Nashville, TN, USA

Michael J. Ward

Department of Emergency Medicine, Vanderbilt University Medical Center, Nashville, TN, USA

Michael J. Ward & Melissa D. Rubenstein

Department of Biomedical Informatics, Vanderbilt University Medical Center, Nashville, TN, USA

Michael J. Ward, Michael E. Matheny, Shilo Anders & Thomas Reese

Department of Biostatistics, Vanderbilt University Medical Center, Nashville, TN, USA

Michael E. Matheny

Division of General Internal Medicine & Public Health, Vanderbilt University Medical Center, Nashville, TN, USA

Department of Psychology, Vanderbilt University, Nashville, TN, USA

Kemberlee Bonnet, Chloe Dagostino & David G. Schlundt

Center for Research and Innovation in Systems Safety, Vanderbilt University Medical Center, Nashville, TN, USA

Shilo Anders

Section of Hospital Medicine, Vanderbilt University Medical Center, Nashville, TN, USA

Amanda S. Mixon

You can also search for this author in PubMed   Google Scholar

Contributions

Conceptualization: MJW, ASM, MEM, DS, SA. Methodology: MJW, ASM, MEM, DS, KB, SA, TR. Formal analysis: KB, DS, CD, MJW. Investigation: MJW, MDR, DS. Resources: MJW, MEM. Writing—Original Draft. Preparation: MJW, ASM, KB, MDR. Writing—Review & Editing: All investigators. Supervision: MJW, ASM, MEM. Funding acquisition: MJW, MEM.

Corresponding author

Correspondence to Michael J. Ward .

Ethics declarations

Ethics approval and consent to participate.

This study was approved by the VA Tennessee Valley Healthcare System Institutional Review Board as minimal risk (#1573619). A waiver of informed consent was approved and each subject was verbally consented prior to interviews. The IRB determined that all requirements set forth in 38CFR16.111 in accordance for human subjects research have been satisfied. All the methods were carried out according the relevant guidelines and regulations.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Supplementary material 1., rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Ward, M.J., Matheny, M.E., Rubenstein, M.D. et al. Determinants of appropriate antibiotic and NSAID prescribing in unscheduled outpatient settings in the veterans health administration. BMC Health Serv Res 24 , 640 (2024). https://doi.org/10.1186/s12913-024-11082-0

Download citation

Received : 11 October 2023

Accepted : 07 May 2024

Published : 18 May 2024

DOI : https://doi.org/10.1186/s12913-024-11082-0

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Non-Steroidal Anti-Inflammatory Drugs
  • Antibiotics
  • Qualitative Methods
  • Emergency Department
  • Urgent Care
  • Primary Care
  • Prescribing Stewardship

BMC Health Services Research

ISSN: 1472-6963

user research interviews

IMAGES

  1. Conducting Effective User Interviews

    user research interviews

  2. How To Do User Interviews in UX Research Successfully

    user research interviews

  3. How to Conduct User Research Interviews With Zoom: All Your Questions

    user research interviews

  4. How To Do User Interviews in UX Research Successfully

    user research interviews

  5. How to ask the right questions in user interviews to get useful answers

    user research interviews

  6. Individual user interview

    user research interviews

VIDEO

  1. Ethical Online Interview Research

  2. 8. Problem Exploration and User Research

  3. User Interviews Review

  4. Strategic & Reactionary User Research

  5. Making Sense of UX Research: 5 Approaches to Structuring Your Findings

  6. UXR USER RESEARCH JOB INTERVIEW QUESTIONS TIPS & SCRIPT

COMMENTS

  1. User Interviews for UX Research: What, Why & How

    User interviews (also called in-depth interviews) are 30- to 60-minute conversations with a single participant, in which a researcher asks questions about a topic of interest to gain a deeper understanding of participants' their attitudes, beliefs, desires and experiences. Because interviews are live (either online or in-person), moderators ...

  2. User Interviews 101

    User Interviews 101. Summary: User interviews help you learn who your users are, what their experiences are like, and what they need, value, and desire. A user interview is a popular UX research method often used in the discovery phase. User interview: A research method where the interviewer asks participants questions about a topic, listens to ...

  3. What are User Interviews?

    User interviews are a qualitative research method where researchers engage in a dialogue with participants to understand their mental models, motivations, pain points, and latent needs. "To find ideas, find problems, to find problems, talk to people.". Research is the initial step in the design process. It helps you understand what your ...

  4. How to conduct effective user interviews for UX research

    If you're conducting in-person user interviews, you'll need to choose an appropriate setting—ideally somewhere quiet and neutral where the user will feel relaxed. For remote user interviews, you'll need to set up the necessary software, such as Zoom, dscout, or Lookback. Consult this guide for more UX research tools.

  5. How to Conduct User Interviews

    3 Key Steps to Prepare for User Interviews. 1. Set a Goal for Your Interviews. 2. Recruit the Right Participants. 3. Write a Set of Interview Questions. How to Conduct a User Interview. How to Report on a Series of User Interviews.

  6. Mastering UX user interviews: Guide, tips, and examples

    User interviews are a crucial part of the product design process, as they allow designers to gain insight into the users' needs, behaviors, and motivations. This article reviewed all the details you need to know to perform user interviews effectively. We started by discussing what qualitative and quantitative research is and how user ...

  7. Complete Guide to User Interviews: Definition, Benefits, and How to Conduct

    A user interview is a qualitative research method used to gather information and insights from users or potential users of a product, service, or system. It involves conducting one-on-one conversations with individuals who represent the target audience or user group. ... User interviews play a vital role in gaining valuable insights into user ...

  8. User Interviews: A Complete Guide

    User interviews reveal in-depth insights that no other research method can, helping you gather qualitative data about your audience's needs, pain points, and experiences. Typically lasting around 30 to 60 minutes, these sessions give you space to ask an individual user about virtually any topic related to your product—then explore their answers in depth.

  9. Writing an Effective Guide for a UX Interview

    Introduction. In the discovery phase of product development, user interviews are often used to capture important information about users: their backgrounds, beliefs, motivations, desires, or needs. Typically, the interviews carried out at this stage are semistructured (referred to as "depth interviews" by market researchers) — they generally have a predefined structure, but also allow ...

  10. How to write user interview questions (with 30+ examples)

    4. Review and refine your questions for logical flow. Creating a first cut of questions to ask in user interviews is only half the job. Share your freshly-penned questions with research stakeholders to collect feedback and make necessary adjustments. This also helps ensure your questions collect helpful insights for the entire team.

  11. PDF STARTER QUESTIONS FOR USER RESEARCH

    A collection of questions to ask during your customer development and user research interviews. CREATED BY SARAH DOODY. Whether you work at a large company or a startup, one huge key to success is that you must talk to your customers. Without talking to users you risk the expensive mistake of creating something they don't want.

  12. How to Prepare for a User Interview and Ask the Right Questions

    Steinar Kvale and Svend Brinkmann, InterViews: Learning the Craft of Qualitative Research Interviewing. SAGE Publications, 2009. You can read more about why it's a great idea to ask lots of open-ended questions in this article from the Nielsen Norman Group: Open-Ended vs. Closed-Ended Questions in User Research. Image

  13. The Complete Guide to Conducting UX Research Interviews

    Write Down Your UX Research Interview Questions. Never go into a user interview without a discussion guide. This is not the place to "wing it". A discussion guide is a document that contains a list of questions to ask research participants. It must be tied to the purpose of the research and chosen according to your learning goal.

  14. My Expert Guide to User Interviews

    What are User Interviews. User Interview is a method to collect information, via direct conversation with users. Interviews can be face to face or remote. It is one of the main user research methods in our researcher toolbox. It helps us understand user needs, but also motivations, opinions, frustrations, pain points.

  15. 12 Tips For Conducting Better User Research Interviews

    Before the User Research Interview. Our first four tips focus on critical steps you should take before a user interview begins. Consider these best practices when finding your interview candidates, preparing questions for each interview, and scheduling each interview in a way that maximizes output. 1. Develop an interview hypothesis.

  16. User Interviews: Definition & Questions to Ask

    It's important to keep a positive attitude about the interview and use open body language and gestures. Also, don't feel obliged to pill in pauses or silence. If you are giving your best impression, the user will respond in kind and give you a positive interview experience. 3. At the end of the user interview.

  17. Top 20 User Research Interview Questions & Answers

    9. Outline a plan for conducting usability tests on a limited budget. Conducting usability testing allows for direct feedback on how real users interact with a product, which can drive crucial design improvements. It's a common misconception that usability testing requires a hefty budget.

  18. The Best User Research Questions (+ How to Ask Them)

    UX research methods like card sorting are a good way to understand people's mental models, but you can also gather this insight through thoughtful user interviews or research questions. User-focused questions also cover understanding attitudes towards product adoption, use habits and circumstances, pricing models, and demographics.

  19. Top UX Research Tools for User Interviews and Surveys

    User interviews are qualitative research methods where you interact directly with participants to understand their experiences and perspectives. To facilitate these sessions, you need tools that ...

  20. Best 21 UX Research Tools in 2024

    Conduct user interviews to investigate the insights gathered during the research phase. Compelling interviews reveal the why behind user actions and decisions. Approach these conversations with open-ended questions and a keen ear for detail to extract valuable insights that quantitative data alone cannot provide.

  21. How To Run UX Research Without Access To Users

    UX research without users isn't research.We can shape design ideas with bias, assumptions, guesstimates, and even synthetic users, but it's anything but UX research.Yet some of us might find ourselves in situations where we literally don't have access to users — because of legal constraints, high costs, or perhaps users just don't exist yet.

  22. USDA

    Access the portal of NASS, the official source of agricultural data and statistics in the US, and explore various reports and products.

  23. Determinants of appropriate antibiotic and NSAID prescribing in

    Semi-structured interview guides (Supplemental Table 1) were developed using the Consolidated Framework for Implementation Research (CFIR) and the Theory of Planned Behavior [27, 28] to understand attitudes and beliefs as they relate to behaviors, and potential determinants of a future intervention. Interview guides were modified and finalized ...

  24. Research Study #268058

    Research Study #268058 Study type Unmoderated Task. Study format Online. Compensation $10 choice of dozens of digital gift cards Session length 10 minutes. Description. We're looking for feedback on a product we are building. Apply to participate ...