When I began my career as a UX designer, many product developers shared the aspiration of building great products with numerous features. While building great products is a worthy goal, teams often paid little attention to users’ real needs when deciding what features to build, which is a great shame. App development is not just about creating a product but about solving a problem for users. As an essential part of human-centered design, user research helps us to crystallize users’ problems and create solutions that directly address them.
As a design lead, I make user research an integral part of my team’s design process. We use various approaches to interacting with users to help us tailor the end product to the audience’s needs. In this article, I’ll share some of my experiences conducting different types of user research, focusing mainly on in-depth user interviews, usability testing, and surveys, but we’ve used all of the approaches that Figure 1 depicts. You’ll learn how to use each of these types of user research and discover useful methods of collecting and analyzing users’ thoughts.
Champion Advertisement
Continue Reading…
User Interviews
The user interview is a method of research that gives you deep insights into users’ needs, painpoints, and desires while also building empathy with them. Interviewing users requires a lot of attention to detail and well-developed interviewing skills. The following best practices will help you to conduct user interviews successfully:
Warm up before each interview. The first seconds of an interview can be awkward, so your first goal is to become familiar with the participant and create a relaxed atmosphere. Set the context for the interview, describing its purpose and indicating its approximate duration. If you want to record the interview, this is the best time to ask for permission.
Ask follow-up questions. Follow-up questions are those you didn’t plan. Ask them based on the information you’ve obtained during the interview. I love asking follow-up questions because this is where I find the most insights.
Ask the same questions in different ways. Another way to follow up is to restate your initial question. I often do this to get at the root of a problem and determine the user’s actual opinion. Using synonyms, injecting a perspective, or pointing to the user’s past experiences can be really helpful here.
Keep your script in mind, but be flexible. You might sound a bit like a robot if you follow your script strictly, and this is definitely not the way to build empathy. Plus, by sticking too closely to your script, you might miss out on insights and other valuable information. So keep your script close, but be ready to improvise.
Create a safe environment for your participants. A user interview is most effective when it turns into a frank talk. If you create a nonjudgmental atmosphere and convey your trustworthiness, participants will feel comfortable sharing their opinions. So put the necessary effort into creating a safe environment.
Wrap up in a friendly way. Your final words are important. It’s not enough to just say Thank you and leave. Use the time at the end of your interviews to communicate your appreciation of the participants’ time, let them know the following steps, and ask them whether you can contact them later, if necessary. You might also ask them for recommendations on who else would be interested in participating in an interview.
When to Conduct User Interviews
Key goals for conducting user interviews are as follows:
discovery research—This research usually occurs during the first stage of product development, when no product yet exists. You might have a basic idea, but need to dig deeper into the market need and the problem you’re trying to solve. The goal of conducting user interviews during the Discovery stage is getting to know about your users’ experiences and how they currently solve their problems.
gathering in-depth feedback about an existing product—Once you’ve released the product with all its features and user flows, it’s time to ask users for feedback. The goal here is to ask about their experience using the product and figure out what user needs and painpoints remain unsatisfied.
Asking Good Questions
During user interviews, it’s vital that you engage participants and get them to give you truthful answers to your questions. To ensure that I learn about real user needs, expectations, and thoughts, I ask only open-ended questions. If you ask open-ended questions, participants can’t just provide Yes or No users; they have to tell you a story. Plus, you can ask additional questions to clarify the information they’ve provided.
Good questions ask about the participant’s previous experience. Don’t ask participants to imagine a hypothetical situation. Instead, ask them to tell you about an actual situation in their life. Here are some examples of good questions:
Tell me about how you started donating to charity projects.
Why did you donate to charity projects?
Avoiding Bad Questions
Don’t ask participants questions about the future. If they haven’t experienced something in their real life, they’ll need to imagine the situation to answer the question. So what would you get in response? A fake, constructed answer.
Asking closed questions restricts a person to only two possible answers. As a consequence, instead of focusing on what matters to users, we simply confirm our assumptions. Here are some examples of bad questions:
Will you be donating to charity projects?
Why would you donate to charity projects: to make a social impact or clear your karma?
The Optimal Sample Size
When conducting user interviews, the optimal sample size is typically five participants. The more users you interview, the less new information you’ll learn. Additional users will just say the same things.
However, there is one case in which you need to interview additional users: when your product has several distinct user groups. But, in this case, you don’t need to interview five people in each group. If you have two user groups, interview three or four participants belonging to each group; if three or more user groups, three participants.
Next Steps
After gathering all our interview notes and converting recordings to text, I add all the data to a table. Then, our design team creates an affinity diagram—a visualization of the information that groups our notes by category. Based on the affinity diagram, we render the data as a value proposition canvas (VPC), as Figure 2 shows.
Usability Testing
Usability testing is an important part of the development process and provides user feedback on the usability of an existing product or a prototype of a new product. Usability testing lets UX designers look at a product from the user’s perspective, enabling them to create a customer journey map (CJM) similar to that shown in Figure 3.
There are two basic types of usability testing, as follows:
moderated testing—This type of testing involves one participant, a facilitator, and ideally, someone who takes notes. The facilitator provides test tasks to a participant, observes how the participant interacts with the product or design prototype in real time, and asks follow-up questions.
unmoderated testing—In this type of testing, the facilitator’s work is fully automated, so a session involves only a participant. During a test session, instructions, test tasks, and follow-up questions appear on the participant’s screen, so there’s no human impact on the process.
Moderated testing provides more flexibility and opportunities to interact with participants. You can warm up participants at the beginning of their test session, ask follow-up questions, get direct feedback from participants, and tell them they’re free to criticize the product. So you may gain valuable information that you’d miss if were conducting unmoderated testing.
When to Conduct Usability Testing
Key goals of conducting usability testing are as follows:
testing user-interface design solutions when creating a new product
improving the user-interface design for an existing product
Types of Test Tasks
For usability testing, the test tasks that you create must represent realistic activities that the participant would perform in real life. Test tasks might be either of the following types:
open-ended tasks—The participant receives a task to perform without any guidance or tips on how to complete the task. The moderator only observes the participant and asks additional questions. Here is an example of such as task: Donate to a charity project.
closed tasks—The participant receives a detailed task with the steps to take to complete the task. Usually, the goal of a closed task is to see how quickly a participant can complete the task. The moderator observes the participant and asks follow-up questions. Here is an example of such as task: Donate one dollar to the WWF via PayPal.
Asking Good Follow-up Questions
The follow-up questions that you ask participants can be very specific or open ended, depending on the type of research you’re conducting.
Why did you do that?
What do you understand about this screen?
During usability-test sessions, asking such follow-up questions gives you a better understanding of the participant’s intentions and thoughts.
Avoiding Bad Follow-up Questions
Avoid asking participants questions about the future because you’d be asking them to imagine a situation rather than share their actual experience. Here’s an example of such a question: Will you be using this filter function in searching for a charity to which to donate?
The Optimal Sample Size
When conducting usability testing, the optimal sample size is typically five participants. Nielsen/Norman Group’s research indicates that testing with five users is sufficient to discover the most common problems that users encounter when using a product, even for a product with a large audience. So, for a typical qualitative usability study for a single user group, I recommend using five participants.
Next Steps
Your analysis starts with prioritizing the feedback you’ve gathered from participants. At Uptech, we use a feedback-prioritization framework. Each type of feedback gets the appropriate color, as shown in the legend in Figure 4.
Then we fix bugs, decide what changes to implement, and determine what to put in the backlog. We create a type of affinity diagram, an impact/value map similar to that shown in Figure 5, to determine what issues to focus on.
This step is pretty similar to that for user interview, with one difference: we create an affinity diagram for a specific feature or user-flow stage rather than for the whole product.
Surveys
Getting conversational is good, but sometimes you need to find out how a large number of people feel about an application. After all, you’ll potentially release the application to an audience of millions. This is what surveys are for!
A survey is a quantitative user-research tool and often takes the form of a questionnaire. Surveys are an economical way of acquiring user feedback for app development. You can conduct a survey verbally, manually, or digitally, by asking candidates to answer a series of questions.
You must plan your questionnaire properly and make it easy to complete. If you ask clear questions that are essential to your research, you’ll receive meaningful answers. Start with simple, closed questions, then continue with open-ended questions to get in-depth answers. This approach helps keep participants engaged until the end of the survey.
When to Conduct Surveys
Key goals of conducting surveys are as follows:
screening participants—Screeners helps ensure that you find the right research participants and filter out people who don’t belong to the target audience for the product.
market research—Such questionnaires measure brand awareness, customers’ level of loyalty to your business, and customers’ ratings of your products and services. We’ve all received email messages that ask us to rate our satisfaction with a product or service. These surveys are very popular among marketers.
Asking Good Questions
The following questions are from a user survey that we conducted at Uptech:
What smartphone do you use?
iOS
Android
Other
None
Did you make any charitable donations in the last 12 months?
Yes
No
How much did you donate in the last 12 months and to which events, charities, or causes?
For context, the first two questions are screening questions because we were interested only in iOS users who donate to charity projects. We placed the third, open-ended question at the end of the questionnaire. It was important to get the bigger picture on users’ charitable-giving habits and learn what social issues matter to them.
Avoiding Bad Questions
Always avoid repeating the numbers in ranges. In the responses to the following question, the same age is included in multiple age ranges, which is very confusing. What option would you choose if you were 20 years old: 10–20 or 20–30?
How old are you?
0–10
10–20
20–30
30–40
40 +
How would you respond if you came across the following confusing question?
How satisfied were you with the Search field in the charitable-giving app?
Provide a rating of from 1 to 7, as follows:
1—More dissatisfied than satisfied
2
3
4—Neutral
5
6
7—More satisfied than dissatisfied
What does it mean to be more dissatisfied than satisfied? What if I am fully satisfied? Which answer should I choose? Simplify this question, as follows:
1—Very dissatisfied
2—Somewhat dissatisfied
3—Neither satisfied nor dissatisfied
4—Somewhat satisfied
5—Very satisfied
Create simple, straightforward answers to your questions.
The Optimal Sample Size and Length
When conducting a survey, the optimal sample size is typically about twenty participants. Keep your online surveys short. Your goal is to maximize the response rate and, with a brief survey, you’re more likely to do that. My recommendation is that a survey should take no more than 10 minutes to complete.
Next Steps
Before conducting a survey, be sure to define a clear goal for the survey and outline your top research questions. Then, once your survey is complete, take a look at the results for your top research questions. Finally, analyze and compare your findings for specific user groups within your target audience. Let’s say you wanted to compare how people from the USA and Europe have answered the question about the amount of their donations. To figure this out, you need to filter and cross-tabulate the results.
It’s important to pay attention to the quality of your data and to understand statistical significance. Based on your data, you can draw conclusions about benchmarks and trends.
Tools for Conducting User Research
The best tool for user research is the one that suits you and your team. There’s no need to stick with one tool just because you’ve been using it for years or to choose one that has sophisticated capabilities. Find the balance between convenience, purpose, and functionality. At Uptech, we use the following tools for specific purposes:
Remote, unmoderated usability testing: User Testing
Collaborative sessions and whiteboarding: FIGJAM, Miro
Usability-test results: Notion, Google Sheets
Surveys and screeners: Google Forms, Typeform
Final Words
User research is an integral part of the development process and helps ensure that you create a product that is better adapted to user needs. You can learn more about your users’ needs, behaviors, and painpoints by conducting user interviews, usability studies, or surveys.
Being a good UX researcher demands that you possess a number of skills, including active listening, asking the right questions, building empathy, going beyond the surface, being patient, and much more. It also requires skill in connecting with people. I hope that, through this article, I’ve encouraged you to value user research even more and to conduct your research meticulously.
Nikolay has five years of experience in UX design. Working within the Product Development Studio at Uptech, Nikolay designs user interfaces for startups. He and his team have designed more than twenty mobile apps and Web applications. Nikolay eagerly shares his expertise in articles, presentations, and other materials. He holds a Master’s degree from the Cherkasy Institute of Banking and is a Nielsen/Norman Group Certified User Experience Specialist. Read More