Preparing for User Research Interviews: Seven Things to Remember

By Michael Hawley

Published: July 7, 2008

“A researcher’s skill in conducting interviews has a direct impact on the quality and accuracy of research findings and subsequent decisions about design.”

Interviewing is an artful skill that is at the core of a wide variety of research methods in user-centered design, including stakeholder interviews, contextual inquiry, usability testing, and focus groups. Consequently, a researcher’s skill in conducting interviews has a direct impact on the quality and accuracy of research findings and subsequent decisions about design. Skilled interviewers can conduct interviews that uncover the most important elements of a participant’s perspective on a task or a product in a manner that does not introduce interviewer bias. Companies hire user researchers and user-centered designers because they possess this very ability.

There is a wide variety of literature regarding best practices for user research interviews. For example, in their book User and Task Analysis for Interface Design, Hackos and Redish devote an entire section to the formulation of unbiased questions. They advise interviewers to avoid asking leading questions, to ask questions that are based on a participant’s experience, and to avoid overly complex, lengthy questions.

“in many interview formats, a significant portion of each session involves ad-hoc, probing, follow-up questions that require researchers to think quickly to maximize their time with participants.”

Writing interview scripts in advance of a session lets researchers review and revise wording to elicit useful and unbiased responses from participants. However, in many interview formats, a significant portion of each session involves ad-hoc, probing, follow-up questions that require researchers to think quickly to maximize their time with participants. In my experience, this is where the potential to introduce bias is the greatest. In addition, conducting a successful interview involves more than just asking questions. There are also a number of guidelines for how researchers should interact with participants to enable successful interviews. These include monitoring body language, recognizing self-censoring, and understanding the correct balance between leading an interview and listening to a participant.

Experienced researchers may become more comfortable in different kinds of interview situations and have an easier time interacting with participants during interview sessions. But, over time, researchers may also develop familiar patterns for asking questions and ways of interacting with participants that could prevent them from uncovering a unique perspective in the context of a particular interview. Also, the introduction of bias in an interview is often subtle, and it may be difficult even for researchers with years of experience to notice it during one of their own sessions.

Seven Interview Best Practices

Given everything there is to remember to ensure we conduct successful interviews, I find it helpful to remind myself of the following seven key best practices immediately before an interview session:

  1. Set proper expectations. Generally, interview participants are not experienced with the user-centered design process. A recruiter may have given them a brief description of the purpose of an interview during the recruiting process, but it’s very likely participants don’t have a clear sense of why they are there. They may be apprehensive, nervous, or skeptical about your intentions. Business stakeholders especially may come to a session with a negative attitude if they believe a researcher is there to check up on them. All of this will serve to influence the responses they give to interview questions. To minimize this impact, be sure to describe the intent of the interview, your role in the design process, and how the interview process will proceed. Include details such as why you will be taking notes and how you will compile the results.
  2. Shut up and listen. As a researcher, it is easy to get wrapped up in the interview script you developed, all of the questions you want to ask, and your own ideas about the salient points to uncover. It is easy to dominate the conversation and move through the interview at a pace that is too fast for a participant to keep up. In my experience, participants often raise the most interesting points only once they’ve had a chance to internalize and think about a researcher’s question. Listening appropriately involves minimizing interruptions and slowing down the pace of the interview to give participants an opportunity to qualify their statements or provide additional insights.
  3. Minimize biased questions. Asking leading or biased questions is all too easy to do. Even a simple question such as How did you like that process? subconsciously suggests to participants that they should like the process more than they should dislike it. In our attempt to be conversational, such questions as these often roll off the tongues of even the most experienced interviewers. I’ve found the best way of minimizing these types of leading questions is to read a set of good and bad examples before an interview session. Examples might include:

Bad: How did you like the login screen?

Good: What do you think about the login screen?

Bad: Is the feature helpful to you?

Good: Is the feature helpful or not helpful to you? Why?

Bad: Would this be a good idea?

Good: How valuable would this be to you in your job?

  1. Be friendly. Interview scripts are useful, because they help researchers remember all of the topics they need to cover. But reading directly from interview scripts can have a negative effect on the dialogue between an interviewer and a participant. The result: formal, unengaged conversations in which participants give the shortest, simplest possible response to a question so they can move on to the next one. Developing a friendly relationship and an open style with participants starts with the initial greeting and continues through the interview to the closing. Establish eye contact, remember each participant’s name, and develop a casual conversational style to elicit the most thoughtful, considered responses from each participant.
  2. Turn off your assumptions. It is human nature to let your perceptions of a given topic influence your questions and even the responses you hear from participants. You may also be biased by responses you’ve heard from other participants, perhaps earlier in the same study. While it may be impossible to avoid these influences altogether, reminding yourself that they exist before the start of an interview session helps minimize their effects. Especially during the last interview in a series of interviews, make it a point to be open-minded and responsive to alternate points of view.
  3. Avoid generalizations. In rare circumstances, it may be appropriate to ask participants to speak on behalf of others or predict how certain groups of people would react to particular experiences. However, for the most part, the best research interviews are those in which the participants speak about their own experiences and preferences. Researchers must recognize when participants are generalizing their responses and attempting to answer on behalf of others. In such cases, a researcher should politely ask participants to speak about their own experiences.
  4. Don’t forget the non-verbal cues. Participants communicate through more than just their verbal responses. Body language and tone of voice convey a great deal about participants’ comfort levels with the interview session in general, their perspectives on a task or product domain, or their opinions of a researcher or the goal of a project. Researchers who focus too intently on their interview scripts and miss participants’ non-verbal cues may miss the necessary clues that would suggest they should adjust some aspect of the interview. Customers might be nervous or apprehensive and limit their answers. Business stakeholders might be skeptical about a project or the context of an interview. So, researchers need to recognize the clues that indicate such emotional responses and be flexible enough to adjust an interview session to ensure they can properly interpret participants’ responses and get the maximum return on their effort.

Conclusion

“All of us are prone to bias or can fall into bad habits that can limit the reliability and effectiveness of results.”

As Dumas and Loring note in their excellent new book Moderating Usability Tests, it’s difficult to conduct a perfect interview session. All of us are prone to bias or can fall into bad habits that can limit the reliability and effectiveness of results. This is especially true for the last sessions in a series of interviews, when you’re likely to be tired or already have formulated opinions on the outcome of a study. But, by reviewing a checklist of best practices before each interview session to remind yourself of the things you should avoid, you can minimize the impact of these pitfalls and maximize the return on your research effort.

Additional Resources

Dumas, J. and Loring, B. Moderating Usability Tests. San Francisco: Morgan Kaufmann, 2008.

Hackos, J. and Redish, J. User and Task Analysis for Interface Design. New York: Wiley, 1998.

8 Comments

Great article on avoiding bias in testing. Is there an industry best practice on who the interviewers should be. In your experience, isn’t the possibility of bias the highest where the UX designers are the interviewers? If so, how do you avoid this problem? How can neutrality be ensured?

If UX designers are conducting interviews on their own designs, bias is very likely. I like to use other UX designers on my team to conduct tests on my designs, or better yet, use an available UX researcher. If you are the only one available, be sure the participant never finds out that you are discussing your own design.

Thank you, Michael, for this article. Your seven best practices are very helpful. I’m always trying to remind myself to adopt a non-biased attitude before conducting a session. One other thing I always try to remind myself to do is remember to always answer a question with another question. To the “What happens if I click this?” question, I usually respond with a “What do you think may happen?” type counter-question. The other thing I like to do is listen to myself while watching the playback of sessions—if I’ve recorded them using Morae, for example—and grade myself on where I did well and where I need improvement. Thanks again for an interesting article!

If UX designers are conducting interviews on their own designs, bias is very likely. I like to use other UX designers on my team to conduct tests on my designs, or better yet, use an available UX researcher. If you are the only one available, be sure the participant never finds out that you are discussing your own design.

As Dumas and Loring note in their excellent new book Moderating Usability Tests, it’s difficult to conduct a perfect interview session.

There is a wide variety of literature regarding best practices for user research interviews. For example, in their book User and Task Analysis for Interface Design, Hackos and Redish devote an entire section to the formulation of unbiased questions. They advise interviewers to avoid asking leading questions, to ask questions that are based on a participant’s experience, and to avoid overly complex, lengthy questions.

If you are the only one available, be sure the participant never finds out that you are discussing your own design.

I am a graduate student at DMU. Before reading this article, I was confused in regard to conducting senior research. But now, I am really motivated to do it very well.

Join the Discussion

Asterisks (*) indicate required information.