Ask and Ask Again: Critical Interviewing Is an Essential Component of Usability Testing

By Kate Lawrence

Published: November 1, 2012

“Your role as a usability test facilitator … should be to facilitate according to your understanding of test participants. And what better way is there to understand them than to engage them in conversation before the testing begins?”

In a previous professional life, I had a boss who insisted that my job as a usability researcher was to “serve it up cold” when conducting usability tests. “Be professional, yet impersonal,” he advised me, “and you’ll get solid results.” I suppose I did get solid results—if by solid you mean responses that are stale and forced. My test participants were never comfortable because I did nothing to help them feel at ease. Sure, I was professional, but professional without being warm and conversational.

I wasn’t getting test participants to open up and really talk to me, because I wasn’t engaging them in the basic human exchange of conversation. Plus, while I was busy directing users in completing test tasks, I was missing a valuable opportunity to understand users’ view of the world, which would have ultimately given me a more holistic view of how users interact with our products.

While your role as a usability test facilitator isn’t to play Oprah or Dr. Phil, it should be to facilitate according to your understanding of test participants. And what better way is there to understand them than to engage them in conversation before the testing begins?

Of course, random topics like How’s the weather? or debates about who should be the new manager of the Red Sox won’t fulfill your conversational goals. Instead, your focus should be on asking a series of questions that reveal what a participant thinks—about the Web and the products you are testing—and give you their general sense of the world around them. This is an opportunity to ask probing questions on certain points, as well as to gain a participant’s trust. In other words, it’s an abbreviated critical interview session.

Critical interviewing is an important research technique in its own right, but in combination with usability testing, it is a powerful tool for gaining an understanding of your users and what they might think about your products.

Make Users Feel Comfortable

Critical interviewing is an important research technique in its own right, but in combination with usability testing, it is a powerful tool for gaining an understanding of your users and what they might think about your products.

Typically, when participants arrive at a usability test session, the test facilitator is unknown to them—except perhaps for a brief email exchange to recruit them or confirm their test session. Since, as a test facilitator, you are essentially meeting a stranger for the first time, here are some ways to help a participant become comfortable sharing information with you:

  • Play the gracious host or hostess. Welcome the participant, making eye contact; introduce yourself in a clear, strong voice; and offer him a seat and a glass of water. A warm and welcoming attitude goes a long way toward helping someone feel at ease.
  • Confirm that the participant knows he has some control. Make sure the participant understands that he can stop the test at any time if he is uncomfortable. While I have never had anyone take me up on this offer, I do notice that users relax once they feel the facilitator is on their side.
  • Don’t forget your introductory script. Many test facilitators have been conducting test sessions so long that they don’t need to read from a script during the introduction before testing begins—but this may not necessarily be a good thing. You must not forget to relay important information to participants—such as the fact that the testing reflects on the product or application, not the user. Simple statements like “We are not testing your abilities in any way” or “There are no right or wrong answers” help participants to focus on the test tasks rather than on how the facilitator perceives them.
  • Offer participants their reward up front. Handing a participant a cash payment or gift card up front, before the test session begins, is an act of good faith that tells him you have confidence that the test session will go well. When I gave participants their reward at the completion of a test session, they would fret about offering criticism of a design during the session—sometimes asking me, “Does this mean I won’t get my reward?”

What to Ask?

“Your goal for critical interview questions is to probe deep enough to gain insight and to be genuinely inquisitive, without making participants feel that you’re interrogating them.”

Once you’ve made a participant feel comfortable, you’ll need to decide what questions to ask to understand her perspective on the world within which your product or application exists. Your goal for critical interview questions is to probe deep enough to gain insight and to be genuinely inquisitive, without making participants feel that you’re interrogating them. To get you started, here are four interview questions that I include in many of my usability test sessions:

  • What are five Web sites you use on a daily or weekly basis? Knowing how people use the Internet is critical, because it helps you to understand what kind of information they seek and the format in which they like to consume it. For example, when doing usability testing with college students for a library database product, many participants indicated that Facebook was a frequently visited site. This helped provide context for their overwhelming preference for an endless-scroll feature that we tested. Given that Facebook has this feature, it makes perfect sense that it tested so well.
    Tip—I always follow up this question by saying, “My intent in asking is not to pry into your personal business. I am asking this because it will help me to understand the context of your responses during the test.” This helps participants to be more honest about their top-visited sites.
  • What would be your usual process for what you’ll be shopping for, looking at, or evaluating today? Always ask a participant to explain to you how she typically performs the task you’re asking her to complete during the test session. For example, if you are testing a car shopping Web site, you need to understand how a participant approaches this type of shopping. Would she shop for a car on the Web from home or while at work? How many weeks or months before purchasing a car would she start shopping? Her answers may help you to make recommendations for the user interface that you might not have been able to make had you relied only on task feedback. Or, if you are testing an online travel site, you’ll want to understand how a participant books travel—whether from home or work or on a desktop computer or mobile device. Is the participant typically alone, or does she search with her friends or spouse? These types of questions provide detail about the why, while typical usability testing addresses the how.
  • What do you like or dislike about the Web? When you understand what users like or dislike on the Web, you can frame their feedback from a test session appropriately. When I tested an online travel site, many participants told me that they would shop for a cruise from work, which meant their time was limited, so the number of clicks was important. I heard over and over, “I don’t like being made to click to a million screens to find what I want.” Such comments ultimately helped me to make the case that we should reduce the number of pages in the booking path.
  • What words or phrases would you use to describe an ideal experience with this process? Asking participants how they would describe a positive interaction with your site or application lets you tap into their idealized vision of the experience. When I was testing the EBSCO library product, participants expressed the need to feel supported and secure—two feelings that it would not be easy to reveal simply by watching test participants complete their assigned task scenarios.  

Weaving Interview Responses into Your Final Report

“One benefit of my asking critical interview questions is that my report writing has fortunately evolved from a whitepaper style to more of a do-this-not-that style of PowerPoint deck.”

A test report should not read like a dry whitepaper that discusses the results of test tasks. One benefit of my asking critical interview questions is that my report writing has fortunately evolved from a whitepaper style to more of a do-this-not-that style of PowerPoint deck. And while seeing images of what worked and what didn’t may resonate with project stakeholders, the comments that I’ve elicited by asking my interview questions and woven into the report are always the most compelling.

Telling a development team that users might become distracted when looking at a cruise search results page is not as persuasive as using a participant’s own comment about shopping for a cruise while making dinner, starting homework, and watching television. Participants’ comments shed light on their own context, adding much more color to your report than relying on test task success metrics.

With critical interview questions as a complement to traditional usability testing task scenarios, you can illuminate your users’ backgrounds and the contexts in which they use your products, providing a valuable tool for helping you to create a positive, satisfying user experience.

1 Comment

Great suggestion of delivering awards to test participants before starting and letting them know that they can end their session at any given point, without sacrificing compensation. This tactic should increase their attention to the tasks at hand, without participants worrying about how they are performing and whether they will fulfill the obligation leading to the compensation.

Join the Discussion

Asterisks (*) indicate required information.