Top

Unmoderated, Remote Usability Testing: Good or Evil?

January 18, 2010

Conducting traditional synchronous, or moderated, usability testing requires a moderator to communicate with test participants and observe them during a study—either in person or remotely. Unmoderated, automated, or asynchronous usability testing, as the name implies, occurs remotely, without a moderator. The use of a usability testing tool that automatically gathers the participants’ feedback and records their behavior makes this possible. Such tools typically let participants view a Web site they are testing in a browser, with test tasks and related questions in a separate panel on the screen.

Recently, there has been a surge in the number of tools that are available for conducting unmoderated, remote usability testing—and this surge is changing the usability industry. Whether we want to or not, it forces us to take a closer look at the benefits and drawbacks of unmoderated testing and decide whether we should incorporate it into our usability toolbox.

Champion Advertisement
Continue Reading…

To clarify, there are a lot of tools out there that label themselves as usability testing tools, but don’t actually offer the capability of doing usability testing with users through task elicitation. Some of these tools are nothing more than survey tools, Web analytics tools with new and improved visuals—such as CrazyEgg and clickdensity—or Web analytics tools that turn analytics data into videos of actual user sessions—such as Userfly, ClickTale, TeaLeaf, and Clixpy. All of these tools provide a wealth of data about your Web site’s users. However, such tools are not the focus of this article. Instead, this article focuses on unmoderated usability testing tools that actually simulate traditional usability testing by asking participants to complete a series of tasks using your user interface and answering questions about their experience.

What You Can Learn

Unmoderated usability testing lets you do test sessions with hundreds of people simultaneously, in their natural environment, which, in turn, provides quantitative and even some qualitative data. The exact metrics and feedback you can collect vary, depending on the tool you use. (I’ll provide a list of unmoderated usability testing tools later.) Most unmoderated testing tools can gather the following quantitative data:

  • task-completion rate
  • time on task
  • time on page
  • clickstream paths
  • satisfaction ratings or opinion rankings
  • Web analytics data—such as browser, operating system, and screen resolution

Most of these tools can also capture qualitative feedback as users complete their tasks—such as users’ suggestions and comments. This is where the true value of unmoderated usability testing can come into play.

Some unmoderated testing tools can recruit users for tests by intercepting them on your live Web site. This lets you collect invaluable data on participants’ true intent and motivation for visiting your Web site.

How Actionable Is the Data?

How actionable your data is depends heavily on the types of tasks you ask participants to perform. If you have participants perform scavenger-hunt tasks—asking them to find specific content on a Web site—you may miss out on important feedback. Just because someone was able to find the information you requested doesn’t mean they understood it. To elicit more valuable information, you should try to make finding tasks more meaningful by having participants answer a question about the information they were asked to find. For example: Using the Web site, please find out which smartphones are available on Verizon Wireless. Where can you purchase these phones locally?

The self-reported feedback and comments you get in response to open-ended questions can be the most valuable data you collect during an unmoderated test. Sometimes users’ direct quotations can be just as impactful as videos, especially when you start to see a consensus building among different participants.

Take satisfaction ratings and opinions with a grain of salt. Pay closer attention to what users actually do—not what they say they do. Participants can have a terrible experience using a user interface and still give it a high satisfaction rating. For this reason, I suggest asking participants open-ended questions about their experience rather than having them rate it.

Also, you must keep in mind that Web analytics alone cannot paint the full picture. Just because it took someone longer to complete a task doesn’t mean it was harder to complete. They could just have been more interested in the content. Without asking participants, you don’t really know for sure. You must be careful not to make hasty assumptions that are based on just the quantitative data you’ve collected.

Conducting Unmoderated Usability Tests

Creating and administering an unmoderated usability study is similar to the process of creating and administering an online survey, but with the additional steps of a traditional usability study, as follows:

  • Define the study. Decide what tasks you are going to ask participants to perform, the order of the tasks, and what follow-up questions you want to ask them about their experience. Unfortunately, since you are not observing the tests, you can’t ask probing or follow-up questions on the fly, depending on what participants do. However, some unmoderated usability testing tools let you structure tests to ask probing questions after users’ perform specific interactions with a user interface.
  • Recruit participants. You can choose to do the recruiting yourself or hire a recruiter. As I mentioned earlier, some unmoderated testing tools offer you the options of either intercepting users on your live Web site or recruiting them from the tool developers’ own panels of participants—which are pools of test participants they’ve recruited in advance. You should be careful when choosing participants from such panels as your representative users. Who participates in your test is just as important for an unmoderated usability test as it is for a moderated test. Your team will base important design decisions on the data you obtain, so participants should be real or prospective users of a product.
  • Launch your test and send email invitations. Typically, an unmoderated test should be only 15–30 minutes in duration—comprising approximately 3–5 tasks—because the dropout rate tends to increase if a test takes longer.
  • Analyze your results. Most unmoderated testing tools offer live, real-time reporting during tests.

Benefits and Drawbacks

Before choosing to conduct unmoderated usability tests, it’s best to take a look at their benefits and drawbacks in comparison to traditional moderated testing.

Benefits of unmoderated usability testing include the following:

  • You can test hundreds of people simultaneously—while keeping them in their own natural environment.
  • You can test multiple Web sites simultaneously—for example, competitor Web sites, different brands, or Web sites for different countries.
  • You can test at a reduced cost—depending on the tool you use. There are definitely unmoderated usability testing tools that have ridiculously high prices, but some recent tools are very affordable, which can make unmoderated usability testing a less expensive option. (See my list of unmoderated usability testing tools.) Also, the participant honorariums for unmoderated tests are typically a lot lower.
  • Doing unmoderated usability testing is a great way of planting the seed of UCD methodologies and introducing usability testing into a company, using limited resources and budget—assuming you can use one of the less expensive testing tools.
  • There are fewer logistics to manage, with no need to set up testing schedules, set up and moderate individual test sessions, or worry about no-shows and getting last-minute replacements.

Drawbacks of unmoderated usability testing include the following:

  • Nothing beats watching participants in real time and being able to ask probing questions about what they are doing as it’s happening—and you’ll miss out on this opportunity.
  • Some participants may be interested only in earning the honorarium you’ve provided as an incentive. So, rather than taking the time to really perform each task and provide feedback, they’ll just click through the tasks without much thought. Luckily, you can filter such participants out of your findings by looking at their time on task or open-ended feedback. Depending on the capabilities of your chosen testing tool, this task can either be time consuming or quite painless.
  • You cannot conduct interview-based tasks. Participants who are passionate about the tasks they are performing interact with a user interface differently from those who are just doing what they are told.
  • Web analytics can mislead you by giving a wrong impression of a user’s experience. Also, what participants report on surveys can be very different in comparison to what they actually do. You can’t rely solely on rankings and satisfaction ratings to create an accurate picture of what your users actually need and want. Therefore, you should always include qualitative research questions in your unmoderated studies and analyze the self-reported feedback. If necessary, follow up with participants after a study to discuss their feedback.
  • It’s possible for participants to think they’ve successfully completed a task when they haven’t. To move on to the next task, participants must be able to decide whether they’ve completed their current task. For this reason, you need to develop straightforward tasks that have well-defined end states.

When Should You Conduct an Unmoderated Test?

Have you ever presented findings from a moderated usability test only to receive push back on the data, because only 5–10 people participated in your study? As usability professionals, we know when our data is actionable. (How many times do you need to see someone fail to complete a task before you know it’s a problem?) But, sometimes, you need greater numbers to give stakeholders the warm-and-fuzzy feeling they need to make million-dollar design decisions.

Unmoderated usability testing can yield an enormous amount of data and feedback from participants, but you should not use it as a replacement for moderated usability testing. Unmoderated testing is best when you use it in conjunction with moderated testing. Use your large samples from unmoderated testing to help put big numbers behind some key findings from your initial moderated research.

Keep this in mind:

  • Moderated testing is still much better suited for multifaceted products or complex tasks that don’t have a structured sequence of steps.
  • Unmoderated testing is most effective when you have very specific questions about how people use a user interface for relatively simple and straightforward tasks.

Overview of Some Unmoderated Testing Tools

Please note that this is not a complete list of all available unmoderated testing tools. New unmoderated testing tools are constantly appearing, so I urge you to use this list only as a starting point in your process of finding the best tool for your needs. Also, the pricing and feature list for each tool is current only as of this article’s date of publication. Since pricing and feature sets can change, you should visit these companies’ Web sites for the most current information about their offerings.

Keynote WebEffectiveKeynote WebEffective

Self-service option: Yes

Software download required: Yes

Recruiting options: Intercept users, use their panel, or do it yourself

Pricing: $$$$

UserZoomUserZoom

Self-service option: Yes

Software download required: Yes, but there is an optional version, without clickstream tracking, that doesn’t require a download.

Recruiting options: Intercept users, use their panel, or do it yourself

Pricing: $$$$

RelevantViewRelevantView

This tool includes the ability to conduct card sorts and Chalkmark-like tests—see Chalkmark.

Self-service option: Yes

Software download required: No

Recruiting options: Use their panel or do it yourself

Pricing: $$$$

Webnographer

Self-service option: No

Software download required: Yes

Recruiting options: Intercept users, do it yourself, or they’ll recruit for you

Pricing: $$$$

Note—No screenshots were available.

Morae AutopilotMorae Autopilot

This is an in-person, unmoderated testing tool. It does not provide the ability to conduct unmoderated, remote testing.

Self-service option: Yes

Software download required: Morae Recorder must be installed on the computer running the test.

Recruiting options: Do it yourself

Pricing: $$

Note—This pricing is for a one-time purchase of unlimited usage of Autopilot, and you are also buying the entire Morae package, which you can use for moderated, as well as unmoderated testing.

Loop11Loop11

This low-cost option offers many of the same benefits and features as the higher-priced tools.

Self-service option: Yes

Software download required: No

Recruiting options: Do it yourself

Pricing: $

OpenHallwayOpenHallway

This tool also records on-screen interactions and audio.

Self-service option: Yes

Software download required: No

Recruiting options: Do it yourself

Pricing: $

UserTesting.coUserTesting.com

This tool also records on-screen interactions and audio.

Self-service option: Yes

Software download required: No

Recruiting options: Panel, but do it yourself is a custom option.

Pricing: $

EasyUsability.comEasyUsability.com

Self-service option: Yes

Software download required: No

Recruiting options: Panel only

Pricing: $

UsabillaUsabilla

This tool uses task elicitation to collect opinions and feedback and also to find out where people first click—either a static image or a Web address, or URL—to complete a task.

Self-service option: Yes

Software download required: No

Recruiting options: Do it yourself

Pricing: $

ChalkmarkChalkmark

This tool uses task elicitation to find out where people first click a static image to complete a task.

Self-service option: Yes

Software download required: No

Recruiting options: Do it yourself

Pricing: $

TreejackTreejack

This tool uses task elicitation to find out what link people first click in an information architecture to find information.

Self-service option: Yes

Software download required: No

Recruiting options: Do it yourself

Pricing:

Resources

de la Nuez, Alfonso. “An Attainable Goal: Quantifying Usability and User Experience.” (For subscribers only.) User Experience Magazine, Volume 7, Issue 3, 2008. Retrieved September 10, 2009.

—— and Kim Oslob. “What’s the Real Value Behind Unmoderated Remote User Testing? UserZoom Blog, September 9, 2009. Retrieved January 15, 2010.

Farnsworth, Carol. “Getting Your Money Back: The ROI of Remote Unmoderated User Research.” (For subscribers only.) User Experience Magazine, Volume 7, Issue 3, 2008. Retrieved September 10, 2009.

Mach, Sabrina. “Is All Remote Usability Testing The Same? FeraLabs Blog, February 24, 2009. Retrieved September 10, 2009.

Tulathimutte, Tony. “Read Chapter One of Remote Research! Rosenfeld Media, January 26, 2009. Retrieved September 2, 2009.

Tullis, Tom. “Automated Usability Testing: A Case Study.” (For subscribers only.) User Experience Magazine, Volume 7, Issue 3, 2008. Retrieved September 10, 2009.

Founding Principal at Usable Interface

Kingston, New Hampshire, USA

Kyle SoucyAs the founding principal of Usable Interface, a consulting company specializing in product usability and user-centered design, Kyle Soucy has created intuitive user interfaces for a variety of products and Web sites. Her clients range from pharmaceutical giants like Pfizer to publishing powerhouses like McGraw-Hill. Kyle is the Founder and Past President of the New Hampshire Chapter of the Usability Professionals’ Association (NH UPA); has served as the Chair of PhillyCHI, the Philadelphia Chapter of the Association for Computing Machinery’s Special Interest Group on Computer-Human Interaction (ACM SIGCHI), and is the local UXnet Ambassador for New Hampshire. She is passionate about the continued growth of the usability and user experience design community.  Read More

Other Articles on Remote UX Research

New on UXmatters