Evaluating the Usability of Search Forms Using Eyetracking: A Practical Approach

By Matteo Penzo

Published: January 23, 2006

“The usability of forms is often massively important to the overall usability of a Web site.”

In this article, I’ll present findings from eyetracking tests we did to evaluate the best solutions for label placement in Web forms. Today, forms are the primary—often the only—way users have of sending data to Web sites. Web 2.0 makes extensive use of forms. For example, on Flickr™, Del.icio.us, and Writeboard™—which, by the way, I used when writing this article—users provide all of their tags, comments, and other information using forms. Users submit queries to search engines using forms. Ecommerce sites also rely heavily on forms that let visitors find and purchase products. (I’ve never browsed for books on Amazon®. I always search for them.)

So, the usability of forms is often massively important to the overall usability of a Web site. That’s why we decided to subject some of these forms to a quick round of eyetracking tests and have analyzed the resulting data to better understand what makes Web forms usable—or unusable.

We conducted these evaluations in the Consultechnology eyetracking lab. Magda Giacintucci assisted me in conducting the tests and setting up the lab. Three different groups of users participated in the tests. We classified the users by their level of expertise using the Internet—rookie, intermediate, and pro. In the pro group, I included people from my team—from both the programming and user experience groups. I’d like to stress the fact that it was our aim to do these tests quickly and simply, in order to gain practical knowledge that would help us improve the design of forms rather than to do scientific analysis for an academic paper.

How We Tested

We held three different rounds of test sessions. In the sessions, each individual user evaluated the user interfaces we were testing in random order. During the first round of testing, we tested the usability of search forms. Each session started with a training task, through which our testers demonstrated the task that the user would repeat throughout the test session, using a form that was well known to all the users: Google™ search. Then, the user used other search forms to perform this task: locate the search box, then search for eye. The task ended when the user clicked the submit button.

Patterns in the Test Results

“Strong patterns emerged in the test results. These patterns arose from the way users’ different levels of expertise affect the way they look at forms.”

Even though some unanticipated results came to our attention during the tests, strong patterns emerged in the test results. These patterns arose from the way users’ different levels of expertise affect the way they look at forms. While rookies repeatedly scanned the form—both up and down and left to right, the pro users looked directly at what they knew were the tools they needed to complete their task.

Gaze plots showed the very different behaviors of rookies and pros when using search forms, as follows:

  • As shown in Figure 1, the rookie user looked at the input field, then the submit button, then the label; then looked again at the input field, the submit button, and the label. Looping repeatedly through this pattern, even when typing the search string, she continued to look at all the form elements to assure herself she was doing the right thing.
  • The pro user’s first fixation was directly on the input field. She didn’t look at anything else until she’d typed the text string and was ready to submit the search. Then, a long saccade moved the user’s focus to the submit button—which the user had perceived peripherally, but never looked at before—and, typically, pro users activated the submit button using the Enter key. In one case, we had the longest saccade ever recorded in our tests: nearly 600 pixels wide!

Figure 1—Gaze plot showing the use of Google search by a rookie user

Gaze plot of rookie user looking at Google search interface

Rookies needed reassurance that their actions were correct, so while typing search strings, they continually checked whether the input field they were using was actually the search field. Though we first noticed this pattern while analyzing the use of search forms, we found that this pattern was also characteristic of novice users’ behavior during more complex tasks.

Practical advice to UX designers—Clearly label input fields. Doing so won’t bother pro users, but is a great help to your novice users.

Testing Search Forms

We began our tests by observing and analyzing users’ behavior when using search on these popular sites:

Google

“We used Google during our training tests. This training test ensured that users could complete the subsequent tasks without any hesitation because of confusion about the task itself.”

As I mentioned previously, we used Google during our training tests. This training test ensured that users could complete the subsequent tasks without any hesitation because of confusion about the task itself. During this very first test, the pattern I described previously first arose, highlighting the way rookie users interact with search: continually jumping from the input field to the submit button, as many as three times. There really are no intermediate users of Google, but the behavior of all pro users was characteristic of the way they’d interact with a well-known form. They just took one look to acquire the visual target—the search input field—typed the search string, then pressed Enter to submit the search—almost without looking at the submit button.

Amazon and eBay

On Amazon.com, in addition to a not-so-great search form design, the presence of the A9 search form distracts users and causes confusion. So, it took a long time—more than 1.6 seconds—for rookies to fully comprehend the search form.

On both Amazon and eBay, there is a drop-down list very near the search input field, which users can use to limit searches to specific categories of items. We found that users always looked at these lists first and for the longest time. Figure 2 is a heat map that shows the amount of time all rookies collectively spent looking at each interface element in the search form on eBay. All users seemed to perceive the list as the most important part of the form—probably because it’s clearly an interactive element that allows a more complex interaction.

Figure 2—Heat map showing the cumulative time rookies spent looking at elements in the search form on eBay

Cumulative time spent by rookies looking at eBay search interface

When a search form is on a page with a lot of content, as in these cases, a clear and prominent label for the input field helps users to find and recognize the form and reduces the number of repeated fixations they need to reassure themselves that they’re using the correct input field. On eBay, the submit button—which is the element in the search form that is furthest to the right—provides the only label. This obliges users to perform frequent fixations on that area of the screen, then return to the input field, which increases the cognitive workload of performing the task.

Pro users never again looked at either the label or the drop-down list—even though some fixations on this element incremented their cognitive load—but once they had typed the search string, fixated directly on the submit button. Figure 3 is a heat map showing the cumulative time all pros spent looking at each interface element in the search form on Amazon.

Figure 3—Heat map showing the cumulative time pros spent looking at elements in the search form on Amazon

Cumulative time spent by pros looking at Amazon search interface

Flickr

Though the search form is definitely not a prominent feature on the Flickr Web site, I wanted to check how their search form would perform. I thought this form would perform better than the others, because their search form is very compact, with a label just above a narrow input field and a clearly labeled submit button. Figure 4 shows the time all users spent looking for, then focusing on the elements in the search form on Flickr.

Figure 4—Heat map showing the cumulative time all users spent looking at Flickr

Heat map of Flickr

In fact, for all three groups of users, this form performed better than any of the other search forms we evaluated during this round of testing. Because of the compactness of the whole form, it took less than one second to visually navigate it. Plus, in looking at the clearly labeled input field, rookie users were reassured of the form’s purpose.

useit.com

Jakob Nielsen has written a lot about form usability—mainly, but not only, in regard to e-commerce sites. So, I thought it would be interesting to test the search form on his useit.com site—which has no label for its input field.

Neither rookie nor intermediate users noticed the form immediately, so we might conclude that the de facto standard of placing a search form at the upper right on a Web page didn’t work well for those test subjects. Also, as shown in Figure 5, the absence of a label forced the rookie users to double-check the input button on the far right, in order to reassure themselves of the form’s purpose. However, pro users, who are probably the target users for Dr. Nielsen’s site, immediately—at their third fixation on the page—noticed the form and typed their search string.

Figure 5—Scan path showing a rookie user’s use of the Useit.com search form

A rookie user’s scanpath on Jakob Nielsen’s site

Guidelines for Search Forms

Though this was not an in-depth study on search form usability, we were able to identify a few interesting design guidelines for search forms:

  • Form labels help rookie and intermediate users, who look for such labels when trying to locate a site’s search interface. These users expect input field labels to be at the left of the search form. Pro users don’t need and won’t use these labels, which are simply invisible to them.
  • Drop-down lists are very eye catching form elements. You should always consider very carefully whether you should include a drop-down list in a search form. Use a drop-down list only if no alternative element would serve its purpose as well. Maintain adequate distance between the drop-down list and other elements in the search form. In general, if you want to create a simple search form that is easy to use for even novice users, avoid using drop-down lists in the form, because they tend to cognitively overload users.
  • Compactness makes search forms easier to peruse. Therefore, make search forms as compact as possible. Most users visually navigate a form broadly before they can understand its scope, so the smaller the area of a page over which they’re forced to navigate, the better. In our tests, the site search form that performed best was that on Flickr. Their search form is very compact, with a label placed over the input field, so users need look at only one place on the page. Thus, our test results suggest that the ideal location for the label of an input field in a search form is left aligned, immediately above the field.
  • Consistent placement of search forms in a standard location on Web pages overcomes some of the problems of unlabeled input fields. If you don’t want to label the input field, at least place your search form in a discrete area of the page where it’s simple for users to understand its purpose. If you do so, your form will serve not only pro users, who usually don’t look at the labels, but also rookies, who rely heavily on such labels to perform their tasks.
“Form labels help rookie and intermediate users, who look for such labels when trying to locate a site’s search interface.”

In this article, I have just begun my investigation of form usability, an important and interesting area for usability evaluation. To complete my evaluation of form usability, I’m preparing to conduct another series of eyetracking tests, during which I’ll test label placement in forms. What is the best position for labels? What is the most usable alignment? How prominent should form labels be? Stay tuned to UXmatters for the next article in my series of articles on eyetracking, which will cover my analysis of the results of these tests!

23 Comments

I’d like to see findings on forms that have the labels directly inside the form field itself.

Hi Kingpixel,

I chose NOT to test labels inside the input field itself this time, because I’ve planned those tests for my next article on label positioning optimization.

If you want my opinion, I’d say that this could be the best solution, because it provides compactness and clear labeling at once. (That’s why I’ve chosen a similar solution for the Flashability site search form.)

My next article will hopefully clear up this question.

Very good read! Thanks for sharing your findings. I just posted a comparison between traditional search interfaces (like the ones in this article), and Live search interfaces (autocomplete, like Google Suggest).

Any chance you’ll repeat your study with live search?

Post available on http://justaddwater.dk/ http://justaddwater.dk/2006/01/26/live-search-explained/

Jesper,

Thanks for your appreciation and feedback. I noticed your post a couple of days ago and really was intrigued by its subject.

I think that it could be really interesting to do an eyetracking test on whether autocomplete improves the “typing experience.” By watching people using browsers, I’ve found that a load of them nearly notice they have URL autocompletion. :-)

I’ll probably propose this subject to Pabini for a future article. The next one is already set up for label testing.

Hi Matt - As the graduate assistant for the Department of Computer-Human Interaction at Rochester’s Institute of Technology, we have done very similar usability studies using eyetracking. I even had a chance to present at the European Conference on Eye Movements (ECEM) held in Bern, Switzerland this year. I just had a quick question with respect to the setup of your experiment. Could you explain why you decided to show users how to perform the task first? I am confused, because this seems like it would be leading the user to act “unnaturally”. Wouldn’t the results have been more accurate if users were simply given a real life task to perform and you then tracked their eye-movements to see if they would have even used the search box to begin with? I am sure many of these sites had alternative navigational options. Thanks Matt :-)

Hey Chad,

I’d like to clarify that we NEVER showed or taught the user how to perform the task. (I agree with you that this could really have messed up the results). As I’ve written in the article, we used Google just as a dry run, to be sure the user completely understood the task and had sufficient confidence with it, in order to deeply understand her cognitive behaviors during the subsequent tasks.

Coming to your second point: We weren’t testing the usability of the search engines and navigation systems of those sites. In that specific case, I would have probably done something like what you suggested. We were instead testing the search form itself. You have probably noticed that I have evaluated neither the usability of the form position on the page nor the usability of specific form elements.

Hope this further clarifies things. I’d be really interested in seeing the paper you presented. If you like, you can send it to me using my site’s contact page.

Hey Matt - Thank you for clarifying and keep up the good work :-) It will probably be another month or so before my thesis is done, but I will keep you in mind upon its completion. It is probably going to push 100+ pages so I am not sure if sending it through your contact page is the best idea, but maybe I can just send you the summary. Thanks, Matt.

Hi Matt

Great article. It’s nice to see eyetracking data to confirm a point that I’ve been teaching for years: If users are expecting to type something on a page, then they look first for the box to type into, and that’s the most important thing on the page for them.

It was also very useful to find ou,t from the Flickr example, how much they look for the box to type into even if that’s not their main reason for visiting the page.

Congratulations.

Hey Caroline — Thanks for your comment. You might want to check out my forthcoming article on forms design, soon to be published on UXmatters. It covers label placement and styling.

Ciao!

Hello Matt,

You mentioned that you had three groups of participants in your study. How many participants did you test overall and how many were in each group? I didn’t notice that in your article.

Thanks, Chauncey

Hey Matteo - You might be interested in taking a look at etre’s five days / five heatmaps study.

Thanks for the suggestion, Simon. I’m following Etre’s work with great interest.

Chauncey—I perform kind of quick and dirty tests for my UXmatters articles—3 groups; 4 people each.

Are your results, methods, and raw data available in a research paper somewhere? I would like to read it and include it in some form usability research I am conducting.

Susan, Unfortunately no research paper is available. But please feel free to use my data, including a citation of its source, and/or to contact me directly.

Excellent article. Thanks!

With regard to Jacob Nielsen’s Web site, it’s worth noting that as useit.com is a reference site for information on Web design and usability, you would generally expect only expert pro users to be using the site.

I assume your research found that pro users were comfortable using the search on this site.

Although, as you have shown, it served as an excellent test for investigating the difficulties faced by novice and intermediate users if there isn’t a label for the search.

I’m curious to know if anyone has done an eyetracking study for drop-down or pop-up navigation menus and where specific menu items—such as Contact Us, FAQ, and so on—should be placed in the list—top, bottom, or middle. I happen to prefer the important items be placed at the top and bottom of the drop-down menu, but would like to see a study on this.

I think although this post was written over a year ago, it still has relevance today.

The only difference is Web users today tend to be a little bit more Web savvy.

Marketing Sherpa has done quite a bit of eyetracking as well, in their ecommerce benchmark guides, and I think both sources work well together.

Very helpful! Thanks for sharing!

Nice article about search forms. I will check that.

Thank you for the article, very informative.

Yes, this one is the article I was searching for. This shows a lot about human behavior when using search engines. Excellent article. Thanks!

I’m curious about the study and results. What search terms did participants use and where do you get these search terms? I’d like to know how the study relates to real world search versus some exercises you describe.

Join the Discussion

Asterisks (*) indicate required information.