Top

Insights on Switching, Centering, and Gestures for Touchscreens

Mobile Matters

Designing for every screen

A column by Steven Hoober
September 2, 2014

While I’ve discovered many things in the last few years about how users work with touchscreen devices, the one thing I’m really sure about is how much we do not understand. Touch devices are still fairly new. We’re still developing patterns for interactions and are just now beginning to understand how users understand and employ their touchscreen devices.

Since my first research into how users really hold and touch their phones came out over a year and a half ago, I’ve continued to build on my early research and explore the human side of mobile touch interactions. The next logical step was for me to attempt to actually understand users’ motivations and determine whether I can draw relationships between different types of actions or contexts and user interactions.

Champion Advertisement
Continue Reading…

My Research Methodology

In cooperation with my friends at ZIPPGUN, I designed and built a mobile app that lets me see users working with their phones and tablets in a variety of different ways. I observed 31 participants using the app, about two-thirds of them on handsets and the rest on tablets.

I recorded all interactions from the users’ point of view and encoded the recordings later to note their method of holding the device, their method of touching it, and some other features based on their interactions. I recorded the direction and length of scrolling gestures, tap position, and the accuracy of selection.

I performed handset tests mostly through carnival intercepts, which are just like mall intercepts, but at a carnival. The users wore video-recording glasses that I’d modified to make them more suitable for recording natural handset use. Figure 1 shows an example frame of recorded video while it’s being processed. I did remote, unmoderated testing of tablet use in the home, with time donated from UserTesting.com.

Figure 1—Encoding data from video, using on-screen measurement tools
Encoding data from video, using on-screen measurement tools

While this research involved far fewer participants than the research I did before, this was a very different study. It is much more difficult to encode information, so recording and analyzing the data for just these 31 sessions took almost 100 hours. The data that I’ll present in this column correlates nicely to the data from my earlier studies and reviews of research by others and served to expand, explain, or confirm my previous findings.

I did gather some additional information beyond what I’ll discuss here, but either it turned not to be useful in the end, its results were vague or contradictory, or fully understanding the findings would simply require more analysis or additional research.

Switching

The first focus of my tests was switching behavior, in the hope that I’d be able to assign touch and hold styles to specific contexts. The most interesting thing to me was how little I noticed this behavior during the test. People seemed to hold their phone exactly as I had observed in my earlier test, with a lot of one-handed use, for example.

But when I actually analyzed the data, something new became apparent. As you can see in Figure 2, there’s hardly any one-handed use for actually touching the screen.

Figure 2—Observed rates of touching the device in various ways, for several on-screen tasks
Observed rates of touching the device in various ways, for several on-screen tasks

If you accept that users are comfortable moving their hands around—and I do—this all makes perfect sense and gives us a lot more insight into the overall rates of holding the phone in various ways. People seem to carry the phone around a lot in one hand, scroll a bit with their thumbs, but move to cradling or using their other hand for serious scrolling and selection.

For most activities, users cradle or hold the phone with one hand while tapping with the other. A little over 41% of the time, people type on virtual keyboards with two hands—mostly two thumbs. A few people also use two thumbs for everyday tasks, switching back and forth freely to tap or scroll with whichever thumb is available.

I’ve excluded tablet use from this information because tablet users typically use their devices on surfaces or on stands and may use their fingers, pen styluses, or occasionally, their thumbs.

Centering

We already know from previous research that people are more accurate at touching the middle of a mobile device’s screen—and this is pretty much any screen and whatever way they hold their phone or tablet. They also seem to subconsciously know this—or perhaps it’s tied to their preference for reading in the middle—so they are more confident when interacting at the center of the screen, but slow down to tap corner or edge targets.

This is absolutely the most important variable in touch accuracy. Not environmental conditions, familiarity with touchscreens, or anything else. It’s just about the position on the screen that they are trying to tap.

However, there hasn’t been much data on how we use tablets until now. So, as part of this latest round of research, I gathered my own data and was able to confirm that these same levels of pointing accuracy apply to 7-, 8-, and 10-inch tablets as well. Figure 3 shows the actual touch positions and the R95 circle for users attempting to select a Menu button on a tablet. In Figure 3, each dot is a tap, and the circle denotes the probability of 95% of the taps falling within it.

Figure 3—Example of tablet accuracy
Example of tablet accuracy

I was also able to reconfirm participants’ preference for center accuracy. Figure 4 is a heatmap showing the positions that participants actually tapped when selecting items from a full-screen scrolling list. They naturally moved the content to the position they preferred, and that position put most taps in about the center two-thirds of the screen. When users can choose where to touch a screen—for example, as with a scrolling list—they almost always tap the center two-thirds of the screen.

This data is also surprisingly similar regardless of device class. The heatmap in Figure 4 includes the selections that participants made on tablets, as well as on phones, normalized to fit in the visualization. Regardless of the device type, size, or orientation, all taps fall within the same portion of the screen.

Figure 4—Users prefer to touch the center two-thirds of the screen
Users prefer to touch the center two-thirds of the screen

You might have previously thought that, when you copy the user interface for something like Twitter, the key controls for actions and input should be at the top and bottom of the viewport. However, the primary content and interactive area should, in fact, be in the middle of the screen. All of these content-centric tools already accommodate the user’s primary behavior of viewing and tapping the center of the viewport. Any other functions are secondary options.

These findings all fall neatly under the assumption that I have been making about the comfort of users’ switching the way they’re holding a device. As you’ve seen, even when people need to move their hand or stretch to get the center of the screen, they will very often do so. Not because we’ve made them, but because they choose to.

If we look at the data really closely, we might detect a slight preference for left-side tapping. This is subtle, but it definitely exists, and I believe this has much to do with Western languages that we read left to right—meaning most words align to the left side. Several studies have indicated that people tap targets such as words and icons. Keep in mind this preference for tapping words in lists for a minute.

Gesturing

In some touch contexts, I did discover a little more useful information about how people interact with their screens. One of the more useful things is understanding how people gesture. Scrolling is probably the next-most important thing to tapping. In Figure 5, you can see a slightly normalized heatmap showing where people gesture on a screen to scroll. Each of the three images in Figure 5 shows three distinct areas.

Figure 5—Where people gesture on a screen to scroll
Where people gesture on a screen to scroll

Why those three distinct areas? They have to do with what type of content is on the screen. The image on the left shows all of the data that I gathered about scrolling short content in a dialog box; the images in the center and on the right, data about scrolling long content in a full-page scrolling list. In the center image, the list items comprise very brief information, so there are large blank areas in the middle of the screen, and we still see that users prefer to touch the center. In the image on the right, where longer list items occupied much of the screen width, users did most of their scrolling on the far right. Even left-handed users were more inclined to avoid touching the content and actually reached across the screen.

After my presentation of this data at conferences, some people have shared their observations on the Arabic- or Hebrew-language apps that they’ve built, which are similar, just reversed. People use the blank spaces to the left in these right-to-left languages for gesturing—and now they know why.

It seems that users are not always confident about using scrolling gestures in areas where there are items—because either they worry that they will accidentally interact with them or they want to see the content. When a page is completely full of content, so there’s no room anywhere on the screen to touch where there is no content, people choose to scroll at the right side. And, yes, this behavior also varies a bit based on device size. On tablets, content might be relatively shorter due to the additional screen size, so there’s more empty space to touch.

You might think that users would stick more to the edges on tablets, because they are bigger. But, even though people can’t reach the center of the screen, they’re always inclined to tap the center of the screen. Even when space is available to scroll without covering the content, people move their finger, thumb, or stylus way over to the right—regardless of whether it’s a reach or requires completely repositioning their finger on their device.

Summary

We can’t design good touchscreen interfaces without really understanding how users work with their devices. Here’s a summary of my research findings.

Switching

People switch the way they hold and touch their phones a lot—even more than I had previously expected. What is a convenient way to carry your phone is not the same as the way you hold it to tap it, which is probably different from the way you type on it.

Assume that people move their hands around a lot. Design things to work well at whatever angle a user touches a device and with content covering all likely areas of the screen. Test your app or site on different devices and in different contexts to help expose all of the ways people hold their devices.

Centering

Even though centering touches seems to be subconscious—or maybe a learned behavior—users prefer to touch the center of the screen and will do so whenever you give them a choice of where to touch.

Think hard about what your primary information is. Place that key content and the actions in the middle half to two-thirds of the screen. Put other options and secondary actions along the top and bottom of the screen.

Gesturing

All other things being equal, people want to touch and look at the center of the screen.

Think about what people have to tap and the areas they can use to scroll. Whitespace may be really important to give your users the confidence to use gestures. Don’t overfill the screen. Instead give users enough empty space to use for scrolling and swiping. 

References

Hoober, Steven. “Common Misconceptions About Touch.” UXmatters, March 18, 2013. Retrieved August 18, 2014.

Hoober, Steven. “How Do Users Really Hold Mobile Devices?UXmatters, February 18, 2013. Retrieved August 18, 2014.

Hoober, Steven, and Patti Shank. “Making mLearning Usable: How We Use Mobile Devices.” The eLearning Guild, April, 2014.

Hoober, Steven. “Design for Fingers and Thumbs Instead of Touch.” UXmatters, November 11, 2013. Retrieved August 18, 2014.

President of 4ourth Mobile

Mission, Kansas, USA

Steven HooberFor his entire 15-year design career, Steven has been documenting design process. He started designing for mobile full time in 2007 when he joined Little Springs Design. Steven’s publications include Designing by Drawing: A Practical Guide to Creating Usable Interactive Design, the O’Reilly book Designing Mobile Interfaces, and an extensive Web site providing mobile design resources to support his book. Steven has led projects on security, account management, content distribution, and communications services for numerous products, in domains ranging from construction supplies to hospital record-keeping. His mobile work has included the design of browsers, ereaders, search, Near Field Communication (NFC), mobile banking, data communications, location services, and operating system overlays. Steven spent eight years with the US mobile operator Sprint and has also worked with AT&T, Qualcomm, Samsung, Skyfire, Bitstream, VivoTech, The Weather Channel, Bank Midwest, IGLTA, Lowe’s, and Hallmark Cards. He runs his own interactive design studio at 4ourth Mobile.  Read More

Other Columns by Steven Hoober

Other Articles on Mobile Experiences

New on UXmatters