I was reminded of this just the other day. There’s a new application for Android that can simulate a transparent display, making the world beyond your phone the backdrop, always in motion. What struck me—aside from the fact that its Augmented Reality-like technology apparently is not unique—was that the reviews discuss how it’s a safety feature.
“...a new Android app that makes your screen, well, transparent. As a result, you can use most functions of your smartphone while being aware of objects and other people in front of you.”
Naturally, no one actually says this now means you can text in your car safely, but they nevertheless seem to make the argument: as long as your eyes can stay on the road and you keep both hands on the wheel, everything will be fine. Well, it’s just not true. And, this reality is not limited only to mobile calling or texting while driving. In actuality, augmented reality, head-up displays, telepresence, and interactions in distracting contexts all have similar perceptual and behavioral benefits and pitfalls.
This fake transparent screen provides another hint that we’re moving more and more toward truly ambient computing, and new user interfaces like actual transparent displays, ubiquitous gesture sensing, and other more unusual and unpredictable user interfaces. If we want these kinds of user experiences both to be safe and to make us more productive—instead of banishing computers from everywhere, not just voice and text from our cars—these systems need to work within all environments—and, in fact, work with these environments. This is not an entirely new frontier, though. Similar devices are already in use in government and industry—and more are now being tested in laboratories. Some specialized devices have been in use for decades and are in their third or fourth generation, and they have taught us many lessons.
So, where do we find information about how these devices work? Well, if you read articles about user experience, you presumably live and work in a design community, so start by asking your coworkers, friends in other design organizations, and anyone you know who actually uses such devices.
A Story About Getting Things Totally Wrong
I started really thinking about this topic—that is, about writing up the benefits and pitfalls of such devices—at dinner a couple of months ago. I was talking with some people from an interactive agency about the design challenges and the difficulties of understanding context. Knowing they had done some government work, I steered the conversation toward specific interactions and learned about a test they had done with a telepresent control platform they were working on.
The agency guys did most of their design work at their office—often surmising how end-user soldiers might work with it—but periodically they got permission to go to a nearby fort and observe first-time use, demonstrate features, and more or less perform an ethnographic study of the product. I say “more or less” because things kept going so badly that they would have to intervene. There were tasks the soldiers had to do besides robot driving, so the designers couldn’t just let the operators mess with the equipment forever. Plus, when guns and big robots that could theoretically squash you were involved, there were safety issues with just letting the soldiers freely figure it out on their own.
One intervention they recalled was when they simply gave the eyepiece and controller to a soldier, with minimal instruction, and told him to turn the robot to face the other way, so they could go down the road. He turned his body around and was surprised that this didn’t do anything. Instead, you have to press buttons on a hand controller to turn the robot, so they had to tell him this after a short delay. There were a number of issues with task failures and response times—even once the operators had become familiar with the system. This was consistently frustrating to the designers, since they could not understand why this was happening. The soldiers finally invited them to try it themselves, so one of them put on some loaner armor and a helmet and mounted the eyepiece and associated gear, then followed the team through the exercise.
He described this experience to me, saying, “Clearing rooms is hard.” They had completely failed to understand how many tasks the soldiers were routinely undertaking at once, how visual the tasks were, and how much of a cognitive load they were under. They immediately realized that displaying vehicle information over the unrelated surroundings didn’t make a bit of sense to users. When they went back to their office, they kept these lessons in mind, and redesigned the equipment to be less obtrusive and capable of being dismissed, so user could focus on real-world tasks.
These designers realized that they had been applying design solutions from other, less-challenging contexts, and using their intuition to determine how the context change would influence people’s ability to use the equipment. They finally learned the reality, but only after building code and putting the solution in front of people.
The Mission or the Equipment?
I followed this conversation up with some brief questions to soldiers, Marines, and airmen that I know who operate UAVs (Unmanned Aerial Vehicles), RPVs (Remotely Piloted Vehicles), and drones in the field. Regardless of the device—whether winged, hovering, or wheeled—their control interfaces broke down into two categories—what you might call head-up and head-down.
Pretty much all of these devices require a laptop or tablet PC of some sort as the control unit, which, at least for now, soldiers have to carry around with them. The older, less-sexy devices tend to use that PC as the control unit as well—or at least rely on it for the display of information, even when a soldier uses a wand or other controller to give commands. This PC is, ideally, mounted to the operator’s armor, so they can hinge it down and use it as a sort of table that’s attached to them. There is no need to sit down to use it, but when using it, the operator is head-down, looking at nothing but the screen.
The other type of device includes some flavor of a see-through, head-up display that is attached to the operator’s helmet, glasses, or goggles. Display symbology—or even images and video—display in the operator’s peripheral vision, and they are designed to enable the operator to see right past them. The operator can then do his other job at the same time—for example, walking down a narrow trail at night without falling down.
Clearly, the head-up version is better, right? Sort of. Basically, it’s a wash. “In a tactical environment, outside any COP (Combat Outpost) or FOB (Forward Operations Base), the user must be protected due to tunnel vision.” The operators were assigned a body guard, whose job was to make sure they didn’t fail to notice a threat and remind them to stow the computer or switch their gaze to what was most important.
The best device they used was the head-mounted HUD (Head-Up Display) that was part of the Land Warrior package. Not because it was in front of the operator’s eye per se, but because it was hinged. This meant operators could easily flip it up and down. When up, it is turned off and out of the operator’s line of vision, so no longer a distraction. Flipping it down, instantly turns it on and displays relevant information on position, radios, and so on. “I would take a knee, view the information and then be back in the fight within a few seconds.”
As the designers whose story I told earlier had found, all of these devices probably look great on paper, work well in a cubicle, or even work well in a demonstration. But once operators get into the field and start doing multiple activities, the limitations of multitasking start to become obvious. Even users who are not trained in design and analysis can figure this out pretty easily after using poorly designed systems, and they’ll start to reject the Buck Rogers solution. One combat leader summarized this: ”A key lesson learned through the use of all of this equipment is that the mission must drive the development of equipment, not the other way around.”