In fact, in 1968, using the term user experience would have been met with confusion; the term human factors was far more familiar then. But while terms like user experience, customer experience, human factors, and interaction design hint at a semantic mess, they also illustrate the richness of the fields that user experience can draw upon, as well as through which it can develop. (There is a lovely Venn diagram of the fields that contribute to or make up user experience on Visual.ly, but what is even more telling are the comments pointing out lots of areas that the diagram has missed.)
At their core, computers are tools that we use to expand our cognitive abilities. For example, Doug Engelbart was inspired to develop computers as tools for analyzing and displaying information to collaboratively solve important problems. At one level, user experience is about designing tools that focus users’ attention on elements of interest, providing them with the right level of information to make decisions.
In this column, we’ll look at one possible direction in which interacting with computers could go: augmented cognition. Research into augmented cognition started around 2000 as part of a $70 million DARPA-funded research project, bringing together researchers in academia and industry with members of the armed forces. The goal of research into augmented cognition is:
“To create revolutionary human-computer interactions that capitalize on recent advances in the fields of neuroscience, cognitive science, and computer science. Augmented cognition can be distinguished from its predecessors by the focus on the real-time cognitive state of the user, as assessed through modern neuroscientific tools.”
So, essentially, augmented cognition is about understanding the state of a user’s brain and using that understanding to manage the user’s interaction with a computer. For example, if a user were receiving too much information in image form to process it effectively, you might trigger an audio alert to ensure that he responds to another pressing matter. In this way, the user avoids becoming overloaded with information and is in a better position to act appropriately.
Our first step is to understand the state of a user’s brain. When we talk about understanding brain activity, we generally think of people attached via the head to a huge amount of wiring. Indeed, Samsung are using a fairly elaborate headset to develop a device that enables people to operate a computer through brain signals.
While sophisticated systems are in use in research environments, simpler devices are available for home use. On my desk, I have a lightweight headset called the MindWave, from a company called NeuroSky. I can use this headset to monitor my brainwave signals. It’s reasonably comfortable to wear, and while I confess to being a geek, it’s really fun to use, providing enough data of sufficiently good quality to enable me to interact with a range of applications. As the technology continues to reduce in size and obtrusiveness, we’ll be able to introduce these devices into more fields.