Isbister and her colleagues are investigating how specific movements trigger certain feelings and thus learn how gesture-based devices such as the Kinect and the Nintendo's Wii can essentially use your body to hack into your brain. A better understanding of what motions trigger which emotions could help make gesture-based computer interfaces more enjoyable.
"You might not want to make flat-palm gestures for instance, as if you were touching a screen—that might not be especially ergonomic and cause strain if you take that pose for too long," Isbister says. "You might want something more like the curved wrists and circular, flowing movements you see in tai chi. We want to develop a vocabulary that really takes advantage of the technology."
The broad, sweeping motions used with the Kinect and similar technologies would likely not be suited for anything requiring subtle motions such as typing or any long-term activity that can tire your arms if you keep them raised too long. "The keyboard and mouse is much better for long-term, fine-grained work," says M.I.T. Media Lab computer scientist Aaron Zinman, who worked on the Depthjs hack of Kinect to surf the Web. The Kinect also needs a lot of space, requiring that users stand about two to 2.5 meters away from screens, Isbister notes, which could also lead to privacy issues at the office.
"It is best not to think about traditional typing and mouse applications being directly controlled by the Kinect, but rather to think of new possibilities for which the keyboard and mouse were not well suited," Zinman adds. "When we think of group collaboration or pulling in real-time physical world objects into a digital setting—architectural models—these are examples where we want digital tools to aid us that go beyond a single person sitting at a computer."
In the office one might imagine using gestures to sort and rummage through mountains of files "like you would organize work spaces in your house," Isbister says. "It could prove useful for all those endless tabs you have in Web browsers, or how you can have lots of windows on your screen that are very confusing and make you lose your sense of place in the space you are arranging for yourself." These systems could also work together with methods for visualizing data to help users hunt through arcane databases for vital details or interesting trends.
The gesture-technology frontier
The office is not the only place where the Kinect and related technologies might go. At CES, Oslo-based Elliptic Labs debuted a Kinect-like touchless interface for iPads and other tablet computers that uses ultrasound to scan people's movements, whereas a gesture-based way to remote control televisions came out from Brussels-based joint venture Softkinetic–Optrima partnered with Rotterdam-based Metrological Media Innovations.
Microsoft's Kipman says the company's focus with the Kinect is on games and entertainment. Still, "it's easy to imagine the many ways Kinect could be used," he notes. "Microsoft has deep investments in natural user interface. It is part of the company's long-term strategy."
If gestures do become a regular part of everyday computing, "it's hard to anticipate what the secondary effects might be until after you see this technology deploy to a million or more people," Isbister says. "When people first started talking on their phones through headsets and earpieces, it looked like you saw all these people talking to themselves. It could be with gesture technology, the guy flailing around on the street is just talking on his cell phone."