Eye tracking—a technology that uses infrared light to monitor eye movements—has been around for decades. Today scientists at many research labs track gaze, blinks and undulating pupil size to scrutinize the relationship between eye patterns and our health. But in recent years the technology has oozed into consumer products.
Google recently filed patents that industry experts speculate could incorporate eye tracking into Google Glass, the company’s head-mounted smart computer. Many people are concerned about the privacy implications of the front-facing cameras in Glass, which could record snapshots and video of throngs of oblivious bystanders every day. But only a few researchers have pondered what Google could find out about users based on their gaze—and whether we should be worried about a potential invasion of privacy.
There may not be a clear verdict yet but Scientific American MIND has brought together two experts to share their views. These include human computer interaction researcher Mélodie Vidal, PhD candidate at Lancaster University in England, and vision scientist Michael Dorr, international junior research group leader at the Technical University of Munich in Germany.
[An edited transcript of the conversation follows.]
Scientific American MIND: How does head-mounted eye tracking work?
Vidal: There is generally an infrared light next to a camera directed at your eye, and it projects this light onto your eyes. You cannot see the light but the camera can. It can tell where the pupil is and can track how the eye moves—meaning the different sequence of blinks and fixations—and where it is looking. There’s usually also a scene camera that is looking outwards. You can associate the user's fixations with certain markers in the scene to map what the user sees.
If eye tracking were to be integrated into Google Glass, what can a user’s gaze potentially reveal about themselves?
Vidal: Controlled experiments have shown that eye tracking can tell whether someone recognizes a face or not, whether someone is tired or focused on their work, if someone is reading, if they're in a social context or if they're driving. I don't think it can tell someone’s identity unless it could do iris recognition, but similar to your browser history, it could infer sex or age. The current mobile eye trackers can only detect where you are looking and maybe the size of your pupil. Eye tracking can tell a lot of things from pupil size but these are all from experiments that were in controlled conditions. In practice, pupil size is mostly affected by light, so I don't think you could tell much if you were just observing someone's pupil size in a daily setting.
Dorr: When you read, you have a very specific pattern of eye movements from left to right. If you are driving a car, you are using other eye movement patterns. […] The current Google Glass outward camera alone can tell a lot, and with integrated eye tracking, it can tell even more. It can tell where someone is looking, often by where the outside camera is pointing. On top of that, you can infer more specifically the details of a scene and what’s of interest. In principle, changes in pupil size can give you information about emotional states or emotional arousal.
What makes gaze unique? How is it a more informative representation of yourself than, say, the clothes you wear or the car you drive?
Vidal: It indicates our attention, whether we want it to or not. There are 30 years of research telling us all the different things we can learn from it. It gives insight into what someone might be interested in or what they might be thinking. There is never 100 percent accuracy but it's a good enough hint.
Dorr: Gaze tracking has been called the window into the soul. Gaze is tightly linked to what visual information you might be processing in detail. And once you have a head-mounted [outward facing] camera, you have a detailed representation of what people are looking at. You’d be able to find what they think is interesting in that scene.
What can a head-mounted eye tracker reveal about someone’s health?
Vidal: It depends on whether you have a database of people and their eye movements to compare it against. It is possible to detect certain kinds of autism with eye tracking. Healthy people generally scan another person’s face in a triangular pattern between the eyes, nose and mouth. Certain kinds of autistic people avoid this and look at the line between the skin and hair or fixate on an ear instead. It can also detect certain kinds of schizophrenia. When following a moving target, our eyes generally move in a smooth pattern. Certain schizophrenic people cannot do this and their eyes jump instead of moving smoothly. That's because the smooth pursuit movement comes from the same part of the brain that is affected by schizophrenia. That's really cool. You can also tell certain early onset of Alzheimer’s disease just by looking at the speed and movement of the eye. We are actively researching how we could bring this into people's homes, especially elderly people's homes, and monitor over time whether they are developing a mental disease or not.
If Google Glass integrated eye tracking into their device, who could potentially have access to the data?
Vidal: It depends if the server is protected or public. I suppose it would be as private or protected as Apple iCloud. There’s never 100 percent certainty that there won't be any leaks. But it's exactly the same as your Android phone. If we integrated eye tracking into, say, a mobile device, then an app would ask for authority to access your eye-tracking data. From there I suppose they can do whatever they want with it.
Dorr: It depends if they really do all the processing on the computer attached to Google Glass or if the data is sent to Google’s servers. I personally would feel uneasy with a live feed of my vision being stored somewhere. Maybe no one will look at it—because, who cares? But I personally wouldn’t be comfortable. It depends on the application scenario.
What could Google potentially do with this data?
Vidal: Because I'm biased from my field, the main thing I think of is interaction [how humans interact with computers]. But you can definitely imagine that they would tailor advertisements based on what you seem more interested in. They could count the number of times you look at an ad and maybe give the advertiser different kinds of monetization rewards depending on how long people look at their ads. You could use the eye data to play a game. Or monitor your mental health. The possibilities are pretty endless.
Is eye-tracking integration into a wearable device like Google Glass an invasion of privacy?
Vidal: It depends on what Google does with the data. I think they would have to be very open about what they are doing with it, but they are pretty open with this already. I think we need an open dialogue on eye-tracking data to educate people about what the computer can see and infer. But I think it's the same as privacy concerns about location or browser history. As long as you're aware that you are being tracked and that you have an option to delete everything once you are done, I think that wouldn't be a problem. It would just be one more way to profile the user and infer a little bit better who they are, what they are interested in and how better to target them with ads that might be of interest.
Dorr: I feel [Google Glass] is already an invasion of privacy without eye tracking. I don’t think eye tracking is adding much to it. I don’t like the idea of cameras pointing everywhere all the time.
How do you think people will feel about eye tracking becoming more mainstream as it is integrated into wearable technologies like Google Glass?
Vidal: I really think it depends on how we educate people about their gaze and what gaze trackers can actually do. It's easy to fear the technology—for example, whether infrared light is going to harm your eyes or not. It doesn't, but people instinctively shy away from it. If there is a good enough framework and laws surrounding eye data and its privacy, and if we can communicate that effectively to people, I don't think people will be too afraid of it. I also think people will suddenly become more aware of what their eyes are doing at every moment. Maybe there could be a social training aspect to this, where people suddenly realize that their eyes are not just sensors, but are actually communicating something.
Dorr: Some people have been reacting against it. Other people don’t seem to mind too much.
Do you think that eye-tracking technology will benefit the user?
Vidal: In terms of interaction, absolutely. I think it's mainly about making interfaces more natural and reacting faster to what the user is interested in doing and not having to go out of your way to click something. Instead you could just do everything with your eyes. It's fascinating to see what disabled people are suddenly empowered to do. There are a lot of eye-typing applications, for example, which allow people to have a voice again and express themselves. Or there are ideas to steer a wheelchair just with your eyes. That's extremely empowering.
Dorr: For research purposes—especially in helping people with multiple disabilities—it’s great. For everyday use, there aren’t any strong, compelling reasons why you’d need Google Glass or eye tracking at home or why it would make life easier. It is a fairly compelling case in theory but in practice it hasn’t worked out very well and certainly hasn’t caught on in a widespread fashion. One big problem is the Midas Touch: If you’re using your eyes to, say, select an item on the screen, it’s really difficult to tell if you’re looking at the item to select it or if the gaze is the action. That can be quite disturbing. The typical approach is that you have to stare at something for a second to activate it. It’s annoying to have to stare at something for too long. There are a couple of mixed scenarios for eye tracking to be used beneficially, but right now I don’t see the big benefit.
If you owned Google Glass, would you wear it every day?
Vidal: I don't think so. It's still too stigmatizing. I think Google was very brave to finally take this technology that had been around for decades in research and actually turn it into a consumer product and educate people about wearable displays and that they’re a good thing. It has opened up the path for other companies that are going to try to jump into this scene now. So all they have to work on now is the aesthetics factor—and the price, which is making a lot of people refrain from using Google Glass. If it could just look like a normal pair of glasses and if you could have a killer application that is really useful, then I think wearable displays definitely have a future. But it needs a little bit more refining. And integrating eye tracking in it is definitely a good thing.
Dorr: No. I would not want to run around with Google Glass on my head.