You're in your office near the end of the day, preparing to head home. You log on to your computer, navigate the desktop, open a browser, sign in to your e-mail account. You read your latest messages and write a few yourself, then log out. As you're driving home, something about the car in the next lane distracts you, but a gentle alert reminds you to pay attention to the road. When you get to your living room, you turn on your video game console. You assume an identity and traverse the virtual landscape, evading some characters, blasting others. And from the time you sit down at your desk in the office until you make your final Xbox feint, you carry out most of these interactions without using your hands or even your voice but simply by moving your eyes.

Far from being science fiction, the technology to support such a seamless merging of our digital and physical lives already exists. It is the real-world spinoff from the burgeoning field of eye tracking. Loosely defined, eye tracking refers to any technology that can monitor the direction of our gaze and the behavior of our eyes, in the process generating data that give clues to our intentions. Interactions with devices equipped with eye-tracking sensors and software can seem intuitive and effortless, as if our gadgets are reading our minds.

Not so incidentally, as the technology advances, researchers are learning ever more about the workings of our eyes and unobservable aspects of the mind: our thoughts and mental focus and the pathways into our consciousness. Eye tracking can reveal whether we are processing the things in front of us or are mentally adrift, whether we recognize a face or have never encountered it before—or whether we did encounter it but then forgot. Our new understanding of eye movements is also spurring development in a host of industries, especially gaming, computers and health care. Marketers are eager to tap into our gaze patterns, too, with implications for privacy.

Researchers developed eye tracking primarily to learn about basic visual processing (say, how we meld independent streams from each eye into a single mental image). Clinicians were also interested in how eye movements relate to disorders involving vision problems, such as vertigo. Initially eye tracking was a matter of simple observation. The experimenter would sit across from a person and take notes about the behavior of the subject's eyes.

Early findings were surprising. Despite our subjective experience of vision as a smooth sweep across a stable landscape, the movement of our eyes is anything but steady. In most instances, our eyes stay relatively still for extremely short periods (usually around a third of a second), followed by rapid jerks until they alight on their next target. The short, still periods are known as fixations, and the quick jumps are called saccades.

The main reason for the jerky behavior is that our visual sweet spot is very small. Indeed, the part of the visual field that delivers a sharp image is about the size of a dime held at arm's length, with quality falling off sharply toward the periphery. So we move our eyes constantly to bring new pieces of information into central focus. That herky-jerky motion posed a puzzle for investigators: How, despite the constant movement, do we experience vision as stable?

Hence the quest for hardware that can track every movement of the eyes, no matter how fleet. Early in the 20th century psychologist Edmund Huey of the Western University of Pennsylvania (now called the University of Pittsburgh) created a device that could correlate eye movements with the words on a page as someone reads. It was a rather invasive apparatus, involving a plaster cup, worn on the eyeball, with a tiny hole through which the subject could see. A lever was attached to the eyecup and to the lever a pen, which made contact with a rotating drum as the participant read. To minimize irritation, the eyeball was anesthetized with cocaine, and the head was held in place using clamps and a bite bar. Other early contraptions used combinations of contact lenses, suction cups, embedded mirrors and magnetic field sensors to triangulate the focus of the viewer's attention.

Nowadays researchers rely on the way the cornea reflects light to chart the rotation of the eyeballs. In a typical experiment—say, to study reading or attention—a participant sits in front of a computer with her head on a chin rest. A small camera at the base of the computer zooms in on one (or both) of the eyes as diodes emit near-infrared light (people cannot perceive light at that wavelength, so they experience no discomfort). The light bounces back to the camera, and computer algorithms convert the reflection data into a real-time gaze path of the eye. By combining information about the pattern of corneal reflectance, the displacement of the pupil and the location of the computer in relation to the participant, tracking systems can tell precisely where the participant's gaze falls on the computer screen.

A greater challenge came when research moved beyond the laboratory. Spatial accuracy is critical in eye tracking because a minute error in measuring the orientation of someone's gaze will throw off any interpretation of what the person is seeing. Achieving that precision is harder in the outside world because head and body movements can interfere with measurements of gaze location. So researchers developed devices that superimposed information about the gaze onto the environment, such as a helmet topped by a camera that melded a recording of eye movements with a real-time video of the subject's field of vision. Today's wearable tracking devices take the form of lightweight goggles, but the principle remains the same: a small sensor tracks the dark spot of the pupil to pinpoint the direction of the gaze, and a tiny camera mounted directly between the eyes records the scene.

Knowing What We See

Besides exposing the mechanics of vision, eye tracking can also help us understand the invisible elements of cognition: what we remember, how we feel, and what we are paying attention to, whether we are aware of it or not. For instance, eye movements can reveal when we are looking at something we have seen before, even if we have no memory of encountering it. In a 2012 study led by psychologist Deborah Hannula of the University of Wisconsin–Milwaukee, subjects were asked to memorize an image of a face. When shown a panel of faces that included the original, the subjects spent more time examining the image they had seen before than those they had not. If the images instead included a slightly manipulated version of the original face, the subjects still tended to identify the altered image as the real thing. Yet their eyes were not fooled. The subjects spent less time looking at the manipulated images than the original, suggesting that the eyes recognized them as fakes. These findings have important implications for interpretation of eyewitness testimony—say, in gauging whether someone looking through a book of mug shots has seen one of the faces before.

Patterns in eye movements can also give us insights into thinking and emotion. In a 2009 study led by Rachel Bannerman of the University of Aberdeen in Scotland, researchers used eye tracking to examine how people regard threats. They discovered that the subjects' eyes moved faster toward threatening faces and body postures than benign ones, suggesting that our oculomotor system is primed to detect imminent danger. Individuals who are scared or anxious also show a bias toward threatening objects and faces and have a harder time moving their eyes away from the threats than other people do. Reinforcing this finding, a 2014 study by Jonathon Shasteen and his colleagues at the University of Texas at Dallas found that people are quicker to focus on an angry face in a crowd of happy faces than they are on a happy face in a crowd of angry faces, suggesting that danger more than singularity is what draws the eye.

Our eyes are also markers of mental effort. Eckhard H. Hess, a pioneer of pupillometry (the measurement of pupil size) in the 1960s, found that the pupils of his participants dilated when they performed challenging multiplication problems, much as our pupils widen when we enter a dimly lit room. The pupils are an ideally objective structure for research. Unlike our eyeballs, which we can consciously direct—say, by looking one way or another—we have no voluntary control whatsoever over our pupils. Researchers hope that analysis of pupil measurements will help reveal when workers are overtaxed, especially those in risky jobs such as air traffic controllers, baggage screeners, truck drivers and surgeons.

Similar research can help the desk-bound pay attention to what they are doing, too. Psychologist Erik D. Reichle of the University of Southampton in England is working on a system that can let people know when they are zombie reading—that phenomenon by which we move our eyes over text for a while without taking in a word we are seeing. In a 2010 study, Reichle had discovered that our eyes behave differently when we lose mental focus. If we are concentrating, our fixations tend to be shorter when we look at familiar words and longer when we look at less common ones. That variation is absent when we read mindlessly, even though our eyes are still hitting the mark. Now Reichle is trying to develop algorithms that can sift eye-tracking data and alert readers as soon as their attention wanders.

Practical Tracking

As we learn more about the relation between eye movements and the mind, eye-tracking technology is finding its way into real-world applications, especially in the control of digital devices, gaming and health care. Eye trackers can already replace the mouse for such tasks as clicking, zooming and scrolling. Users might click by staring at an icon for a period of time, zoom in or out by fixing on a location and pressing a controller key, and scroll by moving their eyes up or down.

Adding an eye-tracking system to a computer or tablet is simple. The devices, which incorporate a light source and sensor, are small and sleek and adhere to the bottom of a monitor or the frame of a laptop or tablet, connecting through a USB port. They are relatively affordable, with models from companies such as the Eye Tribe and Tobii costing $99 to $139. Users install the relevant software and complete a quick calibration procedure (usually a training program that teaches the software the characteristics of the user's eyes). Current devices require the user to do some programming (such as creating drivers), so they are not quite plug and play. Computers and tablets with built-in eye-tracking technology are expected to reach the market soon, including, according to rumors at press time, an Apple iPad Pro. Indeed, Apple submitted a patent in 2013 for eye-tracking technology that would address the tendency of an image to fade from our perception if we stare at it too long.

Mobile devices that monitor eye movements are also making their way into the market. If you want a fumble-free shutter button on your iPhone, one app already lets you take photographs by winking. Google Glass will do the same. The Samsung Galaxy S4 and S5 phones already let users pause videos by looking away from the screen or turn pages on an e-book with a tilt of the head.

The industry most eager to adopt eye-tracking technology may be gaming. An eye-tracking version of a shooter game, for instance, would let players move their avatar around a virtual world by looking at the spot where they want to advance. Pressing a key might open a menu of weaponry; players would select items by blinking and attack by looking at the target and pressing the trigger key. According to an online story in the January 2014 Gizmag by Jonathan Fincher, gamers who tested early versions of the technology said that using it was a little uncomfortable at first, with the urge to reach for the mouse especially hard to resist, but in the end they found that aiming and shooting with their eyes was faster and more accurate. Eventually players equipped with eye-tracking gear will likely have a speed advantage over those using standard mouse controls, and nothing drives a technology like an arms race.

Good Medicine

Eye tracking is also on its way to becoming an important tool in health care. Already the technology has streamlined the screening and diagnosis of a variety of disorders with visual components, and it will soon help people with disabilities navigate the world.

On the diagnostic frontier, eye tracking is particularly useful in detecting Parkinson's disease, schizophrenia, and a host of childhood maladies, including autism, attention-deficit hyperactivity disorder (ADHD) and dyslexia. People with these disorders have unique patterns of eye movements that simple computer tests can spot. In pioneering work at the University of Southern California, for example, neuroscientist Laurent Itti's lab devised algorithms that have helped identify people with Parkinson's with 90 percent accuracy and people with ADHD with nearly 80 percent accuracy. A 2014 study by psychologist Eva Nouzova of the University of Aberdeen and her colleagues reported progress in using eye tracking to diagnose major depression. And in work published in 2012 psychologist Philip Benson, also at Aberdeen, and his colleagues developed tests that can distinguish patients with schizophrenia from healthy controls with nearly perfect accuracy.

The schizophrenia test takes advantage of an anomaly in eye movements: when we track a moving object like a ball flying through the air, we follow the object smoothly, without saccades. The implication is that smooth pursuit uses different neural circuitry than activities such as reading. When people with schizophrenia try to follow moving objects, however, their eye movements are jerky. So to screen for schizophrenia, technicians ask subjects to follow a dot as it moves around a computer screen and flag anyone whose eyes show telltale saccades. (Benson's team won an award for its research and will use the prize money to bring the program to market.)

Beyond diagnostics, researchers are using eye tracking to help people with physical disabilities live independently. Individuals with neurological disorders and brain and spinal cord injuries often have limited ability to communicate. Computers equipped with gaze-interaction technology would let people use their eyes to open a browser, find their e-mail inbox and “type” by selecting words on the screen. For those who cannot talk, voice-output systems would play the text through speakers. For most, the systems would likely supplant so-called BCI (short for brain-computer interface) spellers, in which a person observes a grid of letters while wearing a cap studded with electrodes that can identify brain activity. To select a letter in the grid, the user must focus on it for several seconds. Eye tracking, in contrast, can detect the location of a viewer's gaze instantaneously.

Do You See What I See?

Like many new technologies, eye tracking raises a host of ethical and privacy concerns. In this increasingly data-driven age, we have cause to wonder who will have access to the kind of information our technology collects. Any number of people could be looking over our shoulders as we browse the Internet with an eye-tracking PC or drive a car with a tracker installed (such as Hyundai's HDC-14 concept car). Could advertisers get hold of this information? What about insurance companies or the police?

Currently advertisers use cookies to track the Web sites you visit so they can serve you ads for products that might interest you. When computers come with eye-tracking systems, these advertisers could use information about where you are looking on a page to tailor the ads even more. Some users might find the fine-tuning helpful, but imagine if pop-up ads moved around a page with your gaze or the video ads on YouTube “knew” when you were not watching them and paused until you looked at them again. These tricks are well within the scope of the technology, and clashes with consumers are bound to occur.

In 2012, for instance, Microsoft patented eye-tracking technology for its Kinect gaming devices to let the company collect information about where users were looking on a screen while playing, causing worries that Microsoft would be tracking which ads gamers were looking at and for how long. The company got into hot water over privacy concerns the following year, with rumors that the company would sell Kinect data to marketers and use Kinect for targeted advertising. Microsoft was also planning to tailor ads to the mood of the user by running the images captured by the eye-tracking system through facial-expression analysis. Some Kinect users voiced concern that the device would be always on and always listening, like Big Brother. Microsoft responded with a series of statements in October 2013 assuring users that they would be able to turn off the device and ad-tracking features and that the company would not collect the data unless the user wanted it to.

The use of eye tracking as a means of identification is evoking more uneasiness. Researchers in the computer science department at Texas State University are testing biometric systems that can identify people from their unique eye-movement patterns as they read text or view a picture. In recent studies, the accuracy of eye tracking in identifying subjects was a little more than 70 percent. That rate is far below the accuracy of iris scans (90 to 99 percent) or fingerprints (up to 99 percent). As computing systems and tracking technologies develop, however, the gap will likely narrow. Even now the technology has clear benefits for home security and protection of technology. For example, intruders would be locked out of your computer because the system would know from their eye movements that they were not the owners. The technology is also easier on the subject than iris scans, which require the user to hold still. The worry, however, is that an eye-tracking ID system is amenable to covert and invasive deployment.

No gadget has more potential for invasiveness and general spookiness than Google Glass, a wearable computer that projects images through a series of lenses onto the user's retina. Like most portable devices, it will have a camera, too, facing outward. Although the current beta version of Glass does not have built-in eye tracking, Google has filed a patent to incorporate the technology into head-mounted devices. The patent, which covers the ability to track gaze and measure pupil size, suggests that Google plans to assess user engagement when people look at ads. Gaze tracking would tell Google what the users were seeing (ads, objects, people); pupillometry would measure their emotional response to these objects and to people in the environment. Armed with these data, Google could deploy a “pay per gaze” system in which advertisers paid the company for each look at one of their ads. The technology would work with anything in the user's field of vision, including billboards, magazines and other print media, as well as images displayed on Glass.

The ethical concerns are obvious: the device could potentially identify not only where people were when they were wearing it but also what, and whom, they encountered. Google's patent addresses privacy issues by making the data collection anonymous and letting users opt out of this form of tracking. Yet will these assurances hold if someone on the NSA watch list happens to pass before your gaze?

In the end, even when eye-tracking technology lets us control the devices that are tracking us, our sense of command may be illusory. If the eyes are the windows to our souls, we need to know who else is looking through them.