Imagine if every conversation you had was like speaking with someone in a foreign language that you only partially understood. Your conversations—to the extent they could be called that—would be filled with an exasperating combination of confusion, frustration and even embarrassment at being unable to comprehend many of the words and phrases that native speakers take for granted. That’s what it feels like for the nearly 8 percent of U.S. kindergartners who suffer from a developmental disorder called specific language impairment (SLI), except that instead of struggling with a foreign language they find it difficult to communicate verbally in any language.
Children with SLI—also called developmental language disorder—can hear just fine but have difficulty processing the meaning of spoken words. It takes them longer than other children to learn to speak. When they do start to form words and sentences they tend to leave off the grammatical endings of verbs that indicate past tense, and their words do not always come out in the right order. These difficulties affect their ability to read, and thereby their ability to learn in general. Researchers have struggled for years to understand the disorder, challenged by their communication barrier with the children they study. In recent years scientists have begun to realize that their best source of information about SLI is visual rather than verbal—a child’s gaze speaks volumes when words fail.
Eye-tracking technology has already proved itself in a number of research areas. Advertisers and publishers have long scrutinized eye movement to better understand which parts of an ad or article best capture a reader’s attention. Carmakers use similar data to study driver distraction and software companies use eye-tracking technology to help them design games and virtual reality experiences. Researchers value eye tracking when studying kids with SLI and other language disorders because it offers a window into how their young minds work.
Technology for tracking eye movement has been around for decades and comes in a few different types but the basic principles are the same. An LED or some other source of near-infrared (IR) light is shined on a person’s eyes. The light creates reflection patterns on the cornea, the transparent front layer of the eye, which are captured by digital cameras that can detect the exact position of the pupil or iris, the central parts of the eye that control the amount of entering light. Software uses this data to calculate where a person is looking. The newest eye trackers resemble Google’s abandoned Glass headset, with the IR lights and eye-tracking sensors built into the frame of a pair of eyeglasses. A more conventional setup consists of a long rectangular sensor bar containing lights and cameras placed in front of—or embedded in—a computer monitor that measures where and how long a person gazes at the screen.
Activation and Inhibition
At his Midtown Manhattan office Richard Schwartz uses eye-tracking sensors and software from Sweden-based company Tobii to study the impact of SLI on children’s language and speech. “It’s a very underserved population in some ways,” says Schwartz, a professor of speech, language and hearing sciences at the City University of New York. “A lot of these kids get services to help them early in school but at a certain point they get lost because they may end up having reading problems or other learning disabilities, and people stop working on their language per se.”
Schwartz has used eye tracking for the past seven years to study children with SLI. In one type of test—known as a visual world paradigm—a child is shown four pictures on a computer screen hooked up to an eye tracker. For example, the pictures might include a cat, a bat, something else that starts with the letter “b” and a picture normally associated with a cat, such as milk. The computer speaks a word describing one of the pictures and then captures the child’s eye movement every three milliseconds until the child uses a computer mouse to click on one of the pictures. “Between the time the child hears the word and the time the child clicks on the picture, eye tracking allows us to look at how they are activating information unconsciously,” says Schwartz, who also uses eye tracking to study speech problems in kids with hearing loss who rely on cochlear implants.
The word “cat,” for example, should prompt the brain to at least glance at the bat because the words sound similar, and at the milk because it is often associated with felines. When a person hears a word, many related words become active before the brain starts inhibiting, or tuning out, certain words to arrive at the correct answer, Schwartz says. “Children with SLI don’t necessarily activate all of the related words,” and this is reflected in their eye movements, he adds. “They get the answer right but their mental vocabulary isn’t connected to the same degree that it is in typically developing children—or adults for that matter. If there’s any activation at all of the competing pictures, it’s much slower than normal. That means that in some very important ways their underlying language system is deficient.”
Life without Language
“Traditional ways we have of assessing kids can only tell you the end product of a very complicated process,” says Courtenay Norbury, a professor of developmental language and communication disorders at University College London who has used Tobii eye-tracking technology to study children with language development disorders, including SLI and autism. “The appeal of eye tracking is that it can sort of give you a window into the process itself. You can see how kids are processing language in real time.”
For all of its uses eye-tracking still has limitations. For starters, “you want to understand how kids are processing language when they’re interacting with other people as opposed to a computer screen,” Norbury says. “The visual world paradigm has been hugely useful but is still very far from what we do in everyday interactions.” Eye trackers also generate an enormous amount of data that researchers have to sift through to find meaningful information. “Capturing the eye movement is getting easier as the technology advances but the processing on the other end is really complicated and time consuming,” Norbury says. “There is still a bit more flexibility in interpretation [of the data] than I would like—different patterns of looking are easy enough to find but what that tells us about cognition is still somewhat uncertain.”
The exact cause of SLI is unknown but the most recent research suggests it is related more to genetics than a child’s environment. Between 50 and 70 percent of children with SLI have at least one other family member with the disorder, according to the U.S. Department of Health and Human Services. Treatment for SLI is limited. Teachers might use interactions between a child and puppets or other toys during which the child is asked to describe what they see in a way that elicits a particular verb tense—for example having the child describe what a puppet is doing to stimulate use of the third person. Another approach is to pair a child with SLI with a classmate whose language is developing normally and can serve as a role model.
Failure to diagnose and treat children with SLI can have very serious consequences as they mature. Some studies show children with language disorders are at a much greater risk for social, emotional and mental health problems as they become adolescents, Norbury says. They also have difficulty with attention, social anxiety and social phobias because they find verbal interactions so difficult. “You need language for jobs, for negotiating with people, for problem-solving—so people with language impairments are at much greater risk of not being to get or maintain employment,” she says. “Language is something we take for granted so much that we overlook the impact it would have if we couldn’t do it.”