Boredom manifests itself in more than yawns and glazed eyes. Subtle body cues called noninstrumental movements—squirming, scratching, shifting—also give away a person's mental state. Like teachers and other public speakers, machines can now also pick up on these telltale signs of restlessness. A new study reveals that when computer users tune in to on-screen material, their fidgeting lessens—and algorithms can use that information to discern attentiveness in real time.

To measure engagement, psychobiologist Harry Witchel of Brighton and Sussex Medical School in England and his colleagues outfitted 27 participants with motion-tracking markers that a computer's visual system could follow. The participants then read digital excerpts from a novel by Mark Haddon, The Curious Incident of the Dog in the Night-Time, and from the European Banking Authority's regulations. Based on motion in the head, torso and legs, the computer could tell when a person had mentally checked out. In fact, an analysis of the cumulative movements revealed that when people read from the novel, they fidgeted nearly 50 percent less than when reading the banking guidelines.

The system, described in Frontiers in Psychology, adds to a growing body of research on “affective-aware technology,” says Nadia Berthouze, a computer scientist at University College London. Once the program is perfected, Witchel thinks educators could use it to create digital lessons that recognize when a student's attention is fading and respond with strategies to reengage him or her. The system could also help researchers build robots that are more emotionally sensitive companions for humans.