ADVERTISEMENT

Texture Messaging: Breakthrough May Help Spinal Cord Patients Experience Tactile Sensations

In a first-ever experiment, primates move and feel objects on a computer screen using only their thoughts



Solaiman Shokur

In a first-ever experiment, primates move and feel objects on a computer screen using only their thoughts

When real brains operate in the real world, it's a two-way street. Electrical activity in the brain's motor cortex speeds down the spinal column to the part of the body to be moved while tactile sensations from the skin simultaneously zip through the spinal cord and into the brain's somatosensory cortex. Most of us would have trouble doing the former without the latter: Absent the feel of a floor beneath your feet it's awfully difficult to walk properly, and lacking the tactile sensation of a coffee mug, your brain cannot sense how tightly your fingers should grasp it. There have been tremendous advances made in brain–machine interfaces in which electrodes implanted, first in monkey brains and, now, in those of quadriplegics and patients with "locked-in syndrome," translate motor cortex electrical activity into output that moves a prosthetic arm or computer cursor. Until now, however, they have addressed only half of our interaction with the world. A new study offers hope of expanding that capacity.

Scientists led by Miguel Nicolelis, professor of neurobiology at Duke University Medical Center, are reporting the first-ever demonstration in which a primate brain not only moves a "virtual body" (an avatar hand on a computer screen), but also receives electric signals encoding the feel of virtual objects the avatar touches—and does so clearly enough to texturally distinguish the objects. The experiment involved two monkeys. If the technology works in people, it promises to make quite a difference to paralyzed patients. They would not only be able to walk and move their arms and hands, says Nicolelis, but also to feel the texture of objects they hold or touch, and sense the terrain they walk on. "You cannot produce motor behavior without tactile feedback from the environment," he says.

In the study, reported in the October 6 issue of Nature, Nicolelis and his colleagues implanted two sets of electrodes in the monkeys' brains. (Scientific American is part of Nature Publishing Group.) One set sensed the electrical activity of neurons in the motor cortex and translated it into signals that steered an avatar arm on a computer screen. Other electrodes, in the somatosensory cortex, received electrical feedback from the avatar hand. The monkeys were trained to move the avatar arm with their thoughts and touch identical-looking circles on the screen. The touch triggered the transmission of high-frequency electrical signals meant to encode virtually the object's unique texture. The monkeys learned that touching only one of the three objects won them a reward (juice), so they were motivated to put their avatar hand on that one and not the other two. It took one monkey four tries and the other nine to learn how to select the desired object by texture alone. "This is the first time a brain has controlled a virtual arm that touches objects and receives signals that describe the texture of those objects," Nicolelis says.

The field of brain–machine interfaces or "neural prosthetics" has proceeded in fits and starts. The first paralyzed patient received a neural implant in 1996, translating thoughts into words spoken by a computer. A few other victims of accidents, stroke orlocked-in syndrome have had electrodes implanted in their motor cortexes or attached to their scalps to translate electrical activity in the brain into an output that moves a prosthetic arm, computer cursor or other device. But not all of the devices have worked, and some have deteriorated after a few months.

That disappointing record is about to change. At the University of Pittsburgh, for instance, neuroscientists led by Andrew Schwartz have begun recruiting patients paralyzed by spinal cord injury: As with the Duke monkeys, they, too, will be able to "feel" the environment around them thanks to electrodes in the somatosensory cortex that receive information from a robot arm. "This will be essential for manipulating objects," Schwartz says, whose research is independent from Nicolelis's. "Giving a subject the sense of touch...would be a novel advance."

And it is well within reach, Nicolelis says. He is a founder of the Walk Again Project, an international collaboration whose goal is to develop the first brain–machine interface that will give paralyzed patients full mobility through a "wearable robot." Think: Iron Man, a full-body exoskeleton-like prosthetic, the interface controlled by neural implants that capture signals from the motor cortex to move legs, hands, fingers and everything else as well as be studded with sensors that relay tactile information about the outside world to the somatosensory cortex. Buoyed by the advances so far, Nicolelis predicts that the device will be ready to debut in 2014; his team plans to unveil it at the opening game of soccer's World Cup in Brazil that June. "It's our moon shot," he says.

Share this Article:

Comments

You must sign in or register as a ScientificAmerican.com member to submit a comment.
Scientific American MIND iPad

Give a Gift & Get a Gift - Free!

Give a 1 year subscription as low as $9.99

Subscribe Now >>

X

Email this Article

X