I KNOW WHAT YOU'RE THINKING: University of Portsmouth researcher Paul Gnanayutham is working to create an inexpensive, easy-to-use interface that allows a computer to read, interpret and display thoughts and feelings based on eye movement, the use of face muscles and/or brain waves. Image: Image courtesy of the University of Portsmouth
People suffering from physically debilitating illnesses such as amyotrophic lateral sclerosis (aka Lou Gehrig's Disease) and traumatic brain injuries often find themselves trapped inside their own bodies, unable to speak, gesture or otherwise communicate with the outside world. Scientists have shown they can create computer interfaces that sense, interpret and display a locked-in person's brain waves, eye movements or facial expressions, but the challenge has been to find cost-effective ways of harnessing this technology for consumer use.
Paul Gnanayutham, a computer scientist at England's University of Portsmouth School of Computing, is attempting to do this by taking a generic, relatively inexpensive interface (software and a sensor-laden headband), loading its software into a laptop, and writing additional code to customize the device to meet the specific needs of different patients with different capabilities.
Gnanayutham chose Brain Actuated Technologies, Inc.'s Cyberlink Interface, which costs about $2,000 per unit, to serve as the core of his brain-to-computer interface kit. Probes on Cyberlink's headband can detect the minute surface electrical signals (resulting from brain and subtle muscle activity) by using electrooculography (EOG) to sense eye movement, electromyography (EMG) to sense the twitching of forehead muscles, and electroencephalography (EEG) to sense brain waves.
Gnanayutham has been searching for ways to improve brain and body computer interfaces since 2001. The past three years, he has worked with Jennifer George, a doctoral candidate at the University of Sunderland in England, focusing on accessibility of young children with severe motor impairment. As part of their research they have taught patients to control a computer cursor using facial muscles (frowning or relaxing their faces to move the cursor up or down) and eye movement (looking left or right to move the cursor accordingly). EMG and EOG sensors proved to work best in this situation, he says, because the muscle and eye movement signals are about 1,000 times stronger than those produced by brain waves (measured via EEG).
Gnanayutham's interest in helping severely disabled patients stems back to a 2000 trip he took with a church group to London's Royal Hospital for Neuro-disability. There he met a 24-year-old man who could communicate only through eye movement, primarily blinking to his nurse to indicate "yes" or "no" in response to her questions. This eye movement may have saved the man's life. Before his family and the hospital staff realized he could control his eye movement, they believed he was in a vegetative state and had made the heart-wrenching decision to disconnect his feeding tube. The man's nurse stopped the procedure when she noticed the patient was moving his eyes in a way that seemed to be communicating with her, Gnanayutham says, adding, "I thought I could do more for people like him."