One weekend about 10 years ago, when she was a nurse at a hospital in Cologne, Bettina Sorger volunteered to help the intensive care unit staff. One of her patients was still recovering from anesthesia after a surgery in which doctors removed a brain tumor. He was not talking, and he did not seem to move much. While Sorger was making his bed, the man reached up and put his hands around her neck. Another nurse helped her break away from his grip. Sorger returned to business as usual; she was used to unusual behavior in patients still groggy from anesthesia. Surely the man did not know what he was doing. One week later Sorger ran into the same patient, who promptly apologized. She was shocked—she did not realize he had been aware of his actions, let alone that he remembered them and felt remorse.

Through her experiences caring for people who cannot speak or move but retain some level of consciousness—patients who have suffered brain injuries, for example—as well as through her subsequent doctoral research, Sorger became increasingly devoted to the possibility of helping such patients communicate. Now she and a team of researchers from the Netherlands have published a study about a brain–machine interface that could help paralyzed patients spell out answers to questions with their thoughts alone. Although the study is only a proof of concept, the new program is a promising complement to a growing collection of similar technologies.

In the new study, six healthy adults learned to answer questions by selecting letters on a computer screen with their minds. Most volunteers learned to communicate in this way after a single one-hour training session. While lying inside a functional magnetic resonance imaging (fMRI) scanner, which measures changes in blood flow in the brain, volunteers stared at a computer screen displaying a table of three rows, nine columns and 27 squares. The squares contained the 26 letters of the alphabet and a blank space for separating words. Each row of letters was paired with one of three mental tasks: a motor imagery task, such as tracing stars or flowers in one's mind; a mental calculation task, in which patients rehearsed multiplication tables; and an inner speech task, during which patients silently recited a poem or prayer. Different blocks of letters were highlighted on the screen at different times. To choose a particular letter, participants waited for the computer screen to highlight that letter and performed the mental task associated with that letter's row for as long as the letter was highlighted.

The computer program could not read the volunteers' thoughts, but it could distinguish between the different kinds of brain activity associated with the three different mental tasks, as well as measure how long the volunteers performed a particular task. Thanks to some clever programming, the system associated a unique pattern and duration of brain activity with each of the 26 letters of the alphabet, as well as with the blank space. For example, if a participant's brain showed activity associated with the mental calculation task for 10 seconds after a 20-second delay, the program interpreted their thought as an O, but if the same activity lasted for 30 seconds after the same delay, the program selected Q. The new study appears online June 28 in Current Biology.

Below are two examples of conversations the volunteers had by spelling out answers to questions with their thoughts. Although the software did not always interpret the volunteers' brain activity correctly, it achieved a commendable 82 percent accuracy rate. Despite the errors, the volunteers' intended answers were still understood, as in the following examples:

Where did you spend your most recent vacation?
Computer's interpretation: INDCNERCA
Intended response: INDONESIA

What did you like most in Indonesia?
Computer's interpretation: TEKPLES
Intended response: TEMPLES

Which movie did you watch last?
Computer's interpretation: TOPFUN
Intended response: TOP GUN

The table on the computer screen is not restricted to letters. One could fill the 27 squares with, for example, different icons of foods and drinks, so that patients could choose what they eat. The squares could also contain commonly used words so that patients could string together simple sentences.

This is not the first time researchers have developed a spelling device for the paralyzed. Niels Birbaumer of the University of Tübingen has created a "thought translation device" that allows paralyzed patients to spell words and choose pictograms.The device interprets electrical activity in the patients' brains, rather than changes in blood flow, with an electroencephalogram (EEG)—a net of electrodes placed on the scalp. Sorger says one advantage of the fMRI-based device is that EEG cannot be applied to some patients with severely damaged skulls. However, fMRI also has disadvantages: it requires patients to lie still inside a giant scanner, for example, which is not exactly the most portable piece of equipment. And so far it is slower than EEG-based technology. On average it took 50 seconds for volunteers in Sorger's study to select a letter.

Presumably, paralyzed patients could communicate more quickly if they could control a cursor with their thoughts and click letters on a virtual keyboard. John Donoghue of Brown University and his colleagues taught one paralyzed man to open e-mail and play Pong by moving a cursor with his mind. Researchers have also created brain–computer interfaces that allow paralyzed patients to type one or two words a minute on a computer screen with their thoughts as well as devices that convert thoughts into vowel sounds spoken by a voice synthesizer.

Sorger, now a researcher at Maastricht University in the Netherlands, sees the computer program she and her colleagues developed as a potential supplement to such technologies—one that she hopes to improve in the coming years. "Even if one person benefits I would be very happy," she says.