The ability for brains to control inanimate objects, like computer cursors, robotic arms, wheelchairs, has seen significant progress in the last decade. A case in point is the recent success at Andrew Schwartz’s lab at the University of Pittsburgh where macaque monkeys fed themselves using a robotic arm controlled only by their thoughts.
Even commercial companies are now using brain-computer interface (or BCI.) Products like Mattel’s MindFlex or the Star Wars Force Trainer allow players to move a ball with thoughts alone. Or there is also the consumer product Zeo that follows your sleeping brain waves in order to diagnose any restless nights.
But it’s important to note that there are very different tools being used in the lab versus the marketplace. In Schwartz’s lab, an electrode placed beneath the skull of the macaque can detect spikes from single neurons. The pattern of neurons firing is then translated into code that a computer can understand.
The commercial products, however, cannot be so invasive. These companies use an electroencephalography cap (or EEG) that is placed on top of your head, and reads your overall brain state. Here the results are fairly crude. We can detect if one is calm, angry, excited or distracted, and we can manipulate those brain states to activate switches, like move a ball forward and back. But if we want to go beyond any binary on/off activation, however, we need to get deeper into the brain.
To do anything more complex with an EEG cap is like trying to distinguish the cello in an orchestra from outside of Lincoln Center.
To put this into perspective, the electrodes that are placed under the skull and are tapping into our grey matter and are used to move robotic arms or surf the Web, are not only inside Lincoln Center but are right smack in the front row, directly monitoring every string bowed on that same cello. And it is this sort of extreme detail that we are probably gonna to need to do any complex tasks with thoughts alone.