To handle the avalanche of data from these sensors, we are also moving ahead on making a new generation of custom-designed neurochips. Implanted in a patient's skull along with the microelectrodes, they will extract the raw motor commands needed to manipulate a whole-body exoskeleton.
Of course, the signals detected from the brain will then need to be broadcast to the prosthetic limbs. Recently Tim Hanson, a newly graduated Ph.D. student at Duke, built a 128-channel wireless recording system equipped with sensors and chips that can be encased in the cranium and that is capable of broadcasting recorded brain waves to a remote receiver. The first version of these neurochips is currently being tested successfully in monkeys. Indeed, we have recently witnessed the first monkey to operate a brain-machine interface around the clock using wireless transmission of brain signals. We filed in July with the Brazilian government for permission to use this technology in humans.
For our future soccer ball kicker, the data from the recording systems will be relayed wirelessly to a small computer processing unit contained in a backpack. Multiple digital processors will run various software algorithms that translate motor signals into digital commands that are able to control moving parts, or actuators, distributed across the joints of the robotic suit, hardware elements that adjust the positioning of the exoskeleton's artificial limbs.
Force of Brainpower
The commands will permit the exoskeleton wearer to take one step and then another, slow down or speed up, bend over or climb a set of stairs. Some low-level adjustments to the positioning of the prosthetic hardware will be handled directly by the exoskeleton's electromechanical circuits without any neural input. The space suit–like garment will remain flexible but still furnish structural support to its wearer, a surrogate for the human spinal cord. By taking full advantage of this interplay between brain-derived control signals and the electronic reflexes supplied by the actuators, we hope that our brain-machine interface will literally carry the World Cup kicker along by force of willpower.
The kicker will not only move but also feel the ground underneath. The exoskeleton will replicate a sense of touch and balance by incorporating microscopic sensors that both detect the amount of force from a particular movement and convey the information from the suit back to the brain. The kicker should be able to feel that a toe has come in contact with the ball.
Our decade-long experience with brain-machine interfaces suggests that as soon as the kicker starts interacting with this exoskeleton, the brain will start incorporating this robotic body as a true extension of his or her own body image. From training, the accumulated experience obtained from this continuous feeling of contact with the ground and the position of the robotic legs should enable movement with fluid steps over a soccer pitch or down any sidewalk. All phases of this project require continuous and rigorous testing in animal experiments before we begin in humans. In addition, all procedures must pass muster with regulatory agencies in Brazil, the U.S. and Europe to ensure proper scientific and ethical review. Even with all the uncertainties involved and the short time required for the completion of its first public demonstration, the simple idea of reaching for such a major milestone has galvanized Brazilian society's interest in science in ways rarely seen before.
The opening kickoff of the World Cup—or a similar event, say, the 2016 Olympic and Paralympic Games in Rio de Janeiro, if we miss the first deadline for any reason—will be more than just a one-time stunt. A hint of what may be possible with this technology can be gleaned from a two-part experiment already completed with monkeys. As a prelude, back in 2007, our research team at Duke trained rhesus monkeys to walk upright on a treadmill as the electrical activity of more than 200 cortical neurons was recorded simultaneously. Meanwhile Gordon Cheng, then at ATR Intelligent Robotics and Communication Laboratories in Kyoto, built an extremely fast Internet protocol that allowed us to send this stream of neuronal data directly to Kyoto, where it fed the electronic controllers of CB1, a humanoid robot. In the first half of this across-the-globe experiment, Cheng and my group at Duke showed that the same software algorithms developed previously for translating thoughts into control of robotic arms could also convert patterns of neural activity involved in bipedal locomotion to make two mechanical legs walk.