By Geoff Brumfiel of Nature magazine
Today IBM unveiled a new "cognitive computing" microchip that, according to the company, emulates some of the brain's abilities. The chip is the latest development in an ongoing program by the Defense Advanced Research Projects Agency (DARPA), based in Arlington, Va., to develop systems that can analyze complex information. Nature takes a peek under the hood of the new chip.
Does this use some new kind of technology?
No. The chip unveiled today uses the workhorse of everyday electronics, known as complementary metal-oxide semiconductors (CMOSs). Each core contains 256 clusters of transistors and thousands of random access memory (RAM) elements. Compared to modern electronics, the total computing power is tiny. "It's a worm-scale chip," says Dharmendra Modha, the manager of IBM's cognitive computing program.
So what's new then?
The difference is the way the transistors and memory are wired together. In a conventional computer, the computational elements are mostly in the central processing unit, while the RAM sits off to one side. In the cognitive chips, the computational elements and RAM are wired together.
How is this like a brain?
The theory is that the computational components act as "neurons," while the RAM units act as the "synapses," which connect the neurons together. In a real brain, neurons receive electrical pulses from synapses until a sufficient voltage builds up across their membrane. The neuron then discharges, sending signals to other neurons via the synapses.
In the cognitive chip, a pattern of signals from the RAM can cause a computational element to carry out a simple operation. The result goes to another RAM synapse, which can send signals to other computational neurons. In this way, the chip is "inspired" by the brain's architecture, Modha says.
What's the advantage of building chips like these?
The main benefit is decreasing power consumption. Because the memory and computation are intermingled, less energy is wasted shuffling electrons back and forth. The new chips have the potential to be orders of magnitude more efficient than a conventional computer, according to Rajit Manohar, an electrical and computer engineer at Cornell University in Ithaca, N.Y., and member of the DARPA collaboration.
In terms of speed, it's believed that the chips will be particularly good at crunching certain kinds of problems, such as pattern recognition, but they may not be as good as a regular computer at handling other tasks.
What comes next?
The success of these first chips has led DARPA to award $21 million for further development. Meanwhile, IBM researchers are constructing algorithms that are adept at speech recognition and problem solving. Their Watson supercomputer, for example, which was victorious over human competitors on the American quiz show Jeopardy! (see "Quiz-playing computer system could revolutionize research"), currently requires a room full of power-hungry processors. Manohar hopes that low-power cognitive chips could do the job just as well. "Our goal is basically to meet in the middle," he says.
With additional reporting by Mitch Waldrop.
This article is reproduced with permission from the magazine Nature. The article was first published on August 18, 2011.