If you watched football or the final game of the World Series yesterday, you may have noticed the following: When the announcers were speaking on camera, it seemed as though the sound of their voices were coming from their mouths. But when the commentary occurred off-screen as the game action was shown, it was quite apparent the TV speakers were the actual sound source of the endless color-commentary babble.

This processing phenomenon in which a visual cue affects how one perceives an auditory stimulus—ventriloquism is another example—may be explained by new research that pinpointed neurons in a primitive brain area that responds to both visual and auditory information. This area, the inferior colliculus region in the midbrain, less than half an inch in diameter, is a way station for nearly all auditory signals as they travel from the ear to the cortex (the brain's central processing area).

"It's important if you're going to be integrating visual and auditory information that they be on a level playing field, so both are encoded the same way," says Jennifer Groh, an associate professor at Duke University's Center for Cognitive Neuroscience and a co-author of the new work published in Proceedings of the National Academy of Sciences USA. "It's important for the auditory pathway to know where the eye is pointed."

Groh and her colleagues planted electrodes in the brains of three monkeys, targeting 180 individual neurons (or nerve cells) in the inferior colliculus. The animals were placed in a dark chamber where a light-emitting diode (LED) would switch on in one of several predetermined locations. After the monkeys attended to and fixated on the light for a few fractions of a second, a short clip of white noise would play from speakers in the chamber.

When the researchers examined the time-stamped activity of the individual neurons, they observed that each monkey had a neural response in its inferior colliculus when the LED turned on. In addition, two of the three animals showed activity in the auditory structure as they moved their eyes toward the light. In all, the scientists report that more than 67 percent of the neurons monitored (121 of the 180) showed statistically significant responses to the visual stimulus.

"The implication is that it's possible that perception involves more interaction between the sensory pathways than we expected and, because they are happening in low-level areas, they may be more automatic," Groh says. She adds that some cells responded more quickly to the light, although others had a buildup of activity. She speculates that the quicker acting cells process the information whereas the slower ones may encode a reward response (a secondary function of the inferior colliculus).

Christoph Kayser, a research scientist at the Max Planck Institute for Biological Cybernetics in Tübingen, Germany, calls the new work "stunning." "Results like these suggest that the brain does not try to keep the information provided by the different sensory organs as isolated as possible, but rather that an early mixing of sensory information seems to be the rule," he says. "All this can best be interpreted when seeing the brain as being faced with a flood of sensory information that must be co-registered and merged into a coherent percept."