The discovery of mirror neurons in the brains of macaques about ten years ago sent shockwaves through the neuroscience community. Mirror neurons are cells that fire both when a monkey performs a certain task and when it observes another individual performing that same task. With the identification of networks of similarly-behaving cells in humans, there was much speculation over the role such neurons might play in phenomena such as imitation, language acquisition, observational learning, empathy, and theory of mind.
Several research groups have observed the activity of mirror neuron networks indirectly in humans through the use of functional magnetic resonance imaging (fMRI). This technology allows scientists to correlate changes in blood flow in specific brain areas to particular behaviors or mental operations. Experiments using fMRI have demonstrated that there is more activation in the human mirror system when people observe movements with which they are familiar; for instance, experienced dancers had larger mirror network activations when they viewed steps from their own repertoire compared to moves from a different style of dance.
Studies of the human mirror system have also revealed that it can be activated by the sounds of actions alone, in the absence of any visual cues. While evidence along these lines suggests that hearing can activate mirror neurons as well as vision, it is not clear if aurally-presented stimuli evoke visual imagery that then recruits the mirror system. These studies did not address whether a functional visual system was a necessary prerequisite for the development of the mirror system.
Emiliano Ricciardi, Pietro Pietrini and colleagues at the University of Pisa tackle this issue directly in their recent paper in the Journal of Neuroscience. They performed fMRI scans of healthy sighted subjects and congenitally blind subjects who had never had any visual experience to see if a functional, effective mirror system developed normally in individuals without any visual experience. Scans were run while the subjects listened to the sounds of several common hand-executed actions (such as cutting paper with scissors and hammering a nail) and environmental sounds (such as a rainstorm) as a control. Subjects were also asked to pantomime with their hands the same set of motor actions that they heard while in the fMRI scanner. Sighted subjects completed an additional visual version of the stimuli in which they watched movies of the hand-executed actions and performed the motor pantomime.
Ricciardi, Pietrini and colleagues report similar patterns of neural activation in congenitally blind individuals listening to familiar actions and in sighted individuals both when listening to and watching the same actions. Compared with environmental sounds, the action sounds elicited brain activity in the premotor, temporal and parietal cortex, primarily in the left hemisphere. All subjects showed increased neural activity in motor, somatosensory, and premotor cortex on both sides of the brain when performing pantomimed actions. The region of overlap between brain areas that were active during listening to actions and pantomiming them is identified as the mirror system. In this case, the mirror system was a cortical network that included premotor, temporal and parietal regions in the left hemisphere. Listening to environmental sounds did not activate the mirror system in either sighted or blind subjects. And in both groups, neural mirror activity increased in response to familiar action sounds as opposed to unfamiliar action sounds.
The results of this study illustrate that visual experience is not necessary for the development and function of the mirror system. Congenitally blind subjects showed mirror network activation in response to action sounds in the same brain areas that were active in response to both visual and auditory stimuli in sighted individuals. The authors conclude that the human mirror system can develop without visual input and is able to process information about actions that comes from other sensory modalities, as well.
In essence, when blind people hear the actions of others, they use the same network of cortical brain areas that sighted people use when they observe such actions. This fits into what we already know about how some regions of the brain are recruited for different uses by blind people. For example, congenitally blind individuals rely on areas in the visual cortex to acquire information about an object’s shape and movement through other senses like touch and hearing. As Ricciardi, Pietrini and colleagues point out, the recruitment of visual brain areas for nonvisual recognition in congenitally blind individuals indicates that neither visual experience nor visual imagery is required to form an abstract representation of objects.
The activation of a mirror system by aural stimuli in congenitally blind subjects supports and extends such findings by demonstrating that neither visual experience nor visually-based imagery is necessary to form a representation of the actions of others. Since individuals with no visual experience still learn from and imitate others, studies such as this help explain how they can use their other sensory modalities to interact with the world. The human mirror system functions effectively in people blind from birth, indicating that it is capable of interpreting nonvisual sensory information to acquire knowledge about others.