[Below is the original script. But a few changes may have been made during the recording of this audio podcast.]
It makes sense to us that the movement of our face helps in the production of words. We move our mouth to create words. But does it follow, then, that feeling a certain muscle movement in our face helps us hear words?
Scientists used a robotic device to stretch and manipulate the skin of subjects’ faces as they heard words, along a computer-generated continuum between, “head” and “had.” They specifically stretched skin to match how it would move when subjects said either the word “head” or the word “had.”
If the robot stretched the skin up, words sounded more like head, when stretched down, words sounded more like had.
Now of course we don’t, unless we are lip syncing, move our lips to silently form the words someone else says—but what this research reveals is that the brain uses “motor movement images” (a representation in our motor cortex of how we form speech, for instance) when we try to understand what someone is saying.
It seems there is a neural connection between motor movement and our understanding of auditory information.
According to this research there is a neural link between our motor cortex, tactile sensations, and how we hear—or feel—a word.