You might assume that listening and speaking are different processes, but more and more evidence suggests these tasks are inextricably linked. The latest piece of the puzzle comes from a study of babies with teething toys. The findings support a theory that our perception of speech is dependent on brain areas that control mouth movements.

Alison Bruderer, a cognitive scientist at the University of British Columbia, gave six-month-old infants who were not yet starting to talk common teething toys that immobilized their tongue. Then she played recordings of an English d sound and a Hindi d sound made by moving the tongue farther back on the palate. Babies of this age from any culture reliably notice the difference between these sounds, as indicated by an increase in attention paid when the sound changes. When the babies were using the teethers, however, they did not appear to notice the difference in sounds. The results suggest that even babies who have not yet started speaking use their tongue to help them understand speech.

Evidence in adults also suggests a link between tongue movement and speech perception. Neurolinguist William Katz of the University of Texas at Dallas used real-time three-dimensional images to show native English speakers the position of their tongue as they tried to make a made-up sound not found in any known language. Participants were more likely to make the sound correctly when they had this visual feedback. Katz says the technology might help people improve pronunciation in a second language or even relearn speech after a stroke.

Both studies are aligned with the motor theory of speech perception: our perception of speech sounds is in some way dependent on the knowledge of how we would position our lips, teeth and tongue if we were making those sounds ourselves. Yet it is unclear to what degree we rely on that information. The strongest version of the theory suggests that the sound of spoken speech does not matter; it is just another clue as to how a speaker is moving his or her mouth. It is mentally mirroring the speaker's movements that makes us understand, not auditory recognition.

“I would be a little more moderate than that,” Bruderer says. She is curious whether the ability to move our lips and tongue is critical to learning speech—longer-term studies would be needed to determine if the differences in perception she found could cause a delay in language acquisition. She also hopes to investigate whether oral malformations such as cleft palate and tongue-tie affect speech perception.