The ability to link language to the world around us is a crowning feature of our species. For very young infants, it is not yet about learning the meaning of words like “cat” or “dog.” Rather, the acoustic signals in speech help foster infants' fundamental cognitive capacities, including the formation of categories of objects, such as cats or dogs. 

The sounds that activate this key step in development can come not just from human language but also from vocalizations made by nonhuman primates. A new study shows that babies do not use just any natural sound to build cognition, however. While primate calls and human language pass the test, birdsongs do not.

“By tracing the link from language to cognition and how it’s built up with babies’ experiences with objects in the world, we get to see what are the components of this quintessential human ability to go beyond the here and now,” says Sandra Waxman, a developmental scientist at Northwestern University and senior author of the findings, which were published today in PLOS ONE. “Asking how broad that earliest link is helps to answer questions about our evolutionary legacy.”

By three or four months of age, infants can categorize objects—from toys and food to pets and people—based on commonalities those objects share. This ability is boosted if the objects are presented while the infants are listening to language.

The new findings build on previous work Waxman and her colleagues conducted about which sounds outside of the realm of human speech support infants’ ability to categorize objects. In past studies, they found that sequences of pure tones and backward speech do not help infants under six months of age to categorize objects, whereas listening to vocalizations from nonhuman primates—specifically, lemurs—does. “The lemur finding rocked my little world, because it showed that the template was broader than human linguistic signals alone,” Waxman says. “It told us there’s something inborn in babies in selecting signals that will have positive downstream effects to support cognition.”

Rather than incrementally investigate the breadth of sounds that babies respond to by testing their responses to various nonprimate mammals, the researchers decided to take a broader jump to a nonmammalian species. “We wanted to really find a boundary condition,” Waxman says. She and her colleagues selected zebra finches because they are one of the most studied birds and because the species’s song is about the same length and frequency as primate vocalizations.

Replicating the same categorization task used in past studies, the researchers showed 23 three- to four-month-olds eight images representing one of two categories—dinosaurs or fish—while simultaneously playing a zebra finch song. Next, in silence, they showed the infants two new images—one within the same category they previously viewed and one within the other category. Based on the previous work, the researchers knew that if the birdsong boosted the infants’ categorization, they would distinguish between the two test images. But in analyzing the infants’ gaze, the team found no difference in the amount of time the babies spent looking at either the familiar image or the new one. This indicated that the zebra finch song did not facilitate object categorization.

“This new study builds on an elegant and highly impactful line of research,” says Jenny Saffran, a professor of psychology at the University of Wisconsin–Madison, who was not involved in the research. “The fact that speech and lemur calls impact infant category learning—but not birdsong—suggests that sounds with particular rhythmic organization, perhaps related to the structure of primate vocalizations, are privileged for infants in ways that other sounds are not, potentially facilitating human language learning.”

The researchers’ conclusion that they did not find a connection between birdsong and infant cognition is important for building “a comprehensive picture of infant behavior, including what infants can and cannot do at a certain age and under certain experimental conditions,” adds Chiara Santolin, a developmental psychologist at Pompeu Fabra University in Barcelona, who was also not involved in the work. “This study, along with other research, addresses a crucial issue in developmental psychology: What is the connection between human language and cognition?”

Waxman and her colleagues are planning several lines of follow-up research to home in on the answers, including testing infants’ responses to vocalizations from nonprimate mammals; applying machine-learning techniques to try to identify which acoustic features contribute to an auditory signal making an impression; and using electroencephalographic caps to see what is happening in infants’ brain while they listen to various sounds.

“That the brain responds differently is self-evident, now that we have the data,” Waxman says. “But what is underneath that’s directing this is not.”