Late one evening in 1990 at the Ketambe Research Station in Indonesia’s Gunung Leuser National Park, I sat transcribing notes by the light of a kerosene lamp in my hut on the banks of the Alas River. Something was bothering me. I had come to gather data for my dissertation, documenting what and how the monkeys and apes there ate. The idea was to relate those observations to the sizes, shapes and wear patterns of their teeth. Long-tailed macaques have large incisors and blunt molars—teeth built for eating fruit, according to the received wisdom. But the ones I had been tracking for the past four days seemed to eat nothing but young leaves. I realized then that relations between tooth form and function are more complicated than the textbooks suggest and that the sizes and shapes of an animal’s teeth do not dictate what it eats. This might sound like an esoteric revelation, but it has key implications for understanding how animals—including humans—evolved.

I am a paleontologist, and I earn a living reconstructing the behaviors of extinct species from their fossilized remains. Specifically, I work to discern how animals in the past obtained food from their surroundings and thus how environmental change triggers evolution. That year at Ketambe shaped my way of thinking about primates and the larger community of life that surrounds them. I began to see the biosphere—the part of our planet that harbors life—as a giant buffet of sorts. Animals belly up to the sneeze guard with plates in hand to pick from items available in a given place, at a given time. Each species’ place in the forest, and in nature, is defined by the choices it makes.

Teeth play a role in food choice—you need the right utensils. But I learned at Ketambe that availability is even more important. The macaques ate leaves because that is what nature laid out on the biospheric buffet at that time and place. Their diet changed over the course of the year as leaves unfurled, flowers bloomed and fruits ripened with the passing seasons. I began to imagine how changes in food availability over centuries, millennia or longer could affect what a species eats.

Most paleontologists are not used to thinking about life in the past this way. Our field has a long tradition of inferring function from form by assuming that nature selects the best tools for whatever job an organism has to do. If form always followed function, however, macaques would not eat leaves. But how can we detect food choices in the fossil record?

I have spent decades doing exactly that by studying microscopic wear patterns on fossil teeth, including those of a number of human ancestors. Other researchers have analyzed the chemical signatures of food in fossil teeth for dietary clues. These “foodprints,” as I call them, reveal the kinds of foods individuals actually ate and have given us a much richer picture of the past than tooth shape alone. Together with insights from the paleoenvironmental record, these findings have allowed us to test some leading hypotheses about the impact of climate change on human evolution. The results refine the classic explanation for how our branch of the human family tree succeeded where others did not.

Liem’s Paradox

Observations of living animals have revealed numerous creatures that eat foods other than the ones to which they are adapted. While I was at Ketambe, Melissa Remis, now at Purdue University, was gathering diet data on gorillas at Bai Hokou, a lowland rain forest site in the Central African Republic’s Dzanga-Ndoki National Park. At that time, most researchers thought that gorillas were dietary specialists that ate stems, leaves and the pith of nonwoody plants such as wild celery. Pioneering gorilla researcher Dian Fossey and others had shown as much in the high-altitude cloud forests of the Virunga Mountains in Uganda and Rwanda. It made sense. Gorillas have very specialized teeth and guts—sharp-crested molars well suited to shearing tough plant parts and a massive hindgut to host microorganisms that help to digest cellulose in fibrous foods. Besides, there was little else to eat at those elevations.

The Virunga mountain gorillas were actually a small, marginal population of just a few hundred individuals living in an extreme habitat, however. What about the 200,000 gorillas living 1,000 miles to the west in the lowland rain forests of the Congo Basin? The gorillas at Bai Hokou told a different story. They seemed to prefer soft, sugary fruits. In fact, Remis saw gorillas walk half a mile or more, right past edible leaves and stems, to get to a fruiting tree. Fibrous foods seemed to dominate their diet only when favored fruits were unavailable. But western lowland gorillas were skittish compared with their cousins in the Virunga Mountains, limiting the amount of data Remis could collect. Some researchers questioned whether gorillas could actually prefer fruits, given their teeth and guts.

GRAY-CHEEKED MANGABEYS have flat, thickly enameled molars that appear to be specialized for crushing hard foods. But they fall back on these foods only when the soft fruits and leaves they prefer are unavailable. Credit: Alain Houle 

There is an old joke: “What do you feed a 400-pound gorilla? Anything it wants.” How can we know what a gorilla wants to eat? After Remis returned home from Bai Hokou, she went to the San Francisco Zoo to ask the gorillas themselves. She offered the captive apes a variety of foods, from sweet mango to bitter tamarind, sour lemon and, of course, tough celery. The zoo gorillas clearly preferred sugary, fleshy fruits to tough, fibrous foods, regardless of what their teeth and guts suggested they should eat. This finding confirmed that although gorillas are adapted to the most mechanically and chemically challenging foods they have to eat, these are not their favored foods. Perhaps, then, gorillas in the Virunga Mountains eat tough, fibrous foods year-round not because they prefer them but because they can—and must, given the limited options on the biospheric buffet at such high elevations. Indeed, nearby mountain gorillas that live at lower altitudes prefer to eat fruit when it is available.

A preference for foods other than those to which one is adapted is common enough in the animal kingdom to merit a term for the phenomenon: Liem’s paradox. The late Karel Liem of Harvard University observed the paradox first in 1980 in Minckley’s cichlid, a freshwater fish endemic to the valley of Cuatro Ciénegas in northern Mexico. One form of this fish has flat, pebblelike teeth in its throat that are seemingly perfectly suited for cracking hard-shelled snails. Yet members of this group swim right past those snails when softer foods are available. Why would an animal evolve teeth specialized for less preferred, rarely eaten items? So long as the hard-object specialization does not preclude consumption of softer foods, it can leave an animal more options when it needs them. The paradox, then, is not so much that individuals avoid the foods to which they are adapted but that specialized anatomy can lead to a more generalized diet.

Other primates exemplify Liem’s paradox, including the gray-cheeked mangabey monkeys of Uganda’s Kibale National Park. Mangabeys have flat, thickly enameled molars that seem to be specialized for crushing hard, brittle foods. But day after day, month after month, even year after year, Joanna Lambert, now at the University of Colorado Boulder, watched them eat soft, fleshy fruits and young leaves, just like the thinner-toothed red-tailed guenon monkeys that lived alongside them. Then, in the summer of 1997, everything changed. The forest was reeling from an especially severe drought brought on by an El Niño event. Fruits were scarce, leaves were wilting and the monkeys were hungry. The mangabeys ate more bark and hard seeds, but the guenons did not. The mangabeys’ specialized teeth and jaws allowed them to fall back on mechanically challenging foods. Even if such adaptations are needed only once or twice in a generation, that can be just what the animals require to get through the lean times.

Specialized anatomy can also relate to preferred foods, though. Sooty mangabeys in Ivory Coast’s Taï National Park, for instance, have thick tooth enamel and strong jaws, and they actually prefer hard foods. Much of their foraging time is devoted to scouring the forest floor for seeds of the Sacoglottis tree, which have casings that resemble peach pits. Scott McGraw of Ohio State University argues that this practice allows them to avoid competing for food with the 10 other primate species that live alongside them. Just as gorillas vary in how often they eat mechanically challenging foods, some mangabeys eat them all the time, and others do so only on rare occasions.

Examples such as these show that primate food choice is complex and depends not just on teeth but also on availability, competition and personal preference. Tooth form can tell us something about what an animal in the past was capable of eating and the most challenging foods its ancestors had to contend with. But for insights into food choices among options that were available on the biospheric buffet, we need foodprints.

Dental microwear, the microscopic scratches and pits that form on a tooth’s surface as the result of its use, is a commonly studied type of foodprint. Species that tend to shear or slice tough foods, such as grass-grazing antelopes or meat-eating cheetahs, get long, parallel scratches as opposing teeth slide past one another and abrasives between them are dragged along. Species that crush hard foods, such as nut-eating Taï mangabeys or bone-crunching hyenas, tend to have cratered microwear surfaces, covered in pits of various sizes and shapes.

Because those marks typically wear away and are overwritten in a matter of days, we can learn something about the variety, and perhaps even the proportions, of foods eaten if we consider teeth of individuals sampled at different times and places. The microwear patterns of Kibale mangabeys typically resemble those of soft-fruit eaters, with wispy scratches and fine pits, although a few specimens are more heavily pitted. The teeth of mangabeys from Taï, in contrast, have much more cratered surfaces on average. Despite similar tooth form in the two species, foodprints distinguish them as predicted, based on observations of their diets.

Ancient Menus

With microwear patterns from living animals whose dietary habits are known from firsthand observation to guide us, scientists can use microwear on fossil teeth to infer what extinct species ate on a daily basis and gain insight into their food choices. To that end, my colleagues and I have put a lot of effort into analyzing the microwear of human fossils. Our work has generated surprising results.

The human family tree has many branches. Today Homo sapiens is the only human species alive, but once upon a time, multiple human species, or hominins, shared the planet. Why our lineage survived when others went extinct is an enduring question. My own foray into this mystery began when I set out to study the diet of members of one of these extinct branches, a group of species belonging to the genus Paranthropus. Paranthropus lived in eastern and southern Africa between about 2.7 million and 1.2 million years ago, during the Pleistocene epoch. None of its species gave rise to us; rather they were evolutionary experiments that walked alongside our own early ancestors. Paranthropus had big, flat, thick-enameled premolars and molars, heavy jaws, and the telltale bony ridges and scars that come from having massive, powerful chewing muscles. These traits are clearly dietary specializations for extreme chewing, so these species seemed to be ideal candidates for microwear analysis. If my collaborators and I could not figure out what they ate, then we had little hope of reconstructing diets of other fossil hominins with less distinctive jaws and teeth.

Paleoanthropologist John Robinson was the first to try to reconstruct the diet of Paranthropus, back in 1954. Robinson believed that the large, flat and thickly enameled premolars and molars of Paranthropus robustus from South Africa had evolved for grinding plant parts, such as shoots and leaves, berries and tough wild fruits. Chipping on those teeth suggested to him that P. robustus ate grit-laden roots and bulbs. The late Phillip Tobias of the University of the Witwatersrand, Johannesburg, saw things differently, arguing in the 1960s that the chips occurred during consumption of hard foods rather than gritty ones. At the time, Tobias was describing a new species of Paranthropus from East Africa, Paranthropus boisei. On first seeing its skull, he is famously reported to have said, “I have never seen a more remarkable set of nutcrackers.”

Source: Peter S. Ungar (data and microwear simulations). Illustration by Portia Sloan Rollings (teeth); Graphics by Jen Christiansen

The idea of a hominin that specialized in nut cracking was born. Paranthropus stood in sharp contrast to early Homo fossils found in the same sedimentary deposits, with their daintier teeth and jaws, larger brain and emerging stone tool kit for processing food. Researchers came up with a tidy explanation for the differences, dubbed the savanna hypothesis. As grasslands began to spread across Africa, our ancestors came to an evolutionary fork in the road. Paranthropus went one way, evolving to specialize on hard, dry savanna plant parts, such as seeds and roots. Early Homo went another direction, becoming increasingly versatile, with a more flexible diet that included meat. That dietary flexibility is why we are here today and Paranthropus is gone, according to the theory. It was a compelling story, and early microwear studies by Frederick Grine of Stony Brook University in the 1980s showed that the teeth of P. robustus do have more microwear pits than those of its own predecessors, seemingly confirming that this cousin of ours specialized in hard, brittle foods.

But in 2005, when my then postdoctoral fellow Rob Scott and I looked again at P. robustus microwear using newer technology, another part of the story began to emerge. Yes, P. robustus specimens had more pitted, complex microwear surfaces on average, but some of the specimens we studied had less pitted, simpler textures. In fact, microwear in P. robustus varied a lot, suggesting that while some ate hard foods in the days before they died, others did not. To put it another way, the specialized anatomy of P. robustus did not mean it was a dietary specialist. This was not a new idea. David Strait, now at Washington University in St. Louis, and Bernard Wood of George Washington University had the year before speculated that Paranthropus may well have been an ecological generalist with a flexible diet, based largely on indirect evidence. But our work provided direct evidence for Liem’s paradox among the hominins.

A bigger surprise came in 2008, when my colleagues and I looked at the microwear textures of P. boisei. This was Tobias’s nutcracker, the species with the largest teeth, heaviest jaws and thickest enamel of all the hominins. I expected P. boisei’s teeth to have microwear akin to that of the sooty mangabey’s, cratered like the surface of the moon. They did not. Surface after surface had wispy scratches running every which way. Not only were these critters not hard-object specialists, but their microwear showed no sign at all of hard foods. The nutcracker hypothesis seemed to fall like a house of cards in a stiff wind. So what was P. boisei eating with those big, flat teeth? That would have to wait on another set of foodprints: carbon isotope ratios.

Distinctive chemical signatures of foods that provide the raw materials used to build the body are sometimes preserved in teeth. Like microwear, these chemical clues can be read and decoded. For example, compared with trees and bushes, tropical grasses have a higher proportion of carbon atoms with seven neutrons rather than the usual six; the teeth of animals that eat tropical grasses have predictably more “heavy” carbon as a result.

Carbon isotope ratios of P. robustus teeth indicate a diet dominated by tree and bush products but with a hearty helping of tropical grasses or sedges. This finding is consistent with a broad-based diet. But P. boisei shows a very different pattern, with carbon isotope ratios suggesting that grasses or sedges made up at least three quarters of its diet.

This result came as a surprise to many paleoanthropologists. A cowlike hominin? Surely no self-respecting member of our family tree would earn its living eating grass! But it made sense to me. These species debuted just as grasslands were spreading across eastern and southern Africa, and the biospheric buffet table was becoming covered in turf. If P. boisei was grinding grass or sedge products with its big, flat teeth and powerful jaws rather than crushing hard, brittle foods, that should leave exactly the microwear texture pattern my colleagues and I found. Such a diet would also explain why P. boisei wore down its molars so quickly.

You would never know it by just looking at the shapes of their huge, flat teeth, but foodprints suggest that the two Paranthropus species used their specialized anatomy in different and unexpected ways. Like the Kibale mangabeys, P. robustus seems to have had a generalized diet that included some hard objects. But for P. boisei, the relation between teeth and diet seems to have been very different from anything we see in primates today. Big, flat teeth are far from ideal for shredding grass, but one works with what one has. And so long as a grinding platform is better than what hominins had before, it would be selected for even if it is not optimal for the task at hand.

Microwear of our direct ancestors—those in the Homo genus—points to a decidedly different dietary strategy. My colleagues and I have looked at two early species: the more “primitive” Homo habilis, a smaller-brained hominin that retained some features related to life in the trees, and Homo erectus, a larger-brained hominin committed to the ground. Our samples are small because microwear requires pristine teeth, and there are just not that many of them. But they show an interesting pattern. Compared with Australopithecus afarensis, its putative ancestor, and P. boisei, which lived alongside it, H. habilis has a somewhat broader range of microwear textures, from complex pitted surfaces to simple scratched ones. The finding hints that H. habilis ate a wider range of foods than either its predecessors or its contemporaries. Its successor H. erectus has even more variable microwear textures, perhaps suggesting a broader diet still.

These results fit neatly with a leading model of how climate change shaped human evolution that has superseded the savanna hypothesis. Work on climate data from deep ocean cores in the mid-1990s by the late geologist Nicholas Shackleton showed there was more to the story of climate change than the savanna hypothesis supposed. Conditions did become cooler and drier over the long term, but there were also short-term climate swings, and those swings became more and more intense over the course of human evolution.

Whereas Paranthropus boisei (left) specialized in eating grasses or sedges, its contemporary Homo habilis (right) appears to have had a broader diet. Credit: John R. Foster Science Source

Rick Potts of the Smithsonian Institution reasoned that this unstable climate pattern should favor more versatile species, including hominins—an idea that became known as the variability selection hypothesis. Pleistocene Africa was no place to be a picky eater. For Potts, it was not so much the spread of savanna grasses but the need for flexibility that drove human evolution. In this light, Homo’s larger brain and stone tools for processing a variety of foods make sense. They would have allowed our ancestors to survive increasingly intense environmental swings and to keep up as nature more quickly swapped items on and off the biospheric buffet. The increasing variation in microwear complexity from A. afarensis to H. habilis to H. erectus just might be direct evidence of variability selection.

Potts’s idea has held up pretty well in the two decades since he first presented it, although others have built on it, and new details have emerged about how changes to Earth’s landscapes and in its orbit around the sun have combined to create the conditions under which humans evolved. For example, in 2009 Mark Maslin of University College London and Martin Trauth of the University of Potsdam in Germany suggested that climate swings filled and emptied the spreading lakes in eastern Africa, disrupting life in the rift basins. This flux may have led to fragmentation and dispersal of hominin populations, fueling human evolution. The ability to pursue a more variable diet would have aided survival in such turbulent times.

Appetite and Evolution

Although the available evidence allows scientists to paint a plausible picture of how early hominins adapted to their changing world, we can only do so with the broadest of brushstrokes. The biggest challenge to understanding how climate change drives evolution is matching specific climate events in the past to changes in the fossil record.

Local environments react to global and even regional climate change in different ways, and our fossil record is simply not complete enough to tell exactly where and when particular species appeared and disappeared. We can be off by 1,000 miles and 100,000 years or more. We might be able to tie the extinction or evolution of a given species to a massive, catastrophic event in Earth’s history, such as the asteroid impact in the Yucatán Peninsula that killed off the dinosaurs 66 million years ago. But the climate-related events we associate with human evolution are very different—repeated cycles of cool-dry conditions followed by warm-wet ones. The fact that hominins were probably flexible species capable of adjusting to a broad range of habitats and the foods available within them further obscures the picture. Our best shot at understanding how hominins responded to changing environments thus lies in the more recent past, in places that are exceptionally well studied.

Research published by Sireen El Zaatari of the University of Tübingen in Germany, Kristin Krueger of Loyola University Chicago and their colleagues over the past two years shows how this approach might work. Their studies of the microwear of Neandertals and the anatomically modern humans that supplanted them in Eurasia allow us to revisit the long-standing mystery of this replacement from a fresh perspective. Neandertals ruled Europe and western Asia between about 400,000 and 40,000 years ago. Then they were gone. Paleoanthropologists have been debating what happened and why for more than a century, and even today there is little consensus.

Although popular science often tells a tale of brutish Neandertals living in near-glacial conditions, swaddled in animal hides and gorging lustfully on mammoth and woolly rhinoceros meat, it was not always like that. Neandertals inhabited a wide range of habitats, from cold, dry steppes to warmer, wetter woodlands, and conditions varied over time and space. Recent studies of their molars show that Neandertals living in more wooded or mixed settings had complex pitted microwear, suggesting that they ate more hard, brittle and perhaps abrasive plant foods. Neandertals that dwelled on the open steppes, in contrast, have less complex molar microwear, which El Zaatari and her colleagues argue reflects a less variable diet composed primarily of soft meat. Krueger, for her part, found differences in incisor microwear between the two groups; she thinks the differences stem from the steppe Neandertals having used their incisors to aid in processing animal hides and the forest Neandertals having eaten a greater variety of foods. Intriguingly these differences hold whether one considers earlier Neandertals or later ones. It seems that Neandertals were flexible feeders with diets that tracked to habitat and associated food availabilities.

The pattern is different, though, for anatomically modern people living in Europe during the last ice age. There is not much difference in molar microwear between those from open habitats and those who occupied habitats containing a mix of open and wooded vegetation, whether one considers earlier or later individuals. Perhaps early modern humans were better able to acquire their preferred foods than Neandertals were when faced with environmental change.

Food for Thought

Studies of early human diets bear on what people today should eat to be healthy—though perhaps not in the manner popularly envisioned. “Paleolithic diet” gurus argue that we should eat the kinds of foods our ancestors evolved to eat. Many chronic degenerative diseases have been linked to a mismatch between our diets and the fuels our bodies were “designed” to burn, they contend. And it certainly cannot hurt to remind ourselves every now and again that our distant forebears did not eat corn dogs or milkshakes.

That does not mean that we should look to follow a specific Paleolithic diet, however. Foodprints teach us that early hominin diets varied over time and space and that we mostly likely evolved to be flexible eaters, driven by ever changing climates, habitats and food availability. In other words, there was no single ancestral human diet for us to replicate. Dietary versatility allowed our ancestors to spread across the planet and find something to eat on all of Earth’s myriad biospheric buffets. It was the key to our evolutionary success.