During the summer of 1963, when i was six years old, my family traveled from our home in Philadelphia to Los Angeles to visit my maternal relatives. I already knew my grandmother well: she helped my mother care for my twin brothers, who were only 18 months my junior, and me. When she was not with us, my grandmother lived with her mother, whom I met that summer for the first time. I come from a long-lived family. My grandmother was born in 1895 and her mother in the 1860s; both lived almost 100 years. We stayed with the two matriarchs for several weeks. Through their stories, I learned about my roots and where I belonged in a social network spanning four generations. Their reminiscences personally connected me to life at the end of the Civil War and the Reconstruction era and to the challenges my ancestors faced and the ways they persevered.
My story is not unique. Elders play critical roles in human societies around the globe, conveying wisdom and providing social and economic support for the families of their children and larger kin groups. In our modern era, people routinely live long enough to become grandparents. But this was not always the case. When did grandparents become prevalent, and how did their ubiquity affect human evolution?
Research my colleagues and I have been conducting indicates that grandparent-aged individuals became common relatively recently in human prehistory and that this change came at about the same time as cultural shifts toward distinctly modern behaviors—including a dependence on sophisticated symbol-based communication of the kind that underpins art and language. These findings suggest that living to an older age had profound effects on the population sizes, social interactions and genetics of early modern human groups and may explain why they were more successful than archaic humans, such as the Neandertals.
Live fast, die young
The first step in figuring out when grandparents became a fixture in society is assessing the typical age breakdown of past populations—what percent were children, adults of childbearing age and parents of those younger adults. Reconstructing the demography of ancient populations is tricky business, however. For one thing, whole populations are never preserved in the fossil record. Rather paleontologists tend to recover fragments of individuals. For another, early humans did not necessarily mature at the same rate as modern humans. In fact, maturation rates differ even among contemporary human populations. But a handful of sites have yielded high enough numbers of human fossils in the same layers of sediment that scientists can confidently assess the age at death of the remains—which is key to understanding the makeup of a prehistoric group.
A rock-shelter located in the town of Krapina in Croatia, about 40 kilometers northwest of the city of Zagreb, is one such site. More than a century ago Croatian paleontologist Dragutin Gorjanovi-Kramberger excavated and described the fragmentary remains of perhaps as many as 70 Neandertal individuals there, most of which came from a layer dated to about 130,000 years ago. The large number of fossils found close to one another, the apparently rapid accumulation of the sediments at the site and the fact that some of the remains share distinctive, genetically determined features all indicate that the Krapina bones approximate the remains of a single population of Neandertals. As often happens in the fossil record, the best-preserved remains at Krapina are teeth because the high mineral content of teeth protects them from degradation. Fortunately, teeth are also one of the best skeletal elements for determining age at death, which is achieved by analyzing surface wear and age-related changes in their internal structure.
In 1979, before I began my research into the evolution of grandparents, Milford H. Wolpoff of the University of Michigan published a paper, based on dental remains, that assessed how old the Krapina Neandertals were when they died. Molar teeth erupt sequentially. Using one of the fastest eruption schedules observed in modern-day humans as a guide, Wolpoff estimated that the first, second and third molars of Neandertals erupted at ages that rounded to six, 12 and 15, respectively. Wear from chewing accumulates at a steady pace over a lifetime, so when the second molar emerges, the first already has six years of wear on it, and when the third emerges, the second has three years of wear.
Working backward, one can infer, for instance, that a first molar with 15 years of wear on it belonged to a 21-year-old Neandertal, a second molar with 15 years of wear on it belonged to a 27-year-old and a third molar with 15 years of wear on it belonged to a 30-year-old. (These estimates have an uncertainty of plus or minus one year.) This wear-based seriation method for determining age at death, adapted from a technique developed by dental researcher A.E.W. Miles in 1963, works best on samples with large numbers of juveniles, which Krapina has in abundance. The method loses accuracy when applied to the teeth of elderly individuals, whose tooth crowns can be too worn to evaluate reliably and in some cases may even be totally eroded.
Wolpoff's work indicated that the Krapina Neandertals died young. In 2005, a few years after I began researching the evolution of longevity, I decided to take another look at this sample using a novel approach. I wanted to make sure that we were not missing older individuals as a result of the inherent limitations of wear-based seriation. With Jakov Radovci of the Croatian Natural History Museum in Zagreb, Steven A. Goldstein, Jeffrey A. Meganck and Dana L. Begun, then all at Michigan, and undergraduate students from Central Michigan University, I worked to develop a new nondestructive method—using high-resolution three-dimensional microcomputed tomography (μCT)—to reassess how old the Krapina individuals were when they died. Specifically, we looked at the degree of development of a type of tissue within the tooth called secondary dentin; the volume of secondary dentin increases with age and provides a way to assess how old an individual was at death when the tooth crown is too worn to be a good indicator.
Our initial findings, supplemented with scans provided by the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, corroborated Wolpoff's results and validated the wear-based seriation method: the Krapina Neandertals had remarkably high mortality rates; no one there survived past age 30. (This is not to say that Neandertals never lived beyond 30. A few individuals from other sites were around 40 when they died.)
By today's standards, the Krapina death pattern is unimaginable. After all, for most people age 30 is the prime of life. And hunter-gatherers lived beyond 30 in the recent past. Yet the Krapina Neandertals are not unique among early humans. The few other human fossil localities with large numbers of individuals preserved, such as the approximately 600,000-year-old Sima de los Huesos site in Atapuerca, Spain, show similar patterns. The Sima de los Huesos people had very high levels of juvenile and young adult mortality, with no one surviving past 35 and very few living even that long. It is possible that catastrophic events or the particular conditions under which the remains became fossilized somehow selected against the preservation of older individuals at these sites. But the broad surveys of the human fossil record that my colleagues and I have conducted indicate that dying young was the rule, not the exception. To paraphrase words attributed to British philosopher Thomas Hobbes, prehistoric life really was nasty, brutish and short.
Rise of the grandparents
This new μct approach has the potential to provide a high-resolution picture of the ages of older individuals in other fossil human populations. About a decade ago, before we hit on this technique, Sang-Hee Lee of the University of California, Riverside, and I were ready to start looking for evidence of changes in longevity over the course of human evolution. We turned to the best approach available at the time: wear-based seriation.
We faced a daunting challenge, though. Most human fossils do not come from sites, such as Krapina, that preserve so many individuals that the remains can be considered reflective of their larger populations. And the smaller the number of contemporaneous individuals found at a site, the more difficult it is to reliably estimate how old members were when they died because of the statistical uncertainties associated with small samples.
But we realized that we could get at the question of when grandparents started becoming common in another way. Instead of asking how long individuals lived, we asked how many of them lived to be old. That is, rather than focusing on absolute ages, we calculated relative ages and asked what proportion of adults survived to the age at which one could first become a grandparent. Our objective was to evaluate changes over evolutionary time in the ratio of older to younger adults—the so-called OY ratio. Among primates, including humans up until very recently, the third molar erupts at about the same time that an individual becomes an adult and reaches reproductive age. Based on data from Neandertals and contemporary hunter-gatherer populations, we inferred that fossil humans got their third molars and had their first child at around age 15. And we considered double that age to mark the beginning of grandparenthood—just as some women today can potentially give birth at age 15 and those women can become grandmothers when their own children reach age 15 and reproduce.
For our purposes, then, any archaic individual judged to be 30 years old or more qualified as an older adult—one old enough to have become a grandparent. But the beauty of the OY ratio approach is that regardless of whether maturation occurred at 10, 15 or 20 years, the number of older and younger individuals in a sample would be unaffected because the start of older adulthood would change accordingly. And because we were only looking to place the fossils in these two broad categories, we could include large numbers of smaller fossil samples in our analysis without worrying about uncertainties in absolute ages.
We calculated the OY ratios for four large aggregates of fossil samples totaling 768 individuals spanning a period of three million years. One aggregate comprised later australopithecines, who lived in Africa from three million to 1.5 million years ago. Another aggregate consisted of early members of our genus, Homo, from around the globe who lived between two million and 500,000 years ago. The third group was the European Neandertals from 130,000 to 30,000 years ago. And the last consisted of modern Europeans from the early Upper Paleolithic period, who lived between about 30,000 and 20,000 years ago and left behind sophisticated cultural remains.
Although we expected to find increases in longevity over time, we were unprepared for how striking our results would turn out to be. We observed a small trend of increased longevity over time among all samples, but the difference between earlier humans and the modern humans of the Upper Paleolithic was a dramatic fivefold increase in the OY ratio. Thus, for every 10 young adult Neandertals who died between the ages of 15 and 30, there were only four older adults who survived past age 30; in contrast, for every 10 young adults in the European Upper Paleolithic death distribution, there were 20 potential grandparents. Wondering whether the higher numbers of burials at Upper Paleolithic sites might account for the high number of older adults in that sample, we reanalyzed our Upper Paleolithic sample, using only those remains that had not been buried. But we got similar results. The conclusion was inescapable: adult survivorship soared very late in human evolution.
Biology or culture?
Now that Lee and I had established that the number of potential grandparents surged at some point in the evolution of anatomically modern humans, we had another question on our hands: What was it that brought about this change? There were two possibilities. Either longevity was one of the constellations of genetically controlled traits that biologically distinguished anatomically modern humans from their predecessors, or it did not come along with the emergence of modern anatomy and was instead the result of a later shift in behavior. Anatomically modern humans did not burst onto the evolutionary scene making the art and advanced weaponry that define Upper Paleolithic culture. They originated long before those Upper Paleolithic Europeans, more than 100,000 years ago, and for most of that time they and their anatomically archaic contemporaries the Neandertals used the same, simpler Middle Paleolithic technology. (Members of both groups appear to have dabbled in making art and sophisticated weapons before the Upper Paleolithic, but these traditions were ephemeral compared with the ubiquitous and enduring ones that characterize that later period.) Although our study indicated that a large increase in grandparents was unique to anatomically modern humans, it alone could not distinguish between the biological explanation and the cultural one, because the modern humans we looked at were both anatomically and behaviorally modern. Could we trace longevity back to earlier anatomically modern humans who were not yet behaviorally modern?
To address this question, Lee and I analyzed Middle Paleolithic humans from sites in western Asia dating to between about 110,000 and 40,000 years ago. Our sample included both Neandertals and modern humans, all associated with the same comparatively simple artifacts. This approach allowed us to compare the OY ratios of two biologically distinct groups (many scholars consider them to be separate species) who lived in the same region and had the same cultural complexity. We found that the Neandertals and modern humans from western Asia had statistically identical OY ratios, ruling out the possibility that a biological shift accounted for the increase in adult survivorship seen in Upper Paleolithic Europeans. Both western Asian groups had roughly even proportions of older and younger adults, putting their OY ratios between those of the Neandertals and early modern humans from Europe.
Compared with European Neandertals, a much larger proportion of western Asian Neandertals (and modern humans) lived to be grandparents. This is not unexpected—the more temperate environment of western Asia would have been far easier to survive in than the harsh ecological conditions of Ice Age Europe. Yet if the more temperate environment of western Asia accounts for the elevated adult survivorship seen in the Middle Paleolithic populations there, the longevity of Upper Paleolithic Europeans is even more impressive. Despite living in much harsher conditions, the Upper Paleolithic Europeans had an OY ratio more than double that of the Middle Paleolithic modern humans.
We do not know exactly what those Upper Paleolithic Europeans started doing culturally that allowed so many more of them to live to older age. But there can be no doubt that this increased adult survivorship itself had far-reaching effects. As Kristen Hawkes of the University of Utah, Hillard Kaplan of the University of New Mexico and others have shown in their studies of several modern-day hunter-gatherer groups, grandparents routinely contribute economic and social resources to their descendants, increasing both the number of offspring their children can have and the survivorship of their grandchildren. Grandparents also reinforce complex social connections—like my grandmother did in telling stories of ancestors that linked me to other relatives in my generation.
Elders transmit other kinds of cultural knowledge, too—from environmental (what kinds of plants are poisonous or where to find water during a drought, for example) to technological (how to weave a basket or knap a stone knife, perhaps). Multigenerational families have more members to hammer home important lessons. Thus, longevity presumably fostered the intergenerational accumulation and transfer of information that encouraged the formation of intricate kinship systems and other social networks.
Increases in longevity would also have translated into increases in population size by adding an age group that was not there in the past and that was still fertile. And large populations are major drivers of new behaviors. In 2009 Adam Powell, then at University College London, and his colleagues published a paper in Science showing that population density figures importantly in the maintenance of cultural complexity. They and many other researchers argue that larger populations promoted the development of extensive trade networks, complex systems of cooperation, and material expressions of individual and group identity (jewelry, body paint, and so on). Viewed in that light, the hallmark features of the Upper Paleolithic look as though they might well have been consequences of swelling population size.
Growing population size would have affected our forebears another way, too: by accelerating the pace of evolution. As John Hawks of the University of Wisconsin–Madison has emphasized, more people mean more mutations and opportunities for advantageous mutations to sweep through populations as their members reproduce. This trend may have had an even more striking effect on recent humans than on Upper Paleolithic ones, compounding the dramatic population growth that accompanied the domestication of plants 10,000 years ago.
The relation between adult survivorship and the emergence of sophisticated new cultural traditions was almost certainly a positive feedback process. Initially a by-product of some sort of cultural change, longevity became a prerequisite for the complex behaviors that signal modernity. These innovations in turn promoted the importance and survivorship of older adults, which led to the population expansions that had such profound cultural and genetic effects on our predecessors. Older and wiser, indeed.