I have loved archaeology since middle school and have spent many vacations dragging my wife and kids around the world visiting ancient ruins—from the Anasazi kivas of the American Southwest to the “lost cities” of Machu Picchu and Petra to the big-headed Moai statues towering over Easter Island. Somewhere along the way, medical school and a neurology residency derailed my affair with the subject. But even now I sometimes imagine myself as a brain archaeologist—delicately picking through preserved specimens, cataloguing biological artifacts and trying to align my findings with people's unique histories. I am lucky to have had plenty of opportunity to indulge this daydream. At the Rush Alzheimer's Disease Center in Chicago, where I am director, about 100 scientists are searching for ways to treat and prevent a range of common neurodegenerative disorders. For nearly a quarter of a century I have led two longitudinal investigations—the Religious Orders Study and the Rush Memory and Aging Project—which have enrolled more than 3,350 older adults across the U.S. Our volunteers enter these studies, dementia-free, anywhere from their mid-50s to their 100s and, remarkably, agree to hours of testing each year. They undergo comprehensive physical examinations, detailed interviews, cognitive testing, blood draws and, in some cases, brain scans. Most important, all of them donate their brain after death to our research. The resulting collection fills various cabinets and two “freezer farms”—maintained at −112 degrees Fahrenheit and protected by backup and alarm systems—covering about 4,000 square feet.

To date, we have conducted tens of thousands of clinical evaluations and more than 1,400 autopsies, generating an unprecedented set of data that we share with researchers around the world. Like archaeologists in the field, we sift through the remains in our care in hopes of understanding why some people stay sharp into their second century while others begin to lose their faculties as early as their 60s. We link risk factors and lifestyle choices to cognitive function and the biological footprints of disease. It is time-consuming work—the ultimate test of delayed gratification. You might think the more actual damage we find in the brain, the more cognitive challenges its owner experienced—and this is generally true. But not always. Sometimes, given two people with comparable amounts of brain injury, only one of them will have suffered ill effects.

In fact, it is rare to grow old with a completely healthy brain. Virtually every brain we examine exhibits at least some of the neuron-killing tangles associated with Alzheimer's disease, by far the most common cause of dementia. In about half, we find the scars of a previous stroke, big or small. And in almost a fifth, we discover so-called Lewy bodies—abnormal protein clumps that are the mark of Parkinson's disease and Lewy body dementia. But when we trace these laboratory finds back to each individual's records, we can account for only about half of the cognitive changes we measured on tests of memory, processing speed, and the like. Put another way: the pathology in someone's brain postmortem only partially tells us how well it functioned in the years leading up to the person's death.

The big question, of course, is, Why do some people develop symptoms of Alzheimer's dementia, and others do not? To a certain extent, genetics come into play; some people are unlucky to inherit high-risk genes associated with the disease. But investigators working with our data have also identified many key lifestyle factors that shape our brain's health into old age. Some—such as a healthy diet—probably help to slow the buildup of toxic materials that can cripple memory and critical thinking. For instance, Rush epidemiologist Martha Clare Morris has found that the so-called MIND diet—which is rich in berries, vegetables, whole grains and nuts—dramatically lowers the risk of developing Alzheimer's. She is now conducting a clinical trial of this diet.

But other life choices seem to actually bolster the brain's ability to cope with the disease, helping it compensate for any loss of mental firing power. In particular, we have found that the more engaged our volunteers stay throughout life—physically, socially and intellectually—the more resilient they are to dementia at its end.

We are beginning to understand exactly where this resilience comes from in some individuals, raising the hope that we might be able to prevent Alzheimer's in many more—or at least delay its onset so that death comes first. From the dawn of humankind until roughly half a century ago, death did typically come first; most of us did not live long enough to worry about neurodegenerative diseases. As life spans have lengthened, however, Alzheimer's has become increasingly prevalent and now affects more than five million Americans older than 65, or roughly one in nine. Diagnoses are forecast to triple by 2050. Our research suggests that we might be able to avert, or at least blunt, this looming crisis. Indeed, there are things we can all do—from childhood to our retirement years—to make our brain less vulnerable to the ravages of aging and disease.

Laying Foundations

Alzheimer's was not always such an urgent matter. My grandmother was born in October 1906, when people had more reason to worry about communicable diseases than age-related ones. A month after her birth, neuropathologist Alois Alzheimer presented a novel case of dementia to a meeting of his colleagues, who were so unimpressed that they did not ask a single question. The patient, a middle-aged woman named Auguste Deter, had not had syphilis, then considered a major cause of dementia. So instead Alzheimer attributed her symptoms to distinctive hard plaques he observed during autopsy between the nerve cells in her brain and odd tangles of fibers within the cells.

Today we know that these classic features are accumulations of malfunctioning proteins—mostly misfolded fragments of beta-amyloid in the plaques and abnormal tau in the tangles. For several decades after Alzheimer's discovery, though, the disease and its mysterious pathology remained largely forgotten. Then, between 1968 and 1970, neuropathologist Sir Bernard Tomlinson and his colleagues at Newcastle University in England ran a series of elegant studies that led to an important insight: older people without dementia often had plaques and tangles in their brain, too. Those with dementia just had more—and also suffered more strokes. The findings suggested that Alzheimer's disease might be far more common than anyone had realized.

Evidence for this began to accumulate. In April 1976 neurologist Robert Katzman, then at the Albert Einstein College of Medicine, penned a landmark editorial in the American Medical Association's Archives of Neurology in which he declared Alzheimer's disease a “major killer.” The floodgates were opened, and a trickle of money began to flow into labs across the country. Between 1984 and 1991 the nascent National Institute on Aging funded 29 dedicated research centers, including our own. From the start, our primary interest was how to prevent Alzheimer's. Such efforts were in their infancy, but we hoped to take an original approach. Rather than limiting our investigation to the connection between potential risk factors and Alzheimer's dementia, as others were doing, we decided to also take into consideration the physical changes associated with aging and disease in the brain itself.

One big challenge was getting our hands on enough brains, especially from people without dementia. It is relatively easy to get organ donations from patients brought to an Alzheimer's clinic by concerned family members. Obtaining brains from healthy older people—who will also need to agree to multiple examinations before death—proves far more difficult. But we knew that nonsymptomatic people were a vital part of the puzzle. In a revealing 1988 study, Katzman performed autopsies on 137 former residents of a nursing facility, roughly half of whom had previously received an Alzheimer's diagnosis. Among the other half, though, he spotted 10 with significant Alzheimer's-related damage in the brain—who were also among the top-scoring residents on tests of cognitive performance. This group, Katzman noted, had higher brain weights and more neurons than the others with similar amounts of pathology. He proposed that maybe these people just had more brain to lose—an idea that sparked our interest in what is now referred to as neural or cognitive reserve.

For around two decades Bennett (1) has led two large longitudinal investigations into Alzheimer's disease. All participants donate their brains after death. The collection, stored in specialized freezers (2), is yielding important clues about how to prevent the disease. Credit: Todd Winters

How many more people like that were there? Could anyone bank this kind of mind-saving surplus? We planned our investigation to find out, taking inspiration from the Nun Study founded in 1986 by epidemiologist David Snowdon, now retired from the University of Kentucky. The Nun Study tracked nearly 700 members of the School Sisters of Notre Dame older than 75—a high percentage of whom donated their brain after death. Our plan was to complement, not copy, the Nun Study. With the help of the Chicago Archdiocese and the late Sister Katie McHugh, we networked with Catholic orders across the country. By 1993 we had secured funding to launch the Religious Orders Study, requiring organ donation for all participants at sign-up. Four years later we received additional funding to start the Rush Memory and Aging Project to study lay retirees.

We deliberately designed our experiment to be free from as many assumptions about aging and Alzheimer's as possible. For instance, there are no inclusion or exclusion criteria other than being old enough and agreeing to organ donation. We ask our participants not only about their diet, sleep and exercise—widely known to affect health and aging—but also about their education, musical training, foreign-language skills, personality, social activities, traumatic experiences, socioeconomic status as children, and more. We analyze how all these variables relate to changes in the brain and symptoms of dementia, ignoring conventional diagnostic labels. We track how people's cognition changes, sometimes improving, but all too often declining. And we note the pace: some individuals run through the disease's course quickly, whereas others decline slowly or not at all. Our key question: How do you get into that latter group?

Your Brain Fights Back

My grandmother lived to be nearly 100 and liked to tell me repeatedly: “Aging isn't for sissies!” Given my professional focus on aging and dementia, she did not have to tell me twice. Clinically, Alzheimer's can be devastating. Over time it robs people of their memories, use of language, attention and independence. I often compare the deepening memory issues to losing pages from a chronological photo album of your life from back to front—with childhood memories the last to go. Ultimately sufferers lose the ability to function on any meaningful level. Mercifully, perhaps, many die from other conditions long before reaching the end stages of the disease. The good news is that as the disease unfolds, the brain fights back. Like all other systems in the body, the brain does not sit idly by, a mere bystander. In fact, it is the most plastic and adaptable of all our organs (which is how you learn in the first place). This plasticity appears to be a large part of what constitutes our reservoir of resilience or cognitive reserve.

To better understand it, we scrutinize the brains of people who seemed to have real cognitive staying power—or declined only slowly—despite the presence of plaques, evidence of stroke or other damage. Like Katzman, we find that such individuals tend to have more neurons—specifically in the locus coeruleus, a blue-tinted region in the brain stem normally involved in our stress and panic responses. The finding makes sense: most Alzheimer's patients eventually lose up to 70 percent of the neurons there. Working with psychiatrist William Honer of the University of British Columbia, we also discovered that slow decliners typically have higher amounts of specific proteins, such as vesicle-associated membrane protein, complexin-I and complexin-II, which help to relay messages across the synapses, or gaps, between brain cells.

Using our samples, neuroscientist Bruce A. Yankner of Harvard University discovered yet another protein that helps to actively preserve our mental abilities. Levels of this protein, called repressor element 1–silencing transcription factor, or REST, are highest in the brains of elderly people who live into their 90s and 100s. Perhaps not surprisingly, Yankner found in animal studies that REST protects neurons from death caused by oxidative stress or beta-amyloid, among other threats. His research shows that better cognition correlates with high levels of REST in the cortex and hippocampus, areas that are normally vulnerable in Alzheimer's. And when the investigators disabled REST in mice, the animals began to show signs of Alzheimer's-like neurodegeneration. Neurologist Aron Buchman in our group at Rush found that gene expression of brain-derived neurotrophic factor (BDNF) was associated with a slower rate of cognitive decline. Furthermore, it reduced the deleterious impact of Alzheimer's pathology on cognitive decline. In other words, for each unit of brain pathology, the effect on cognition was lower in persons with higher BDNF expression. This suggests that the increased expression is in response to the pathology. BDNF is well known to be involved in neuronal activity and synaptic plasticity in the brain.

We and other researchers continue to search for additional biochemical factors that help save our mind as we age—plus other mechanisms that cause it harm. Recently neurologist Julie Schneider in our group at Rush uncovered that more than half of the brains in our collection contain abnormal clumps of the protein TDP-43, previously linked to frontotemporal lobe dementia and amyotrophic lateral sclerosis (Lou Gehrig's disease). Nearly 10 percent also bear scar tissue and a major loss of neurons in the hippocampus, critical to memory formation.

Others have observed signs of chronic inflammation in the brains of Alzheimer's patients, possibly supporting theories that tie the disease to infections such as human cytomegalovirus, which psychologist Lisa Barnes in our group confirmed is associated with lost cognition. And in collaboration with neurologist Steven Arnold, now at Massachusetts General Hospital, we found evidence of an association between Alzheimer's and abnormal insulin signaling in the brain.

This biological complexity has important implications for how we think about the treatment and prevention of this disease. With so many variables involved and likely many more to be discovered, it is not surprising that a lot of risk factors for Alzheimer's dementia are not actually related to Alzheimer's pathology. Working with neurologist Philip De Jager of Brigham and Women's Hospital, we recently examined the relation of more than 25 genomic variants linked to Alzheimer's dementia to several different types of abnormalities in the brain. We found that a few correlated with Alzheimer's pathology, but some were associated with other causes of dementia such as stroke, Lewy bodies and scarring in the hippocampus.

This complexity also means that it is exceedingly challenging—if not impossible—to single out meaningful targets for drug therapies. And given the imperfect correlation between brain pathology and cognitive performance, any interventions aimed at these biological processes would not necessarily have a large effect on symptoms. In fact, drug development for treating Alzheimer's has been slow and marked mostly by disappointment.

Building Cognitive Reserve

As researchers continue to untangle the intricate web of disease mechanisms, it makes sense to focus on preventing Alzheimer's in the first place—to apply what we know about strengthening our brain to withstand the hits that come with aging. In our work, we have homed in on a variety of experiences, from childhood through old age, that can help us shore up cognitive reserve. Perhaps one of the most critical early steps toward ensuring better brain health is education—and not just formal schooling but other kinds of learning as well. Cognitive psychologist Fergus Craik and his colleagues have estimated that, on average, bilingualism delays the onset of dementia by as much as four years. And neuropsychologist Robert Wilson in our group at Rush has found that training in a second language, as well as in music—another form of language—correlates with a slower rate of cognitive decline. Had I only continued with those violin lessons!

That said, the relation between education and cognitive decline is complicated, as statistician Lei Yu in our group at Rush has discovered. In general, cognitive decline does not occur at a steady pace; it begins at one rate, and then, after a certain point, it accelerates. More education shifts this so-called change point later in life, maybe because more learning builds more brainpower to fall back on. Those with fewer years of formal education tend to have lower baseline abilities to start with and hit the change point sooner. Before the change point, both groups lose cognitive skills at roughly the same rate. Interestingly, even though those with more education typically start their descent at a later age, once they reach the change point they decline much faster. Biostatistician Charles Hall of the Albert Einstein College of Medicine has also identified this pattern in analyzing data from the Einstein Aging Study, an investigation of aging in the brain that has tracked a group of Bronx, N.Y., residents for more than three decades.

This precipitous downturn among the highly educated supports a theory called compression of morbidity, which James Fries, a professor of medicine at Stanford University, first put forth in 1980. Fries's basic hypothesis is that it is possible and desirable to delay the onset of diseases and compress the number of years someone spends ill and disabled at the end of life. For a disease such as Alzheimer's, being able to compress morbidity is hugely valuable, both emotionally and economically. This disease takes a terrible toll on both patients and family members, who are often put in the role of caregiver. Finding support is costly. Thus, any measure that can give someone even one more year of independent living translates into benefits not only for that person but for their family and the economy.

Among our participants, the more educated they are, the shorter, on average, their overall suffering. This trend explains a 1995 report by Yaakov Stern of Columbia University, which found that the risk of death among patients who have Alzheimer's dementia was greater for those with more education.

Education is not directly related to any neuropathology or protective neurobiology measured to date. Instead it seems to mute the effects of advancing disease on people's cognitive skills. The more damage someone has in the brain, the more protection he or she is afforded by extra years of education. This holds true at very high levels of education, as seen in our data, as well as at very low levels of education, as demonstrated by neuropathologist Jose Farfel of the University of São Paulo in Brazil.

Mining the Golden Years

If you don't play the violin or speak another language, don't fret. Early educational experiences are not your only shot at building cognitive reserve. We have also found factors later in life that can buy more years of healthy living. Among them is something commonly called purpose in life, a measure of well-being that refers to our psychological tendency to derive meaning from life's experiences and to have clear intentions and goals.

Neuropsychologist Patricia Boyle in our group at Rush measured this trait in more than 900 participants in the Rush Memory and Aging Project, the majority in their 70s, 80s and 90s, using a scale based on the work of psychologist Carol D. Ryff of the University of Wisconsin–Madison. During up to seven years of follow-up, we discovered that those who scored higher on purpose in life were 2.4 times more likely to have avoided an Alzheimer's diagnosis, compared with those with lower scores. Relatively higher scores were also associated with slower rates of cognitive decline. In a similar analysis, Wilson found that higher levels of conscientiousness—one of the classic “big five” personality traits, characterized by organization, self-discipline, dependability and a drive to achieve—also offered some protection: participants in the Religious Orders Study scoring in the 90th percentile in conscientiousness had an 89 percent reduction in risk for developing Alzheimer's.

Psychological traits aside, other studies show that the size of our communities can affect how quickly Alzheimer's pathology encroaches on our cognition: among our participants, those with larger social networks are better able to postpone some of the worst symptoms. By social networks, we do not mean Facebook friends or Twitter followers but rather close relatives and friends with whom you can discuss private matters. Our first thought was that perhaps people with large social networks engage in more cognitive, physical and social activities, but controlling for these variables does not affect the association. Instead the protection from larger social networks might reflect in part the type of individuals who can form them. Simply put, they may have better developed people skills—and so a greater reserve of social cognition—to fall back on.

Physician Laura Fratiglioni, a professor at Sweden's Karolinska Institute, was first to describe the link between social networks and Alzheimer's in 2000. She based her findings on data from the Kungsholmen Project, a longitudinal population–based study on aging and dementia in Stockholm. Interestingly, she also measured how satisfied people were with their social contacts and found that frequent but unsatisfactory interactions with your children increase dementia risk. (It reminds me of an old Sam Levenson joke: “Insanity is hereditary—you get it from your children!”)

All humor aside, Wilson in our group examined negative social interactions in a 2015 study, following 529 of our participants. All were symptom-free at the start of the study, but in keeping with Fratiglioni's finding, after an average of nearly five years, those who reported more neglect and rejection were more likely to show signs of cognitive impairment.

The central theme behind all these influences is positive engagement. We and many others have found that increases in cognitive, physical and social activities are all associated with a reduced risk of Alzheimer's dementia. Buchman in our Rush group went so far as to periodically place actigraph units (similar to pedometers) on the wrists of nearly 1,000 participants to measure their physical movements—capturing not just formal exercise but any activity, like playing cards or cooking. His results showed that those in the bottom 10 percent of intensity—the people who moved the least—were more than twice as likely to later be diagnosed with Alzheimer's, compared with the most mobile in the study. The implied lesson for us all: keep moving.

Another way to think about engaging with the world is to actually get out there. Epidemiologist Bryan James in our Rush group tested something referred to as life space among nearly 1,300 of our volunteers, none of whom showed signs of dementia at the start of the study. They measured the subjects' range during the previous week: Had they left their bedroom, front porch or yard? Had they ventured out of their neighborhood? Or had they made it farther afield and out of town? After about four years they found that those most constricted to their homes were twice as likely to develop Alzheimer's, compared with those with the largest life spaces—controlling for cognitive, physical and social activity. Is it the motivation to get up and go, or is it what you do when you get there?

We hope that in the years to come, as our collection expands and our means to study it grow ever more sophisticated, we will find many more clues to age-proofing our brain. When I used to visit my grandmother—we called her GG for “great grandmother”—at her retirement facility, she would always ask me, “So, David, still working on the Alzheimer's?”

“Yes, GG,” I would answer. “Still examining old brains trying to figure out what protects us from memory loss.”

She always followed with, “Find anything?”

“Sure,” I'd say, “a little.”

Then she'd lean over, point to a few people in her facility and whisper, “You better hurry up!”

How right she was.


How Did Marge Stay Sharp?

I first met Marjorie Mason Heffernan in January 2003, when I began recruiting participants for the Memory and Aging Project at a retirement community, now called Presence Bethlehem Woods, in La Grange Park, Ill., a 40-minute drive west of Rush University. I am not sure what took us so long to enroll there; it is right next door to the Sisters of St. Joseph, our very first Religious Orders site, where we had been testing study participants for a decade.

Roughly a month after she signed up for our study, Marge—as she was known to friends and family—came in for her baseline evaluation. During the first week of March, I sat down to review the results with her. At 79 years old, she was doing great. On the Mini-Mental State Examination (MMSE), the most widely used test of overall cognitive abilities, she had scored a perfect 30. In fact, she performed extremely well on nearly all the 21 cognitive tests we gave her.

Over the course of seven years Marge proved to be an energetic study participant. She enrolled in a number of substudies—including a brain-imaging study and a behavioral economics and decision-making study. We evaluated her cognition eight times, and each time she scored a perfect 30 on the MMSE—apart from one test at age 80, in which she nearly scored 30, and one at age 84, when she dipped to 28. At the end of 2010, Marge died peacefully at home, at age 87, comforted by her son and two nieces.

Like all our study participants, Marge had generously donated her brain for research. At autopsy, it weighed 1,246 grams, pretty much average for women. She had mild, widespread tissue loss, which is typical of Alzheimer's and other neurodegenerative diseases but can also be seen in healthy older brains. Under the microscope, her brain had enough beta-amyloid plaques and tau tangles to meet the pathological criteria for Alzheimer's. There were no signs of infarcts (areas of dead tissue that can indicate stroke) or Lewy bodies (marks of Parkinson's disease and Lewy body dementia). In short, the findings were consistent with moderate Alzheimer's, which raised the question: Why was Marge's cognition so good?

The answer might be found in her life story, which featured many of the factors that our studies indicate can boost cognitive reserve and hold dementia at bay. For one thing, she was well educated—having attended school for 22 years, a lot for a woman born in 1923. Her younger sister, Betty Borman, who joined our study after Marge's death, later relayed that both she and Marge graduated from Chicago Teachers College in the 1940s.

From the data we collected, I knew that Marge was cognitively and socially active. Betty later described her sister as a voracious reader, who could get through a book in a day. She told me that Marge founded a book club and that she and her late husband were involved in a local theater company. Marge also maintained a positive attitude, despite many adversities: she buried two of her three sons and two husbands.

 
Credit: David A. Bennett

Tests of Marge's personality and well-being backed up Betty's description. She had scored high on “purpose in life” and conscientiousness and low on neuroticism, anxiety, depressive symptoms and harm avoidance, a trait that encompasses shyness, excessive worrying and pessimism. Despite having a bad back, Marge was no homebody and scored the maximum possible in “life space”—a measure of one's geographical range—on our survey.

It is interesting to contrast Marge with another of our female participants, Mary.* She also enrolled at age 79 and, like Marge, completed eight annual clinical evaluations before her death at age 87. Mary's MMSE score at baseline was a solid 28 but declined to half that at her final evaluation. She was diagnosed with mild cognitive impairment at age 81, dementia at 84 and Alzheimer's at 85.

At autopsy, Mary's brain weighed 1,088 grams, much smaller than Marge's. And unlike Marge's brain, hers showed scarring from three small infarcts, although she had no history of strokes. But like Marge, Mary had mild tissue loss and enough damage to meet the pathological criteria for Alzheimer's. She actually had less beta-amyloid and fewer tangles than Marge did.

Despite having less Alzheimer's pathology than Marge, Mary suffered from a progressive loss of cognition, resulting in an inability to care for herself by the time of her death. Yes, she had a couple of small infarcts and some beta-amyloid in her blood vessels, and there may have been genetic differences that made Mary more vulnerable. But again, we found clues to her cognitive decline in her life story: Mary had 10 fewer years of education than Marge, having graduated only from high school. She scored low on measures of cognitive activity, purpose in life and life space. And she scored very high on harm avoidance, anxiety, neuroticism and depressive symptoms.

All efforts to develop therapies to prevent Alzheimer's have so far failed, but the comparison of these two women brings into focus the potential protective effects of life habits—ranging from early education to late-life social engagement. Marge and Mary had similar levels of Alzheimer's-related damage, yet their brains functioned at very different levels during their final years of life. —D.A.B.

*Not her real name.


Building a better brain as we age

Based on the results of scores of studies, here are 10 things you can do to reduce the risk of losing cognition and developing Alzheimer's dementia:

  1. Pick your parents well! Make sure you get good genes, a good education, a second language and music lessons. Avoid emotional neglect.
  2. Engage in regular cognitive and physical activity.
  3. Strengthen and maintain social ties.
  4. Get out and explore new things.
  5. Chillax and be happy.
  6. Avoid people who are downers, especially close family members!
  7. Be conscientious and diligent.
  8. Spend time engaged in activities that are meaningful and goal-directed.
  9. Be heart-healthy: what's good for the heart is good for the brain.
  10. Eat a MIND diet, with fresh fruit and vegetables and fish.
  11. (For fans of This is Spinal Tap, our list goes to 11.) Be lucky!