Imagine yourself dead. What picture comes to mind? Your funeral with a casket surrounded by family and friends? Complete darkness and void? In either case, you are still conscious and observing the scene. In reality, you can no more envision what it is like to be dead than you can visualize yourself before you were born. Death is cognitively nonexistent, and yet we know it is real because every one of the 100 billion people who lived before us is gone. As Christopher Hitchens told an audience I was in shortly before his death, “I’m dying, but so are all of you.” Reality check.
In his book Immortality: The Quest to Live Forever and How It Drives Civilization (Crown, 2012), British philosopher and Financial Times essayist Stephen Cave calls this the Mortality Paradox. “Death therefore presents itself as both inevitable and impossible,” Cave suggests. We see it all around us, and yet “it involves the end of consciousness, and we cannot consciously simulate what it is like to not be conscious.”
The attempt to resolve the paradox has led to four immortality narratives: Staying alive: “Like all living systems, we strive to avoid death. The dream of doing so forever—physically, in this world—is the most basic of immortality narratives.” Resurrection: “The belief that, although we must physically die, nonetheless we can physically rise again with the bodies we knew in life.” Soul: The “dream of surviving as some kind of spiritual entity.” Legacy: “More indirect ways of extending ourselves into the future” such as glory, reputation, historical impact or children.
All four fail to deliver everlasting life. Science is nowhere near reengineering the body to stay alive beyond 120 years. Both religious and scientific forms of resurrecting your body succumb to the Transformation Problem (how could you be reassembled just as you were and yet this time be invulnerable to disease and death?) and the Duplication Problem (how would duplicates be different from twins?). “Even if DigiGod made a perfect copy of you at the end of time,” Case conjectures, “it would be exactly that: a copy, an entirely new person who just happened to have the same memories and beliefs as you.” The soul hypothesis has been slain by neuroscience showing that the mind (consciousness, memory and personality patterns representing “you”) cannot exist without the brain. When the brain dies of injury, stroke, dementia or Alzheimer’s, the mind dies with it. No brain, no mind; no body, no soul.
That leaves us with the legacy narrative, of which Woody Allen quipped: "I don’t want to achieve immortality through my work; I want to achieve it by not dying." Nevertheless, Cave argues that legacy is the driving force behind works of art, music, literature, science, culture, architecture and other artifacts of civilization. How? Because of something called Terror Management Theory. Awareness of one’s mortality focuses the mind to create and produce to avoid the terror that comes from confronting the mortality paradox that would otherwise, in the words of the theory’s proponents—psychologists Sheldon Solomon, Jeff Greenberg and Tom Pyszczynski—reduce people to “twitching blobs of biological protoplasm completely perfused with anxiety and unable to effectively respond to the demands of their immediate surroundings.”
Maybe, but human behavior is multivariate in causality, and fear of death is only one of many drivers of creativity and productivity. A baser evolutionary driver is sexual selection, in which organisms from bowerbirds to brainy bohemians engage in the creative production of magnificent works with the express purpose of attracting mates—from big blue bowerbird nests to big-brained orchestral music, epic poems, stirring literature and even scientific discoveries. As well argued by evolutionary psychologist Geoffrey Miller in The Mating Mind (Anchor, 2001), those that do so most effectively leave behind more offspring and thus pass on their creative genes to future generations. As Hitchens once told me, mastering the pen and the podium means never having to dine or sleep alone.