Unsigned and undated, inventory number 779 hangs behind thick glass in the Louvre's brilliantly lit Salle des États. A few minutes after the stroke of nine each morning, except for Tuesdays when the museum remains closed, Parisians and tourists, art lovers and curiosity seekers begin flooding into the room. As their hushed voices blend into a steady hivelike hum, some crane for the best view; others stretch their arms urgently upward, clicking cell-phone cameras. Most, however, tilt forward, a look of rapt wonder on their faces, as they study one of humanity's most celebrated creations: the Mona Lisa, by Leonardo da Vinci.
Completed in the early 16th century, the Mona Lisa possesses a mysterious, otherworldly beauty quite unlike any portrait that came before it. To produce such a painting, Leonardo developed a new artistic technique he called sfumato, or “smoke.” Over a period of several years he applied translucent glazes in delicate films—some no more than the thickness of a red blood cell—to the painting, most likely with the sensitive tip of his finger. Gradually stacking as many as 30 of these films one on top of another, Leonardo subtly softened lines and color gradations until it seemed as if the entire composition lay behind a veil of smoke.
The Mona Lisa is clearly a work of inventive genius, a masterpiece that stands alongside the music of Mozart, the jewels of Fabergé, the choreography of Martha Graham, and other such classics. But these renowned works are only the grandest manifestations of a trait that has long seemed part of our human hardwiring: the ability to create something new and desirable, the knack of continually improving designs and technologies—from the latest zero-emissions cars made in Japan to the sleekly engineered Falcon 9 rockets from SpaceX. Modern humans, says Christopher S. Henshilwood, an archaeologist at the Universities of Bergen in Norway and the Witwatersrand, Johannesburg, in South Africa, “are inventors of note. We advance and experiment with technology constantly.”
Just how we came by this seemingly infinite capacity to create is the subject of intense scientific study: we were not always such whirlwinds of invention. Although our human lineage emerged in Africa around six million years ago, early family members left behind little visible record of innovation for nearly 3.4 million years, suggesting that they obtained plant and animal foods by hand, with tools such as digging or jabbing sticks that did not preserve. Then, at some point, wandering hominins started flaking water-worn cobblestones with hammerstones to produce cutting tools. That was an act of astonishing ingenuity, to be sure, but a long plateau followed—during which very little seems to have happened on the creativity front. Our early ancestors apparently knapped the same style of handheld, multipurpose hand ax for 1.6 million years, with only minor tweaks to the template. “Those tools are really kind of stereotypical,” says Sally McBrearty, an archaeologist at the University of Connecticut.
So when did the human mind begin churning with new ideas for technology and art? Until recently, most researchers pointed to the start of the Upper Paleolithic period 40,000 years ago, when Homo sapiens embarked on what seemed a sudden, wondrous invention spree in Europe: fashioning shell-bead necklaces, adorning cave walls with geometric signs and paintings of Ice Age animals, and carving and knapping a wide variety of new stone and bone tools. The finds prompted a popular theory proposing that a random genetic mutation at around that time had spurred a sudden leap in human cognition, igniting a creative “big bang.”
New evidence, however, has cast grave doubt on the mutation theory. Over the past decade or so archaeologists have uncovered far older evidence of art and advanced technology, suggesting that the human capacity to cook up new ideas evolved much earlier than previously thought—even before the emergence of H. sapiens 200,000 years ago. Yet although our capacity for creativity sparked early on, it then smoldered for millennia before finally catching fire in our species in Africa and Europe. The evidence seems to indicate that our power of innovation did not burst into existence fully formed late in our evolutionary history but rather gained steam over hundreds of thousands of years, fueled by a complex mix of biological and social factors.
Exactly when did humankind begin thinking outside the box, and what factors converged to ultimately fan our brilliant creative fire? Understanding this scenario requires following a detective story composed of several strands of evidence, starting with the one showing that the biological roots of our creativity date back much further than scientists once thought.
Mother of Invention
Archaeologists have long viewed the use of symbols as the single most important indicator of modern human cognition, in large part because it attests to a capacity for language—a hallmark human trait. Thus, the geometric signs and the spectacular cave art of the Upper Paleolithic clearly signal the presence of people who thought as we do. But more recently, experts have begun searching for hints of other kinds of modern behavior and its antecedents in the archaeological record—and coming up with fascinating clues.
Archaeologist Lyn Wadley of the University of the Witwatersrand, Johannesburg, has spent much of her career studying ancient cognition, research that led her in the 1990s to open excavations at Sibudu Cave, some 40 kilometers north of Durban, South Africa. At that same site, about five years ago, she and her team discovered a layer of strange, white, fibrous plant material there. To Wadley, the pale, brittle mash looked like ancient bedding—rushes and other plants that later people often scattered on the ground for sitting and sleeping on. But the layer could also have formed from wind-borne leaf litter. The only way to tell one from the other was to encase the entire layer in a protective plaster jacket and take it back to the laboratory. “It took us three weeks to make all that plaster,” Wadley recounts, “and I was really grumpy the whole time. I kept wondering, ‘Am I wasting three weeks in the field?’”
But Wadley's gamble paid off richly. In 2011 she and her colleagues reported in Science that Sibudu's occupants selected leaves from just one of many woody species in the area to make bedding 77,000 years ago—nearly 50,000 years earlier than previously reported examples. What most surprised Wadley, however, was the occupants' sophisticated knowledge of the local vegetation. Analysis showed that the chosen leaves came from Cryptocarya woodii, a tree containing traces of natural insecticides and larvicides effective against the mosquitoes that carry deadly disease today. “And that's very handy to have in your bedding, particularly if you live near a river,” Wadley observes.
The creative minds at Sibudu did not stop there, however. They most likely devised snares to capture small antelopes, whose remains litter the site, and crafted bows and arrows to bring down more dangerous prey, judging from the sizes, shapes and wear patterns of several stone points from the cave. Moreover, Sibudu's hunters concocted various valuable new chemical compounds. By shooting a high-energy beam of charged particles at dark residues on stone points from the cave, Wadley's team detected multi-ingredient glues that once fastened the points to wood hafts. She and her colleagues then set about experimentally replicating these adhesives, mixing ocher particles of different sizes with plant gums and heating the mixtures over wood fires. Publishing the results in Science, the team concluded that Sibudu's occupants were very likely “competent chemists, alchemists and pyrotechnologists” by 70,000 years ago.
Elsewhere in southern Africa, researchers have recently turned up traces of many other early inventions. The hunter-gatherers who inhabited Blombos Cave between 100,000 and 72,000 years ago, for example, engraved patterns on chunks of ocher; fashioned bone awls, perhaps for tailoring hide clothing; adorned themselves with strands of shimmering shell beads; and created an artists' studio where they ground red ocher and stored it in the earliest known containers, made from abalone shells. Farther west, at the site of Pinnacle Point, people engineered the stone they worked with 164,000 years ago, heating a low-grade, local rock known as silcrete over a controlled fire to transform it into a lustrous, easily knappable material. “We are seeing behaviors that we didn't even dream about 10 years ago,” Henshilwood remarks.
Moreover, technological ingenuity was not the sole preserve of modern humans: other hominins possessed a creative streak, too. In northern Italy a research team headed by University of Florence archaeologist Paul Peter Anthony Mazza discovered that our near kin, the Neandertals, who first emerged in Europe some 300,000 years ago, concocted a birch bark–tar glue to fasten stone flakes to wood handles, fabricating hafted tools some 200,000 years ago. Likewise, a study published in 2012 in Science concluded that stone points from the site of Kathu Pan 1 in South Africa once formed the lethal tips of 500,000-year-old spears, presumably belonging to Homo heidelbergensis, the last common ancestor of Neandertals and H. sapiens. And at Wonderwerk Cave in South Africa, an ancient layer containing plant ash and bits of burned bone suggests that an even earlier hominin, Homo erectus, learned to kindle fires for warmth and protection from predators as early as one million years ago.
Even our very distant ancestors were capable on occasion of coining new ideas. At two sites near the Kada Gona River in Ethiopia, a team led by paleoanthropologist Sileshi Semaw of Indiana University Bloomington found stone tools—2.6-million-year-old choppers knapped by Australopithecus garhi or one of its contemporaries, likely for stripping meat from animal carcasses. Such tools look crude to us, a far cry from the smartphones, laptops and tablets of today. “But when the world consisted solely of naturally formed objects, the capacity to imagine something and turn it into a reality may well have seemed almost magical,” wrote cognitive scientist Liane Gabora of the University of British Columbia and psychologist Scott Barry Kaufman, now at the University of Pennsylvania, in a chapter appearing in The Cambridge Handbook of Creativity (Cambridge University Press, 2010).
Cognition and Creation
Yet impressive as these early flashes of creativity are, the great disparity in the depth and breadth of innovation between modern humans and our distant forebears demands an explanation. What changes in the brain set our kind apart from our predecessors? By poring over three-dimensional scans of ancient hominin braincases and by examining the brains of our nearest living evolutionary kin—chimpanzees and bonobos, whose ancestors branched off from our lineage some six million years ago—researchers are beginning to unlock this puzzle. Their data show just how extensively human gray matter evolved over time.
Generally speaking, natural selection favored large brains in humans. Whereas our australopithecine kin possessed an estimated mean cranial capacity of 450 cubic centimeters, roughly that of some chimpanzees, H. erectus more than doubled that capacity by 1.6 million years ago, with a mean of 930 cubic centimeters. And by 100,000 years ago H. sapiens had a mean capacity of 1,330 cubic centimeters. Inside this spacious braincase, an estimated 100 billion neurons processed information and transmitted it along nearly 165,000 kilometers of myelinated nerve fibers and across some 0.15 quadrillion synapses. “And if you look at what this correlates with in the archaeological record,” says Dean Falk, a paleoneurologist at Florida State University, “there does seem to be an association between brain size and technology or intellectual productivity.”
But size was not the only major change over time. At the University of California, San Diego, biological anthropologist Katerina Semendeferi studies a part of the brain known as the prefrontal cortex, which appears to orchestrate thought and action to accomplish goals. Examining this region in modern humans and in both chimpanzees and bonobos, Semendeferi and her colleagues discovered that several key subareas underwent a major reorganization during hominin evolution. Brodmann area 10, for example—which is implicated in bringing plans to fruition and organizing sensory input—nearly doubled in volume after chimpanzees and bonobos branched off from our human lineage. Moreover, the horizontal spaces between neurons in this subarea widened by nearly 50 percent, creating more room for axons and dendrites. “This means that you can have more complicated connections and ones that go farther away, so you can get more complex and more synthetic communication between neurons,” Falk comments.
Pinpointing just how a bigger, reorganized brain spurred creativity is a tricky business. But Gabora thinks that psychological studies of creative people today supply a key clue. Such individuals are excellent woolgatherers, she explains. When tackling a problem, they first let their minds wander, allowing one memory or thought to spontaneously conjure up another. This free association encourages analogies and gives rise to thoughts that break out of the box. Then, as these individuals settle on a vague idea for a solution, they switch to a more analytic mode of thought. “They zero in on only the most relevant properties,” Gabora says, and they start refining an idea to make it workable.
In all likelihood, Gabora notes, a bigger brain led to a greater ability to free-associate. More stimuli could be encoded in a brain made up of many billions of neurons. In addition, more neurons could participate in the encoding of a particular episode, leading to a finer-grained memory and more potential routes for associating one stimulus with another. Imagine, Gabora says, that a hominin brushes against a spiny shrub and sharp thorns tear its flesh. An australopithecine might encode this episode very simply—as a minor pain and as an identifiable feature of the shrub. But H. erectus, with its larger assembly of neurons, could conceivably encode many aspects of the episode. Then, when this hominin begins hunting, its need to kill prey might activate all memory locations encoding torn flesh, bringing to mind the encounter with the sharp pointed thorns. That memory, in turn, could inspire a fresh idea for a weapon: a spear with a sharp pointed tip.
But large-brained hominins could not afford to linger too long in an associative state in which one thing immediately reminded them of a flood of other things, both important and inconsequential. Their survival depended mostly on analytic thought—the default mode. So our ancestors had to develop a way of switching smoothly from one mode to another by subtly altering concentrations of dopamine and other neurotransmitters.
Gabora now hypothesizes that H. sapiens needed tens of thousands of years to fine-tune this mechanism before they could reap the full creative benefit of their large brains, and she and her students are testing these ideas on an artificial neural network. Through a computer model, they simulate the brain's ability to switch between the analytic and associative mode to see how it could help someone break out of a cognitive rut and see things in a new way. “Just having more neurons isn't enough,” Gabora asserts. “You have to be able to make use of all that extra gray matter.” Once that final piece of the biological puzzle fell into place—perhaps a little more than 100,000 years ago—the ancestral mind was a virtual tinder box, awaiting the right social circumstances to burst into flame.
Building on Brilliance
In the autumn of 1987 two researchers, both then at the University of Zurich—Christophe and Hedwige Boesch—observed a behavior they had never seen before in a group of chimpanzees foraging for food in Tai National Park in Ivory Coast. Near a ground nest belonging to a species of driver ants, a female stopped and picked up a twig. She dipped one end into the loose soil covering the nest's entrance and waited for the colony's soldier ants to attack. When the dark swarm had advanced nearly 10 centimeters up the twig, the female chimpanzee plucked it from the nest and deftly rolled it toward her mouth, snacking on the ants. She then repeated the process until she had eaten her fill.
Chimpanzees are highly adept at using a wide range of tools—cracking open nuts with stones, sponging up water from tree hollows with leaves and unearthing nutritious plant roots with digging sticks. But they seem unable to build on this knowledge or to craft ever more advanced technology. “Chimps can show other chimps how to hunt termites,” Henshilwood says, “but they don't improve on it, they don't say, ‘Let's do it with a different kind of probe’—they just do the same thing over and over.” Modern humans, in contrast, suffer from no such limitations. Indeed, we daily take the ideas of others and put our own twist on them, adding one modification after another, until we end up with something new and very complex. No one individual, for example, came up with all the intricate technology embedded in a laptop computer: such technological achievements arise from the creative insights of generations of inventors.
Anthropologists call this knack of ours cultural ratcheting. It requires, first and foremost, the ability to pass on knowledge from one individual to another or from one generation to the next, until someone comes along with an idea for an improvement.
In a 2012 study published in Science by Lewis Dean, a behavioral primatologist at the University of St. Andrews in Scotland, and four colleagues revealed why human beings can do this and chimpanzees and capuchin monkeys cannot. Dean and his team designed an experimental puzzle box, with three sequential and incrementally difficult levels: then they presented it to groups of chimps in Texas, capuchin monkeys in France and nursery schoolchildren in England. Only one of the 55 nonhuman primates—a chimp—reached the highest level after more than 30 hours of trying. The children, however, fared far better. Unlike the groups of monkeys, the children worked collaboratively—talking among themselves, offering encouragement and showing one another the right way to do things. After two and a half hours, 15 of the 35 children had reached level three.
Equipped with these social skills and cognitive abilities, our ancestors could readily transmit knowledge to others—a key prerequisite for cultural ratcheting. Yet something else was needed to propel the ratcheting process and push H. sapiens to new creative heights in Africa some 90,000 to 60,000 years ago and in Europe 40,000 years ago. Mark Thomas, an evolutionary geneticist at University College London, thinks this push came from demography. His premise is simple. The larger a hunter-gatherer group is, the greater the chances are that one member will dream up an idea that could advance a technology. Moreover, individuals in a large group who frequently rubbed shoulders with neighbors had a better chance of learning a new innovation than those in small, isolated groups. “It's not how smart you are,” Thomas says. “It's how well connected you are.”
To test these ideas, Thomas and two colleagues developed a computer model to simulate the effects of demography on the ratcheting process. With genetic data from modern Europeans, the team estimated the size of modern human populations in Europe at the beginning of the Upper Paleolithic, when evidence of human creativity started to spike, and calculated the population density. Then the researchers examined African populations over time, simulating their growth and patterns of migratory activity. Their model showed that African populations reached the same density as the early Upper Paleolithic Europeans around 101,000 years ago, just before innovation began to take off in sub-Saharan regions, according to the archaeological record. It also showed that large social networks actively spur human creativity.
Archaeological evidence published in 2012 in Nature sheds light on the tech renaissance that followed the rise of population density in southern Africa. Some 71,000 years ago at Pinnacle Point, H. sapiens devised and passed down to others a complex technological recipe to make lightweight stone blades for projectile weapons—cooking silcrete to a specific temperature to improve its flaking qualities, knapping the finished material into blades little more than a couple of centimeters long, and mounting them on wood or bone shafts with homemade glue. “Like viruses,” note archaeologists Fiona Coward of Bournemouth University and Matt Grove of the University of Liverpool in England in a paper published in 2011 in PaleoAnthropology, “cultural innovations need very particular social conditions to spread—most notably… large connected populations who can ‘infect’ one another.”
Which brings us to the jostling, teeming, intimately linked world we live in today. Never before have humans crowded together in such massive cities, accessing vast realms of knowledge with a click of the keyboard and sharing new concepts, new blueprints and designs across the sprawling social networks of the World Wide Web. And never before has the pace of innovation accelerated so dramatically, filling our lives with new fashions, new electronics, new cars, new music, new architecture.
Half a millennium after Leonardo da Vinci conceived of his most celebrated work, we marvel at his inventive genius—a genius built on the countless ideas and inventions of a lineage of artists stretching back into the Paleolithic past. And even now a new crop of artists gaze at the Mona Lisa with an eye to turning it into something fresh and dazzlingly creative. The human chain of invention remains unbroken, and in our superbly connected world, our singular talent to create races on ahead of us.