We are always telling stories about the world, the universe, ourselves. It helps to make sense of things. But sometimes, through familiarity or neglect, we get lost. We forget where a story really starts, losing sight of where it’s headed. What is biodiversity? Are electric cars new? Even the well-worn tale of human origins is missing a key chapter: how a small band of hunter-gatherers survived a climate disaster, becoming ancestors of us all. Here we provide the surprising origins of some strange and familiar things.
All In The Family
What persuaded the male hominid to stick around after mating?
From the standpoint of biology, males have nothing to do after copulation. “It’s literally wham-bam thank-you-ma’am,” says Kermyt G. Anderson, an anthropologist at the University of Oklahoma–Norman and co-author of Fatherhood: Evolution and Human Paternal Behavior.
What made the first father stick around afterward? He was needed. At some point in the six million years since the human lineage split from chimpanzees, babies got to be too expensive, in terms of care, for a single mother to raise. A chimp can feed itself at age four, but humans come out of the womb essentially premature and remain dependent on their parents for many years longer. Hunters in Amazonian tribes cannot survive on their own until age 18, according to anthropologist Hillard Kaplan of the University of New Mexico–Albuquerque. Their skills peak in their 30s—not unlike income profiles of modern men and women.
Oddly enough, bird families also tend to have stay-at-home dads. In more than 90 percent of bird species, both parents share the care of their young. This arrangement probably began, at least for most birds, when males started staying around the nests to protect helpless babies from predators. “A flightless bird sitting on a nest is a very vulnerable creature,” explains evolutionary biologist Richard O. Prum of Yale University.
Some birds, though, might have inherited their particular form of fatherhood from dinosaurs. Male theropods, a close relative of birds, seem to have done all the nest building, just as male ostriches do today. That doesn’t mean everything was on the up and up.
A female ostrich will lay an egg in the nest of her mate, but usually a different male fertilizes it. “There’s a loose relationship,” Prum says, “between paternal care and paternity.”
—Brendan Borrell
Cheese Story
Swiss dairy farmers created an American institution
These days most Swiss cheese consumed in the U.S. is made in Ohio, but our palettes—and ham sandwiches—ultimately have that tiny European country to thank. More specifically, the cheese, which only Americans refer to by its generic name, owes much of its success to the Alpine climate and terrain. Swiss cheese is so easy to slice and keeps for such long periods because Swiss farmers of yore had so much trouble selling the product during the brutal winter months.
Hard, mild cheeses similar to the Swiss cheese we know today were first produced in Switzerland and surrounding areas more than 2,000 years ago, according to food historian Andrew Dalby. Because it was difficult for farmers to traverse the mountains in the winters to sell their wares, they may have opted against soft, fresh versions in favor of hard ones, which “securely keep for a good long time,” he says.
Those hard Swiss cheeses also had other redeeming characteristics, including a mild, nutty flavor and a useful texture for cooking, which gave them broad appeal. The American Swiss cheese industry got its start in 1845, after 27 Swiss families immigrated to Wisconsin. The characteristic holes—cheese makers call them “eyes”—arise from inconsistent pressing during production and have historically been a sign of imperfection. “You can read medieval or early modern descriptions of cheese making in which you are specifically instructed to avoid this,” Dalby says. But now “it has become almost a trademark.”
—Melinda Wenner Moyer
Electronic Pathogens
The first computer virus spawned an arms race in software
Malware, the menagerie of malicious software that includes Trojan horses and worms, first made its appearance in the early 1970s, before personal computers had entered the public consciousness. A self-replicating program called Creeper infected the ARPANET, the forerunner of the Internet. This virus was not malicious—it simply printed on a screen, “I’m the creeper, catch me if you can!”—but it triggered the first antivirus program, Reaper, which removed it.
Viruses went public in a big way with the proliferation of the personal computer during the 1980s. The first PC virus, Elk Cloner, infected early Apple computers. In 1986 a virus called Brain emerged on PCs that booted up with Microsoft’s disk operating system, spreading via floppy disks.
—Mike May
Before Mickey Mouse
The inspiration for today’s animated pictures began long ago with dreams and toys
Each time a photon hits light receptors on the retina, it triggers a Rube-Goldbergian chemical reaction that takes tens of milliseconds to reset. We don’t notice this interruption—our brains smooth it over into an apparently fluid stream of visual information—but the delay provided just the opening animators like Walt Disney needed.
Animators, of course, were not the first to notice this perceptual quirk, often called persistence of vision. Aristotle found that when he stared at the sun, the burned-in image faded away slowly. Roman poet Titus Lucretius Carus described a dream in which a sequence of images presented rapidly before him produced the illusion of motion. By then the Chinese had invented the chao hua chih kuan (“the pipe that makes fantasies appear”), a cylindrical contraption that, when spun in the wind, displayed a succession of images. It gave “an impression of movement of animals or men,” writes Joseph Needham in Science and Civilisation in China.
In the 19th century Europeans developed their own animated pictures in the form of spinning disks and zoetropes featuring sequential drawings visible through a slit, says Donald Crafton of the University of Notre Dame and author of Before Mickey. The first animated film, Phantasmagoria, came out in 1908, depicting the decapitation of a clown and other slapstick in a series of 700 drawings, which took two minutes to show. It was a visual tour de force, though choppy by today’s exacting standards.
Science didn’t catch up to the animators until 1912, when Max Wertheimer, in Experimental Studies on the Seeing of Motion, revealed that it takes 25 frames per second to fool the human eye. It’s a good thing people don’t have the vision of fruit flies, which need more than 200 frames per second to succumb to the illusion of motion.
—Brendan Borrell
Is Sex Really Necessary?
Most living things do it, but nobody knows why
Approximately two billion years ago a pair of single-celled organisms made a terrible mistake—they had sex. We’re still living with the consequences. Sexual reproduction is the preferred method for an overwhelming portion of the planet’s species, and yet from the standpoint of evolution it leaves much to be desired. Finding and wooing a prospective mate takes time and energy that could be better spent directly on one’s offspring. And having sex is not necessarily the best way for a species to attain Darwinian fitness. If the evolutionary goal of each individual is to get as many genes into the next generation as possible, it would be simpler and easier to just make a clone.
The truth is, nobody really knows why people—and other animals, plants and fungi—prefer sex to, say, budding. Stephen C. Stearns, an evolutionary biologist at Yale University, says scientists now actively discuss more than 40 different theories on why sex is so popular. Each has its shortcomings, but the current front-runner seems to be the Red Queen hypothesis. It gets its name from a race in Lewis Carroll’s Through the Looking Glass. Just as Alice has to keep running to stay in the same place, organisms have to keep changing their genetic makeup to stay one step ahead of parasites. Sexual reproduction allows them to shuffle their genetic deck with each generation.
That’s not to say that sex is forever. When it comes to reproduction, evolution is a two-way street. When resources and mates are scarce, almost all types of animals have been known to revert to reproducing asexually. In May 2006 Flora, a Komodo dragon living in an English zoo, laid 11 eggs, even though she had had no contact with males. Virgin births are the norm for the flower-pot snake, a female-only creature that has spread throughout the world, one individual at a time. Mammals, including humans, appear to have been denied the cloning option, however. Our lives seem fated to include plenty of sex, in good times and in bad.
—Brendan Borrell
On the Parasite’s Trail
Scientists have traced malaria to its first human victims a mere 10,000 years ago
For more than a century researchers have been trying to figure out how malaria first arose in humans. The question is urgent, because more than two million people die every year from Plasmodium, the malaria parasite, and understanding its origins might one day lend clues to its complex biology. A piece of the puzzle fell into place in September 2009, when a team of researchers discovered that the main strain that infects human beings—P. falciparum—evolved from another version of the parasite, P. reichenowi, which currently infects chimpanzees. And it happened a mere 10,000 years ago—a moment in evolutionary terms.
The finding rests on a molecular comparison of the genomes of the two parasites. Stephen Rich, an evolutionary geneticist at the University of Massachusetts Amherst, and his colleagues measured the diversity of the genomes, a rough proxy for age (genomes tend to acquire genetic components over time). Reichenowi’s genome can be 20 times more diverse than falciparum’s, which means reichenowi is much older. “It seems that malaria has been in chimps as long as they’ve been chimps,” Rich says.
Following the trail back to the origin of reichenowi is a more complicated problem, not least because malaria is so widespread. “In terrestrial vertebrates, we find it virtually everywhere we look,” Rich observes. “We’re only getting started.”
—Mike May
Snap, Crackle, Bang
The ancient Chinese invented fireworks to scare off 10-foot-tall mountain men
That raucous rite of summer—the fireworks display—may have started as a scholarly tradition in ancient China. Before the Chinese got around to inventing paper in the second century A.D., scribes, using a stylus, would etch ideograms on the rounded surface of green bamboo stalks. The medium served as a way for recording transactions and stories. As the stalks dried over the fire, air pockets in the wood would often burst with a loud cracking noise.
The noise, of course, gradually became the whole point of the exercise. The classic I Ching, or Book of Changes, explains how the cracks and pops succeeded in scaring off the Shan Shan, 10-foot-tall mountain men. Later, the Chinese spiced things up by adding gunpowder to the stalks.
The first fireworks display didn’t take place until the 12th century rolled around. In 1267 English philosopher Roger Bacon wrote about “that toy of children” and the “horrible sound” it produces, which “exceeds the roar of sharp thunder.” Those sharp bangs evoke nothing more than, yes, another Fourth of July celebration.
—Mike May
Thorny Fence
The invention of barbed wire was a huge commercial success—and the subject of furious legal battles
At some point in the history of civilization, shepherding gave way to farming. That created a need for some way of keeping cows and pigs from wandering freely through the meadows. The fence was born. Wooden fences were among the earliest, but they are expensive and time-consuming to build. By 1870 smooth cable was easy to get hold of and came into wide use on ranches. Cattle would rub their back on the wire, and sometimes one would slip through. Eventually the herds caught on.
That got Michael Kelly, an inventor from New York City, wondering how he might make the wire less comfortable as a bovine back scratcher. He got the idea to twist bits of sharp pointed wire onto ordinary cable, and in 1868 he patented his “thorny fence.” It was a big success—and a magnet for lawsuits. “Almost overnight it developed into a source of wealth and furious litigation colored by impassioned charges and countercharges of patent infringement and greed,” says historian Robert T. Clifton.
Joseph Glidden of DeKalb, Ill., also hit legal snags over an improved wire that used two strands to lock the barbs in place. In 1892 his case went before the U.S. Supreme Court, which ruled in his favor, making him the undisputed father of an invention that more than any other marked the closing of the West’s open range.
—Mike May
Scrubs
A rise in maternity ward deaths led one physician to discover the importance of hand washing
In the mid-1840s Hungarian physician Ignaz Semmelweis saw with alarm that 15 percent of new mothers in his Vienna General Hospital were dying of an illness called puerperal fever. Semmelweis was desperate to prevent the illnesses, but he didn’t know how. As he pondered the problem, he learned that his friend, forensic pathologist Jakob Kolletschka, had died from what sounded like the same illness. It happened only a few days after a student accidentally pricked Kolletschka with a scalpel that had been used to dissect a cadaver.
The news gave Semmelweis pause. Medical students at his hospital would routinely go right from the morgue to the maternity ward without ever washing their hands. Were they carrying an infection to the mothers? Was that why they were dying? Could hand washing help?
To test his dirty-hands hypothesis, Semmelweis made his students wash their hands in a mixture of water and chlorine (soap and water did not eliminate the cadaver smell). Fevers in the maternity ward quickly dropped by 10 percent. Hand washing became standard procedure at Semmelweis’s hospital.
It took 40 years for the policy to take hold widely. Even today hospital workers don’t follow it as consistently as they should. According to an ongoing study from the Maryland Health Quality and Cost Council, 90 percent of staff wash their hands when someone is looking, but only 40 percent do when alone.
—Mike May
Moral Animal
A sense of right and wrong starts with innate brain circuitry
The roots of modern morality have long been a point of contention among psychologists, philosophers and neuroscientists. Do our ethical foundations arise from our relatively recent ability to reason or from our ancient emotions? Studies have recently lent support to the notion that we owe much of our sense of right and wrong to our animal ancestors.
Evidence that morality comes before reason is supported by primate studies. A chimpanzee, for instance, will sometimes drown to save its peers and refuse food if doing so prevents others from injury. That’s not to say they are morally sophisticated beings, but “it’s not as if morality and our moral rules are just a pure invention of the religious or philosophical mind,” explains Frans de Waal, a primatologist and psychologist at Emory University. De Waal’s work suggests that our morality is an outgrowth of our ancestors’ social tendencies, an indication that it is at least in part an evolved trait (an idea Charles Darwin shared). Dogs, too, seem to have a keen sense of “wild justice,” says Marc Bekoff, a professor emeritus at the University of Colorado at Boulder. He has observed a sense of morality among dogs at play. “Animals know right from wrong,” he notes.
If morality is innate rather than learned, then it should have left biological traces. Studies suggest that moral decisions involve certain parts of the brain associated with prosocial tendencies and emotional regulation, such as the ventromedial prefrontal cortex. In brain scans, this region lights up when subjects choose to donate money to charity, and those with damage to this region make unexpected moral judgments. Some ethical dilemmas also activate brain regions involved in rational decision making, such as one called the anterior cingulate cortex—a finding that implies that higher-order brain functions may also contribute to our morality, even if it’s rooted in emotions.
Ultimately, de Waal says, we need to thank our evolutionary ancestors for far more than just bestial urges. “When humans kill each other or commit genocide, we say we’re acting like animals,” he says. But “you can see the same sort of thing with regard to our positive behavior."
—Melinda Wenner Moyer
Urban Bug
Packed living conditions made the influenza virus a leading public health threat
Hippocrates described the symptoms of the flu some 2,400 years ago. But the influenza virus didn’t become a true menace until the rise of stable, densely populated settlements and the growth of animal husbandry. This crowding of people and their animals furnished the virus with ample opportunities to jump from one species to another, acquiring deadly attributes along the way.
The first influenza pandemics were recorded during the 1500s. The one that occurred in 1580 traced a path that epidemiologists today would recognize: it began in Asia during the summer and then spread to Africa, Europe and America over the next six months. Another big epidemic hit in 1789, the year that George Washington took office, “before modern means of rapid travel were available and when a man could go no faster than his horse could gallop,” wrote virologist and epidemiologist Richard E. Shope in 1958. Even so, he said, it “spread like wildfire.”
Shope knew influenza well: in 1931 he became the first scientist to transmit the virus between two animals, by transferring mucus from one pig’s nose to another’s. Because Shope had filtered bacteria from the mucus beforehand, his experiment suggested, for the first time, that the flu was caused by a virus. Two years later a group of U.K. scientists became the first to isolate a human form of the virus, from a sick ferret.
—Melinda Wenner Moyer
Former Life of the Electric Car
A century ago taxicabs had batteries, not gas guzzlers, under their hoods
In what may have been the first attempt at an electric car, Scottish inventor Robert Anderson built a “crude electric carriage” in the mid- to late 1830s. It didn’t get far. For one thing, its battery wasn’t good enough. (Today’s green car engineers can sympathize.) It also faced stiff competition from steam-powered cars.
When rechargable batteries started to appear in the mid-1800s, electric vehicles got a fillip. In 1897 the Electric Carriage and Wagon Company in Philadelphia assembled a fleet of electric-powered taxis for New York City. By 1902 the Pope Manufacturing Company in Hartford, Conn., had built around 900 electric vehicles, most of which were used as cabs. That same year Studebaker, which had gotten its start in horse-drawn wagons, entered the car market in Indiana with an electric model. Through the early 1900s electric vehicles ran smoother and quieter than their gas-guzzling, internal-combustion-engine-powered rivals.
Where the electric car stumbled was range—it couldn’t go far between rechargings. By 1920 gas-powered rivals emerged as the clear winner, a twist of history that engineers are now working furiously to undo.
—Mike May
The First Humvee
Wheeled vehicles may have first arisen as a tool of war
Sir C. Leonard Woolley’s 1922 excavation of the Royal Cemetery of Ur—a Sumerian site located in modern-day Iraq—was, by early 20th-century standards, a major media event. Thomas Edward Lawrence, a.k.a. Lawrence of Arabia, who had achieved fame for his dashing exploits during the Arab Revolt several years earlier, helped to organize the expedition. British mystery writer Agatha Christie paid a visit to the site and penned Murder in Mesopotamia as a tribute (she would later marry Woolley’s assistant). All this fuss over a box with a picture of a wheel on it.
It wasn’t just any box, of course. It was the Standard of Ur, a 4,600-year-old container, the size of a shoebox, encrusted in lapis lazuli. Most important, it featured an illustration of ancient warfare that included the oldest uncontested image of the wheel in transportation. A series of images depicted tanklike carriages, each with four solid wheels braced to their axles and a team of horses propelling them forward. The wheeled carriages clearly provided soldiers with better protection against ambush than the poor foot soldiers had, who are shown squirming to avoid horses’ hooves.
This ancient Humvee wasn’t the only way fifth-millennium engineers deployed wheels. The Sumerians, Egyptians and Chinese all used wheels for spinning pots, and Egyptians moved massive stones with log rollers. Wheels never caught on for ordinary transport because they weren’t useful on the sandy soils of the world’s trade routes, says Richard Olson, a historian and author. Camels remained the all-terrain vehicle of choice for another 2,000 years or so.
Wheeled vehicles didn’t take off until the advent of roads. The Egyptians built extensive dirt roads and paved some of them with sandstone, limestone and even a surfacing of petrified wood. By as far back as 3,500 years ago, they fashioned a metal wheel with six spokes, and from the Middle East to Russia, agile, two-wheeled chariots became all the rage.
—Brendan Borrell
Gravity’s Tug
The first black holes are almost as old as the universe itself
The idea that a black hole could possibly exist came from an English rector, John Michell. In 1783 he calculated that the force of gravity exerted by a massive star could prevent light from escaping its surface. Michell’s work was largely forgotten for 200 years. In 1971 astrophysicists noticed flickering x-rays coming from the constellation Cygnus, 6,000 light-years away: the radiation indicated that a black hole was apparently circling a star. As with any black hole, it formed as a star ran out of fuel and collapsed in on itself. If the sun were to somehow become a black hole, it would be less than three miles across, trapping light in the warped space that enfolds it. For Earth to become a black hole, it would be the size of a marble.
The first black holes in the universe arose nearly 14 billion years ago, contends Abraham Loeb, an astrophysicist at Harvard University. At that time, gas began to condense into clouds that fragmented into massive stars 100 times the size of the sun, which in turn collapsed into black holes. Fortunately, the spinning of early galaxies limited the growth of the black holes at their cores, allowing stars to form.
Physicists have now begun to make something akin to black holes on Earth. Chinese researchers built concentric cylinders that mimic a black hole, bending microwave radiation in on itself as it passes from outer to inner surfaces. And a real black hole could still improbably pop out of the Large Hadron Collider near Geneva.
—Brendan Borrell
Rainbow Cells
Biodiversity was the first step toward complex life
Witnesses were absent for the comings and goings of the first life some four billion years ago, but scientists are pretty sure the typical Earth creature in those days consisted of no more than a single cell. That doesn’t mean the planet was a dull sea of sameness. Single-celled creatures may have acquired genetic diversity early on.
Here’s why. When cells divide, mistakes have a way of creeping into genetic material. Variants that enhance a cell’s ability to survive and reproduce become more common over successive generations. This basic fact of evolution applied to the early Earth. “Variation is necessary for there to be evolution by natural selection in the first place,” explains Andrew Hamilton, a philosopher of science at Arizona State University. “Biodiversity originated at the point that there was variation on which selection could operate.”
Today we think of biodiversity in terms of multicellular life, but flowering plants and animals didn’t arrive until relatively recently (540 million years ago). Although some evidence suggests that having a wide variety of species makes an ecosystem more stable, the jury is still out. It is of no comfort to know that the worst catastrophe would still preserve some biodiversity—even if only for the lowly cell.
—Melinda Wenner Moyer
Zero
How nothing became something
Nobody knew how much we needed nothing until we had a number for it. Without zero, negative and imaginary numbers would have no meaning, and it would be impossible to solve quadratic equations, a mainstay of applied math. Without zero to act as a placeholder to distinguish, say, 10 from 100, all but the simplest arithmetic requires an abacus or counting board. “If we didn’t have zero, our system of numbers would be incomplete,” says Charles Seife, author of Zero: The Biography of a Dangerous Idea. “It would really break down without zero.”
Zero arrived on the scene in two installments. Around 300 b.c., Babylonians developed a proto-zero—two slanted wedges pressed into clay tablets—that served as a placeholder in their funky sexagesimal, or base 60, number system. By the fifth century, the concept of zero migrated to India and made its symbolic entrance as a dot carved on a wall at the Chaturbhuja Temple in Gwalior. Then, like a pebble dropped into a puddle, the symbol for zero expanded to an “0” and became a number with properties all its own: an even number that is the average of –1 and 1. In 628 mathematician Brahmagupta pontificated on the frightening properties of zero: multiply anything by zero, and it, too, turns to nothing. Independently, Mayans in the Americas developed their own zero to assist in the study of astronomy.
Over time the expansion of the Islamic empire spread the Indian zero back to the Middle East and, eventually, to the Moors in Spain, where it became one of 10 Arabic numerals, as we refer to them today. European scholars clung to their Roman numerals. Zero’s official endorsement by the Western world came by way of Italian mathematician Fibonacci (Leonardo of Pisa), who included it in a textbook in 1202.
—Brendan Borrell
Noodling the Noodles
It took thousands of years to go from mush to spaghetti
Food historian Francine Segan asserts that pasta emerged more than 5,000 years ago when an enterprising chef happened upon the now seemingly obvious idea of mushing flour and water together to create something that looked surprisingly like lasagna. “It breaks my heart to tell you this,” says Segan, invoking her own Italian heritage, “but the first to make those noodles might have been the ancient Greeks. Lots of references in ancient Greek writing—even in 3,000 b.c.—talk about layers that sound a lot like lasagna.”
Spaghetti took longer but appears to have taken shape in Italy. A popular misconception has Marco Polo introducing Italians to pasta when he returned from China in 1295, but Italy already had pasta by then. “In Sicily there is a town called Trabia,” wrote Arab geographer Muhammad al-Idrisi in 1154. “In this town they made a food of flour in the form of strings.” He describes a true pasta industry: foodstuffs were first dried in the sun and then shipped by boat to other regions of Italy and even other countries.
A few hundred years later Leonardo da Vinci invented a machine that turned dough into edible strings. Technical glitches kept his pasta maker from achieving the mechanization of the industry he had hoped for. Still, the Italians succeeded in refining the art of pasta making, crafting more elaborate forms of mushed dough than anyone else.
—Mike May