Astronomers have found thousands of planets orbiting stars in the Milky Way, and 100 billion more stars in the galaxy presumably host planets of their own. Given the sheer number of worlds out there, it’s easy to imagine that some of them must be harboring sentient beings. After all, could Earth really be unique among so many planets?
The short answer is: yes.
Optimism about the possibilities of intelligent extraterrestrial life ignores what we know about how humans came to exist. We’re here because of a long chain of implausible coincidences. Many, many, many things had to go just right to give rise to our technological civilization.
On supporting science journalism
If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.
This chain is so implausible, in fact, that there’s good reason to conclude that humans are most likely the only technological civilization in the galaxy. (Let’s leave aside the countless other galaxies in the cosmos because, as the saying goes, “In an infinite universe, anything’s possible.”)
Perfect Timing for a Rocky Planet
The coincidences begin with the manufacture of heavy elements, which include everything heavier than hydrogen and helium. After the big bang more than 13 billion years ago, the first stars were born in clouds of these two lightest elements. These stars couldn’t have had planets, because there was nothing to make planets from—no carbon, oxygen, silicon, iron, or any other heavy element.
(By the way, with utter disregard for chemical subtleties, astronomers call all elements heavier than hydrogen and helium “metals.”)
Metals are created inside stars and get spread through space when stars throw off material as they die, sometimes in spectacular supernova explosions. This material enriches interstellar clouds, such that each successive generation of stars made from the clouds ends up having a greater metallicity than the one before it.
When our sun came into being about 4.5 billion years ago, this enrichment had been going on for billions of years in our galactic neighborhood. Even so, the sun contains roughly 71 percent hydrogen, 27 percent helium, and just 2 percent metals. Its composition mirrors that of the cloud that made the solar system, so the rocky planets, including Earth, formed from only that tiny amount of elemental construction material.
Stars older than the sun have even fewer metals, and therefore less chance of making rocky, Earth-like planets. Giant gaseous planets, such as Jupiter, are easier to form but not as likely to host life.
All of this means that even if we aren’t the only technological civilization in the galaxy, we must be one of the first.
Location, Location, Location
Our place in the Milky Way is also special. Located in a thin disk of stars about 100,000 light-years across, our sun is roughly 27,000 light-years from the galactic center. That’s a little more than halfway to the outer rim.
By and large, stars closer to the center contain more metals, and there are more old stars there. This situation is typical of disk galaxies, which seem to have grown outward from the center. Having more metals sounds like a good thing for making rocky planets, but it may not be so good for life.
One reason for this extra metallicity is that the stars are packed more densely toward the center, so there are many supernovae. Exploding stars produce energetic radiation—X-rays and charged particles known as cosmic rays. This radiation is harmful to the planets of nearby stars. The galactic center is also home to a very large black hole, Sagittarius A-Star, which produces intense outbursts of radiation from time to time.
Then there’s the problem of even more energetic events called gamma-ray bursts. Using recent gravitational-wave studies, astronomers have learned that some of these explosions are caused by merging neutron stars. Observations of gamma-ray bursts in other galaxies show that they’re more common in the crowded inner regions of galaxies. A single burst could sterilize the core of the Milky Way, and statistics based on studies of other galaxies suggest that one occurs in ours every 1 million to 100 million years.
Farther from the center of the galaxy, all these catastrophic events have less impact. But, here, stars are sparser and metallicity is lower, so there are fewer if any rocky planets.
Taking all this into account, astronomers such as Charles H. Lineweaver of the Australian National University infer that there’s a “galactic habitable zone” extending from about 23,000 to 30,000 light-years from the galactic center. That’s only about 7 percent of the galactic radius, and it contains fewer than 5 percent of the galaxy’s stars because of the way they’re concentrated toward the core. This region still encompasses a lot of stars, but it rules out life for the majority of them in the Milky Way.
Our sun is close to the middle of the habitable zone, but other astronomical idiosyncrasies distinguish the solar system. For example, there’s evidence that an orderly arrangement of planets in nearly circular orbits providing long-term stability is uncommon. Most planetary systems are chaotic places, lacking the calm that Earth has provided for life to evolve.
It’s Hard to Be “Earth-Like”
Astronomers have found around 50 “Earth-like” planets. Yet all the talk of these worlds obscures another critical distinction.
When they say “Earth-like,” all astronomers really mean is a rocky planet in the habitable zone that’s about the same size as ours. But by these criteria, the most Earth-like planet we know of is Venus—and you could never live there. The fact that we can live on Earth is the result of yet more fortuitous circumstances.
The two planets differ in several important ways. Venus has a thick crust, no sign of plate tectonics, and essentially no magnetic field. Earth has a thin, mobile crust where tectonic activity, especially around plate boundaries, brings material to the surface through volcanism.
Plate tectonics brings nutrients to the surface to replenish those depleted by the life-forms living there, and it’s crucial for recycling carbon and stabilizing the temperature over long periods of time. And this activity is particularly important for us as a technological species. Throughout Earth’s long history, it has carried ores up to where humans can mine them to provide the raw materials for our advanced civilization.
Earth also has a large metallic (in the everyday sense of the word) core that, coupled with its rapid rotation, produces a strong magnetic field to shield its surface from harmful cosmic radiation. Without this screen, our atmosphere would probably erode, and any living thing on the surface would get fried.
What’s more, all these attributes of our planet are directly related to our moon—another feature that Venus and many other so-called Earth-like planets lack.
Scientists’ best guess is that the moon formed early in the solar system’s history, when a Mars-size object struck the nascent Earth a glancing blow that caused both protoplanets to melt. The metallic material from the two objects settled into Earth’s center, and much of our planet’s original lighter rocky material splashed out to become the moon, leaving Earth with a thinner crust than before. Without that impact, Earth would be a sterile lump of rock like Venus, lacking a magnetic field and plate tectonics.
The presence of such a large moon has also acted as a stabilizer for our planet. Over the millennia, Earth has wobbled on its axis as it goes around the sun. But thanks to the gravitational influence of the moon, it can never topple far from the vertical, as seems to have happened with Mars.
It’s impossible to say how often such impacts occur to form double systems such as Earth and its moon. But clearly they’re rare, and without our satellite we would likely not be here.
An Unlikely Fusion of Cells
Once the Earth-moon system settled down, life emerged with almost indecent rapidity. Leaving aside controversial claims for evidence of even earlier creatures, scientists have found fossil remains of single-celled organisms in rocks 3.4 billion years old—just about a billion years younger than Earth itself.
At first, this sounds like good news for anyone hoping to find extraterrestrials. Surely, if life got started on Earth so soon, it could arise with equal ease on other planets? The snag is that although life started 3.4 billion years ago, it didn’t really do much for the next three billion years.
In fact, microbes that are essentially identical to those original bacterial cells still live on Earth today—arguably the most successful species in the history of life on our planet and a classic example of “If it ain’t broke, don’t fix it.”
These simple cells, known as prokaryotes, are little more than bags of jelly containing the basic molecules of life (such as DNA) but without the central nucleus and specialized structures like mitochondria, which use chemical reactions to generate the energy needed by the cells in your body.
The more complex cells, the stuff of animals and plants, are known as eukaryotes, and they’re all descended from a single merging of cells that occurred about 1.5 billion years ago. The merger involved two types of primordial single-celled organisms: bacteria and archaea.
Archaea are so named because they were once thought to be older than bacteria. However, the evidence now suggests that both forms emerged at about the same time, when life first appeared on Earth—meaning that however life got started, it actually emerged twice.
Once it was here, it went about its business largely unchanged for about two billion years. That business involved, among other things, “eating” other prokaryotes by engulfing them and using their raw materials.
Then came the dramatic turning point: An archaeon engulfed a bacterium but did not “digest” it. Instead, the bacterium became a resident of the new cell—the first eukaryote—and evolved to carry out specialized duties within it, leaving the rest of the host free to develop without worrying about where it got its energy. The cell then repeated the trick, becoming more complex.
The similarities between the cells of all advanced life-forms on Earth show that they’re descended from a single single-celled ancestor. As biologists are fond of saying, “At the level of a cell, there’s no difference between you and a mushroom.”
Of course, the trick might have happened more than once. But if it did, the other protoeukaryotes left no descendants (probably because they got eaten). It’s a measure of how unlikely such a single fusion of cells was that it took two billion years of evolution to occur.
Even then, not much happened for another billion years or so. Early eukaryotes got together to make multicellular organisms, but at first these were nothing more than flat, soft-bodied creatures with a structure resembling a quilt.
The proliferation of multicellular life-forms that led to the variety of life on Earth today only kicked off nearly 550 million years ago, in an outburst known as the Cambrian explosion. This was so spectacular that it’s still the most significant event in the fossil record. But nobody knows why it happened—or how likely it is to happen elsewhere.
Eventually that eruption of life produced a species capable of developing technology and wondering where it came from.
A Once Endangered Species
The progression from primitive to advanced species was not easy. The history of humanity is written in our genes, and in such detail that it’s possible to determine from DNA analysis not only where different populations came from but how many of them were around.
One of the surprising conclusions from this kind of analysis is that groups of chimpanzees living close to one another in central Africa are more different genetically than humans living on opposite sides of the world. This can only mean that we’re all descended from a tiny population of humans, possibly the survivors of some catastrophe (or catastrophes).
DNA evidence pinpoints two evolutionary bottlenecks in particular. A little more than 150,000 years ago, the human population was reduced to no more than a few thousand—perhaps only a few hundred—breeding pairs. And about 70,000 years ago, the entire human population fell to about 1,000.
Although this interpretation of the evidence has been questioned by some researchers, if it is correct, all the billions of people now on Earth are descended from this group, which was so small that a species diminished to such numbers today would likely be regarded as endangered.
That our species survived—and even flourished, eventually growing to number more than seven billion and advancing into a technological society—is amazing. This outcome seems far from assured.
Alone in the Milky Way?
As we put everything together, what can we say?
Is life likely to exist elsewhere in the galaxy? Almost certainly yes, given the speed with which it appeared on Earth.
But is another technological civilization likely to exist today? Almost certainly no, given the chain of circumstances that led to our own existence.
These considerations suggest we are unique not just on our planet but in the whole Milky Way. And if our planet is so special, it becomes all the more important to preserve this unique world for ourselves, our descendants, and the many creatures that call Earth home.
Reference: Alone in the Milky Way. John Gribbin in Scientific American Vol. 319, No. 3, 94-99; September 2018. doi:10.1038/scientificamerican0918-94
