In 2002 Seth Lloyd, a professor of quantum computing at the Massachusetts Institute of Technology, published a formula estimating the number of bits that could fit in the universe. A “bit” is a fundamental unit of information that represents the answer to a yes or no question. A computer stores bits in a transistor, but a bit can also be encoded in the state of a physical particle, such as the spin of an electron. Lloyd's formula exploited the physicality of information to estimate the rate at which physical systems can process and record information as a function of Planck's constant (an unimaginably tiny unit that is fundamental to quantum mechanics), the speed of light and the age of the universe. Lloyd concluded that our universe could fit a whopping 1090 bits, or a trillion trillion trillion trillion trillion trillion trillion megabits.

Lloyd developed his formula because his work on quantum computers, which use single atoms to encode information and perform computations, had him thinking about the universe in terms of bits that live in atoms. He performed a thought experiment, asking himself: What is the largest computer that could ever be built? The answer: one that would employ every atom in the universe. That computer could store 1090 bits.

But the beauty of Lloyd's formula is that it can be used to estimate the information-storing capacity of any physical system, not just the universe. Recently I have drawn inspiration from Lloyd's formula while exploring the computational capacities of economies and societies. Lloyd's formula does not incorporate much of the social and economic complexity inherent in our economies, but it gives us rough estimates of the capacity of systems to store and process information. Think of Earth as a hard drive. According to Lloyd's formula, the planet can store up to 1056 bits—roughly a trillion trillion trillion trillion gigabits. But is this planetary hard drive mostly empty or mostly full?

To answer that question, let us consider the work of Martin Hilbert and Priscilla López. In 2011 Hilbert and López, then at the University of Southern California and the Open University of Catalonia in Spain, respectively, published an estimate of the cultural information stored in our planet's texts, pictures and videos. They concluded that as of 2007, humans had stored 2 × 1021 bits, or two trillion gigabits. But there is much more information in our planet than what is contained in cultural artifacts. Information is also embodied in human-designed objects, such as your car and your shoes, and in biological systems, such as your ribosomes, mitochondria and DNA. Indeed, it turns out that most of the information contained in Earth is stored in the form of biomass. Based on Lloyd's formula, I estimate that Earth contains roughly 1044 bits. That figure might sound like a lot, but it is only a small fraction of the globe's capacity. If humans continued to generate 1021 bits every year, it would still take much more than a trillion ages of the universe to fill our planetary hard drive.

What these calculations tell us is that although Earth has an enormous capacity to store information, order is still rare. That insight, in turn, tells us a lot about how information is created and processed by the planet and the hurdles that could limit its growth in the future.

Our Computational Universe

The first thing the informational emptiness of our planet reveals is that information is hard to grow—difficult to make, tough to preserve and challenging to combine into new configurations. That deduction fits well with past observations and is explained by the universe's hostility toward the emergence of order. The second law of thermodynamics dictates that our universe has a natural tendency to average out, making order disappear. Heat flows from hot to cold, music vanishes as it travels through the air and the swirls in your latte quickly diffuse into milky clouds. This move from order to disorder is known as the growth of entropy.

Yet there are loopholes that allow pockets of order to emerge. Think of a biological cell, the human body or the man-made economy. These highly organized systems defy the increase of physical entropy that governs the universe as a whole, albeit locally. Information-rich systems can exist only as long as they “sweat” entropy at their seams, paying for their high levels of organization by expunging heat. “Entropy is the price of structure,” as Nobel Prize–winning chemist Ilya Prigogine once cleverly noted.

Order emerges or persists in our universe thanks to three tricks. The first, and perhaps most familiar, trick involves the flow of energy. Imagine a bathtub full of water: water molecules bounce off one another in random directions until you remove the plunger. Once water starts racing down the drain, increasing the fluid's kinetic energy, a whirlpool emerges. In that whirlpool, order materializes; the molecules in the whirlpool have velocities similar in both magnitude and direction to that of neighboring molecules, and these correlations are the primitive origins of macroscopic information. To understand not only the formation but the endurance of order, we need the second trick: the existence of solids. Solid objects, such as DNA, preserve order for long periods. Without them, information would be too evanescent to last, recombine and grow.

But to explain the emergence of more complex forms of order (such as the information embodied in a city or economy) or the creation of order that gave rise to the life and societies of our planet, we need the third trick: the capacity of matter to compute. A tree, for example, is a computer that knows in which direction to grow its roots and leaves. Trees know when to turn genes on and off to fight parasites, when to sprout or shed their leaves and how to harvest carbon from the air via photosynthesis. As a computer, a tree begets order in the macrostructure of its branches and the microstructures of its cells. We often fail to acknowledge trees as computers, but the fact is that trees contribute to the growth of information in our planet because they compute.

Crystallized Imagination

Another thing we learn from thinking of our planet as a hard drive is more surprising: despite the forces arrayed against the emergence of order, information gradually grows. Earth's hard drive is fuller now than it was yesterday or a billion years ago. In part, it is fuller because of life's emergence: biomass contains a great deal of information. But the growth of order on Earth also stems from the production of cultural information.

To see why, let us compare the apples that grow on trees with the Apples we carry in our pockets and use to check our e-mail. For our purposes, we are interested in the origin of the physical order embodied in each object. The first apple embodies order that existed in the world prior to humans. There were apples before we had a name for apples, a price for apples or a market for apples. The second Apple is different because it is an object that existed first in someone's head and then, later, in the world. It is a solidified piece of order that emerged first as imagination. As we will see, objects of this kind are particularly special.

Species that can imagine objects and then create them have important advantages over other animals. The real yet imaginary objects that pervade our economy augment us because they give us access to the practical uses of knowledge and know-how embodied in the nervous systems of other people. Consider a tube of toothpaste. Most people use toothpaste every day, but they do not know how to synthesize sodium fluoride, the active ingredient in toothpaste. That lack of knowledge, however, does not exclude them from benefiting from the practical uses of the understanding needed to synthesize sodium fluoride. People make practical uses of others' knowledge through products—which are, in effect, solidified pieces of imagination. Products augment us, and markets make us not only richer but also wiser.

The problem is that creating products is not easy. It often requires collaboration among large numbers of people. To contribute to the growth of information, people need to form networks with the ability to compute products. We need networks because the computational capacities of systems, just like their information-storing capacities, are finite. Biological cells are finite computers that transcend their limitations through multicellularity. People are also limited, and we transcend our finite computational capacities by forming social and professional networks. Economies are distributed computers that run via the hardware we know as social networks. Ultimately it is this reembodiment of computation—from cells to people and to economies—that makes the growth of complex forms of information possible but also challenging.

Limits of Growth

The final thing this thought experiment tells us is that the ability of human networks to create information is severely constrained. Forget all the talk about big data: from a cosmic perspective, we are creating a surprisingly small amount of information (even though we burn enough energy in the process to have liberated the carbon that is warming up our planet).

Our information-creation capacity is limited in part because our ability to form networks of people is constrained by historical, institutional and technological factors. Language barriers, for instance, fracture our global society and limit our ability to connect humans born in distant parts of the globe. Technological forces can help reduce these barriers. The rise of air travel and long-distance communication has reduced the cost of distant interactions, allowing us to weave networks that are highly global and that increase our capacity to process information. Still, these technologies are no panacea, and our ability to process information collectively, while larger than in previous decades, remains small.

So how will the growth of information on Earth evolve in the coming centuries? An optimistic view is that the globalizing forces of technology and the fall of parochial institutions, such as patriotism and religion, will help erode historical differences that continue to inspire hate among people from different linguistic, ethnic, religious and national backgrounds. Meanwhile technological changes will deliver an age of hyperconnectivity. Electronics will evolve from portable to wearable to implantable, delivering new forms of mediated social interactions.

For millennia, our species' ability to create information has benefited from our ability to deposit information in our environment, whether by building stone axes or by writing epic poems. This talent has provided us with the augmentation and coordination needed for our computational capacity to increase. We are in the midst of a new revolution that has the potential to transform this dynamic and make it even more powerful. In this millennium, human and machine will merge through devices that will combine the biological computers housed between our ears and the digital machines that have emerged from our curious minds. The resulting hyperconnected society will present our species with some of the most challenging ethical problems in human history. We could lose aspects of our humanity that some of us consider essential: for example, we might cheat death. But this merger between our bodies and the information-processing machines our brains imagined might be the only way to push the growth of information forward. We were born from information, and now, increasingly, information is being born from us.