In 2002 Seth Lloyd, a professor of quantum computing at the Massachusetts Institute of Technology, published a formula estimating the number of bits that could fit in the universe. A “bit” is a fundamental unit of information that represents the answer to a yes or no question. A computer stores bits in a transistor, but a bit can also be encoded in the state of a physical particle, such as the spin of an electron. Lloyd's formula exploited the physicality of information to estimate the rate at which physical systems can process and record information as a function of Planck's constant (an unimaginably tiny unit that is fundamental to quantum mechanics), the speed of light and the age of the universe. Lloyd concluded that our universe could fit a whopping 1090 bits, or a trillion trillion trillion trillion trillion trillion trillion megabits.