In the standard story, the computer’s evolution has been brisk and short. It starts with the giant machines warehoused in World War II–era laboratories. Microchips shrink them onto desktops, Moore’s Law predicts how powerful they will become, and Microsoft capitalizes on the software. Eventually small, inexpensive devices appear that can trade stocks and beam video around the world. That is one way to approach the history of computing—the history of solid-state electronics in the past 60 years.
But computing existed long before the transistor. Ancient astronomers developed ways to predict the motion of the heavenly bodies. The Greeks deduced the shape and size of Earth. Taxes were summed; distances mapped. Always, though, computing was a human pursuit. It was arithmetic, a skill like reading or writing that helped a person make sense of the world.