My grandfather, for some reason, wore a hat to meals. Some evenings-also hatted-he would play the fiddle. He was born in Ireland in 1874, and he lived to see, in his long life, satellites, computers, jet airplanes and the Apollo space program. He went from a world where illiterate people footed their way on dirt roads, where one-room schools had peat fires in the corner, where stories were told at night in shadows and candlelight, to a world of motor cars and electricity and telephones and radio and x-ray machines and television. He never left Ireland, although late in life he wanted to go to England in an airplane to experience flying. But in his lifetime-one lifetime-he witnessed all these birthings of technology.
It is young, this new technology. It is recent. It has come fast. So fast, in fact, that speed of evolution is regarded as a signature of technology itself. But how fast? How quickly does technology evolve? It is hard to clock something as ill defined as technology's speed of evolution. But we can ask how fast we would have to speed up the natural, biological evolution of life on our planet to make it roughly match some particular technology's rate of change.
Let's imagine speeding up biological evolution in history by a factor of 10 million. This would mean that instead of life starting around 3,600 million years ago, in our fast-forwarded world the first, crude blue-green algae appear 360 years ago, about the year 1640. Multicellular organisms arise in Jane Austen's time, about 1810 or so, and the great Cambrian explosion that produced the ancestors of most of today's creatures happens in the early 1930s, the Depression era. Dinosaurs show up in the late 1960s, then lumber through the 1970s and into the 1980s. Birds and mammals appear in the mid-1970s but do not come fully into their own until the 1990s. Humankind emerges only in the past year or two-and as Homo sapiens only in the past month.
Now let's lay this alongside a technology whose speed we want to measure-calculating machinery, say. We'll put it on the same timeline, but evolving at its actual rate. Early calculating machines-abacuses-trail back, of course, into antiquity. But the modern era of mechanical devices starts in the years surrounding the 1640s, when the first addition, subtraction and multiplication machines of Wilhelm Schickard, Blaise Pascal and Gottfried Wilhelm Leibniz begin to appear. These were rudimentary, perhaps, but early computational life nonetheless. The first successful multicellular devices (machines that use multiple instructions) are the Jacquard looms of Jane Austen's time. Calculators and difference engines of varying ingenuity arise and vanish throughout the 1800s. But not until the 1930s-the Cambrian time on our parallel scale--is there a true explosion. It's then that calculating machines become electrical, the government goes statistical, and accounting becomes mechanized. The 1960s see the arrival of large mainframe computers, our parallel to the dinosaurs, and their dominance lasts through the 1970s and 1980s. Personal computers show up, like birds and mammals in the mid-1970s, but do not take hold until the late 1980s and early 1990s.
What then corresponds to humankind, evolution's most peculiar creation to date? My answer is the Internet or, more specifically, its offshoot, the World Wide Web. The Web? Well, what counts about the Web is not its technology. That's still primitive. What counts is that the Web provides access to the stored memories, the stored experiences of others. And that's what is also particular to humans: our ability not just to think and experience but to store our thoughts and experiences and share them with others as needed, in an interactive culture. What gives us power as humans is not our minds but the ability to share our minds, the ability to compute in parallel. And it's this sharing-this parallelism-that gives the Web its power. Like humans, the Web is new, although its roots are not. And its impact is barely two years old.