More In This Article
Editor's Note: This story, originally printed in the August 1999 issue of Scientific American, is being reproduced as CERN on March 13 celebrates the 20th anniversary of Tim Berners-Lee's original proposal for the World Wide Web.
Last year a few of us from the Laboratory for Computer Science at the Massachusetts Institute of Technology were flying to Taiwan. I had been trying for about three hours to make my new laptop work with one of those cards you plug in to download your calendar. But when the card software was happy, the operating system complained, and vice versa. Frustrated, I turned to Tim Berners-Lee sitting next to me, who graciously offered to assist. After an hour, though, the inventor of the Web admitted that the task was beyond his capabilities.
Next I asked Ronald Rivest, the co-inventor of RSA publickey cryptography, for his help. Exhibiting his wisdom, he politely declined. At this point, one of our youngest faculty members spoke up: “You guys are too old. Let me do it.” But he also gave up after an hour and a half. So I went back to my “expert” approach of typing random entries into the various wizards and lizards that kept popping up on the screen until by sheer accident, I made it work ... three hours later.
Such an ordeal is typical and raises an important issue: for the first 40 years of computer science, we have been preoccupied with catering our technology to what machines want. We design systems and subsystems individually and then throw them at the public, expecting people to make the different components work together. The image this approach evokes for me is that of designing a car in which the driver has to twist dozens of individual knobs to control the fuel mixture, spark advance and valve clearances, among other things—when all he wants to do is go from one place to another.
Doing More by Doing Less
We have done enough of this kind of design. It’s time we change our machine-oriented mind-set and invent the steering wheel, gas pedal and brakes for people of the Information Age. This idea brings me squarely to the goal of my vision for the near future: people should be able to use the new information technologies to do more by doing less. When I say "doing more by doing less," I mean three things. First, we must bring new technologies into our lives, not vice versa. We will not accomplish more if we leave our current lives, don goggles and bodysuits, and enter some metallic, gigabyte-infested cyberspace. When the industrial revolution came, we didn’t go to motorspace. The motors came to us as refrigerators to store our food and cars to transport us. This kind of transition is exactly what I expect will happen with computers and communications: they will come into our lives, and their identities will become synonymous with the useful tasks they perform.
Second, new technologies must increase human productivity and ease of use. Imagine if I could pull out a handheld device and say, “Take us to Athens this weekend.” My computer would connect to the EasySabre airline reservation system and begin interacting with it, using the same commands that travel agents use. The machine would know that “us” is two people and that we like business class, aisle seats and so forth. It would negotiate with the airline computer for maybe 10 minutes, until it found an acceptable flight and confirmed it. I would have spent three seconds giving my order, whereas my electronic bulldozer—the handheld’s software—would have worked for 10 minutes, or 600 seconds. The human productivity improvement in this example is 600 divided by three, which is 200, or, in business terms, 20,000 percent.