From being part of a closely knit team that created the first "computer on a chip" 25 years ago to co-founding and shepherding Intel Corp. into a semiconductor powerhouse, Gordon E. Moore has remained a dominant figure in the development of the modern computer.
In 1965, Moore noted that the number of devices on a microchip (and hence the potential power of a computer) was doubling each year, and he projected that trend would continue for 10 years. Thirty years later, that geometric growth, now canonized as "Moore's Law." remains the fundamental economic force driving the computer industry.
In September 1997, Moore took some time to speak with Scientific American's west coast editor, W. Wayt Gibbs. In this second section of their four-part interview, they discuss Moore's Law and its implications.
PART 2: MOORE'S LAW
SCIENTIFIC AMERICAN: We've all heard of "Moore's Law" that computer power doubles each year. Let me get a little more detail. I've seen different dates for when you came upon this observation, 1964 and 1965.
GORDON MOORE: It was 1965. I published it in the 35th anniversary edition of Electronics magazine.
SA: Do you remember when you noticed the trend?
GM: When I was writing the article! The gist of my article was really that integrated circuit technology is going to make electronics cheap. I was trying to drive home the fact this was the route to low-cost electronics--it wasn't at all clear that that was true. Most of the integrated circuits made so far had gone into fairly expensive machines, like Minuteman missiles or something. They were just starting to make commercial inroads.
I happened to plot the increase in complexity and saw that it was doubling every year, so I extrapolated that for 10 years. I extrapolated it from about 60 components to about 64,000 components on a chip--a pretty long extrapolation. I was just making the pitch that the cheapest way to buy a component 10 years later was going to be as one of these very complex chips. It tracked that curve better than I ever could have imagined.
SA: People make all sorts of long extrapolations today, and a lot of the time they only half believe them. Did you honestly believe that it would last 10 years?
GM: I didn't really have any feeling for the precision of it. Really the precision wasn't even important for the argument I was trying to make. At the end of that time I dug back into it and what had happened, and in 1975 I gave a paper updating the thing, and there I tried to be more precise as to what had contributed to the progress we had made. I predicted we were going to change from doubling every year to doubling every two years, which is kind of where we are now. I never said 18 months, in spite of that being quoted in literature quite often!
SA: Plotted on semi-log graph paper this does make a beautiful smooth line, but that smoothes over many difficult engineering struggles that have occurred along the way, doesn't it?
GM: Yes, it sure does. In one respect it has become a self-fulfilling prophecy. People know they have to stay on that curve to remain competitive, so they put the effort in to make it happen.
In my view, this was the best thing I ever did to the Japanese semiconductor industry. Once they understood the progress of DRAMs--one, four, 16, 64 [megabit]--they could multiply by four as well as any of us. Then, for the first time, they really had a fix on where the industry was going.
Before that, the industry seemed to move in more or less random directions, which didn't work well in the Japanese top-down corporate culture. But once they had a road map of where the industry was going, they became very formidable competitors. And even now, people look at these curves at the semiconductor industry association and essentially turn out road maps for staying exactly on the curves we have been on. They just try to get the industry to commit the resources to be there. So each of the individual participants tries to get ahead of that curve.