In 1965, Moore noted that the number of devices on a microchip (and hence the potential power of a computer) was doubling each year, and he projected that trend would continue for 10 years. Thirty years later, that geometric growth, now canonized as "Moore's Law." remains the fundamental economic force driving the computer industry.
In September 1997, Moore took some time to speak with Scientific American's west coast editor, W. Wayt Gibbs. In this second section of their four-part interview, they discuss Moore's Law and its implications.
PART 2: MOORE'S LAW
SCIENTIFIC AMERICAN: We've all heard of "Moore's Law" that computer power doubles each year. Let me get a little more detail. I've seen different dates for when you came upon this observation, 1964 and 1965.
GORDON MOORE: It was 1965. I published it in the 35th anniversary edition of Electronics magazine.
SA: Do you remember when you noticed the trend?
GM: When I was writing the article! The gist of my article was really that integrated circuit technology is going to make electronics cheap. I was trying to drive home the fact this was the route to low-cost electronics--it wasn't at all clear that that was true. Most of the integrated circuits made so far had gone into fairly expensive machines, like Minuteman missiles or something. They were just starting to make commercial inroads.
I happened to plot the increase in complexity and saw that it was doubling every year, so I extrapolated that for 10 years. I extrapolated it from about 60 components to about 64,000 components on a chip--a pretty long extrapolation. I was just making the pitch that the cheapest way to buy a component 10 years later was going to be as one of these very complex chips. It tracked that curve better than I ever could have imagined.
SA: People make all sorts of long extrapolations today, and a lot of the time they only half believe them. Did you honestly believe that it would last 10 years?
GM: I didn't really have any feeling for the precision of it. Really the precision wasn't even important for the argument I was trying to make. At the end of that time I dug back into it and what had happened, and in 1975 I gave a paper updating the thing, and there I tried to be more precise as to what had contributed to the progress we had made. I predicted we were going to change from doubling every year to doubling every two years, which is kind of where we are now. I never said 18 months, in spite of that being quoted in literature quite often!
SA: Plotted on semi-log graph paper this does make a beautiful smooth line, but that smoothes over many difficult engineering struggles that have occurred along the way, doesn't it?
GM: Yes, it sure does. In one respect it has become a self-fulfilling prophecy. People know they have to stay on that curve to remain competitive, so they put the effort in to make it happen.
In my view, this was the best thing I ever did to the Japanese semiconductor industry. Once they understood the progress of DRAMs--one, four, 16, 64 [megabit]--they could multiply by four as well as any of us. Then, for the first time, they really had a fix on where the industry was going.
Before that, the industry seemed to move in more or less random directions, which didn't work well in the Japanese top-down corporate culture. But once they had a road map of where the industry was going, they became very formidable competitors. And even now, people look at these curves at the semiconductor industry association and essentially turn out road maps for staying exactly on the curves we have been on. They just try to get the industry to commit the resources to be there. So each of the individual participants tries to get ahead of that curve.
SA: To consumers, this almost seems to be a law of nature; it just happens by some magic. They don't necessarily see the tremendous engineering efforts that have to go into knocking down obstacles each time.
GM: Yes, there is a phenomenal amount of R&D work involved in this. This year we'll spend about $2.5 billion on R&D; it was about $1.9 last year. And we represent only about 10 percent of total industry spending on research. So it's up to $20 to $30 billion a year in R&D. A big, big investment.
SA: Does it seem to be getting harder or easier to knock down these obstacles each time we move from one generation to the next?
GM: I get farther away from it each time, so it gets easier for me. Technically these are phenomenally challenging problems, and the things we used to do relatively casually now take teams of Ph.D.s to advance the technology by an equivalent amount. But the amazing thing is that we've been able to do this for almost 40 years without running up against a barrier that really stopped progress.
Eventually, we may run out of gas. We are subject to the fact that materials are made out of atoms, and things like that. We're getting down now into some places where the atomic nature of matter starts to be a concern.
SA: Let me run through some of the most commonly cited obstacles facing continuing growth in chip density, and let me get your opinion on each. For instance, the cost of smaller features, switching to excimer lasers: is that a big problem?
GM: The cost overall of a modern fabrication facility keeps going up, but it hasn't proven to be the barrier people thought it would. At one time even talking of a billion-dollar factory would generate concern that only a couple companies could build them. Heck, now we all build them.
A couple of them are over $2.5 billion. There's one in Albuquerque that's $2.7 billion total investment. We have another near Phoenix that is not full yet, but by the time it's full will cost more than $2.5 billion. It's absurd! They're continuing to go up, but there doesn't seem to be a shortage of capital that would impose a limit.
SA: But the margins are decreasing, right? Aren't the costs of the plants going up faster than the returns you can get from them?
GM: Not for us; our margins aren't decreasing in that respect. The memory chips, the DRAMs, do go through cyclical periods that depend on demand-capacity relationships. DRAM prices have come down 80 percent in the last year. That clearly makes a huge difference in the profitability of the DRAM makers; they're not turning out five times as many chips as they were. So right now they're in one of these dips, but history can be depended on--it'll come back.
There was a terrible dip in the '81 time period, a terrible one in the '84-'85 period, and then it came back strongly after that. In '91-'92 there was a big drop, but it came back again, and now we're in a dip since last year.
It's the nature of the product that you've got a huge fixed investment in the plant--and more important, in the people who run the plant, all the engineers and people in the place. Once you've got that and demand goes down, it's very tempting to look at the incremental cost of making one more memory. You are faced with the choice of shutting down the plant or running it and filling it with something more than incremental cost but less than total cost. The industry has always opted to run it and sell the chips at lower prices.
And it's been a very elastic market. The memory in your PC keeps growing--the memory in everything keeps growing. So if you just run it, and wait awhile, the elasticity of the market has always bailed the industry out. So the common response is not to shut down plants but rather to run them, sell them for what you can get, and wait. The knee-jerk reaction of this industry is: if there is any problem, cut the price.