In the 1960s Gordon Moore made the empirical observation that the density of components on a chip was doubling roughly every 18 months. Over the past 40 years, Moore's law has continued to hold. These doublings in chip density explain why today's personal computers are as powerful as those that only governments and large corporations possessed just a couple decades ago. But in 10 to 20 years each transistor will have shrunk to atomic size, and Moore's law, which is based on current silicon technology, is expected to end. This prospect drives the search for entirely new technologies, and one major candidate is a quantum computer--that is, a computer based on the principles of quantum mechanics. There is another motive for studying quantum computers. The functioning of such a device, which lies at the intersection of quantum mechanics, computer science and mathematics, has aroused great intellectual curiosity.
George Johnson, who writes about science for the New York Times, has set himself the task of deconstructing quantum computing at a level that readers of that newspaper--and this magazine--can understand. He has succeeded admirably. He explains the principles of quantum mechanics essential to quantum computing but tells no more than necessary. "We are operating here," he promises, "on a need-to-know basis."
This article was originally published with the title The Next Big Thing?.