Editor's note (6/1/2011): We are making the text of this July 1985 article freely available for 30 days to coincide with the publication of a paper on entropy and quantum systems by Vlatko Vedral. He authored our June 2011 cover story and blogs about his latest work, which discusses the research featured in this 1985 article.
A computation, whether it is performed by electronic machinery, on an abacus or in a biological system such as the brain, is a physical process. It is subject to the same questions that apply to other physical processes: How much energy must be expended to perform a particular computation? How long must it take? How large must the computing device be? In other words, what are the physical limits of the process of computation?
So far it has been easier to ask these questions than to answer them. To the extent that we have found limits, they are terribly far away from the real limits of modern technology. We cannot profess, therefore, to be guiding the technologist or the engineer. What we are doing is really more fundamental. We are looking for general laws that must govern all information processing, no matter how it is accomplished. Any limits we find must be based solely on fundamental physical principles, not on whatever technology we may currently be using.
There are precedents for this kind of fundamental examination. In the 1940's Claude E. Shannon of the Bell Telephone Laboratories found there are limits on the amount of information that can be transmitted through a noisy channel; these limits apply no matter how the message is encoded into a signal. Shannon's work represents the birth of modern information science. Earlier, in the mid- and late 19th century, physicists attempting to determine the fundamental limits on the efficiency of steam engines had created the science of thermodynamics. In about 1960 one of us (Landauer) and John Swanson at IBM began attempting to apply the same type of analysis to the process of computing. Since the mid-1970's a growing number of other workers at other institutions have entered this field.
In our analysis of the physical limits of computation we use the term "information" in the technical sense of information theory. In this sense information is destroyed whenever two previously distinct situations become indistinguishable. In physical systems without friction, information can never be destroyed; whenever information is destroyed, some amount of energy must be dissipated (converted into heat). As an example, imagine two easily distinguishable physical situations, such as a rubber ball held either one meter or two meters off the ground. If the ball is dropped, it will bounce. If there is no friction and the ball is perfectly elastic, an observer will always be able to tell what state the ball started out in (that is, what its initial height was) because a ball dropped from two meters will bounce higher than a ball dropped from one meter.
If there is friction, however, the ball will dissipate a small amount of energy with each bounce, until it eventually stops bouncing and comes to rest on the ground. It will then be impossible to determine what the ball's initial state was; a ball dropped from two meters will be identical with a ball dropped from one meter. Information will have been lost as a result of energy dissipation.
Here is another example of information destruction: the expression 2 + 2 contains more information than the expression = 4. If all we know is that we have added two numbers to yield 4, then we do not know whether we have added 1 + 3, 2 + 2, 0 + 4 or some other pair of numbers. Since the output is implicit in the input, no computation ever generates information
In fact, computation as it is currently carried out depends on many operations that destroy information. The so-called and gate is a device with two input lines, each of which may be set at 1 or 0, and one output, whose value depends on the value of the inputs. If both inputs are 1, the output will be 1. If one of the inputs is 0 or if both are 0, the output will also be 0. Any time the gate's output is a 0 we lose information, because we do not know which of three possible states the input lines were in (0 and 1, 1 and 0, or 0 and 0). In fact, any logic gate that has more input than output lines inevitably discards information, because we cannot deduce the input from the output. Whenever we use such a "logically irreversible" gate, we dissipate energy into the environment. Erasing a bit of memory, another operation that is frequently used in computing, is also fundamentally dissipative; when we erase a bit, we lose all information about that bit's previous state.