# The Fundamental Physical Limits of Computation

What constraints govern the physical process of computing? Is a minimum amount of energy required, for example, per logic step? There seems to be no minimum, but some other questions are open

Are irreversible logic gates and erasures essential to computation? If they are, any computation we perform has to dissipate some minimum amount of energy.

As one of us (Bennett) showed in 1973, however, they are not essential. This conclusion has since been demonstrated in several models; the easiest of these to describe are based on so-called reversible logic elements such as the Fredkin gate, named for Edward Fredkin of the Massachusetts Institute of Technology. The Fredkin gate has three input lines and three outputs. The input on one line, which is called the control channel, is fed unchanged through the gate. If the control channel is set at 0, the input on the other two lines also passes through unchanged. If the control line is a 1, however, the outputs of the other two lines are switched: the input of one line becomes the output of the other and vice versa. The Fredkin gate does not discard any information; the input can always be deduced from the output.

Fredkin has shown that any logic device required in a computer can be implemented by an appropriate arrangement of Fredkin gates. To make the computation work, certain input lines of some of the Fredkin gates must be preset at particular values.

Fredkin gates have more output lines than the gates they are made to simulate. In the process of computing, what seem to be "garbage bits," bits of information that have no apparent use, are therefore generated. These bits must somehow be cleared out of the computer if we are to use it again, but if we erase them, it will cost us all the energy dissipation we have been trying to avoid.

Actually these bits have a most important use. Once we have copied down the result of our computation, which will reside in the normal output bits, we simply run the computer in reverse. That is, we enter the "garbage bits" and output bits that were produced by the computer's normal operation as "input" into the "back end" of the computer. This is possible because each of the logic gates in the computer is itself reversible. Running the computer in reverse discards no information, and so it need not dissipate any energy. Eventually the computer will be left exactly as it was before the computation began. Hence it is possible to complete a "computing cycle" to run a computer and then to return it to its original state-without dissipating any energy.

So far we have discussed a set of logic operations, not a physical device. It is not hard, however, to imagine a physical device that operates as a Fredkin gate. In this device the information channels are represented by pipes. A bit of information is represented by the presence or absence of a ball in a particular section of pipe; the presence of a ball signifies a 1 and the absence of a ball signifies a 0.

The control line is represented by a narrow segment of pipe that is split lengthwise down the middle. When a ball enters the split segment of pipe, it pushes the two halves of the pipe apart, actuating a switching device. The switching device channels any input balls that may be in the other two pipes: when a ball is present in the control line, any ball that enters an input pipe is automatically redirected to the other pipe. To ensure that the switch is closed when no control ball is present, there are springs that hold the two halves of the split pipe together. A ball entering the split pipe must expend energy when it compresses the springs, but this energy is not lost; it can be recovered when the control ball leaves the split pipe and the springs expand.

All the balls are linked together and pushed forward by one mechanism, so that they move in synchrony; otherwise we could not ensure that the various input and controlling balls would arrive at a logic gate together. In a sense the forward progress of the computation is really motion along a single degree of freedom, like the motion of two wheels rigidly attached to one axle. Once the computation is done we push all the balls backward, undoing all the operations and returning the computer to its initial state.

Rights & Permissions

View
1. 1. jtdwyer 04:44 PM 6/1/11

In fact, a bit of data is not information, not even for logic gates. Data being operated on by processors are merely transient copies of data representing information - no functional information system destroys any required data, no how many times it's 'ANDed'.

Moreover, in equating information to bits and energy, physicists apparently aren't aware that data stored in most electronic memories ('chips') require continual energy refreshment or their 'information' will simply dissipate!

Physical memory of all types, from processor registers, caches, memory chips, device buffers, magnetic and even optical media is continuously being released and reallocated, requiring that data be moved into to newly assigned physical memory, overlaying any previously stored data. All this is accomplished without any unintentional loss of information, which is managed and interpreted by software, not logic circuits!

Did you ever try to load a current spreadsheet file into an old copy of the program? If so, you should understand the difference between data and information.

When the article states:
"Are irreversible logic gates and erasures essential to computation? If they are, any computation we perform has to dissipate some minimum amount of energy."
…it ignores the physical realities of actual information processes. Just for example, even if some method of instantaneous computation were devised, a computer system based on that method could only operate as fast as data could be delivered for computation and retrieved from it. That and all data would be useless without properly interpretive software to finally represent it as information. Information management typically requires far more processing than computation of data and the enormous migration of data required for processing requires far more time than computations do.

IMO, the quest for instantaneous computing is much more dependent on the energy requirements and performance of information storage, retrieval and routing than logic gates.

2. 2. Some guy 12:03 AM 6/2/11

The AND operation and any other irreversible operation does destroy information which costs energy. I think the point made in the article was if you could make a computer that doesn't lose information then you can compute without that expense. Logic gates are the basis of computing. Software is merely a convenience.

3. 3. jtdwyer in reply to Some guy 02:55 AM 6/2/11

The AND operation has two parameters: a source data field and a destination field. The source field is not altered by the operation. If the destination field is a copy of an original data field, not only is its data not destroyed, but the AND operation produces additional data.

By the author's definition of irreversible operations or erasures, only repeated operations on a single set of data could ever be performed, since no new data could ever be brought into storage and no results could be extracted - that would require erasure of previous data.

Logic gates do absolutely nothing unless instructed by software to operate on data provided by software - very convenient indeed!

4. 4. deisner 06:08 PM 6/3/11

1. Every page of the original article contains detailed figures that add much to the authors' arguments. If you're going to publish the article (which is appreciated), please include all of it.

2. Would today's Scientific American publish an article of this technical sophistication and length (nine pages)?

3. Check out what SciAm covers looked like back in the 1980's when this was published (including the July 1985 cover):

http://www.coverbrowser.com/covers/scientific-american/42

The covers are beautiful, informative, and depict real subjects of the science journalism contained in that issue.

Compare that to the current cover:

http://www.scientificamerican.com/media/cover/cover_2011-06.jpg

Or these recent gems:

http://www.scientificamerican.com/media/cover/cover_2008-02.jpg

http://www.scientificamerican.com/media/cover/cover_2008-03.jpg

http://www.scientificamerican.com/media/cover/cover_2010-03.jpg

Why do the covers have to be so freaking goofy? This is embarrassing. What happened to this once-great magazine?

5. 5. jtdwyer in reply to deisner 08:51 PM 6/3/11

Really excellent comment! I dug back into my 'library' and found the original print issue.

I think a more direct comparison for the cover illustrations of current issues would be OMNI magazine of the 1970s. Unfortunately, much of the content of recent issues is also comparable to OMNI...

I hadn't even realized that the original figures had been omitted - they really do enhance the reader's ability to understand the article.

However, I still object to many of the premises of the article and I think information theory at least as adopted and incorporated into physics.

The article states:
"Here is another example of information destruction: the expression 2 + 2 contains more information than the expression = 4. If all we know is that we have added two numbers to yield 4, then we do not know whether we have added 1 + 3, 2 + 2, 0 + 4 or some other pair of numbers. Since the output is implicit in the input, no computation ever generates information."

Again, a programmer is not at the mercy of logic gates and can maintain the source variables for any 'destructive' operation. In that case the last statement is doubly incorrect: _all_ computations generate additional information and, if source data is maintained no information is destroyed!

Lastly, the description of first illustration on page 49 in the original issue states:
"The "logic gates" central to the design of a chip expend energy because they discard information."

Does it require more energy for a person of a chip to calculate that 2 + 2 = 4 than it does to calculate that 1 + 1 = 2? In the second case the original values can be derived so, according to the article, no information is lost. I guess it's presumed that a person or a chip cannot remember the numbers that produced the sum of 4...

I don't see that any relationship has been established between and data content or information and any energy requirements, except as some obtuse abstract misconception.

6. 6. bernsten69 08:51 AM 6/7/11

I see the problem as thus:

Let me go with the ball-system you describe. In order to perform a computation, some minimum amount of energy must be put into the system to move the balls forward. After the computation, in order to USE that information (ie the balls go into another logic gate etc.), some amount of that initial energy must be transfered forward into whatever apparatus uses the result of the computation. The measurement of the result creates some minimum amount of entropy that makes it impossible for the system to completely reverse itself. A simple example would be some sort of light sensor at the end of your computation system that detects the presence of the ball. When the photons bounce off of the ball, they impart a certain minimum amount of energy to the ball (possibly negative if it pushes against its direction of motion). The state of the ball therefore becomes, to some extent, uncertain and more entropic. After the ball bounces against some immovable object that reflects 100% of its energy back towards the beginning part of the machine, the ball(s) will arrive back at the starting location. However, the energy that they impart to the 'first mover' that caused the balls initial motion is now uncertain (by whatever uncertainty the measurement created). Further, there is a non-zero probability that, due to the measurement interaction, the balls do not return to their original states.

Although, in theory, a self-contained, reversible computation machine is possible, in the physical universe, to actually observe (and use) the result of the computation, a certain amount of entropy is introduced into the system which makes it impossible for the system to return to its original state. The minimum amount of energy necessary to perform a computation relates to the entropy imparted by the observation of the computation - A strange but thermodynamically sound conclusion.

Adding in a bit of Quantum Mechanics, until we observe the output of the computation, the computation machine state exists in a superposition of all possible states. In otherwords, no computation (in the classical sense) is possible without an observation. Since observation imparts a certain non-zero entropy into the computation machine state, no computation is possible without the expenditure of a certain minimum amount of energy, and no 'computation machine' is fully reversible in the physical universe.

7. 7. rodovre 12:18 PM 6/13/11

Errata: Page 4, 4th Para. has been truncated.

8. 8. wolfkiss in reply to jtdwyer 03:24 PM 6/13/11

Yes, you can save any data within any argument that is passed to some function. However, this "saving" of the inputs requires extra resources in the form of an array or data structure *before* those inputs are entered into the function. These extra resources are not abstract, because the execution of the "saving" and the function require the use of entirely separate groups logic gates (or the same gates at different times). The former group duplicates inputs while the latter computes the programmed function. The function execution does in fact destroy the existence of its copy of the inputs. This is just a fact of Turing equivalent discrete state computation at the physical level. Logic gates don't "remember" unless additional logic gates are employed.

But it's even worse than that, because the system must expend even *more* physical resources in order to create a pointer between the inputs and the outputs in order to maintain any relevance between these discrete and independent states. The noiseless environment that is the genius of Turing equivalent computation comes at an entropic cost, and that cost is amplified when trying to preserve, and associate, past states with present states.

We forget this today because memory has become so relatively cheap. But in the good ol' days software engineering really was engineering because the ability to execute some code was very constrained by available time and memory.

You must sign in or register as a ScientificAmerican.com member to submit a comment.
Click one of the buttons below to register using an existing Social Account.

## More from Scientific American

• Scientific American Magazine | 6 hours ago | 1

### What to Do about the Flu?

• Scientific American Magazine | 8 hours ago | 2

### Orphaned Chimpanzees Play Rougher Than Their Mother-Reared Counterparts

• Scientific American Magazine | Dec 7, 2013 | 1

### Teenage Flu Scientist Shares His Recipe for Prizewinning Research

• Scientific American Magazine | Dec 7, 2013 | 1

• Cocktail Party Physics | Dec 7, 2013

More »

## Latest from SA Blog Network

• ### Nerds and Words: Week 49

Overthinking It | 1 hour ago
• ### Photoblogging: Muppet or Flamingo?

MIND
The Thoughtful Animal | 3 hours ago
• ### Sunday Species Snapshot: Fijian Monkey-Faced Bat

Extinction Countdown | 5 hours ago
• ### Right now, there's a giant blue chicken in Trafalgar Square

Tetrapod Zoology | 5 hours ago
• ### The bacteria in breast milk

Lab Rat | 6 hours ago

## Science Jobs of the Week

The Fundamental Physical Limits of Computation

X

Give a 1 year subscription as low as \$14.99

X

X

###### Welcome, . Do you have an existing ScientificAmerican.com account?

No, I would like to create a new account with my profile information.

X

Are you sure?

X