# The Fundamental Physical Limits of Computation

What constraints govern the physical process of computing? Is a minimum amount of energy required, for example, per logic step? There seems to be no minimum, but some other questions are open

If the entire assembly is immersed in an ideal viscous fluid, then the frictional forces that act on the balls will be proportional to their velocity; there will be no static friction. The frictional force will therefore be very weak if we are content to move the balls slowly. In any mechanical system the energy that must be expended to work against friction is equal to the product of the frictional force and the distance through which the system travels. (Hence the faster a swimmer travels between two points, the more energy he or she will expend, although the distance traveled is the same whether the swimmer is fast or slow.) If we move the balls through the Fredkin gates at a low speed, then the energy expended (the product of force and distance) will be very small, because the frictional force depends directly on the balls' speed. Infact, we can expend as little energy as we wish, simply by taking a long time to carry out the operation. There is thus no minimum amount of energy that must be expended in order to perform any given computation.

The energy lost to friction in this model will be very small if the machine is operated very slowly. Is it possible to design a more idealized machine that could compute without any friction? Or is friction essential to the computing process? Fredkin, together with Tommaso Toffoli and others at M.I.T., has shown that it is not.

They demonstrated that it is possible to do computation by firing ideal, frictionless billiard balls at one another. In the billiard-ball model perfect reflecting "mirrors," surfaces that redirect the balls' motion, are arranged in such a way that the movement of the balls across a table emulates the movement of bits of information through logic gates. As before, the presence of a ball in a particular part of the computer signifies a 1, whereas the absence of a ball signifies a 0. If two balls arrive simultaneously at a logic gate, they will collide and their paths will change; their new paths represent the output of the gate. Fredkin, Toffoli and others have described arrangements of mirrors that correspond to different types of logic gate, and they have shown that billiard-ball models can be built to simulate any logic element that is necessary for computing.

To start the computation we fire a billiard ball into the computer wherever we wish to input a 1. The balls must enter the machine simultaneously. Since they are perfectly elastic, they do not lose energy when they collide; they will emerge from the computer with the same amount of kinetic energy we gave them at the beginning.

In operation a billiard-ball computer produces "garbage bits," just as a computer built of Fredkin gates does. After the computer has reached an answer we reflect the billiard balls back into it, undoing the computation. They will come out of the machine exactly where we sent them in, and at the same speed. The mechanism that launched them into the computer can then be used to absorb their kinetic energy. Once again we will have performed a computation and returned the computer to its initial state without dissipating energy.

The billiard-ball computer has one major flaw: it is extremely sensitive to slight errors. If a ball is aimed slightly incorrectly or if a mirror is tilted at a slightly wrong angle, the balls' trajectories will go astray. One or more balls will deviate from their intended paths, and in due course errors will combine to invalidate the entire computation. Even if perfectly elastic and frictionless billiard balls could be manufactured, the small amount of random thermal motion in the molecules they are made of would be enough to cause errors after a few dozen collisions.

Of course we could install some kind of corrective device that would return any errant billiard ball to its desired path, but then we would be obliterating information about the ball's earlier history. For example, we might be discarding information about the extent to which a mirror is tilted incorrectly. Discarding information, even to correct an error, can be done only in a system in which there is friction and loss of energy. Any correctional device must therefore dissipate some energy.

Rights & Permissions

View
1. 1. jtdwyer 04:44 PM 6/1/11

In fact, a bit of data is not information, not even for logic gates. Data being operated on by processors are merely transient copies of data representing information - no functional information system destroys any required data, no how many times it's 'ANDed'.

Moreover, in equating information to bits and energy, physicists apparently aren't aware that data stored in most electronic memories ('chips') require continual energy refreshment or their 'information' will simply dissipate!

Physical memory of all types, from processor registers, caches, memory chips, device buffers, magnetic and even optical media is continuously being released and reallocated, requiring that data be moved into to newly assigned physical memory, overlaying any previously stored data. All this is accomplished without any unintentional loss of information, which is managed and interpreted by software, not logic circuits!

Did you ever try to load a current spreadsheet file into an old copy of the program? If so, you should understand the difference between data and information.

When the article states:
"Are irreversible logic gates and erasures essential to computation? If they are, any computation we perform has to dissipate some minimum amount of energy."
…it ignores the physical realities of actual information processes. Just for example, even if some method of instantaneous computation were devised, a computer system based on that method could only operate as fast as data could be delivered for computation and retrieved from it. That and all data would be useless without properly interpretive software to finally represent it as information. Information management typically requires far more processing than computation of data and the enormous migration of data required for processing requires far more time than computations do.

IMO, the quest for instantaneous computing is much more dependent on the energy requirements and performance of information storage, retrieval and routing than logic gates.

2. 2. Some guy 12:03 AM 6/2/11

The AND operation and any other irreversible operation does destroy information which costs energy. I think the point made in the article was if you could make a computer that doesn't lose information then you can compute without that expense. Logic gates are the basis of computing. Software is merely a convenience.

3. 3. jtdwyer in reply to Some guy 02:55 AM 6/2/11

The AND operation has two parameters: a source data field and a destination field. The source field is not altered by the operation. If the destination field is a copy of an original data field, not only is its data not destroyed, but the AND operation produces additional data.

By the author's definition of irreversible operations or erasures, only repeated operations on a single set of data could ever be performed, since no new data could ever be brought into storage and no results could be extracted - that would require erasure of previous data.

Logic gates do absolutely nothing unless instructed by software to operate on data provided by software - very convenient indeed!

4. 4. deisner 06:08 PM 6/3/11

1. Every page of the original article contains detailed figures that add much to the authors' arguments. If you're going to publish the article (which is appreciated), please include all of it.

2. Would today's Scientific American publish an article of this technical sophistication and length (nine pages)?

3. Check out what SciAm covers looked like back in the 1980's when this was published (including the July 1985 cover):

http://www.coverbrowser.com/covers/scientific-american/42

The covers are beautiful, informative, and depict real subjects of the science journalism contained in that issue.

Compare that to the current cover:

http://www.scientificamerican.com/media/cover/cover_2011-06.jpg

Or these recent gems:

http://www.scientificamerican.com/media/cover/cover_2008-02.jpg

http://www.scientificamerican.com/media/cover/cover_2008-03.jpg

http://www.scientificamerican.com/media/cover/cover_2010-03.jpg

Why do the covers have to be so freaking goofy? This is embarrassing. What happened to this once-great magazine?

5. 5. jtdwyer in reply to deisner 08:51 PM 6/3/11

Really excellent comment! I dug back into my 'library' and found the original print issue.

I think a more direct comparison for the cover illustrations of current issues would be OMNI magazine of the 1970s. Unfortunately, much of the content of recent issues is also comparable to OMNI...

I hadn't even realized that the original figures had been omitted - they really do enhance the reader's ability to understand the article.

However, I still object to many of the premises of the article and I think information theory at least as adopted and incorporated into physics.

The article states:
"Here is another example of information destruction: the expression 2 + 2 contains more information than the expression = 4. If all we know is that we have added two numbers to yield 4, then we do not know whether we have added 1 + 3, 2 + 2, 0 + 4 or some other pair of numbers. Since the output is implicit in the input, no computation ever generates information."

Again, a programmer is not at the mercy of logic gates and can maintain the source variables for any 'destructive' operation. In that case the last statement is doubly incorrect: _all_ computations generate additional information and, if source data is maintained no information is destroyed!

Lastly, the description of first illustration on page 49 in the original issue states:
"The "logic gates" central to the design of a chip expend energy because they discard information."

Does it require more energy for a person of a chip to calculate that 2 + 2 = 4 than it does to calculate that 1 + 1 = 2? In the second case the original values can be derived so, according to the article, no information is lost. I guess it's presumed that a person or a chip cannot remember the numbers that produced the sum of 4...

I don't see that any relationship has been established between and data content or information and any energy requirements, except as some obtuse abstract misconception.

6. 6. bernsten69 08:51 AM 6/7/11

I see the problem as thus:

Let me go with the ball-system you describe. In order to perform a computation, some minimum amount of energy must be put into the system to move the balls forward. After the computation, in order to USE that information (ie the balls go into another logic gate etc.), some amount of that initial energy must be transfered forward into whatever apparatus uses the result of the computation. The measurement of the result creates some minimum amount of entropy that makes it impossible for the system to completely reverse itself. A simple example would be some sort of light sensor at the end of your computation system that detects the presence of the ball. When the photons bounce off of the ball, they impart a certain minimum amount of energy to the ball (possibly negative if it pushes against its direction of motion). The state of the ball therefore becomes, to some extent, uncertain and more entropic. After the ball bounces against some immovable object that reflects 100% of its energy back towards the beginning part of the machine, the ball(s) will arrive back at the starting location. However, the energy that they impart to the 'first mover' that caused the balls initial motion is now uncertain (by whatever uncertainty the measurement created). Further, there is a non-zero probability that, due to the measurement interaction, the balls do not return to their original states.

Although, in theory, a self-contained, reversible computation machine is possible, in the physical universe, to actually observe (and use) the result of the computation, a certain amount of entropy is introduced into the system which makes it impossible for the system to return to its original state. The minimum amount of energy necessary to perform a computation relates to the entropy imparted by the observation of the computation - A strange but thermodynamically sound conclusion.

Adding in a bit of Quantum Mechanics, until we observe the output of the computation, the computation machine state exists in a superposition of all possible states. In otherwords, no computation (in the classical sense) is possible without an observation. Since observation imparts a certain non-zero entropy into the computation machine state, no computation is possible without the expenditure of a certain minimum amount of energy, and no 'computation machine' is fully reversible in the physical universe.

7. 7. rodovre 12:18 PM 6/13/11

Errata: Page 4, 4th Para. has been truncated.

8. 8. wolfkiss in reply to jtdwyer 03:24 PM 6/13/11

Yes, you can save any data within any argument that is passed to some function. However, this "saving" of the inputs requires extra resources in the form of an array or data structure *before* those inputs are entered into the function. These extra resources are not abstract, because the execution of the "saving" and the function require the use of entirely separate groups logic gates (or the same gates at different times). The former group duplicates inputs while the latter computes the programmed function. The function execution does in fact destroy the existence of its copy of the inputs. This is just a fact of Turing equivalent discrete state computation at the physical level. Logic gates don't "remember" unless additional logic gates are employed.

But it's even worse than that, because the system must expend even *more* physical resources in order to create a pointer between the inputs and the outputs in order to maintain any relevance between these discrete and independent states. The noiseless environment that is the genius of Turing equivalent computation comes at an entropic cost, and that cost is amplified when trying to preserve, and associate, past states with present states.

We forget this today because memory has become so relatively cheap. But in the good ol' days software engineering really was engineering because the ability to execute some code was very constrained by available time and memory.

You must sign in or register as a ScientificAmerican.com member to submit a comment.
Click one of the buttons below to register using an existing Social Account.

## More from Scientific American

• Reuters | 7 hours ago

### Six People Rescued from Nevada Cold Kept Warm by Heating Stones

• Reuters | 9 hours ago

### Ex-BP Supervisors Win Dismissal of Some Manslaughter Charges

• Reuters | 11 hours ago

### EPA Tells Court U.S. Mercury, Toxics Rule Is Legally Justified

• Image Gallery | 13 hours ago | 3

### Chelyabinsk Meteor: Dust Grains Reveal How It Played Bumper Car Before Hitting Earth

• News | 14 hours ago | 5

## Latest from SA Blog Network

• ### Blast from the Past: A Few Science Highlights from 1994

Cocktail Party Physics | 13 hours ago
• ### Public Domain Day: January 1st

Compound Eye | 14 hours ago
• ### Air pollution stretches from Beijing to Shanghai, as seen from space

Plugged In | 15 hours ago
• ### Are Genes Really Selfish? [Video]

Observations | 16 hours ago
• ### Conversation on Daydreaming with Jerome L. Singer

MIND
Beautiful Minds | 17 hours ago

The Fundamental Physical Limits of Computation

X

Give a 1 year subscription as low as \$14.99

X

X

###### Welcome, . Do you have an existing ScientificAmerican.com account?

No, I would like to create a new account with my profile information.

X

Are you sure?

X