# The Fundamental Physical Limits of Computation

What constraints govern the physical process of computing? Is a minimum amount of energy required, for example, per logic step? There seems to be no minimum, but some other questions are open

In the laboratory the speed with which RNA polymerase acts can be varied by adjusting the concentrations of the reactants (as Judith Levin and Michael J. Chamberlin of the University of California at Berkeley have shown). As the concentrations are brought closer to equilibrium the enzyme works more slowly and dissipates less energy to copy a given section of DNA, because the ratio of forward to backward steps is smaller.

Although RNA polymerase merely copies information without processing it, it is relatively easy to imagine how a hypothetical chemical Turing machine might work. The tape is a single long backbone molecule to which two types of base, representing the binary 0 and 1, attach at periodic sites. A small additional molecule is attached to the 0 or 1 group at one site along the chain. The position of this additional molecule represents the position of the Turing machine's head. There are several different types of "head molecule," each type representing a different machine state.

The machine's transition rules are represented by enzymes. Each enzyme is capable of catalyzing one particular reaction. The way these enzymes work is best demonstrated by an example.

Suppose the head molecule is type A (indicating that the machine is in state A) and is attached to a 0 base. Also suppose the following transition rule applies: "When the head is in state A and reads a 0, change the 0 to a 1, change state to B and move to 'the right." A molecule of the enzyme representing this rule has a site that fits a type-A head molecule attached to a 1 base. It also has one site that fits a 0 base and one site that fits a B head [see illustration on opposite page).

To achieve the transition, the enzyme molecule first approaches the tape molecule at a location ‘just to the right of the base on which the A head resides. Then it detaches from the tape both the head molecule and the 0 base to which the head was attached, putting in their place a 1 base. Next it attaches a B head to the base that is to the right of the 1 base it has just added to the tape. At this point the transition is complete. The head's original site is changed from a 0 to a 1, the head molecule is now a type B, and it is attached to the base that is one notch to the right of the previous head position.

During the operation of a Brownian Turing machine the tape would have to be immersed in a solution containing many enzyme molecules, as well as extra O's, 1 's, A's and B's. To drive the reaction forward there would have to be some other reaction that cleaned the enzyme molecules of detached heads and bases. The concentrations of the reactants that clean the enzyme molecules represent the force that drives the Turing machine forward. Again we can expend as little energy as we wish simply by driving the machine forward very slowly.

The enzymatic Turing machine would not be error-free. Occasionally a reaction that is not catalyzed by any enzyme might occur; for example, a 0 base could spontaneously detach itself from the backbone molecule and a 1 base could be attached in its place. Similar errors do indeed occur during RNA synthesis.

In principle it would be possible to eliminate errors by building a Brownian Turing machine out of rigid, frictionless clockwork. The clockwork Turing machine involves less idealization than the billiard-ball computer but more than the enzymatic Turing machine. On the one hand, its parts need not be manufactured to perfect tolerances, as the billiard balls would have to be; the parts fit loosely together, and the machine can operate even in the presence of a large amount of thermal noise. Still, its parts must be perfectly rigid and free of static friction, properties not found in any macroscopic body.

Because the machine's parts fit together loosely, they are held in place not by friction but by grooves or notches in neighboring parts. Although each part of the machine is free to jiggle a little, like the pieces of a well-worn wood puzzle, the machine as a whole can only follow one "computational path." That is, the machine's parts interlock in such a way that at any time the machine can make only two kinds of large-scale motion: the motion corresponding to a forward computational step and that corresponding to a backward step.

Rights & Permissions

View
1. 1. jtdwyer 04:44 PM 6/1/11

In fact, a bit of data is not information, not even for logic gates. Data being operated on by processors are merely transient copies of data representing information - no functional information system destroys any required data, no how many times it's 'ANDed'.

Moreover, in equating information to bits and energy, physicists apparently aren't aware that data stored in most electronic memories ('chips') require continual energy refreshment or their 'information' will simply dissipate!

Physical memory of all types, from processor registers, caches, memory chips, device buffers, magnetic and even optical media is continuously being released and reallocated, requiring that data be moved into to newly assigned physical memory, overlaying any previously stored data. All this is accomplished without any unintentional loss of information, which is managed and interpreted by software, not logic circuits!

Did you ever try to load a current spreadsheet file into an old copy of the program? If so, you should understand the difference between data and information.

When the article states:
"Are irreversible logic gates and erasures essential to computation? If they are, any computation we perform has to dissipate some minimum amount of energy."
…it ignores the physical realities of actual information processes. Just for example, even if some method of instantaneous computation were devised, a computer system based on that method could only operate as fast as data could be delivered for computation and retrieved from it. That and all data would be useless without properly interpretive software to finally represent it as information. Information management typically requires far more processing than computation of data and the enormous migration of data required for processing requires far more time than computations do.

IMO, the quest for instantaneous computing is much more dependent on the energy requirements and performance of information storage, retrieval and routing than logic gates.

2. 2. Some guy 12:03 AM 6/2/11

The AND operation and any other irreversible operation does destroy information which costs energy. I think the point made in the article was if you could make a computer that doesn't lose information then you can compute without that expense. Logic gates are the basis of computing. Software is merely a convenience.

3. 3. jtdwyer in reply to Some guy 02:55 AM 6/2/11

The AND operation has two parameters: a source data field and a destination field. The source field is not altered by the operation. If the destination field is a copy of an original data field, not only is its data not destroyed, but the AND operation produces additional data.

By the author's definition of irreversible operations or erasures, only repeated operations on a single set of data could ever be performed, since no new data could ever be brought into storage and no results could be extracted - that would require erasure of previous data.

Logic gates do absolutely nothing unless instructed by software to operate on data provided by software - very convenient indeed!

4. 4. deisner 06:08 PM 6/3/11

1. Every page of the original article contains detailed figures that add much to the authors' arguments. If you're going to publish the article (which is appreciated), please include all of it.

2. Would today's Scientific American publish an article of this technical sophistication and length (nine pages)?

3. Check out what SciAm covers looked like back in the 1980's when this was published (including the July 1985 cover):

http://www.coverbrowser.com/covers/scientific-american/42

The covers are beautiful, informative, and depict real subjects of the science journalism contained in that issue.

Compare that to the current cover:

http://www.scientificamerican.com/media/cover/cover_2011-06.jpg

Or these recent gems:

http://www.scientificamerican.com/media/cover/cover_2008-02.jpg

http://www.scientificamerican.com/media/cover/cover_2008-03.jpg

http://www.scientificamerican.com/media/cover/cover_2010-03.jpg

Why do the covers have to be so freaking goofy? This is embarrassing. What happened to this once-great magazine?

5. 5. jtdwyer in reply to deisner 08:51 PM 6/3/11

Really excellent comment! I dug back into my 'library' and found the original print issue.

I think a more direct comparison for the cover illustrations of current issues would be OMNI magazine of the 1970s. Unfortunately, much of the content of recent issues is also comparable to OMNI...

I hadn't even realized that the original figures had been omitted - they really do enhance the reader's ability to understand the article.

However, I still object to many of the premises of the article and I think information theory at least as adopted and incorporated into physics.

The article states:
"Here is another example of information destruction: the expression 2 + 2 contains more information than the expression = 4. If all we know is that we have added two numbers to yield 4, then we do not know whether we have added 1 + 3, 2 + 2, 0 + 4 or some other pair of numbers. Since the output is implicit in the input, no computation ever generates information."

Again, a programmer is not at the mercy of logic gates and can maintain the source variables for any 'destructive' operation. In that case the last statement is doubly incorrect: _all_ computations generate additional information and, if source data is maintained no information is destroyed!

Lastly, the description of first illustration on page 49 in the original issue states:
"The "logic gates" central to the design of a chip expend energy because they discard information."

Does it require more energy for a person of a chip to calculate that 2 + 2 = 4 than it does to calculate that 1 + 1 = 2? In the second case the original values can be derived so, according to the article, no information is lost. I guess it's presumed that a person or a chip cannot remember the numbers that produced the sum of 4...

I don't see that any relationship has been established between and data content or information and any energy requirements, except as some obtuse abstract misconception.

6. 6. bernsten69 08:51 AM 6/7/11

I see the problem as thus:

Let me go with the ball-system you describe. In order to perform a computation, some minimum amount of energy must be put into the system to move the balls forward. After the computation, in order to USE that information (ie the balls go into another logic gate etc.), some amount of that initial energy must be transfered forward into whatever apparatus uses the result of the computation. The measurement of the result creates some minimum amount of entropy that makes it impossible for the system to completely reverse itself. A simple example would be some sort of light sensor at the end of your computation system that detects the presence of the ball. When the photons bounce off of the ball, they impart a certain minimum amount of energy to the ball (possibly negative if it pushes against its direction of motion). The state of the ball therefore becomes, to some extent, uncertain and more entropic. After the ball bounces against some immovable object that reflects 100% of its energy back towards the beginning part of the machine, the ball(s) will arrive back at the starting location. However, the energy that they impart to the 'first mover' that caused the balls initial motion is now uncertain (by whatever uncertainty the measurement created). Further, there is a non-zero probability that, due to the measurement interaction, the balls do not return to their original states.

Although, in theory, a self-contained, reversible computation machine is possible, in the physical universe, to actually observe (and use) the result of the computation, a certain amount of entropy is introduced into the system which makes it impossible for the system to return to its original state. The minimum amount of energy necessary to perform a computation relates to the entropy imparted by the observation of the computation - A strange but thermodynamically sound conclusion.

Adding in a bit of Quantum Mechanics, until we observe the output of the computation, the computation machine state exists in a superposition of all possible states. In otherwords, no computation (in the classical sense) is possible without an observation. Since observation imparts a certain non-zero entropy into the computation machine state, no computation is possible without the expenditure of a certain minimum amount of energy, and no 'computation machine' is fully reversible in the physical universe.

7. 7. rodovre 12:18 PM 6/13/11

Errata: Page 4, 4th Para. has been truncated.

8. 8. wolfkiss in reply to jtdwyer 03:24 PM 6/13/11

Yes, you can save any data within any argument that is passed to some function. However, this "saving" of the inputs requires extra resources in the form of an array or data structure *before* those inputs are entered into the function. These extra resources are not abstract, because the execution of the "saving" and the function require the use of entirely separate groups logic gates (or the same gates at different times). The former group duplicates inputs while the latter computes the programmed function. The function execution does in fact destroy the existence of its copy of the inputs. This is just a fact of Turing equivalent discrete state computation at the physical level. Logic gates don't "remember" unless additional logic gates are employed.

But it's even worse than that, because the system must expend even *more* physical resources in order to create a pointer between the inputs and the outputs in order to maintain any relevance between these discrete and independent states. The noiseless environment that is the genius of Turing equivalent computation comes at an entropic cost, and that cost is amplified when trying to preserve, and associate, past states with present states.

We forget this today because memory has become so relatively cheap. But in the good ol' days software engineering really was engineering because the ability to execute some code was very constrained by available time and memory.

You must sign in or register as a ScientificAmerican.com member to submit a comment.
Click one of the buttons below to register using an existing Social Account.

## More from Scientific American

• News | 6 hours ago | 4

### Universe Really Is a Hologram According to New Simulations

• Observations | 7 hours ago

### The FDA s Action on Agricultural Antibiotics is Overdue and Utterly Insufficient

• News | 7 hours ago | 3

### As Lionfish Invade the Caribbean and Gulf of Mexico, Conservationists Say Eat Up [Slide Show]

• News | 8 hours ago | 8

### Physicists Find a Link between Wormholes and Spooky Action at a Distance

• News | 10 hours ago | 1

## Latest from SA Blog Network

• ### The FDA s Action on Agricultural Antibiotics is Overdue and Utterly Insufficient

Observations | 7 hours ago
• ### Can Microbes Clean Up Our Oily Mess? [Video]

Observations | 8 hours ago
• ### Glow Sticks Prove the Math Theorem behind the Famous Flatiron Building

Observations | 10 hours ago
• ### Dark Energy, Dark Universe

Plugged In | 13 hours ago
• ### Wolves Can Learn From Humans. What Does That Mean For Dogs?

MIND
The Thoughtful Animal | 14 hours ago

## Science Jobs of the Week

The Fundamental Physical Limits of Computation

X

Give a 1 year subscription as low as \$14.99

X

X

###### Welcome, . Do you have an existing ScientificAmerican.com account?

No, I would like to create a new account with my profile information.

X

Are you sure?

X