Thanksgiving

This year we give thanks for an idea that establishes a direct connection between the concepts of “energy” and “information”: Landauer’s Principle. (We’ve previously given thanks for the Standard Model Lagrangian, Hubble’s Law, the Spin-Statistics Theorem, conservation of momentum, effective field theory, the error bar, and gauge symmetry.)

Landauer’s Principle states that irreversible loss of information — whether it’s erasing a notebook or swiping a computer disk — is necessarily accompanied by an increase in entropy. Charles Bennett puts it in relatively precise terms:

Any logically irreversible manipulation of information, such as the erasure of a bit or the merging of two computation paths, must be accompanied by a corresponding entropy increase in non-information bearing degrees of freedom of the information processing apparatus or its environment.

The principle captures the broad idea that “information is physical.” More specifically, it establishes a relationship between logically irreversible processes and the generation of heat. If you want to erase a single bit of information in a system at temperature T, says Landauer, you will generate an amount of heat equal to at least

$(\ln 2)k T,$

where k is Boltzmann’s constant.

This all might come across as a blur of buzzwords, so take a moment to appreciate what is going on. “Information” seems like a fairly abstract concept, even in a field like physics where you can’t swing a cat without hitting an abstract concept or two. We record data, take pictures, write things down, all the time — and we forget, or erase, or lose our notebooks all the time, too. Landauer’s Principle says there is a direct connection between these processes and the thermodynamic arrow of time, the increase in entropy throughout the universe. The information we possess is a precious, physical thing, and we are gradually losing it to the heat death of the cosmos under the irresistible pull of the Second Law.

The principle originated in attempts to understand Maxwell’s Demon. You’ll remember the plucky sprite who decreases the entropy of gas in a box by letting all the high-velocity molecules accumulate on one side and all the low-velocity ones on the other. Since Maxwell proposed the Demon, all right-thinking folks agreed that the entropy of the whole universe must somehow be increasing along the way, but it turned out to be really hard to pinpoint just where it was happening.

The answer is not, as many people supposed, in the act of the Demon observing the motion of the molecules; it’s possible to make such observations in a perfectly reversible (entropy-neutral) fashion. But the Demon has to somehow keep track of what its measurements have revealed. And unless it has an infinitely big notebook, it’s going to eventually have to erase some of its records about the outcomes of those measurements — and that’s the truly irreversible process. This was the insight of Rolf Landauer in the 1960’s, which led to his principle.

A 1982 paper by Bennett provides a nice illustration of the principle in action, based on Szilard’s Engine. Short version of the argument: imagine you have a piston with a single molecule in it, rattling back and forth. If you don’t know where it is, you can’t extract any energy from it. But if you measure the position of the molecule, you could quickly stick in a piston on the side where the molecule is not, then let the molecule bump into your piston and extract energy. The amount you get out is (ln 2)kT. You have “extracted work” from a system that was supposed to be at maximum entropy, in apparent violation of the Second Law. But it was important that you started in a “ready state,” not knowing where the molecule was — in a world governed by reversible laws, that’s a crucial step if you want your measurement to correspond reliably to the correct result. So to do this kind of thing repeatedly, you will have to return to that ready state — which means erasing information. That decreases your phase space, and therefore increases entropy, and generates heat. At the end of the day, that information erasure generates just as much entropy as went down when you extracted work; the Second Law is perfectly safe.

The status of Landauer’s Principle is still a bit controversial in some circles — here’s a paper by John Norton setting out the skeptical case. But modern computers are running up against the physical limits on irreversible computation established by the Principle, and experiments seem to be verifying it. Even something as abstract as “information” is ultimately part of the world of physics.

This entry was posted in Science. Bookmark the permalink.

36 Responses to Thanksgiving

1. Gizelle Janine says:

Great post, Sean. Happy Thanksgiving!

Like or Dislike: 0  0

2. Matt Lewis says:

Great post! I think there an error in the last sentence of paragraph 6? It should say that “…all right thinking folks agreed that the _entropy_ of the whole universe…”, right?

Happy thanksgiving!

Like or Dislike: 0  0

3. Oscar says:

Nice Post.

Regarding the controversial status there is a nice recent paper (http://arxiv.org/abs/1306.4352) formulating Landauer’s principle in a minimal setting and providing a rigorous proof.

Cheers
Oscar

Like or Dislike: 3  0

4. Daniel Arovas says:

I’m thankful for this blog.

Like or Dislike: 1  0

5. Sean Carroll says:

Matt Lewis — totally correct, I’ll fix it.

Like or Dislike: 2  0

6. Chris says:

I was focused so much on ISON this year I forgot about your yearly tradition. Something to cheer me up today.

Like or Dislike: 0  0

7. T.E. Oakley says:

Dr. Carroll,
I have two questions in reference to your post on Landauer’s Principle:
(A). 1. If all information is PHYSICAL, and
2. The laws of physics are INFORMATION, then
3. The laws of physics are PHYSICAL.
Therefore, how can you get a PHYSICAL universe from “absolute nothing,” which by definition is META or NON-PHYSICAL? re: Dr. Lawrence Krauss, and Dr. Alexander Vilenkin?

(B). How can information be irreversibly LOGICALLY LOST?; does this not contradict the CONSERVATION LAW OF INFORMATION? Re: Black Hole information loss controversy, Dr. Stephen Hawking vs. Dr. Leonard Susskind.

Like or Dislike: 1  0

8. If “information” is ultimately part of the world of physics and all things physical are information-theoretic in origin (John Wheeler), then it seems the physical world and the informational world are one and the same.

Like or Dislike: 0  0

9. Brett says:

T.E. Oakley, I was about to say the same thing I always say when people ask your first question. Then I realized that 1.) nobody wants to hear it again, and 2.) I should just write a book about it after I graduate so that I can stop living on discarded pizza crusts.

Like or Dislike: 0  0

10. John Duffield says:

I’m an IT guy with an interest in fundamental physics, and whilst I have no issue with energy being a physical thing, I just don’t feel the same way about information. I could use coins as binary bits, wherein heads=0 and tails=1. I could cover my desk with coins and arrange them into groups to emulate ascii. I could spell out some message, or record some information. But the pennies are physical things, IMHO the information is just a “pattern” of some aspect of those physical things.

Like or Dislike: 1  2

11. I have a problem: I am afraid that erasing or swiping is false play as it requires the use /input of energy that is not generated endogenously.

Like or Dislike: 1  0

12. stevenjohnson says:

The Wikipedia link says that Szilard argued that the physical process by which Maxwell’s demon acts itself creates entropy. The entropy of the box alone decreases but the total entropy of the Demon/box system does not. The Landauer principle arose because there are reversible, that is entropy-conserving, ways to separate the particles, whatever they may be. But even these processes requires information destruction that increases entropy. The objection is that this is a consequent of the second law of thermodynamics, not an independent derivation. (I see no problem there, but I’m a materialist who thinks science corrigibly describes reality.)

So far, so good I hope. The first thing is, I don’t understand what information would need to be recorded, other than the average KE. I’ve got a guess, but more below. The second thing is, what ever continual stream of data must be recorded, I don’t understand why this isn’t deemed a physical process that is an intrinsic part of the demon/box system.

And my obtuseness extends even further. I imagine the box to be small as a bacterium. The same processes that cause Brownian motion by unequal impacts from particles will take place. As near as I can deduce, this will happen both within and without the box. I think the impacts in the high KE part of the box will cause an irregular variation in rotation and add to the Brownian vibration. If this is correct, is this an increase in entropy?

I’m sure the box is posited to be constructed of a material that will not deform under the different pressures. But the barrier must eventually be in thermal equilibrium with the two different compartments of the box, as a frying pan handle must have two different temperatures, one at the pan and one by your hand. I think the handle is not always in equilibrium and when if it is, loss of heat to the air etc. plays a role. But that can’t be a part of the Maxwell’s demon setup. If we think of the different KEs and the positions of the particles in the barrier as information, do we think of the gradient from the cold side to the hot side as a loss of information due to increased uncertainty? Obviously this is my guess as to what Landauer could have meant.

Like or Dislike: 1  0

13. Andy Johnson says:

Yes, it seems information is in a sense in the eye of the beholder. For something to be regarded as “information” it must be interpreted as such. Can any “information” be understood as simply objectively there? If not, there is an inherent subjectivity in regarding certain patterns as “information” whereas the world described by physical law is manifestly objective. In this light Landauer’s principle may provide another intriguing glimpse into the objective/subjective divide that we seem to come face to face with when thinking about what quantum mechanics means.

Like or Dislike: 0  0

14. Neil says:

How does Landauer’s principle relate to the black hole information paradox, if at all? If matter falls into a black hole and information is lost to the rest of the universe, should there not be a compensating rise in heat (entropy)? Hawking radiation?

Like or Dislike: 1  0

15. John Gregg says:

Barring quantum weirdness, there is no irreversible loss of information, is there? Can’t we always run the film backwards? It might not be convenient to do so (I’d rather not follow all those molecular collisions backwards to reconstruct the events of that morning fifty years ago and find out who really killed JFK) but it is always possible in principle.

I’ve also never really gotten my head around Maxwell’s demon. Even if erasing the information is such a big deal, can’t we stipulate that the demon’s information processing hardware is arbitrarily fine, dwarfed by the relative boulders that are the molecules in the chambers? Then the demon has decreased entropy massively in the form of the segregated molecules, compared to the tiny extent to which he has increased it by erasing or not erasing information in his own information processing machinery, and the thought experiment still goes through.

Finally, the single molecule in the piston – isn’t the second law a law of statistics, of aggregates? Haven’t you kind of exceeded the second law’s domain of applicability as soon as you restrict yourself to a single molecule? With only one molecule, can you even say that the system is in a state of maximum entropy?

I’ve always been suspicious about the ontological status of information. I think that sometimes people go too far with the highly suggestive correspondences between information and thermodynamics. Information, ones and zeros, are Platonic abstractions, like Euclidean points and planes.

The above notwithstanding, it’s not false modesty when I say that I don’t completely understand all the issues and arguments here.

-John Gregg

Like or Dislike: 2  0

16. TMS says:

How does “loss” ( or “erasure”) of information in the context of Maxwell’s demon tie in with the claim by certain black hole warriors (Susskind, etc.) that information is never lost?

Like or Dislike: 1  0

17. James Cross says:

In regard to the Toyobe experiment:

“Physicists in Japan have shown experimentally that a particle can be made to do work simply by receiving information, rather than energy.”

It seems to be something like a demonstration of Maxwell’s Demon.

http://physicsworld.com/cws/article/…rted-to-energy

My question is:

Can the conversion of information into energy work in reverse? In other words, can energy be converted to information? How would that work? Does it actually happen?

Like or Dislike: 0  0

18. Ed says:

“But the Demon has to somehow keep track of what its measurements have revealed. And unless it has an infinitely big notebook, it’s going to eventually have to erase some of its records about the outcomes of those measurements — and that’s the truly irreversible process. ” – not sure that I follow this, must have missed something. The Demon only has to keep track of the average of the speed of the particles he sees and let the faster ones through and the slower ones are sent back. He doesn’t need an infinitely big notebook for that?

Like or Dislike: 0  0

19. Latverian Diplomat says:

I was a little surprised not to see more of a link to quantum computing.

As I understand it, the only irreversible process required for a computation is storing the result. In particular, data can be copied irreversibly as long as it is copied to a cell with a fixed initial state. Thus, to save the result one must clear a memory cell (from a random initial state to a known state such as 0) and this the only process in the computation that must increase entropy–in theory, in practice any device of human manufacture, even a quantum computer, is going to increase entropy all over the place

This brings up an interesting connection between memory and entropy increase that has implications for the arrow of time. We remember the past and not the future because the very act of memory formation increases entropy. A system that remembers an event added entropy to the universe to create that memory.

Some of this is discussed in an engaging and laymen-friendly way in this book:

Feynman Lectures on Computation

Like or Dislike: 1  0

20. Hal Swyers says:

Just an analysis that reveals the tautological nature of Landauer’s principle.

Like or Dislike: 2  1

21. BobC says:

Sean,

Thanks for your and Jennifer’s efforts to expand knowledge and awareness, to reduce ignorance and nonsense, by sharing the wonders of our natural world. And thanks for doing so with such great clarity, expressiveness, and care.

As Charles McCabe said, “Any clod can have the facts, but having opinions is an art.” Powerful expression is the key: Anyone can be factually correct but still lose an argument, or fail to be persuasive, by being unable to express themselves or by failing to understand their audience or the sources of differences of opinion or belief.

The most important first step is to start from a place of shared understanding or perspective or experience. And that, of all your and Jennifer’s combined talents, is the one I appreciate most. Be it anti-de Sitter space or atheism, you always start by first understanding your audience and engaging with them.

Engaging with passion, a passion that typically evokes shared curiosity. Be it passion for science, philosophy, people, places, or just life and living. (Speaking of life and living, I especially enjoyed Jennifer’s story describing the path leading to this image.)

I am very thankful for all that one of science’s true “power couples” has shared, and eagerly look forward to what the future brings. Jennifer’s blog led me to yours: Both are valuable gifts that keep on giving.

Best Wishes for the Holidays and the New Year,

– Bob Cunningham

Like or Dislike: 1  0

22. John Duffield says:

Hal: that was an interesting read. I’m reminded of the -13.6eV hydrogen ground state binding energy, which has been likened to “a bigger box” for the electron wavefunction. Follow the link to atomic orbitals and note that electrons “exist as standing waves”.

Like or Dislike: 0  0

23. Alex Pavellas says:

John Duffield –
I think your example with the arrangement of coins on a desk can be used as a good illustration of this principle.
Consider: Suppose you arrange the coins in such a way to spell out a word in ASCII or perform some binary computation or emulate the human genome. By my reckoning, you are right in saying that the information content has to do with the pattern, and depending on what patterns you are looking for, you might get different measures of information. But say you pick a particular measure for the information and stick with it (iirc this is referred to by physicists as “coarse graining.”)

So now some toddler comes over and knocks the thing over, simultaneously destroying the information and increasing the entropy.

Like or Dislike: 0  0

24. John Duffield says:

Fair enough Alex, but I take my cue from relativity, wherein matter is made of energy and field energy is a state of space. Energy is fundamental, and at the fundamental level you cannot distinguish it from space. Information is the “pattern”, entropy is the “sameness” relating to available energy, no problem with that. But conservation of energy applies, it’s the one thing you can neither create not destroy. And it’s energy that’s physical, as is matter, as are the coins. However the pattern of the coins isn’t something physical in itself, just as colour isn’t physical because it’s a quale. So the idea that information is in itself physical or fundamental leaves me cold I’m afraid.

Like or Dislike: 1  2

25. John says:

I had read Brian Greens latest book that touched a little upon this topic. But, it mentioned that a particle reflected in a box would contain some unknown amount of energy, and in their attempt to solve an equation for the total energy in the box, they only got infinite answers and threw out the theory. It mentioned something about this being due to “quantum jitters”. Then it came up as part of a problem in trying to solve for dark energy, and it was part of a problem from talks with people involving the findings for the cosmological constant.

I would be interested on your thoughts one this, and how you would calculate a particle being reflected in a box over time.

Like or Dislike: 0  0

26. John Duffield says:

Error, post deleted

Like or Dislike: 0  0

27. Entropy as I see it is an endogenous increase of either mass or energy.

Like or Dislike: 0  3

28. LizR says:

According to quantum mechanics, information can’t be destroyed (General Relativity says it can, but I think most physicists would say that QM “trumps” GR on this – Hawking recently gave up on one of his famous bets on “the black hole information paradox” for example).

Hence according to QM, the concept of erasing information isn’t physically correct.

I might also mention that entropy is generally considered an emergent concept, rather than anything fundamental to the operation of the universe. It occurs because of the expansion of the universe, the way the components of the universe are arranged, and so on. The laws of physics are time-symmetric with only one known violation, which is generally thought to be unimportant in most physical processes, especially ones which are involved in the entropy gradient (although of course this may turn out to be wrong – perhaps neutral kaon decay had a significant impact in the first few seconds after the Big Bang).

Like or Dislike: 1  1

29. Andy Johnson says:

@LizR, this is my sense of it too, but this deepens the puzzle as far the ontological status of information goes. What you’re saying is that information as a fundamental quantity cannot be lost. Landauer’s principle relates information loss to entropy increase, a relationship between something fundamental and something emergent. The way we get an arrow of time (to “emerge”) from time-symmetric fundamental laws is by initial conditions, a past that is in a lower entropy state. How do we get heat generation from “fundamental” information loss? Or is information loss always only “emergent” in some sense?

Like or Dislike: 0  0

30. Jesse Mazer says:

Does Landauer’s analysis allow for temporary reversals of entropy in isolated systems, in between periodic erasures of the demon’s memory? Of course in statistical mechanics you can always have random fluctuations that lower the entropy in an isolated system, but I’m talking about a scenario where the odds are strongly in favor of an entropy decrease over a given span of time, not just a statistical fluke. This would seem to violate the fluctuation theorem but perhaps there are restrictions on the types of systems that theorem is meant to apply to.

Like or Dislike: 0  0

31. Count Iblis says:

The issue with “conservation of information” in QM is resolved as follows. To define a thermodynamic system rigorously, you need to do a coarse graining over microscopic degrees of freedom, leaving you with the parameters that you use to describe the system. Entropy is (expressed in the right units) precisely the amount of information that you lose in this coarse graining process.

Then given a thermodynamic state, the entropy gives you thelogarithm of the number of microstates that will be mapped to the same macrostate. Hoewever, while all these microstates have the same macroscopic thermodynamic properties, that does not mean that the system can really be in any one of these states. E.g. consider a gas undergoing free expansion in a perfectly isolated box that will even prevent decoherence so that it evolves in a unitary way. Clearly, while the entropy increases in the usual way, the number of states the system really can be in must be identical to what it was before the gas expanded. These states will under time reversal evolve back to the unexpanded state. But macroscopically, these states are indistinguishable from “fictitious” states that the gas certainly cannot really be in, the vast majority of these states don’t evolve back under time reversal.

Like or Dislike: 0  0

32. veeramohan says:

/We record data, take pictures, write things down, all the time — and we forget, or erase, or lose our notebooks all the time, too. Landauer’s Principle says there is a direct connection between these processes and the thermodynamic arrow of time, the increase in entropy throughout the universe./
“Time dilation” is a physical reality because, of the speed at which our body ages and the speed of light is constant – so “rest mass” is a relative order.

Time is merely a mathematical quantity (numerical order) of material change. In physical world, time is exclusively a mathematical quantity.

In 1926 Max Planck wrote an important paper on the basics of thermodynamics. He indicated the principle…
“The internal energy of a closed system is increased by an isochoric adiabatic process.”

This proves the expansion of space ? If mass = energy (radioactive), if Speed of light is constant and the universe is expanding, so what is happening to time?

Like or Dislike: 0  1

33. DEL says:

My first encounter with Sean’s blog took place a few years ago, with a post called “The Arrow of Time in Scientific American,” when the blog was still in “Cosmic Variance.” I got there by googling Sean’s name after the said SciAm article got me intrigued. And the reason for my special interest was IMAP (=Information & Meaning Are Physics.)

IMAP is a private, independent, research project of mine, in which I have engaged myself in recent years, unfortunately on and off. It is an interdisciplinary project, lying at the interface of physics, information theory, psychology and philosophy. IMAP aims high: it’s about generalizing the concept of information and, in particular, of the MEANING carried by it, accross all of reality—from the dumbest piece of rock to man-made IT devices to the smartest human brain.

In IMAP, the term “information” (not the quantity of it) is not restricted to signals meaningful to minds, and information meaningful to a mind is not in a separate category from random signals transmitted over a communication channel or from microstates of a volume of gas. All signals may be attributed with meanings pertinent to the systems they interact with, and no system is considered transcendental on grounds of its presumed ability to “understand” the meaning of a signal (in contrast to just acting on it.)

My work on IMAP involved intensive reading and thinking about entropy—Clausius’, Boltzmann’s and Shannon’s—and so I found myself the first time in Sean’s blog (in which I deposited a comment concerning the unhelpfulness of characterizing entropy as a measure of disorder.) Landauer’s Principle and Bennett’s and Zurek’s papers were naturally part of that study, as they are instrumental for the unification of the above three entropy concepts.

I idly returned to visit Sean’s blog, in its present home, only recently, while organizing my Favorites folder. And I stayed on. What a coincidence: I didn’t have to wait long before this thanksgiving post appeared, to which subject I am particularly connected. It surely seems a blog for me, and I must have been lucky chancing upon it. So, here’s some thanksgiving to you, Sean.

Like or Dislike: 0  2

34. Jim Williams says:

“Short version of the argument: imagine you have a “piston” with a single molecule in it, rattling back and forth.”
Shouldn’t that be “cylinder”? Pistons operate inside cylinders.

Like or Dislike: 2  2