The Biggest Ideas in the Universe | 20. Entropy and Information

You knew this one was coming, right? Why the past is different from the future, and why we seem to flow through time. Also a bit about how different groups of scientists use the idea of “information” in very different ways.

The Biggest Ideas in the Universe | 20. Entropy and Information

And here is the associated Q&A video:

The Biggest Ideas in the Universe | Q&A 20 - Entropy and Information
32 Comments

32 thoughts on “The Biggest Ideas in the Universe | 20. Entropy and Information”

  1. I’ll stick my neck out here. I am not a physicist. I’ll leave that to you. But, entropy falls under the laws of thermodynamics—speaking of physical things and how those depend upon one another. Gravity.. Relativity.. Or anything else we might attribute to natural law(s). Information, near as I can tell, is man-made.. Nothing natural about it. So, information is, I think., cultural—customary—traditional. And so on. So, maybe you were only keeping your ‘preposterous universe’ theme, in posting this notion. Or are you just having fun? ( I did not add the above underline, so, ????) The universe is erposterous. That is what makes it so much fun! A request: it would be helpful if this screen was in something larger than their-point type.
    ,

  2. John Alan Wiltshire

    Very good… but…

    Not a single mention of “Disorder” or that long standing, and often quoted in learned textbooks, thing about why a teenager’s bedroom becomes untidy. Have we finally dispensed with those notions? Also:

    Four definitions of Entropy ( Including a clear affirmation that Shannon Entropy is not the same thing as Thermodynamic Entropy) but no definition of “Information”.

    What is wrong with this definition of information which has nothing to so with “knowing” by a conscious entity:

    INFORMATION: If the arrangement of physical assembly A is, in some way, correlated with the arrangement of physical assembly B then the fact that, in principle, something about the arrangement of physical assembly A, can be deduced by examination of B, is more succinctly stated as ” B contains information about A.

  3. Joao Victor Sant Anna Silva

    Hi doctor Sean! Thanks again for the video! I’ll try to stick to the theme of the video this time, hehehehehe! So, here we go:
    1. Please, imagine a set of 10 billiards ball, numbered 1-10, and 5 of then are blue, and 5 are green.
    We start at all the balls aligned, ordered by their number.
    To a colorblind person, it’s a low entropy state. To a dislexyc person, it’s a high entropy state.
    If the dislexyc starts ordering it by colour, to him the entropy would go down, but to the colorblind it would go up.
    So, entropy is subjective? Could, in principle, be some unknown propriety of the world that, according to that, entropy always go down?
    2. That part about matrices and quantum mechanics… I didnt understand much, hehehehehe! I cant even formulate a question, hahahahaha!
    Thanks!

  4. John Alan Wiltshire

    PaulD: “Information, near as I can tell, is man-made.. Nothing natural about it. So, information is, I think., cultural—customary—traditional. And so on”

    Geological strata is not “Man-made, Tree rings are not “Man-made” , Ice cores are not “Man Made” yet they contain information about past events and conditions.

  5. You remarked a few times in these videos that they are not “lectures”
    but it seems to me that they would make a good basis for a nice textbook.
    You’ve also mentioned that you were writing a textbook, so I’m wondering
    if it will be structured along the lines of these videos?
    (I think I had the impression that the textbook would be more narrowly focused on quantum mechanics,
    but since these videos are not (yet?) a textbook, they don’t have an index with with I might try
    to find where I could have gotten that impression.)

    Regarding the ideas in this particular video,
    It seems like an obvious objective macroscopic observable of system would be its energy.
    Which has the natural consequence that when energy flows from a system with high (dEntropy/dEnergy)^-1
    to a system with high (dEntropy/dEnergy)^-1, the number of possible micro states increases.
    (So a monotonicity is a consequence of a conservation/symmetry and a boundary condition)
    That makes me wonder if using other conserved quantities for our macroscopic observable might
    lead to other notions analogous to temperature for those quantities.
    I don’t recall ever hearing anyone speak of dEntropy/dMomentum
    but maybe it appears somewhere in work on fluid dynamic dissipation.
    Or maybe there’s something that makes momentum fundamentally less useful than energy for the purpose of counting macro states?
    dEntropy/dCharge? dEntropy/dBaryon? That seems strange to me,
    but maybe there could be something in there related to osmosis.

    Information is conserved in QM, but entropy is not,
    Which seems to say it’s not the Gibbs or Shannon entropy that’s meant in QM.
    Or is it information that’s interpreted differently?

    How do we know we are not Boltzmann’s brains that just think we’re part of a bigger fluctuation with a lower entropy past?
    We might as well assume that we are not Boltzmann’s brains, just as we might as well assume that we are not brains in vats,
    but our incentive to assume we are not seems like a Pascal’s wager argument.
    When I’m in a dream I can believe I remember a past that never really existed.
    Within the dream I usually act as if it is real until some discrepancy indicates that it is not,
    although the dream state can often induce enough cognitive instability to impair my ability to notice discrepancies.

    I looked up your arrow of time paper with Chen, but I may not be be able to digest it before you
    post the Q&A video. Might you summarize it in the Q&A?

    If the universe started “from nothing”, there’s only one empty set, which means a very low entropy,
    so would we expect such a state to naturally evolve into more likely higher entropy states?
    The nothing state might be like the top of Norton’s dome.
    If it evolves in both time directions, with both calling the low entropy in the middle the past,
    could one side be matter and the other antimatter, and could that resolve the baryogenesis question?

    If entropy is unbounded, then any finite entropy would seem low compared to what it could evolve to,
    could that suggest something like Penrose’s conformal cyclic cosmology?

  6. It would seem, entropically, that energy radiates toward infinity, while order coalesces toward equilibrium. Conceptually they are diametrically opposed. ?
    It seems we are stuck somewhere between everything cancelling out and everything fading out.

  7. My question is about Maxwell’s demon, and the relation between information and energy.
    Let’s suppose we have two separated, identical sized closed boxes, where the gas is some macroscopical balls. Every parameter of the balls are the same, their number, their size, their positions, their momentums, the only difference is that in one box the balls are made from a lighter material, and in the other are made from a heavier material.
    1.) Question: For each box the amount of information required to describe it’s state is obviously the same, only their masses differs. How do you reconcile the fact that with the same amount of information the Demon can do two different amount of work, yet radiate out the same amount of heat on his information erasure, since for both example was required the same amount of information to know/describe the system?
    2.) Question: Let’s suppose the Demon is constructed in two different way at each example, in one case the Damon is constructed from old-fashion low-energy efficient electronics, storing information in capacitors/coils, and in the another example the Demon is constructed from modern more energy-efficient microchip components. Obviously on the same amount of memory-information-erasure operations the two cases will loose different amount of heat to the environment?
    So what is the solid scientifical relationship between the entropy- information and energy?

  8. William H Harnew

    Another wonderful talk. Thanks so much. I very much enjoyed the Von Neumann entropy aspect which I’ve never heard about.

    1. Doesn’t an expanding universe answer the recurrence hypothesis? Although I see “bouncing universes” are in vogue. Here’s a fun version of recurrence with your cat. 🙂
    http://gerdbreitenbach.de/arnold_cat/cat.html

    2. I’ve seen several posts about “information” being a human/cultural/semantic concept vs. a physics reality ( fields/particles etc) concept. It seems to stem from the confusion about the Shannon term (which you point out can lead to an opposite conclusion). Wittgenstein called this a “category error” i.e. confusing things via language. Still, it seems that it’s a concept that serious physicist find useful. Starting with “It from Bit”…
    https://philpapers.org/archive/WHEIPQ.pdf

    3. I must confess to a fundamental confusion about black holes. I thought they were concentrated, collapsed stars. They only have mass, charge and angular momentum. So they would be “low entropy” like the early universe. But, they are high entropy?

    4. You didn’t explore the Beckinstein-Hawking black hole entropy of a black hole as the surface area. I wonder what your thoughts are on that hypothesis.

    Again, many, many thanks from a lay person.

  9. My thermodynamics class taught entropy is a state function, i.e., an objective property of the world. Coming from the two ways to add energy: either by work (no entropy change) or heat (changeing entropy). E.g. pushing a piston into gas or random motions of a piston wall. Microscopically, how can atoms pushed by part of a wall ‘know’ whether that motion is work or heat? Their individual reactions to the wall can’t depend on knowing whether far atoms get the same push (work) or uncorrelated (heat). So I expect work vs. heat is somewhat subjective, depending on what we can measure. As you mentioned for defining macrostates. Normally someone with much better measurement devices can only determine a few more degrees of freedom, so for practical purposes heat & work are clearly distinct. But as technology develops better nanoscale devices, could that change? So someone with such devices would see a much lower entropy, and be able to extract significantly more work from the system than a normal observer?

  10. I have two questions:
    1. As the Gibbs entropy is constant for a closed system (22:43) and the universe is a closed system, why should the entropy of the universe increase at all? I understood the argument that the past hypothesis allows us to assume a low-entropy past, and the entropy should tend to increase with the time evolution from this low-entropy state, but this seems to simply be at odds with the statement that dS/dt = 0. I can only presume that the logical step of the universe being closed is flawed somehow, but I don’t see why.

    2. At 52:05 you begin to argue for why the anthropic principle can not refute the recurrence objection, primarily by using the idea of Boltzmann brains. I didn’t really understand how this idea makes the anthropic principle fail in the first place, but nevertheless, can’t the past hypothesis give us a way to avoid the problem of an eternal past of Boltzmann brains forming at all? Surely the entire history of the universe is one in which the universe’s entropy has been increasing from a low value? Surely in this context, i.e. the observable one in which we even state the second law in the first place, there is no problem?

    Thanks for the great video as always!

  11. How can the idea of entropy always increasing be confronted with the action of gravity for instance in a dispersed gas cloud that is converted in a low entropy star is gravity a force that lower entropy in a system?

  12. Is there an agreed upon way to calculate the entropy associated with gravitional collapse? (Into, say, a star, rather than a black hole). If so it would seem hard for cosmologists to argue with the low entropy state of the big bang?
    Thanks!

  13. Hello everyone. I have two questions.
    1 How the value of the entropy at the start of the univers could determine its variation in a closed system nowadays ?
    2 About the Boltzmann brain : maybe the simpler way to obtain a brain is to have a universe like ours and have life, and evolution, up to our brain ? In my opinion, it is much harder or unlikely to obtain directly a Boltzmann brain directly from scratch…

  14. So why do physicists consider the entropic arrow of time more fundamental than the causal arrow of time? It seems pretty obvious that if you take it as a fundamental feature of spacetime that it is causally partially ordered (as determined by light cones) then an entropic arrow will arise if entropy has any ability to grow. But I’ve read a number of popular science books (most recently Rovelli’s Order of Time) which have tried to convince me that it’s really the other way around. And every argument I’ve heard seemed to implicitly assume the causal arrow it was supposedly deriving.

    So my biggest objection that I haven’t heard addressed is — when you say as part of your explanation that we need an assumption that the universe began in a low entropy state — isn’t there an arrow of time implicit in that very statement? Just the fact that you need to know this thing about the past to be able to explain the present (no comparable claim about the future will do) implicitly identifies an arrow in time. Or I guess it identifies a particular point in time as special (the beginning), implicitly giving us an arrow (the direction from the beginning to now) and any explanation starting from this point seems to really explain that this arrow gives rise to the entropic arrow. Is there a nuance I’m missing here?

  15. I’ve heard Entropy is not an observable in Quantum Mechanics although in classical mechanics it appears on the same footing as observables like the internal energy, volume and particle number in thermodynamic equations. How does an experimentalist measure the Entropy quantum mechanically and what kind of Born rules apply to such measurements?

  16. Nicholas Sullivan

    I have a few questions.
    1. Concerning the connection of life to entropy and information, a basic, fundamental feature of life is that it is an (imperfect) replicator of information. Considering Landauer’s principle, which states that the information erased by a process puts a lower bound on the entropy created elsewhere, can we have a similar principle for life? For example, is the amount of information created by a living process bounded by the amount of entropy it generates?
    2. According to the standard picture of quantum mechanics, information is conserved in any closed system because time evolution is described by unitary operators, which preserves the eigenvalues of the density matrix. If it is valid to look at the universe as a whole, this should still be true. How do we reconcile this with the fact that according to the standard picture of cosmology, the expansion of the universe creates new space, and thus new field degrees of freedom? Can unitary evolution still describe this, and is information still conserved in this scenario? I suppose a theory of quantum gravity might be needed to answer this.
    Thanks for the great talk!

  17. Hi Sean,

    In https://arxiv.org/abs/hep-th/0212209, Goheer, Kleban, and Susskind write:

    “In the literature [on quantum de Sitter space] much stronger conditions have been assumed for the spectrum of [Hilbert space] H1. Banks and Fischler have conjectured that the Hilbert space of states H1 should be finite dimensional. This may be so, but it does not follow from the finiteness of the entropy. The Entropy is only equal to the dimensionality of the space of states when the temperature is infinite. Entropy can certainly be finite even though the Hilbert space of states is infinite dimensional. We are assuming only the weaker condition that the spectrum is discrete.”

    Can you explain what they mean here? What can it mean that the max entropy is finite if there are infinite microstates? If this claim makes sense, would finite entropy but infinite dimensional Hilbert space offer a way to affirm the black hole entropy bound without the danger to Lorentz invariance posed by finite dimensional Hilbert space?

  18. Brown and Susskind argue that “the kinetic entropy of the classical system is equivalent to the Kolmogorov complexity of the quantum Hamiltonian.” They name it the Second Law of Complexity.

    Not knowing about entanglement or not being able to measure it, is one way of loosing information on a quantum state as it evolves in time. You pointed that out in the end of your talk. Could you please be more explicit on that idea. Thanks for another great talk.

  19. I have to rephrase/clarify my question about Maxwell’s Demon, and the relation between information and energy. After contemplating, I ended up that my second question must be dumped, since it can be answered like the dissipated heat is the sum of the heat coming from the information erasure and the heat loss of the components.
    However, the experimental setup of the first question can be kept, with two important fix, one being that the two boxes are not “separated” but they are completely “distinct”, and they never meant to touch/interact or mix their content between.
    The second important rectification is that on both boxes the velocities of the balls/molecules are the same, not their momentums, since the whole point is that for the Demon to do it’s action, it is irrelevant the masses of the balls/molecules, he selects them just by their speed (and of course the right orientation). So to summarize the experimental setup is two completely distinct, ideal closed box, with some macroscopical balls inside as model for gas. Every parameter in the two boxes are the same, the size of the boxes, the amount of balls/particles, their speeds, sizes and orientations, the only difference is that in one box the balls/particles are made from a heavier material, in another from a lighter material.
    In this way when the Demon does it’s work, this setup is a right proof to the equation of entropy change formula (per bit info):
    ΔS = k ln(2) <– the missing temperature correctly reflects the mass-independence
    And also correctly the amount of work I can extract by the action of the Demon is:
    E = kT ln(2) <– the present temperature correctly reflects the mass-dependence
    where T as I understand is the amount of work I extracted recalculated/expressed in temperature, which very importantly depends on the mass of the balls/molecules!
    But as we know, with this setup with the action of the Demon, we extract energy from the box, which ultimately also will extract heat/energy from the environment, which is prohibited according to the Clausius-Kelvin definition of the second law of thermodynamics, which states that heat cannot be extracted from a colder object to a hotter object without investing energy.
    So to satisfy the Clausius condition, when the Demon erases it's memory, it has to dissipate at least the same E = kT ln(2) energy.
    But as we see, the energy depends from the mass and momentum of the particles, so for the same amount of information-erasure the Damon will have to dissipate different amount of heat.
    How does it know the memory of the Demon what masses was involved in his action, when for him the masses of the particles are irrelevant, doesn't count as information, he checks only velocities and directions as we already stated?
    Something is clearly very-wrong here. What is??

  20. douglas albrecht

    I was surprised to learn that the 2nd law, and therefore the arrow of time , depends on the assumption of a boundary condition of low entropy at the BB. I had thought either that the low past entropy condition was a stronger principle, or that the second law could be derived without depending on that. I have listened to your pod cast with Roger Penrose, and thought that he derived the low starting entropy (which not all cosmologists understood) by running the second law backwards. but now it all seems circular.

    Separate question. Coarse graining has always struck me as subjective and dependent on the way humans categorize things and find patterns. Is it a stronger principle?

  21. Super! One of the best videos in your series “The biggest ideas in the universe”. The concepts of entropy and information-entropy are well explained.
    I also found it useful that you referred to your book “From Eternity to Here” where all these ideas are treated in chapters 8, 9 and 10. I should add that your book of 2010 is (still) one of the best books I’ve read in my lifetime (and I have read a lot).
    Rene Kail

  22. Entropy of the Universe increases with time. However, does entropy increase in a uniform manner?
    Time appears to pass in a uniform manner. Therefore, I think the implication that entropy increase is a causal factor in time passing is incorrect?

    Excellent talks. You have filled in many gaps in my education. I can now choose which areas to dip into
    – just for fun !

  23. Hey Sean, excellent talk, as usual – thank you so much and please keep going. A few direct questions:
    – If the early universe was so dense, why did it not form and stay a black hole? Is it because of inflation?
    – Or are we living inside a black hole? How does entropy behave inside a black hole? It seems like it should decrease with time, since everything inexorably goes towards the singularity.
    – Since the universe is a closed system, its entropy should be constant. Is expansion, or dark energy, making it unbounded and therefore “allowing” entropy to increase?

Comments are closed.

Scroll to Top