Entropy and Complexity, Cause and Effect, Life and Time

Finally back from Scotland, where I gave a series of five talks for the Gifford Lectures in Glasgow. The final four, at least, were recorded, and should go up on the web at some point, but I’m not sure when.

Meanwhile, I had a very fun collaboration with Henry Reich, the wizard behind the Minute Physics videos. Henry and I have known each other for a while, and I previously joined forces with him to talk about dark energy and the arrow of time.

This time, we made a series of five videos (sponsored by Google and Audible.com) based on sections of The Big Picture. In particular, we focused on the thread connecting the arrow of time and entropy to such everyday notions of cause and effect and the appearance of complex structures, ending with the origin of life and how low-entropy energy from the Sun powers the biosphere here on Earth. Henry and I wrote the scripts together, based on the book; I read the narration, and of course he did the art.

Enjoy!

  1. Why Doesn’t Time Flow Backwards?
  2. Do Cause and Effect Really Exist?
  3. Where Does Complexity Come From?
  4. How Entropy Powers the Earth
  5. What Is the Purpose of Life?
This entry was posted in Big Picture, Science, Time. Bookmark the permalink.

27 Responses to Entropy and Complexity, Cause and Effect, Life and Time

  1. Draft of Short Story on ‘Causaility’ is here:

    https://skepticalsciencereviews.wordpress.com/story-land/

    –Arthur Snyder SLAC

  2. Great explanations AND animations. Thanks!

  3. James Goetz says:

    Thank you very much for posting this 🙂

  4. Chris G says:

    Wonderful stuff Sean, thank you
    Chris G (UK)
    P.S. did you mean ‘thread’ rather than ‘tread’?

  5. James Rose says:

    Very nice, concise and crisp videos that, under current conventional models and comprehensions, “describe” relations and certain overarching processes of energy and systems behaviors. What I was hoping to hear~view — especially coming out of the Gifford Lectures series — is some offered model or models of the -mechanisms- for ‘how’ complex system arise naturally and spontaneously into higher tiers of complexity and order — and opportunity. Those ideas were not evident here, nor for that matter, in much of the literature anywhere. Does Sean have any ideas on specifically ‘how’ higher-ordered complex systems arise and come into existence? High ordered systems are not ‘energy sinks’ – they localize different forms of organization and energy – on the face if it “against” the rule of entropy increase~maximization. Thanks for a reply – which I am anxious to get.

  6. Sean Carroll says:

    I did mean ‘thread,’ thanks.

  7. Katrin Boeke-Purkis says:

    This is a great supplement to Sean’s book “The Big Picture” (the first nine chapters) and will spark the scientists imagination. Looking forward to watching The Gifford Lectures when they are published. Physics, Chemistry, Biology and then the Metamorphosis of the Future….

  8. Tim Martin says:

    Great videos! The difference between entropy and complexity was very informative. After watching the videos I have a couple of questions, if anyone’s willing to answer.

    Video 2:
    I’m having trouble seeing how physics equations *don’t* have a concept of time build into them. If you take an equation and plug in some numbers, the equation tells you what will happen “next.” Isn’t that a concept of time?

    At 0:40 the video says that the current momentum and position of a particle determine both how it will move in the next second, and how it was moving in the previous second. But in the video they had to change the equation they were using in order to calculate the “next second” versus the “previous second” (they changed + vt to – vt). So again it looks like time is built in, and the equations themselves are not agnostic about time.

    Video 3:
    Based on the coffee and milk example, it seems like the “milk mixing into coffee” macrostate should have the lowest entropy of all 3 states named. If this macrostate is defined as “the specific swirls of milk in coffee that you see in this moment,” then surely there are relatively few microstates for this macrostate. If you switch many molecules around, you will change the swirls and thus change the macrostate. So why is this not the lowest entropy state?

  9. BobC says:

    Just started reading The Big Picture, and it’s so great to read an author who’s warm voice rises from the page, who writes the way he speaks. (Or does Sean speak the way he writes? Whatever…)

    Having followed Sean for some time via this blog, his Twitter feed, his videos, the occasional public lecture, as well as having read his prior books, I look forward to seeing how he pulls it all together. As advertised, I’m expecting a magnum opus combining his philosophical and scientific perspectives in a single whole.

    My goal, at a meta-level, is to better comprehend Sean’s “lucid logic”, his ability to be broad yet brief, to sustain a pace through material that could easily become stultifying were it not for careful selection of points, then weaving them together without leaving the impression that the subject had been glossed over. A rare combination of sufficient content and context with smooth flow.

    For example, Sean’s treatment of Aristotle could have productively gone on for many more pages, at the risk of losing the main thread that brought us to Aristotle and that leads onward from him. Brief and breezy, though never to the point of being mere “name dropping” or “concept dropping”. Yet with abundant hints (mainly lots of proper nouns) that beg for a brief Google or Wikipedia segue.

    But doing searches requires some effort, and there’s no guarantee it will lead me to authoritative (or even readable) sources. The only thing lacking at this early point in the book are explicit links to references. The references are there, stealthily gathered at the back of the book, but they are huddled there devoid of contextual reference beyond the chapter number. In the name of the Gods of Short Attention Spans and Instant Gratification, I’d much rather see footnotes at the bottom of the page, and not have to occupy two fingers for place-tracking. Only longer referential discourses should be relegated to notes at the end of the book.

    There. That’s my only quibble so far. Nit-Picking Achievement Unlocked!

    [Edit: Harumph. Footnotes could easily disturb The Force of the Flow. Tough decision either way.]

    PS: Love the Minute Physics videos. Only pictures permit saying so much in so little time (at the nominal equivalence of 1K words each).

  10. BobC says:

    I agree with Tim Martin’s perspective on Time’s Arrow.

    If Time’s Arrow was solely based on entropy, making it an emergent phenomenon, I would not expect it to be uniform throughout the cosmos. Entropy varies across scale and location, and only has a universal average, not a universally fixed value established by law. What then emergent Time?

    What about violations of CPT Symmetry? I don’t see how they could be due to entropy, or even linked to it (though that could be my ignorance showing). Could CPT violations be better ‘time candles’ than entropy?

    If not, why not?

  11. Torbjörn Larsson says:

    Lots of nuggets here!

    The part about organisms as an analogy to stars in the context of entropy reminds me of a recent paper that connects with the description of finding life at an intermediate state of universal entropy. The paper finds that the triple-alpha process, the main 12C producer of the universe since 8Be is unstable, would be rare among universes. A smaller change in universal parameters than would remove the triple alpha process would make 8Be is stable and result in easy carbon production. They find that habitable universes must “reside in a regime with intermediate properties”:

    “Although carbon is (most likely) necessary for a universe to be habitable, it is not sufficient. Universes favorable for the development of life require additional heavy elements, including oxygen, nitrogen, and many others. Although a full treatment of heavy element production for all possible universes is beyond the scope of this paper, we can outline some basic requirements: Hydrogen is a necessary ingredient, so it is important that big bang nucleosynthesis does not process all of the protons into heavier nuclei and that star formation is not overly efficient. The natural endpoint for stellar nucleosynthesis is to produce large quantities of the element with the highest binding energy per particle, i.e., iron (in our universe) or its analog (in other universes).

    As a result, nuclear processing in stars cannot be too efficient. In our universe, stars span a range of masses, from those that can barely burn hydrogen up to those that produce iron cores and explode as supernovae. This range of stellar masses results in a wide range of endpoints for nuclear reaction chains and is thus favorable for producing a diverse ensemble of heavy elements. Habitable universes thus reside in a regime with intermediate properties: Star formation must take place readily in order to produce the heavy elements, energy, and planets necessary for life, but cannot be so efficient that no hydrogen is left over for water. Stars must be able to synthesize the full distribution of heavy elements necessary for life, but cannot be so efficient that all nuclei become iron (or whatever nuclide has the highest binding energy in the given universe).”

    [ https://arxiv.org/pdf/1608.04690.pdf ]

    Though I would claim that the idea the process of life is driven by the mechanism of eagerly dissipating energy (increasing entropy) is wrong. And I say that as a physicist, but now studying bioinformatics. It is easy to see that chemical reactions, use of free energy, can be driven both by changes in energy content (enthalpy) as well as changes in entropy.

    Life is indifferent to entropy. Modern life is a system in homeostasis, a cellular system in out of equilibrium steady state regulated by negative and positive feedback. It is maintained that way due to that populations of genetic systems survives in that state. Ironically massive negative feedback of homeostasis rely on genetic system positive feedback of exponential growth in replicating populations. Maybe we should call it “the signal of life” (since cells use positive feedback to promote signaling).

    This is essential, since we don’t see cellular bags of “more complicated chains of reactions” [video 5] populate the world outside of a local birth place such as alkaline hydrothermal vents. Life emerged in a local place and not all over Earth because complicated chains of reaction couldn’t emerge in, and adapt to, a different environment. [That the LUCA had to emerge out of alkaline hydrothermal vents because universal chemiosmosis could only evolve at the boundaries of such vents were shown by Sojo et al 2015. And that the LUCA indeed emerged in such vents, with vent methylation a deeply evolved feature of both their genetic and metabolic system – and with the expected chemiosmosis – was shown by Martin et al 2016.]

    Presumably it took the adaptation inherent in biological evolution to accomplish the jail break.

  12. Torbjörn Larsson says:

    @James Rose: I started out with biology this time, so despite interesting discussions on physics I’ll stick to this for a while.

    This is an important observation: “High ordered systems are not ‘energy sinks’ – they localize different forms of organization and energy – on the face if it “against” the rule of entropy increase~maximization.”

    As for extant cellular life I wrote a long comment on why they seek steady state instead of equilibrium. Biological evolution doesn’t “know” that “cellular life shouldn’t do so”, it is inherent in adaptive replicators – differential fitness of genetic populations – to be “good enough” at surviving. Mind that the process also doesn’t “know” that life exists because death does. (I.e. adaptation through weeding, if nothing else because of using up available resource capacity.)

    The interesting problem is how metabolism and genetics co-evolved chemically to that state. Those who have looked at growth of homeostasis (Dyson, Kaufmann, Lancet et al, for starters) have constrained it to phase transitions of one kind or other [ http://prelude.bu.edu/publications/Segre_Lancet_Chemtracts_1999.pdf ]. But that is at odds with how biological evolution works. Speciation is gradual, there is no one time or individual you can point to and state that “branching happened here”. Why would the process characteristic of emergence of life be different from the characteristic of the process of life?

    Presumably then there were many ways that life could emerge, and indeed it was swift and so easy according to the geological record. Conversely, it may be hard to tell details along the road how life emerged here on Earth.

  13. Paul Torek says:

    Tim,

    It’s not that physics equations don’t have a concept of time, just that they don’t have a *preferred direction* of time (except for the 2nd law). Your point about x + vt versus x – vt has an exact mirror in Sean’s examples of the numbers 41, 42, 43. 43 is 42 *plus* 1, but 41 is 42 *minus* 1; this difference however shows us nothing about causality.

    Sean,

    I think that the discussion of causality could benefit from mentioning what we can control. We could, in principle, painstakingly take all the C-14 out of that pencil, but history would still record the atomic tests of the 1960s. Not that this would all fit in one five-minute video. This one was five minutes very well spent, by the way. Is there a fuller discussion of causality in The Big Picture?

  14. James Goetz says:

    Hi Sean,

    I listened to these excellent short productions, and I think we have a major difference if I correctly understood you. You implied that if we know the complete quantum state of a particle at any given point of time, then we can know the past and future quantum states of that particle.

    Please let me know if I am correctly interpreting you or if I am mistaken?

    For example, I suppose that any particle’s change from one quantum state to the next is probabilistic instead of deterministic. In this case ,we can know the possible ranges of the particle’s past or future, but we cannot know the exact past of future.

  15. Thank You. Great and enjoyable videos and lectures.

  16. Sean Carroll says:

    James– The full quantum state evolves completely deterministically, if left by itself. Only the act of observation is indeterministic. Whether or not that indeterminism is real or only apparent depends on your favorite interpretation of quantum mechanics.

  17. Magnema says:

    @BobC: The problem with that reasoning is that CPT-type time asymmetry has nothing to do with the time asymmetry we observe on a macroscopic scale. Unless you want to propose (since CPT symmetry holds, even if T, which is thus the same as CP, does not) that applying charge conjugation and parity reversal would cause time to proceed “backwards” as viewed from a macroscopic scale. I think that would be a very hard argument to make, at best, and almost certainly faulty.

    Thus, while we could retroactively claim a direction of time based on that T asymmetry (relative to CP, at least), had this T asymmetry been reversed with the entropy behavior the same (which seems entirely physically plausible), we would have claimed the opposite direction of time relative to that T asymmetry to be forward. Thus, while we can compare “forward in time” to CPT, we shouldn’t use that because it’s not how we actually define the direction of time, as that counterfactual argument shows.

    Compare this with attempting to define the meter. Step 1: We make a meter to be the size of some object which is “around human size” (order of magnitude, anyway). Step 2: Find an object we can measure precisely (say, although it’s no longer the current standard, the wavelength of a particular emission line of a certain kind of krypton). Step 3: Take that measurement, and round our previous definition of the meter slightly so that we can define it to be exactly some quantity in terms of that distance. Then, we get to the modern conception, defined in terms of something we can measure precisely.

    Note, in particular, that it would *not* make sense to now ask, “but why is a human practically a meter, if it’s defined in terms of krypton?” Although we have redefined the value of a meter to have a better definition, that doesn’t change the historical origins of the unit, and so doesn’t change the way we have to reason about it. Similarly with T-asymmetry: we can’t reason in that way because it’s not the way we originally decided which way was forward, and redefining doesn’t change the fact that our original definition was based on entropy, not on T-asymmetry, the two of which are, in principle, unrelated.

    (Granted, it’s a subtle issue, one that Sean makes sure to discuss in his longer Great Courses series, but that’s the essential part of the argument as I reformulate it with my own analogies.)

  18. BobC says:

    @Magnema: I’ve likely misstated what I was aiming toward.

    Perhaps entropy affects time’s direction, but it is not time’s fundamental existence. Let’s briefly consider time to be a fundamental, rather than emergent, phenomenon.

    I’m thinking along the lines that the Big Bang nucleogenesis should show no favoritism between matter and anti-matter. Yet, for reasons not yet fully understood, we find ourselves in a matter-dominated universe. Could something similar have happened to time?

    What if time was initially (perhaps during the Big Bang’s Planck Era) a chaotic mix, and entropy helped smooth it out and give it a net direction?

    Then perhaps the CPT-level could be an indicator of/for ‘quantum time’, quite different from what we experience as ‘entropic time’.

    Or am I just being silly?

  19. KC Lee says:

    Proving this very commonly made statement, by necessity, assumes repeat measurements. Preferably many repeat measurements are done, on presumably (see below) the same quantum state, to lend confidence to the validity of that statement.

    But, largely independent of interpretation* (arguable I know), it is also generally understood that “Any measurement disturbs the system and generically leaves it in a different state from the one it was in before the measurement.” (Binney and Skinner “The Physics of Quantum Mechanics”, Oxford University Press 2014, p.10). Further explanations on the same point are found on p. 23 as well as elsewhere in the same textbook.

    If so, this rules out the very possibility of repeating any measurement on any given quantum state.

    *Whether one adopts the Copenhagen or the Many-world interpretation, the wavefunction either collapses or branches off to other worlds. Both render it not feasible to get hold of the quantum state, again, on which to repeat a measurement.

    How should one resolve this contradiction (other than “Have faith in the Schrodinger equation” for example)? An explanation will be much appreciated.

    Best,

    KC

  20. KC Lee says:

    Btw, my comment was referring to Sean’s statement, “The full quantum state evolves completely deterministically, if left by itself”. For some reason it failed to show up in the body of the text.

    KC

  21. Tom Brown says:

    Sean, these videos are really great! I really enjoyed them, and I sent them to everyone I know with the slightest interest.

  22. Simon Packer says:

    I like the snappy presentations. Probably suits my simple mind.

    The valid descriptive overlay of the part-disordered macrostates is the interesting bit for many of us. We might see life processes as having a local entropy range signature.

    Tim’s point on video 2 about time is I think down to the difference between real time and abstract or simulated time in the evolution of a t-symmetric equation. Real time insists on a direction. But you could say 42 needs 41 to happen, but not 43. Perhaps.

    Seems to me, with video 3, if you only use position/particle type in an entropy calculation, and use a discrete x,y,z spatial grid and fixed numbers/ratios of particles, then with like particle equivalence, all microstates are equally likely. So if you fine grain to the limit, don’t use rotational or vertical symmetric equivalents, and so microstate = macrostate, all conditions are equally likely, whether up/down seperated, maximally mixed or anything else. Entropy is constant. So entropy depends on the degree of coarse-graining and the method of defining equivalence. Here, maximally mixed is actually low entropy, high order. Complexity, defined as quantity of state definition information, peaks in the middle, though not in a simple way.

    I agree with BobC on time. If it were co-emergent from entropy, or caused by it, would we not see local variations in both rate and direction? Time seems boss, reference frame symmetries apart.

  23. James Goetz says:

    Thank you Sean.
    Per my favorite model of QM, I suppose Ozawa (Found. Phys. 41, 592 [2011]; New Generat. Comput. 34, 125 [2016]) is going in the right direction.
    Also, I am still racking my brain while trying to figure out a definition or description of *observation* in the context of QM. Have you figured out a description of *observation* in the context of QM?

    If I may, I have one more question. For example, your presentation clearly explains why there is an arrow of time. But you previously proposed a multiverse model where the arrow of time is reverse in some of the regions (S. M. Carroll and J. Chen, Gen. Rel. Grav. 37, 1671 [2005]; S. M. Carroll, Nature 440, 1132 [2006]). I reject that those models cohere with the second law of thermodynamics. Do you still support that multiverse model?

    Peace,
    Jim

  24. Dave McLaughlin says:

    Sean: Finally back from Scotland, where I gave a series of five talks for the Gifford Lectures in Glasgow.

    Damn! I missed that. I look forward to seeing the videos.

    Dave (in Glasgow)

  25. Larry Fasnacht says:

    Dear Professor Carrol,

    I have been reading about time, in no small part due to your publications. I am a lay person, I have no degree, and am not as smart as you, or your readers. That is why I would like your help in showing me how wrong my understanding of time is. My limited understanding of Special Relativity is at the heart of my difficulties. It says that observers far away from each other, moving toward each other, will observe future events in the others timeline. Is that correct?

    If so, that means that the future exists before we experience it. That means, to me, that the future is already “written”. While we may experience time in a physiological way that makes it seem like we are moving from a past to an uncertain future, this is not correct.

    The other aspect I am having trouble with is probability. I have heard about the Drake equation to estimate the possibility of intelligent life on other planets. I personally dismissed this equation as having too many unknown variables for which even remotely reasonable guesses could not be made. But then I got to thinking about what the probability would be of me sitting here, typing, and you there reading (I hope), this communication. If we were to go back in time a million years, what would you say is the number of events that would have to occur for us to be here now? What would the probability of each of those events be? And worse still, what would the probability be of each of those events occurring in exactly the correct sequence, precisely, perfectly, without any errors? My mind reals at the unlikelihood of my existence. Yet here I am. At least I think I am. The Drake equation pales in comparison

    But if the universe (whatever that means) was somehow (not a question I allow myself to ask) was formed “of a piece”. All at once, so to speak, wouldn’t that solve both of these conundrums?
    I would love to hear how I have gotten this all wrong, I’m mixed up and this can’t be right. The thought that everything I do, everything I think, or say, has already been written, is disturbing and has consequences that I find troubling. So please, tell me I’m wrong.

    Larry Fasnacht
    Omaha, Nebraska, USA