67 | Kate Jeffery on Entropy, Complexity, and Evolution

Our observable universe started out in a highly non-generic state, one of very low entropy, and disorderliness has been growing ever since. How, then, can we account for the appearance of complex systems such as organisms and biospheres? The answer is that very low-entropy states typically appear simple, and high-entropy states also appear simple, and complexity can emerge along the road in between. Today's podcast is more of a discussion than an interview, in which behavioral neuroscientist Kate Jeffery and I discuss how complexity emerges through cosmological and biological evolution. As someone on the biological side of things, Kate is especially interested in how complexity can build up and then catastrophically disappear, as in mass extinction events.

There were some audio-quality issues with the remote recording of this episode, but loyal listeners David Gennaro and Ben Cordell were able to help repair it. I think it sounds pretty good!

Support Mindscape on Patreon.

Kate Jeffery received her Ph.D. in behavioural neuroscience from the University of Edinburgh. She is currently a professor in the Department of Behavioural Neuroscience at University College, London. She is the founder and Director of the Institute of Behavioural Neuroscience at UCL.

13 thoughts on “67 | Kate Jeffery on Entropy, Complexity, and Evolution”

  1. She is amazing! What an intellect. I love this podcast. She asked the questions of you that I would if I was smart enough to form them. Thank you for sharing this.

  2. Great discussion…I’m most interested in the philosophical implications of this, some of which you touched on at the end, the relationship between entropy and information. Also the implications for natural language and communication (also information), and how our (human, and thus contingent) identification and definition of macroscopic states may be analogous, in the arena of language, to the differentiation and observation of said states. Seems like all these processes are informational exchanges of some sort or another.

  3. I just listened to the beginning of this so far.

    I believe that the answer Dr. Jeffries’ question of why a DNA strand of random nucleotides or of degraded DNA would have different entropy content from a DNA strand with genes containing the information to make a viable organism is that entropy also applies to information content.
    I need to go back and look at this, but I took Melanie Mitchell’s Introduction to Complexity some years ago, and I think entropy applies because there are many more ways to send a signal with little or no information content than a signal with large information content. There are many more ways for a strand of DNA to be “nonsense,” or a series of random nucleotides, than there are ways for a DNA strand to contain information. So, I think that means that a DNA strand with little or no information content has lower entropy than a DNA strand with information. (I hope I have this right. I have learned much of what I know about entropy from you as well as from Melanie Mitchell, but I am always having to review which is high and which is low.)

  4. Right on. Another personal favorite, right up there with Kate’s and Michele’s and a couple of others. But then I’m a bioengineer, analytical and applied, interested in future and big picture issues, so I’m biased. lol

    Sean, still hoping here that one of these days, you’ll get around to expanding on your “poetic naturalism” and related ideas – even if it’s just along the lines of creative philosophizing or the like as Sagan for eg would do. I can’t help but think it points to a new kind of framework with infrastructure, along with new exciting, energizing inventions and innovations, one that’s science informed, science abiding and science respecting. Yay!

  5. Obrigada, Sean Carroll, e, Kate Jeffery
    Outro episódio espetacular!
    É minha opinião que deveria haver uma maior “ligação”, discussão, proatividade entre as vareas subareas da ciência! E, filosofia da ciência!
    À semelhança deste episódio!
    Obrigada

  6. The concepts around entropy and the arrow of time fascinate me. “From Eternity to Here” is one of the best books I’ve read. What intrigues me the most is the use of the “past hypothesis” to overcome “Loschmidt’s paradox” which in the way stated is a mere “turtles all the way down (to the big bang…) kind of tautology.

  7. More than anyone else, Kate Jeffery made you “define your terms.” That’s what good philosophy is all about.

  8. subjective entropy I really tease out and bother me could any one talked about it?
    and i dont understand why expansion the universe cause the increase entropy of the universe!

  9. Strictly speaking, entropy is a property of a set of quantum states (spec. the log of the number of those states), not a property of a single quantum state; and the universe at any moment is in a single quantum state; therefore the universe does not have an entropy.

Comments are closed.

Scroll to Top