Evolution and the Second Law

Since no one is blogging around here, and I’m still working on my book, I will cheat and just post an excerpt from the manuscript. Not an especially original one, either; in this section I steal shamelessly from the nice paper that Ted Bunn wrote last year about evolution and entropy (inspired by an previous paper by Daniel Styer).

————————————

Without even addressing the question of how “life” should be defined, we can ask what sounds like a subsequent question: does life make thermodynamic sense? The answer, before you get too excited, is “yes.” But the opposite has been claimed – not by any respectable scientists, but by creationists looking to discredit Darwinian natural selection as the correct explanation for the evolution of life on Earth. One of their arguments relies on a misunderstanding of the Second Law, which they read as “entropy always increases,” and then interpret as a universal tendency toward decay and disorder in all natural processes. Whatever life is, it’s pretty clear that life is complicated and orderly – how, then, can it be reconciled with the natural tendency toward disorder?

There is, of course, no contradiction whatsoever. The creationist argument would equally well imply that refrigerators are impossible, so it’s clearly not correct. The Second Law doesn’t say that entropy always increases. It says that entropy always increases (or stays constant) in a closed system, one that doesn’t interact noticeably with the external world. But it’s pretty obvious that life is not like that; living organisms interact very strongly with the external world. They are the quintessential examples of open systems. And that is pretty much that; we can wash our hands of the issue and get on with our lives.

But there’s a more sophisticated version of the argument, which you could imagine being true – although it still isn’t – and it’s illuminating (and fun) to see exactly how it fails. The more sophisticated argument is quantitative: sure, living beings are open systems, so in principle they can decrease entropy somewhere as long as it increases somewhere else. How do you know that the increase in entropy in the outside world is really enough to account for the low entropy of living beings?

As we mentioned way back in Chapter Two, the Earth and its biosphere are systems that are very far away from thermal equilibrium. In equilibrium, the temperature is the same everywhere, whereas when we look up we see a very hot Sun in an otherwise very cold sky. There is plenty of room for entropy to increase, and that’s exactly what’s happening. But it’s instructive to run the numbers.

The energy budget of the Earth, considered as a single system, is pretty simple. We get energy from the Sun, via radiation; we lose the same amount of energy to empty space, also via radiation. (Not exactly the same; processes such as nuclear decays also heat up the Earth and leak energy into space, and the rate at which energy is radiated is not strictly constant. Still, it’s an excellent approximation.) But while the amount is the same, there is a big difference in the quality of the energy we get and the energy we give back. Remember back in the pre-Boltzmann days, entropy was understood as a measurement of the uselessness of a certain amount of energy; low-entropy forms of energy could be put to useful work, such as powering an engine or grinding flour, while high-entropy forms of energy just sat there.

Sun-Earth-entropy

The energy we get from the Sun is of a low-entropy, useful form, while the energy we radiate back out into space has a much higher entropy. The temperature of the Sun is about twenty times the average temperature of the Earth. The temperature of radiation is just the average energy of the photons of which it is made, so the Earth needs to radiate twenty low-energy (long-wavelength, infrared) photons for every one high-energy (short-wavelength, visible) photon it receives. It turns out, after a bit of math, that twenty times as many photons directly translates into twenty times the entropy. The Earth emits the same amount of energy as it receives, but with twenty times higher entropy.

The hard part is figuring out just what we mean when we say that the life forms here on Earth are “low-entropy.” How exactly do we do the coarse-graining? It is possible to come up with reasonable answers to that question, but it’s complicated. Fortunately, there is a dramatic shortcut we can take. Consider the entire biomass of the Earth – all of the molecules that are found in living organisms of any type. We can easily calculate the maximum entropy that collection of molecules could have, if it were in thermal equilibrium; plugging in the numbers (the biomass is 1015 kilograms, the temperature of the Earth is 255 Kelvin), we find that its maximum entropy is 1044. And we can compare that to the absolute minimum entropy it could have – if it were in an exactly unique state, the entropy would be precisely zero.

So the largest conceivable change in entropy that would be required to take a completely disordered collection of molecules the size of our biomass and turn them into absolutely any configuration at all – including the actual ecosystem we currently have – is 1044. If the evolution of life is consistent with the Second Law, it must be the case that the Earth has generated more entropy over the course of life’s evolution by converting high-energy photons into low-energy ones than it has decreased entropy by creating life. The number 1044 is certainly an overly generous estimate – we don’t have to generate nearly that much entropy, but if we can generate that much, the Second Law is in good shape.

How long does it take to generate that much entropy by converting useful solar energy into useless radiated heat? The answer, once again plugging in the temperature of the Sun and so forth, is: about one year. Every year, if we were really efficient, we could take an undifferentiated mass as large as the entire biosphere and arrange it in a configuration with as small an entropy as we can imagine. In reality, life has evolved over billions of years, and the total entropy of the “Sun + Earth (including life) + escaping radiation” system has increased by quite a bit. So the Second Law is perfectly consistent with life as we know it; not that you were ever in doubt.

35 Comments

35 thoughts on “Evolution and the Second Law”

  1. Ever since I first learned about the Second Law, I’ve always wondered how to go about this quantitative analysis; thanks for the illuminating explanation!

    “It turns out, after a bit of math, that twenty times as many photons directly translates into twenty times the entropy.”

    I think I can handle a bit of math- can anyone point me to a reference (online or otherwise) that contains this derivation?

  2. Am I missing something: Does’nt one need to specify a number for the heat capacity of the 10^15 kg biomass…So many Joules per kg ? Or is it assumed to be some specific hydrocarbon with a known Cv ?

  3. @karthik

    Just look up definitions of entropy, the math is very simple. The definitions involving information (i.e from the information perspective) are a little easier to read than the physical ones.

    For photons, imagine the following: a photon can exist in a fixed number of configurations, or ‘states’. If you have more photons, the state space expands. How it expands depends on how you look at things. This is because for every possible state in the first photon, there are a similar number of possibilities for the next photon and so on, so the increase is exponential if you look carefully. If you look through a dusty lens, you don’t see that, you just see more photons (as many as you added). This number – the volume of the state space, is what entropy tries to measure.

    The author said ’20 times’ because that is the coarse grained view that physicists see. If you pack 20 times the particles you get 20 times the density. The fine grained view is that you get 2^20 times the entropy, if each particle can be in 2 states. Happy Mothers Day! 🙂

  4. This argument for the second law vs evolution tension is correct in its generality.

    But there is a more interesting comparison to make — by making the distinction between preparing a low entropy state (what Sean was really talking about) and maintaining a low entropy state.

    The 2nd law applies to the former. The latter is quite different and the calculation of entropy production per UNIT TIME using T_{sun} and T_{earth} is really related to the latter.

    As Sean notes, the existence of life (say at a steady state) is a low entropy state — lower than thermal equilibrium by some amount Delta S. To prepare such a state, of course, some entropy must have been produced elsewhere.

    To maintain a system in a non-equilibrium steady state such as this, it turns out that one needs a constant RATE of entropy production elsewhere. The T_{Sun} and T_{earth} argument really answer this question of rate.

    Crucially, the RATE of entropy production to maintain a system in a low entropy state is NOT simply directly related to the low entropy of that system — it depends on the kinetic constants between states of that system in a sense. For example, that required RATE of entropy production can be made as high as one wishes by scaling all kinetic constants without affecting the low entropy state one is able to maintain.

    It would be a very interesting and perhaps difficult generalization of Sean’s post here to estimate what RATE of entropy production is needed to maintain life of given entropy on earth and compare that RATE to the Q (1/T_{sun} – 1/T{earth}) quantity..

  5. In case, I wasn’t clear in my last post, the relation between the rate of entropy production to maintain a certain low entropy state and that low entropy of that state is NOT related to the 2nd law..

    In a sense, it can be called a much stronger version of the 2nd law but this relationship is not really universal as far as I know. It depends on the details of the system like the overall scale of the kinetic constants and is usually studied in a limited context such as the Boltzman master equation etc (i.e a Markov process)

    But it would be interesting nonetheless..

  6. So if the Earth were emitting the same amount of energy as it receives and thus remaining somewhat inert, how would you describe global warming?

    Mind-boggling. Here’s another little excersise: given the rate of energy that the earth gets from the sun, how many seconds of solar input would be needed to raise the temperature of the seas/atmosphere by the amount that global warming predicts?

    Here’s a hint, Chris: every morning the sun manages to raise the temperature of the earth by several degrees. Even in a global warming scenario, the energy in and out is almost exactly balanced. Or we’d all be either toast or icicles.

  7. Gabe Feliciano

    Snce the Universe is a closed system, did the Universe begin in high entropy or low entropy?

  8. In an infinite universe, entropy increases and decreases are equivalent:
    http://scientificphilosophy.com/Downloads/SLTOrder.pdf.

    A finite universe would have low entropy at the beginning with continuing increases marked by divergence (expansion). As you can see from the link, I prefer the infinite universe model, as it explains the development of ordered systems through convergence as well as their destruction through divergence.

Comments are closed.

Scroll to Top