Suck it DI

PLoS has an intriguing article providing additional reasons why the thermodynamic arguments against evolution are more than silly. It’s called the maximum entropy production (MEP) hypothesis, and John Whitfield describes why life is actually may be favored by the second law of thermodynamics.

At first glance, life and the laws of thermodynamics seem to be at loggerheads. Most glaringly, the second law states that over time, any system will tend to the maximum level of entropy, meaning the minimum level of order and useful energy. Open a bottle of perfume in a closed room, and eventually the pool of scent will become a smelly cloud. Organisms do their damnedest to avoid the smelly cloud of equilibrium, otherwise known as death, and a common argument of anti-evolutionists is that the universe’s tendency toward disorder means that natural selection cannot make living things more complex.

But recently, some physicists have gone beyond this and argued that living things belong to a whole class of complex and orderly systems that exist not despite the second law of thermodynamics, but because of it. They argue that our view of evolution, and of life itself, should likewise be based in thermodynamics and what these physical laws say about flows of energy and matter. Darwinian selection, these researchers point out, isn’t the only thing that can create order. Throughout the universe, the interaction of energy and matter brings regular structures–be they stars, crystals, eddies in fluids, or weather systems in atmospheres–into being. Living things are the most complex and orderly systems known; could they be part of the same phenomenon? And could the process that brings them about–natural selection, driven by competition between organisms–be ultimately explicable in thermodynamic terms?

Eric Smith, a theoretical physicist at the Santa Fe Institute in New Mexico, certainly thinks so. “Darwinian competition and selection are not unique processes,” he says. “They’re a complicated version of more fundamental chemical competitive exclusion.” In a paper published last year [2], Smith and his colleagues argued that natural selection is a highly sophisticated version of a physical process called self-organization, the still poorly understood means by which energy plus matter can equal order.

Such orderly, self-organized systems are like engines designed to level out energy gradients–while they persist, they produce more entropy, more quickly, than a disordered mishmash of molecules. Weather systems, for example, transport heat from the tropics toward the poles far more quickly than a homogeneous, static atmosphere would. Life does the same thing, Smith points out. Indeed, he believes that this might have been the reason for its origin–that, under the conditions on early Earth, life was the best way to release the build-up of geothermal energy and an inevitable consequence of that energy [3]. Once biochemistry had got going, subsequent chemical and Darwinian selection would each favor the systems best at dissipating Earth’s pent-up energy, whether geothermal or, following the invention of photosynthesis, solar.

It has long been suggested that self-organized systems do not just level out energy gradients more quickly than disordered ones do, they do it as quickly as possible. Models that assume maximum entropy production (MEP) make good predictions about the climates of Earth [4] and Saturn’s moon Titan [5] and about the growth of crystals in solutions [6]. But until recently, MEP was just an assumption–there was no mechanism or theory to explain why such systems should tend to this state. Classical thermodynamics is no help– it explains entropy only in closed systems, with no energy going in or coming out. It says nothing about how much entropy open, nonequilibrium systems, such as organisms, ought to produce.

But what about information theory? You know, that theory that intelligent design creationists insist proves that evolution makes no sense?

Roderick Dewar, a theoretical physicist and ecosystem modeler working at the French agricultural research agency’s centre in Bordeaux, believes he has crossed this hurdle. Using information theory, a branch of mathematics that can reformulate the laws of thermodynamics (see the Box), Dewar has shown that MEP is the most probable behavior of an open, nonequilibrium system made up of many interacting elements, provided that system is free to “choose” its state and not subject to any strong external forces [7]. The large-scale state of MEP represents the largest proportion of the countless possible arrangements of the system’s microscopic parts, regardless of what those parts are up to.

It’s an interesting idea. I like the interaction of physics and biology, but something about it just doesn’t ring true. The way they start describing entropy reminds me of how quantum physicists describe light as “always making the right choice” in describing it’s wave or particle-like behavior upon measurement. Whitfield seems to suggest that entropy similarly “always makes the right choice” in terms of organizing systems to maximize entropic release. At the very least, it’s an excellent explanation for how, if anything, evolution is more consistent with thermodynamic laws that the ID explanation of “a magic man done it”.


Comments

8 responses to “Suck it DI”

  1. darthWilliam

    I’ve been interested in self-organizing systems since playing “The game of life” on my TRS-80 micro back in 1980. Of course it drives evolution, that always seemed obvious to me. Nice that some real scientists are looking at it now.

  2. Of course it doesn’t support the DI’s nonsense – it hasn’t been Cordova-ized yet. Lets see, clip the beginning of the first sentence, add a well-placed ellipsis and…

    [L]ife and the laws of thermodynamics seem to be at loggerheads. Most glaringly, the second law states that over time, any system will tend to the maximum level of entropy, meaning the minimum level of order and useful energy. Open a bottle of perfume in a closed room, and eventually the pool of scent will become a smelly cloud. Organisms do their damnedest to avoid the smelly cloud of equilibrium, otherwise known as death, … the universe’s tendency toward disorder means that natural selection cannot make living things more complex

    There we go!

  3. Pete Dunkelberg

    Oops! No data, no evidence no nuttin.

  4. I too am a bit skeptical, I submitted this comment to PLOS:

    It seems to me that this article, and some of the people interviewed, follow in a long tradition of confusing attempts to mix thermodynamics concepts from different fields in with evolutionary biology. I am not sure that much of this loose talk is productive. For example, is a monoculture forest actually thermodynamically different from a natural diverse old growth forest? Is either substantially different from a bacterial vat of equal biomass? If not, then involving traditional thermodynamics as anything more than a vague analogy is probably pointless.

    More evidence of this confusion:

    “Adding life to physical systems certainly increases entropy production. A pond full of plankton or a patch of grass absorbs more of the Sun’s energy, and so produces more entropy, than a sterile pool or bare rock.”

    Really? If light hits a black rock or water, it pretty much all converts to waste heat right away. Isn’t this the ultimate in maximizing entropy?

    “Earth turns sunlight into microwave radiation, closer to equilibrium with the background glow of the Universe, more efficiently than either Mars or Venus.”

    Really? The albedo of the 3 planets is as follows (from wikipedia):

    Venus: 0.65
    Earth: 0.367
    Mars: 0.15

    So the Earth reflects 36.7% of incoming radiation, but Mars reflects only 15%. So on Mars a greater proportion of incoming light energy is not reflected and is instead converted to waste heat and is emitted as longwave radiation. On Mars I assume this is mostly via the “heating rocks” method, on Earth much of the light just heats rocks or water, but some of it is used for photosynthesis and the energy goes through a large number of transformations in chemical form in plants, animals, etc. This is obviously important for life but in what sense is it “more efficient”? And how can you “maximize entropy” more than waste heat does?

    The article makes some attempts at conveying the complexity of the issues, and getting a variety of opinions, so it does shed some light as well as heat.

    But I think a much more serious attempt at connecting natural selection to physics is given in these articles by Addy Pross, who punches several holes in the whole “explaining life by thermodynamics” tradition and suggests instead that kinetics (Remember kinetics from Chemistry 101? The part of chemistry that tells you what will actually happen, not just what is thermodynamically allowed?) is the key. Since replication is essentially a kinetically driven process, this is huge and important, I think, in this sort of discussion:

    Pross, Addy (2003). “The driving force for life’s emergence. Kinetic and thermodynamic considerations.” J. theor. Biol. 220, 393-406. doi: 10.1006/jtbi.2003.3178

    Pross, Addy (2005). “On the Emergence of Biological Complexity: Life as a Kinetic State of Matter.” Origins of Life and Evolution of Biospheres, 35(2), 151-166. doi: 10.1007/s11084-005-5272-1

  5. Anonymous

    Excellent analysis in the post, the entropy and information descriptions are overgeneralizing.

    Dewar uses bayesian methods to find that maximum entropy as describing the most probable state is most likely reproduced as the extrema under the allowed constraints, even in non-equilibrium systems.

    Perhaps not so impressive, I am not sufficiently studied in thermodynamics to judge however. In any case Dewar also shows that one can expect maximum dissipation at maximum entropy, which is interesting albeit I believe already known.

    And he is aware, more than Whitfield makes clear perhaps, that one can have (and miss) constraints such as his expectation fails. (But he notes that this is at least informative on controlling constraints.)

    This is interesting to me also since it touches work on environmental principles in multiverse cosmology. For example, Boussou finds that the probability for observers are maximized where the total entropy production is maximized ( http://arxiv.org/abs/hep-th/0702115 ).

    And not to worry, Nick, it isn’t constrained by the kinetics but follows from simply finding that dust is the best dissipator into waste heat. And aggregated dust seems to be a great thing to live on and make life out of. 🙂

    So perhaps Whitfield et al are vaguely on to something, but they may be looking at the wrong scale. At we scale where we live, it is as noted kinetics that rules chemistry, dynamics and life as we know it.

  6. Pete Dunkelberg

    In any case Dewar also shows that one can expect maximum dissipation at maximum entropy, which is interesting albeit I believe already known.

    Meaning that max entropy corresponds to max dissipation? This would not be surprising since entropy is the extent to which energy is spread out spatially, or the extent to which the internal energy of a chunk or volume of matter is spread over diverse energy levels, (or, as a tendency, the tendency for the above to increase).

  7. Torbjörn Larsson, OM

    Pete, thanks for those physical intuitions.

    I haven’t really thought about spatial spread outside simple diffusion processes (literally thinking inside the box 😮 ), but it seems obvious now.

    Either entropic process, in space or in states, must move energy through the dissipative mechanisms in a maximal way. Nice!

  8. Pete Dunkelberg

    Torbj�rn, I must say that the idea is not original with me. Check The Entropy Site.

    I must also say that interrupting the spread of energy with photosynthesis, not to mention massive burial of reduced carbon, is not the way to increase entropy quickly.

Leave a Reply

Your email address will not be published. Required fields are marked *