PLoS has an intriguing article providing additional reasons why the thermodynamic arguments against evolution are more than silly. It’s called the maximum entropy production (MEP) hypothesis, and John Whitfield describes why life is actually may be favored by the second law of thermodynamics.
At first glance, life and the laws of thermodynamics seem to be at loggerheads. Most glaringly, the second law states that over time, any system will tend to the maximum level of entropy, meaning the minimum level of order and useful energy. Open a bottle of perfume in a closed room, and eventually the pool of scent will become a smelly cloud. Organisms do their damnedest to avoid the smelly cloud of equilibrium, otherwise known as death, and a common argument of anti-evolutionists is that the universe’s tendency toward disorder means that natural selection cannot make living things more complex.
But recently, some physicists have gone beyond this and argued that living things belong to a whole class of complex and orderly systems that exist not despite the second law of thermodynamics, but because of it. They argue that our view of evolution, and of life itself, should likewise be based in thermodynamics and what these physical laws say about flows of energy and matter. Darwinian selection, these researchers point out, isn’t the only thing that can create order. Throughout the universe, the interaction of energy and matter brings regular structures–be they stars, crystals, eddies in fluids, or weather systems in atmospheres–into being. Living things are the most complex and orderly systems known; could they be part of the same phenomenon? And could the process that brings them about–natural selection, driven by competition between organisms–be ultimately explicable in thermodynamic terms?
Eric Smith, a theoretical physicist at the Santa Fe Institute in New Mexico, certainly thinks so. “Darwinian competition and selection are not unique processes,” he says. “They’re a complicated version of more fundamental chemical competitive exclusion.” In a paper published last year [2], Smith and his colleagues argued that natural selection is a highly sophisticated version of a physical process called self-organization, the still poorly understood means by which energy plus matter can equal order.
Such orderly, self-organized systems are like engines designed to level out energy gradients–while they persist, they produce more entropy, more quickly, than a disordered mishmash of molecules. Weather systems, for example, transport heat from the tropics toward the poles far more quickly than a homogeneous, static atmosphere would. Life does the same thing, Smith points out. Indeed, he believes that this might have been the reason for its origin–that, under the conditions on early Earth, life was the best way to release the build-up of geothermal energy and an inevitable consequence of that energy [3]. Once biochemistry had got going, subsequent chemical and Darwinian selection would each favor the systems best at dissipating Earth’s pent-up energy, whether geothermal or, following the invention of photosynthesis, solar.
It has long been suggested that self-organized systems do not just level out energy gradients more quickly than disordered ones do, they do it as quickly as possible. Models that assume maximum entropy production (MEP) make good predictions about the climates of Earth [4] and Saturn’s moon Titan [5] and about the growth of crystals in solutions [6]. But until recently, MEP was just an assumption–there was no mechanism or theory to explain why such systems should tend to this state. Classical thermodynamics is no help– it explains entropy only in closed systems, with no energy going in or coming out. It says nothing about how much entropy open, nonequilibrium systems, such as organisms, ought to produce.
But what about information theory? You know, that theory that intelligent design creationists insist proves that evolution makes no sense?
Roderick Dewar, a theoretical physicist and ecosystem modeler working at the French agricultural research agency’s centre in Bordeaux, believes he has crossed this hurdle. Using information theory, a branch of mathematics that can reformulate the laws of thermodynamics (see the Box), Dewar has shown that MEP is the most probable behavior of an open, nonequilibrium system made up of many interacting elements, provided that system is free to “choose” its state and not subject to any strong external forces [7]. The large-scale state of MEP represents the largest proportion of the countless possible arrangements of the system’s microscopic parts, regardless of what those parts are up to.
It’s an interesting idea. I like the interaction of physics and biology, but something about it just doesn’t ring true. The way they start describing entropy reminds me of how quantum physicists describe light as “always making the right choice” in describing it’s wave or particle-like behavior upon measurement. Whitfield seems to suggest that entropy similarly “always makes the right choice” in terms of organizing systems to maximize entropic release. At the very least, it’s an excellent explanation for how, if anything, evolution is more consistent with thermodynamic laws that the ID explanation of “a magic man done it”.
Leave a Reply