Entropy law linked to intelligence, say researchers
By Jason Palmer
A modification to one of the most fundamental laws of physics may provide a link to the rise of intelligence, cooperation - even upright walking.
The idea of entropy describes the way in which the Universe heads inexorably toward a higher state of disorder.
A mathematical model in Physical Review Letters proposes that systems maximise entropy in the present and the future.
Simple simulations based on the idea reproduce a variety of real-world cases that reflect intelligent behaviour.
The idea of entropy is fundamentally an intuitive one - that the Universe tends in general to a more disordered state.
The classic example is a dropped cup: it will smash into pieces, but those pieces will never spontaneously recombine back into a cup. Analogously, a hot cup of coffee will always cool down if left - it will never draw warmth from a room to heat back up.
But the idea of "causal entropy" goes further, suggesting that a given physical system not only maximises the entropy within its current conditions, but that it reaches a state that will allow it more entropy - in a real sense, more options - in the future.
Alex Wissner-Gross of Harvard University and the Massachusetts Institute of Technology in the US and Cameron Freer from the University of Hawaii at Manoa, have now put together a mathematical model that ties this causal entropy idea - evident in a range of recent studies - into a single framework.
"In the past 10 to 15 years, there have been many hints from a variety of different disciplines that there was a deep link between entropy production and intelligence," Dr Wissner-Gross told BBC News.
The laws of thermodynamics
First law: Energy can be neither created nor destroyed (although, thanks to Einstein's most famous equation E=mc2, energy can come from or be turned into mass)
Second law: The entropy of an isolated system always rises. But put some energy in and order can be achieved - in a crystal of salt, in humans, even in galaxies
Third Law: As the temperature of an ordered system - like that salt crystal - approaches "absolute zero", entropy approaches its lowest level. But never zero, because everything, everywhere is at least a little bit disordered
"This paper is really the first result that clarifies what that link precisely is... to the point that it's prescriptive - it actually allows you to calculate in a sensible way answers to questions that couldn't reasonably be answered before."
The simplistic model considers a number of examples, such as a pendulum hanging from a moving cart. Simulations of the causal entropy idea show that the pendulum ends up pointing upward - an unstable situation, but one from which the pendulum can explore a wider variety of positions.
The researchers liken this to the development of upright walking.
Further simulations showed how the same idea could drive the development of tool use, social network formation and cooperation, and even the maximisation of profit in a simple financial market.
"While there were hints from a variety of other fields such as cosmology, it was so enormously surprising to see that one could take these principles, apply them to simple systems, and effectively for free have such behaviours pop out," Dr Wissner-Gross said.
Raphael Bousso of the University of California Berkeley said: "It has always mystified me how well this principle models intelligent observers, and it would be wonderful if Alex's work could shed some light on this."
Simulation of ships trading Entropy-maximised simulations of trade routes saw ships spontaneously discover the Panama Canal
Prof Bousso showed in a 2007 paper in Physical Review D that models of the Universe that incorporated causal entropy were more likely to come up with a Universe that contains intelligent observers - that is, us.
However, he cautions that although the new paper bolsters the case for causal entropy, the idea still lacks explanatory power.
"The paper argues that intelligent behaviour, which is hard to quantify, can be reduced to maximising one's options, which is relatively easy to quantify. But it cannot explain intelligent behaviour from first principles," he told BBC News.
"It cannot explain how that 'intelligent agent' evolved in the first place, and why it seeks to maximise future options."
Axel Kleidon of the Max Planck Institute for Biogeochemistry in Germany, who authored a 2010 paper in Physics of Life Reviews using maximised entropy to consider the machinery of life on Earth, said that the work "shows some very intriguing examples" but that only time would tell if causal entropy was as fundamental as it may seem.
"It seems that it is beyond just luck and coincidence," he told BBC News.
"On the other hand, I know from my own research that applying thermodynamics to real-world systems is anything but simple and straightforward... I think it is through more examples that (we will see) how practical their approach will be, compared to other thermodynamic approaches."