The idea of the clockwork universe eventually turned into the idea of the self-organizing universe. The idea of a universe ruled by certain "laws of nature" turned into the idea that at the most fundamental level there is pure randomness and the so-called "laws of nature" are nothing but average behaviors. The idea of strict determinism turned into the idea that in principle everything is possible. The idea of eternal laws cast in stone turned into the idea of an ever-changing universe that invents its own laws and in which, as it evolves and becomes more complex, new laws emerge.

The key figures of this great paradigm shift of materialism were Ludwig Boltzmann and Willard Gibbs, the inventors of statistical mechanics, Charles Darwin, the creator of the theory of evolution, Richard Feynman, who refashioned quantum mechanics in a Boltzmann like style, and Ilya Prigogine who championed the idea of self-organization and of non-equilibrium thermodynamics.

**Boltzmann and Gibbs**

Boltzmann and Gibbs concerned themselves with the issue of the mechanical fundaments of thermodynamics. They wanted to show that phenomena such as heat or friction can be understood from a mechanical point of view, thinking only about the motion of molecules.

Apparently, their task was hopeless: the laws of mechanics are entirely reversible, while the laws of thermodynamics are irreversible (things tend to equalize their temperatures and that's it, there's no going back). How could one start from reversible laws and obtain irreversibility without adding any irreversible ingredient? Many argued at the time that that is simply absurd.

However, there is a particular type of ingredient that isn't irreversible in itself but which, nevertheless, can generate irreversibility: randomness. If something is changed at random periodically it can happen that one random change will undo the previous random change, randomness in itself isn't irreversible. Nonetheless, it is unlikely that such a thing would happen, and more steps you take more unlikely it is you will end up in the starting point.

This idea was given a mathematical proof later on by Einstein. He solved the so-called problem of the drunk. Imagine a drunken man who tries to move away from a lamppost. Every step he makes is at random because he is drunk. How far from the lamppost will he be after N steps? The answer is not zero.

The statistical theory of Boltzmann and Gibbs had one final subtlety which is the key point of the present story. In order to reduce thermodynamics to mechanics one has to make assumptions about how the molecules move. Thus, one imagines something about the fundamental constituents of matter and then uses the statistical theory together with these assumptions to make predictions about how things should look at grand scales. Then one can check how well the predictions fit the observed facts.

Why is this important? The point is that one can use the statistical theory to check whether one's assumptions about the fundamental constituents are correct or not. Prior to Boltzmann and Gibbs there existed no such mechanism. The prior perspective, largely developed in philosophy by Descartes and Kant, was that one should know a priori what the fundamental constituents are, obtaining a sort of absolute certainity from somewhere. They were simply postulated.

**Darwin**

The main importance of Darwin is that he has shown that one can really understand how very complex structures can gradually evolve from very simple structures. Thus, one does not need to simply postulate a great bulk or highly complex things. Things naturally evolve from simple to complex.

Darwin's theory says that life evolves via a trial-and-error process. There are small changes (variability), some of them will be useful while others will be dysfunctional, the useful ones get preserved and slowly accumulate while the dysfunctional ones don't (this is how the process of natural selection generates complexity).

**Feynman**

When quantum mechanics was developed by Schrödinger, Heisenberg and Dirac, it had a structure similar to the Newtonian mechanics rather than statistical mechanics - it had laws. That first formulation claimed the states of the particles are described by a certain function, and that there is a law (Schrödinger's equation) that says how this function evolves in time.

Feynman replaced this view with a more Bolzmannian view. He simply said: let's take every possibility into consideration and make a sort of average. Thus, in the same way as Boltzmann and Gibbs showed that the laws of thermodynamics are average laws which are based on mechanics, Feynman further showed that the laws of mechanics themselves are also average laws that aren't really based on anything. Everything that happens in any experiment is the result of an average over all the imaginable possibilities. The point is to discover how to define the possibilities and how to make the average.

The question is no longer "What are the constraints nature imposes?", the question now is how to make the idea that everything is possible precise and rigorous. What is "everything", and what is "possible"?

As Feynman wrote: "Throughout

**these lectures**I have delighted in showing you that the price of gaining such an accurate theory [quantum electrodynamics] has been the erosion of our common sense. We must accept some very bizarre behavior: the amplification and suppression of probabilities, light reflecting from all parts of amirror, light travelling in paths other than a straight line, photons going faster or slower than the conventional speed of light, electrons going backwards in time, photons suddenly disintegrating into a positron-electron pair, and so on. That we must do, in order to appreciate what Nature is really doing underneath nearly all the phenomena we see in the world."

**Prigogine**

The classical statistical mechanics, developed into a rigorous theory by Gibbs, only deals with equilibrium situations. If the external conditions are kept in a certain way the system will eventually reach equilibrium relative to its environment. Gibbs tells us how the system will look like when equilibrium has been reached. However, his theory does not tell us what the system will do on its path towards equilibrium or what happens if the external conditions are constantly changing. These kinds of problems are not entirely solved even today. (It is worth mentioning that the idea of the appearence of "order out of chaos" was pioneered by Schrödinger in his book

*What is life?*.)

Prigogine was one of the most important scientists who advocated the idea of self-organization. Prigogine's basic idea is that things organize because they are not in equilibrium with the environment and their organization is a means by which this equilibrium is achieved faster.

The image shows a classic example of such a self organizing event. The reason why in the right image the liquid gets organized is that this organized motion allows the flow of heat from bottom to top faster. Thus, local organization is a means for a faster global disorganization.

This idea might not be the entire story of self-organization. As I have written on

**another occasion**: "Things in general don't tend either towards complexity or simplicity; they tend towards more and more diversity and disorder. One can get disorder either by decomposing existing ordered structures, and thus obtaining a large diversity of simpler structures. Similarly, one can get disorder by combining simple structures into highly complex ones which exhibit much more chaotic and diverse behaviors than their components.

For a long time, physicists specialized in thermodynamics focused primarily on the static aspect - on equilibrium thermodynamics - and thus had seen only the decomposition effect. However, in the last several decades they took an increasing interest in the dynamic aspect - the non-equilibrium thermodynamics - and thus began observing in more and more detail the complexity-creation effect (or information growth)."

And why does the world tend towards more and more diversity and disorder? Precisely because, at its most fundamental level, it is entirely random. This does not mean there are no regularities whatsoever in nature; it means that all the regularities are nothing but average behaviors.

**Conclusion**

There's no conclusion yet. The science of self-organization is not yet fully developed. However, the materialist worldview has witnessed a radical change. It is no longer dominated by the idea of the deterministic machine and it is no longer incompatible with humanistic values such as creativity, novelty and freedom.

*Picture credits: Feynman picture (Caltech)*