11 WHAT EMERGES FROM A PARTICULARITY



Why do the tall pine

and the pale poplar

intertwine their branches

to provide such sweet shade for us?

Why does the fleeting water

invent bright spirals

in the turbulent stream? (II, 9)


IT IS ENTROPY, NOT ENERGY, THAT DRIVES THE WORLD


At school I was told that it is energy that makes the world go round. We need to get energy—for example, from petroleum, from the sun, or from nuclear sources. Energy makes our engines run, helps plants to grow, and causes us to wake up every morning full of vitality.

But there is something that does not add up. Energy—as I was also told at school—is conserved. It is neither created nor destroyed. If it is conserved, why do we have to constantly resupply it? Why can’t we just keep using the same energy?

The truth is that there is plenty of energy and it is not consumed. It’s not energy that the world needs in order to keep going. What it needs is low entropy.

Energy (be it mechanical, chemical, electrical, or potential) transforms itself into thermal energy, that is to say, into heat: it goes into cold things, and there is no free way of getting it back from there to reuse it to make a plant grow, or to power a motor. In this process, the energy remains the same but the entropy increases, and it is this which cannot be turned back. The second law of thermodynamics demands it.

What makes the world go round are not sources of energy but sources of low entropy. Without low entropy, energy would dilute into uniform heat and the world would go to sleep in a state of thermal equilibrium—there would no longer be any distinction between past and future, and nothing would happen.

Near to the Earth we have a rich source of low entropy: the sun. The sun sends us hot photons. Then the Earth radiates heat toward the black sky, emitting colder photons. The energy that enters is more or less equal to the energy that exits; consequently, we do not generally gain energy in the exchange. (Gaining energy in the exchange is disastrous for us: it is global warming.) But for every hot photon that arrives, the Earth emits ten cold ones, since a hot photon from the sun has the same energy as ten cold photons emitted by the Earth. The hot photon has less entropy than the ten cold photons, because the number of configurations of a single (hot) photon is lower than the number of configurations of ten (cold) photons. Therefore, the sun is a continual rich source of low entropy for us. We have at our disposal an abundance of low entropy, and it is this that allows plants and animals to grow, enables us to build motors and cities—and to think and to write books such as this one.

Where does the low entropy of the sun come from? From the fact that, in turn, the sun is born out of an entropic configuration that was even lower: the primordial cloud from which the solar system was formed had even lower entropy. And so on, back into the past, until we reach the extremely low initial entropy of the universe.

It is the growth of this entropy that powers the great story of the cosmos.

But the increase in the entropy of the universe is not rapid, like the sudden expansion of gas in a box: it is gradual, it takes time. Even with a gigantic ladle it takes time to stir something as big as the universe. Above all, there are obstacles and closed doors to its growth—passages where it occurs only with great difficulty.

A pile of wood, for example, lasts a long time if left alone. It is not in a state of maximum entropy, because the elements of which it is made, such as carbon and hydrogen, are combined in a very particular manner (“ordered”) to give form to the wood. Entropy grows if these particular combinations are broken down. This is what happens when wood burns: its elements disengage from the particular structures that form wood and entropy increases sharply (fire being, in fact, a markedly irreversible process). But the wood does not start to burn on its own. It remains for a long time in a state of low entropy, until something opens a door that allows it to pass to a state of higher entropy. A pile of wood is in an unstable state, like a pack of cards, but until something comes along to make it do so, it does not collapse. This something might, for instance, be a match to light a flame. The flame is a process that opens a channel through which the wood can pass into a state of higher entropy.

There are situations that impede and hence slow down the increase of entropy throughout the universe. In the past, for instance, the universe was basically an immense expanse of hydrogen. Hydrogen can fuse into helium, and helium has a higher entropy than hydrogen. But for this to happen, it is necessary for a channel to be opened: a star must ignite for hydrogen to begin to burn there into helium. What causes stars to ignite? Another process that increases entropy: the contraction due to gravity of one of the large clouds of hydrogen that sail throughout the galaxy. A contracted cloud of hydrogen has higher entropy than a dispersed one.97 But the clouds of hydrogen are so vast that they take millions of years to contract. Only after they have become concentrated do they manage to heat up to the point that triggers the process of nuclear fission. The ignition of nuclear fission opens the door that allows the further increase in entropy: hydrogen burning into helium.

The entire history of the universe consists of this halting and leaping cosmic growth of entropy. It is neither rapid nor uniform, because things remain trapped in basins of low entropy (the pile of wood, the cloud of hydrogen . . . ) until something opens a door onto a process that finally allows entropy to increase. The growth of entropy itself happens to open new doors through which entropy can increase further. A dam in the mountains, for instance, retains water until it is gradually worn down over time and the freed water escapes downhill once again, causing entropy to grow. Over the course of this irregular trajectory, large or small portions of the universe remain isolated in relatively stable situations for periods that can be very prolonged.

Living beings are made up of similarly intertwined processes. Photosynthesis deposits low entropy from the sun into plants. Animals feed on low entropy by eating. (If all we needed was energy rather than entropy, we would head for the heat of the Sahara rather than toward our next meal.) Inside every living cell, the complex web of chemical processes is a structure that opens and closes gates through which low entropy can increase. Molecules function as the catalysts that allow the processes to intertwine; or, conversely, they put a brake on them. The increase of entropy in each individual process is what makes the whole thing work. Life is this network of processes for increasing entropy—processes that act as catalysts to each other.98 It isn’t true, as is sometimes stated, that life generates structures that are particularly ordered, or that locally diminish entropy: it is simply a process that degrades and consumes the low entropy of food; it is a self-structured disordering, no more and no less than in the rest of the universe.

Even the most banal phenomena are governed by the second law of thermodynamics. A stone falls to the ground. Why? One often reads that it’s because the stone places itself “in a state of lower energy” that it ends up lower down. But why does the stone put itself into a state of lower energy? Why should it lose energy if energy is conserved? The answer is that when the stone hits the Earth, it warms it: its mechanical energy is transformed into heat. And there is no way back from there. If the second law of thermodynamics did not exist, if heat did not exist, if there existed no microscopic swarming, the stone would rebound perpetually; it would never land and be still.

It is entropy, not energy, that keeps stones on the ground and the world turning.

The entire coming into being of the cosmos is a gradual process of disordering, like the pack of cards that begins in order and then becomes disordered through shuffling. There are no immense hands that shuffle the universe. It does this mixing by itself, in the interactions among its parts that open and close during the course of the mixing, step by step. Vast regions remain trapped in configurations that remain ordered, until here and there new channels are opened through which disorder spreads.99

What causes events to happen in the world, what writes its history, is the irresistible mixing of all things, going from the few ordered configurations to the countless disordered ones. The entire universe is like a mountain that collapses in slow motion. Like a structure that very gradually crumbles.

From the most minute events to the more complex ones, it is this dance of ever-increasing entropy, nourished by the initial low entropy of the universe, that is the real dance of Shiva, the destroyer.


TRACES AND CAUSES

The fact that entropy has been low in the past leads to an important fact that is ubiquitous and crucial for the difference between past and future: the past leaves traces of itself in the present.

Traces are everywhere. The craters of the moon testify to impacts in the past. Fossils show the forms of living creatures from long ago. Telescopes show how far off galaxies were in the past. Books contain our history; our brains swarm with memories.

Traces of the past exist, and not traces of the future, only because entropy was low in the past. There can be no other reason, since the only source of the difference between past and future is the low entropy of the past.

In order to leave a trace, it is necessary for something to become arrested, to stop moving, and this can happen only in an irreversible process—that is to say, by degrading energy into heat. In this way, computers heat up, the brain heats up, the meteors that fall into the moon heat it; even the goose quill of a medieval scribe in a Benedictine abbey heats a little the page on which he writes. In a world without heat, everything would rebound elastically, leaving no trace.100

It is the presence of abundant traces of the past that produces the familiar sensation that the past is determined. The absence of any analogous traces of the future produces the sensation that the future is open. The existence of traces serves to make it possible for our brain to dispose of extensive maps of past events. There is nothing analogous to this for future ones. This fact is at the origin of our sensation of being able to act freely in the world: choosing between different futures, even though we are unable to act upon the past.

The vast mechanisms of the brain about which we have no direct awareness (“I know not why I am so sad,” mumbles Antonio at the beginning of The Merchant of Venice) have been designed during the course of evolution in order to make calculations about possible futures. This is what we call “deciding.” Since they elaborate possible alternative futures that would follow if the present were exactly as it is except for some detail, we are naturally inclined to think in terms of “causes” that precede “effects”: the cause of a future event is a past event such that the future event would not follow in a world that was exactly the same except for this cause.101

In our experience, the notion of cause is thus asymmetrical in time: cause precedes effect. When we recognize in particular that two events “have the same cause,” we find this common cause102 in the past, not in the future. If two waves of a tsunami arrive together at two neighboring islands, we think that there has been an event in the past that has caused both. We do not look for it in the future. But this does not happen because there is a magical force of “causality” going from the past to the future. It happens because the improbability of a correlation between two events requires something improbable, and it is only the low entropy of the past that provides such improbability. What else could? In other words, the existence of common causes in the past is nothing but a manifestation of low entropy in the past. In a state of thermal equilibrium, or in a purely mechanical system, there isn’t a direction to time identified by causality.

The laws of elementary physics do not speak of “causes” but only of “regularities,” and these are symmetrical with regard to past and future. Bertrand Russell noted this in a famous article, writing emphatically that “The law of causality . . . is a relic of a bygone age, surviving, like the monarchy, only because it is erroneously supposed to do no harm.”103 He exaggerates, of course, because the fact that there are no “causes” at an elementary level is not a sufficient reason to render obsolete the very notion of cause.104 At an elementary level there are no cats either, but we do not for this reason cease to bother with cats. The low entropy of the past renders the notion of cause an effective one.

But memory, causes and effects, flow, the determined nature of the past and the indeterminacy of the future are nothing but names that we give to the consequences of a statistical fact: the improbability of a past state of the universe.

Causes, memory, traces, the history itself of the becoming of the world that unfolds not only across centuries and millennia of human history but in the billions of years of the great cosmic narrative—all this stems simply from the fact that the configuration of things was “particular” a few billion years ago.105

And “particular” is a relative term: it is particular in relation to a perspective. It is a blurring. It is determined by the interactions that a physical system has with the rest of the world. Hence causality, memory, traces, the history of the happening of the world itself can only be an effect of perspective: like the turning of the heavens; an effect of our peculiar point of view in the world. . . . Inexorably, then, the study of time does nothing but return us to ourselves.

Загрузка...