9

Black Hole Thermodynamics

‘Black holes ain’t so black’

Stephen Hawking

Up to this point, we’ve thought about black holes as objects that, broadly speaking, mind their own business. Stuff can fall in, causing the black hole to grow, but nothing that crosses the horizon can come out. All traces of anything and everything that falls into a black hole would seem to be erased from the Universe forever. This is the description of a black hole according to general relativity. In 1972, John Wheeler and his graduate student Jacob Bekenstein realised that this raises a deep question. Wheeler tells the story of how he, in a ‘joking mood one day’, told Bekenstein that he always felt like a criminal when he put a cup of hot tea next to a glass of iced tea and let them come to the same temperature. The energy of the world doesn’t change, but the disorder of the Universe would be increased, and that crime ‘echoes down to the end of time’.25 Wheeler was referring to the Second Law of Thermodynamics, which roughly speaking says that when any change occurs in the world, the world becomes more disordered as a result. ‘But let a black hole swim by and let me drop the hot tea and the cold tea into it. Then is not all evidence of my crime erased forever?’ Jacob took the remark seriously, recounts Wheeler, and went away to think about it.

The physical quantity that measures disorder is known as entropy. In Wheeler’s words, ‘Whatever is composed of the fewest number of units arranged in the most orderly way (a single, cold molecule for instance) has the least entropy. Something large, complex and disorderly (a child’s bedroom perhaps) has a large entropy.’ The Second Law of Thermodynamics, phrased in terms of entropy, states that in any physical process, entropy always increases. Wheeler was concerned that he had increased the entropy of the Universe by placing his two cups of tea in contact, and then decreased it again by throwing them into a black hole.

As is so often the case, there is a lyrical Eddington quote that reflects the importance of the Second Law: ‘The law that entropy always increases holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations – then so much the worse for Maxwell’s equations. If it is found to be contradicted by observation – well, these experimentalists do bungle things sometimes. But if your theory is found to be against the Second Law of Thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.’

Bekenstein returned a few months later with an answer: the black hole does not conceal the crime. His answer was inspired by Hawking’s prior observation that the area of a black hole’s horizon always increases, no matter what. To Bekenstein, this ‘area always increases’ law reminded him of the ‘entropy always increases’ law. He therefore made the bold claim that throwing objects into a black hole causes an increase in the area of the event horizon, which in turn signals a corresponding increase in entropy. In other words, when something falls into a black hole, a record is kept, the Second Law is obeyed, and deepest humiliation is avoided. Looking back at Wheeler’s description of entropy in terms of molecules and messy bedrooms, however, the assignment of an entropy to a black hole would seem to be questionable, as Bekenstein and Wheeler were well aware. According to general relativity, black holes are simple things: a Schwarzschild black hole can be described by a single number; its mass. But Wheeler begins his description of entropy with the phrase ‘Whatever is composed of the fewest number of units arranged in the most orderly way … has the least entropy.’ With no apparent units to re-arrange, what, then, is the meaning of the entropy of a black hole?

Entropy was introduced in the nineteenth century as one of the fundamental quantities in the newly emerging science of thermodynamics, alongside the more familiar notions of heat, energy and temperature. Perhaps because of its historical origins in the industrial revolution, thermodynamics has the reputation of being closer to engineering than black holes, but this is emphatically not the case. Thermodynamics is connected at a deep level to quantum mechanics and the structure of matter. When applied to black holes, we will see that thermodynamics is connected at a deep level to quantum gravity and the structure of spacetime. Before we get to the thermodynamics of black holes, let’s go back in time to the nineteenth century for a tour of the origins of the subject and to introduce the important concepts of heat, energy, temperature and entropy.

The fundamental physics of the fridge

The foundations of thermodynamics were laid by practical people doing practical things: people like Salford brewer James Prescott Joule who were interested in building better steam engines, developing more efficient industrial processes, and beer.

In the early 1840s, Joule performed a range of experiments to demonstrate that heat and work are different but interchangeable forms of energy. Figure 9.1 illustrates his most famous experiment. A weight falling under gravity turns a paddle that stirs some water, causing the temperature of the water to rise. In thermodynamical jargon, the falling weight does work on the water. Joule’s skill was in being able to make very precise measurements of the temperature increase, which he demonstrated to be proportional to the amount of work done by the falling weight. Initially, his findings were not met with enthusiasm. They went against the thinking of the day, which held that heat is an ethereal fluid (‘caloric’) that flows from hot to cold objects. Joule submitted his findings to the Royal Society in 1844, but his paper was rejected, partly because it was not believed that he could measure temperature increases to the claimed 1/200th of a degree Fahrenheit. The Royal Society, then as now, was not awash with experts in the brewing of fine ales, the demands of which meant that Joule had access to instruments capable of the required precision. A notable exception is the former President of the Royal Society and Nobel Laureate Sir Paul Nurse, who began his distinguished career as a technician in a brewery because he wasn’t accepted onto a university degree course due to his lack of a modern language qualification. Sir Paul was subsequently awarded the 2001 Nobel Prize for his work on yeast. Joule remained undaunted, and by the mid-1850s his work had become widely accepted after a fruitful collaboration with William Thomson (later Lord Kelvin).

Figure 9.1. James Prescott Joule measured the increase in temperature of a vessel of water caused by the rotation of a paddle driven by a weight falling under gravity. The experiment demonstrated the conversion of mechanical work into heat. (Science History Images/Alamy Stock Photo)

Joule’s results illustrate what we now know to be correct: heat is a form of energy associated with the motion of atoms and molecules – the building blocks of matter. As the paddle rotates, it delivers kinetic energy to the water molecules by hitting them. The molecules move around faster, and this is what we measure as an increase in the temperature of the water. At the time, this idea was radical because there was no direct evidence that matter is composed of atoms, although Joule was taught by one of the leading proponents of the atomic hypothesis, John Dalton. In the words of Jacob Abbott, writing about Joule’s experiments in 1869:

‘It is inferred from this that heat consists in some kind of subtle motion – undulatory, vibratory or gyratory – of the elemental atoms or molecules of which all material substances are supposed to be composed. This, however, is a mere theoretical inference.’26

The link between work, temperature, and the motion of the proposed atomic constituents of matter is a clue that thermodynamics is related to the behaviour of the hidden building blocks of the world, whatever those building blocks may be. Incidentally, one of the papers that settled the atomic debate was Einstein’s 1905 paper on Brownian motion (and his follow-up in 1908), which explained the jiggling of pollen grains suspended in water under the assumption that they were being bombarded by water molecules. Einstein’s predictions were confirmed experimentally in 1908 by Jean Baptiste Perrin, who received the Nobel Prize in 1926 for his work on ‘the discontinuous structure of matter’.

The results of Joule’s experiments, as well as providing evidence for the existence of atoms, are captured in what we now call the First Law of Thermodynamics, which expresses the fundamental idea that energy is conserved: The total energy of a system can be altered either by supplying or extracting heat or by doing work. Moreover, a certain amount of work can be converted into an equivalent amount of heat and vice versa, so long as the total energy is conserved. This is the theoretical basis of the steam engine. Burn some coal and use the energy that’s released to spin a wheel. This isn’t all there is to a steam engine, however, because there is another essential component – the environment in which the engine sits. Crucially, the surroundings of the steam engine must be colder than the furnace, otherwise the steam engine won’t work. Why?

The answer is that energy is always transferred from hot objects to cold objects and never the other way round. This has nothing to do with the conservation of energy. Energy would still be conserved if it was removed from a cold cup of tea, making it colder, and transferred to a hot cup of tea, making it hotter. But this is not what happens in Nature. To account for this one-way transfer of energy, another law of Nature is required, and that is the Second Law of Thermodynamics. One way to state the Second Law is simply to say that heat always flows from hot to cold. Described in these terms, a steam engine is a device that sits between a hot furnace and the cold world outside. As energy flows naturally from hot to cold, the engine syphons off some of the flow and converts it into useful work. Hardly of profound significance at first sight, but it turns out that this almost self-evident statement of the Second Law captures the essence of a much deeper idea. In his book The Laws of Thermodynamics, Peter Atkins begins his chapter on the Second Law with this remarkable sentence: ‘When I gave lectures on thermodynamics to an undergraduate chemistry audience I often began by saying that no other scientific law has contributed more to the liberation of the human spirit than the second law of thermodynamics.’ ‘The second law,’ he continues, ‘is of central importance in the whole of science, and hence our rational understanding of the universe, because it provides a foundation for understanding why any change occurs. Thus, not only is it a basis for understanding why engines run and chemical reactions occur, but it is also a foundation for understanding those most exquisite consequences of chemical reactions, the acts of literary, artistic, and musical creativity that enhance our culture.’27

German physicist Rudolph Clausius introduced the idea of entropy in 1865. In his words: ‘The energy of the world is constant. The entropy of the world strives for a maximum.’* This is a beautifully succinct statement of the first two laws of thermodynamics. Let’s see how things work out in the case of Wheeler’s teacups. According to the First Law, energy is always conserved. This can be true whichever way the energy flows, as long as the amount of energy removed from one teacup is equal to the amount of energy deposited in the other. Clausius defined entropy such that the entropy increase caused by adding heat energy to cold tea is greater than the entropy decrease when the same amount of energy is removed from hot tea.† Thus, the combined entropy of the two cups will increase if heat flows from hot to cold, but not the other way round.

Energy can flow from a cooler object to a hotter object if, somewhere else, enough energy is dumped into another cooler object such that the overall entropy of everything increases. This is what happens in a fridge. Heat is removed from the inside, which lowers the entropy of the interior. The entropy of your kitchen must therefore increase by a larger amount to satisfy the Second Law. This is why the back of your fridge must be hotter than your kitchen. Here’s how it works.

A coolant circulates around the fridge from the interior to the exterior. On exiting the interior, the coolant is compressed and therefore heats up. It is then circulated around the element at the rear which, being hotter than your kitchen, transfer heat into the room. The coolant then goes back into the interior of the fridge. As it does so it expands and cools to below the temperature inside the fridge. Being cooler than the interior, it now absorbs heat from the interior. It then goes through the compressor again and the whole cycle repeats. The net effect is the transfer of energy from inside the fridge to outside – from cold to hot – but at the cost of the energy needed to power the compressor, which is why your fridge doesn’t work unless you plug it in.

The energy to power the compressor comes from a power station, which could be a steam engine; a hot furnace in a cold environment. The power station might run off coal or gas, which came from plants, which stored energy from the Sun, which is a hot spot in a cold sky. The stars are the furnaces of the Universe – the ultimate steam engines. At each stage in the flow of energy from the shining stars to the formation of the ice cubes for your (late) afternoon gin and tonic, the overall entropy of the Universe increases as energy flows from hot to cold, and perhaps the cool gin and tonic stimulates ‘acts of literary, artistic, and musical creativity that enhance our culture’.

The stars were formed by the gravitational collapse of primordial clouds of hydrogen and helium in the early Universe which, for reasons we do not understand, began in an extraordinarily low entropy configuration. The origin of this special initial state of the Universe – a reservoir of low entropy without which we would not exist – is one of the great mysteries in modern physics.

The concept of entropy proved extremely useful for nineteenth-century steam engine designers because it allowed them to understand that the efficiency of a steam engine depends on the temperature difference between the furnace and the environment. If there is no temperature difference, no net energy can be transferred, and no work can be done. A larger temperature difference allows more work to be done because it allows for more energy to flow without violating the Second Law. But nowhere in the elegant logical edifice constructed by Joule, Clausius and many others, now known as classical thermodynamics, is there any mention of what entropy actually is; it’s just a very useful quantity.

What is entropy?

When John Wheeler brought his hot and cold teacups into contact, he was worried about increasing the disorder of the Universe. The link between entropy and disorder was appreciated by James Clerk Maxwell, whose work on electromagnetic theory led Einstein to special relativity. Maxwell realised that the Second Law is different to the other laws of Nature known to the nineteenth-century physicists in that it is inherently statistical. In 1870, he wrote: ‘The 2nd law of thermodynamics has the same degree of truth as the statement that if you throw a tumblerful of water into the sea you cannot get the same tumblerful out of the water again.’28

In 1877, Ludwig Boltzmann reinforced this idea with a brilliant new insight. Boltzmann understood that entropy puts a number on ignorance; in particular, on our ignorance of the exact configuration of the component parts of a system. Take Maxwell’s tumblerful of water. Before throwing it into the sea, we know that all the water molecules are in the tumbler. Afterwards, we have far less idea where they are, and the entropy of the system has increased. This idea is powerfully general – things that shuffle and jiggle will, if left alone, tend to mix and disperse, and our ignorance increases as a result.

Boltzmann’s insight connects Clausius’s definition of entropy, based on temperature and energy, with the arrangement of the internal constituents of a system. The resulting methodology, which treats matter as being built up out of component parts about which we have limited knowledge, is a branch of physics known as statistical mechanics. The subject is a challenging one, both technically and philosophically. David L. Goodstein begins his textbook States of Matter with the following paragraph: ‘Ludwig Boltzmann, who spent much of his life studying statistical mechanics, died in 1906, by his own hand. Paul Ehrenfest, carrying on the work, died similarly in 1933. Now it is our turn to study Statistical Mechanics.’29

A nice way to see the connection between entropy, temperature and the arrangement of the constituents of a system is to think about a particularly simple physical system – a collection of atoms in a box. The behaviour of atoms is the province of quantum theory, which we’ll explore in more detail later. For now, we need only one idea; atoms confined inside a box can only have certain specific energies. We say that the system has a discrete set of ‘energy levels’. This is where quantum mechanics gets its name; ‘quantised’ means ‘discrete’, as in a discrete set of energies. The lowest possible energy an atom can have is known as the ground state. If all the atoms are in the ground state, the temperature of the box of atoms is zero degrees Kelvin (-273 degrees Celsius). If energy is added, some of the atoms will move to higher energy levels. The parameter that determines how the atoms are distributed among the available energy levels is the temperature. The higher the temperature, the higher up the ladder of available energy levels the atoms can climb (as illustrated in Figure 9.2). The details of how much energy must be transferred into the box to change the configuration of the atoms depends on the types of atom present and the size of the box, but the key point is that there exists a single quantity – the temperature – which tells us how the atoms are most likely to be arranged across the allowed energy levels.

Figure 9.2. Atoms occupying the energy levels for a bunch of atoms in a box. At zero temperature (on the left) all the atoms are in the lowest energy level (the ground state). As the temperature increases (from left to right), atoms increasingly occupy higher energy levels.

Let’s now imagine placing a different box of different atoms in contact with our original box. The details of the energy levels will be different, but what’s important is that if the two boxes are at the same temperature, no net energy will be transferred between the boxes and the internal configurations will not change in a discernible way. This is the meaning of the nineteenth-century concept of temperature. To put it another way, if we place two systems in contact such that energy can be exchanged and nothing happens overall, then the two systems are at the same temperature. This is known as the Zeroth Law of Thermodynamics, because it was an afterthought. The Zeroth Law was always an essential part of the logical structure of classical thermodynamics because it is necessary to pin down the concept of temperature, but it wasn’t designated as a law until the early twentieth century, by which time everybody had got so used to speaking of the First and Second Laws of Thermodynamics that they didn’t want to change.

Richard Feynman came up with a nice analogy for temperature in his book The Character of Physical Law.30 Imagine sitting on a beach as the clouds sweep in off the ocean and it begins to rain. You grab your towels and rush into a beach hut. The towels are wet, but not as wet as you, and so you can start to dry yourself. You get drier until every towel is as wet as you, at which point there is no way of removing any more water. You could explain this by inventing a quantity called ‘ease of removing water’ and say that you and the towels all have the same value of that quantity. This doesn’t mean that everything contains the same amount of water. A big towel will contain more water than a small towel, but because they all have the same ‘ease of removing water’, there can be no net transfer of water between them. The reason why an object has a particular ‘ease of removing water’ is complicated and related to its internal atomic structure, but we don’t need to know the details if all we’re concerned about is getting dry. The analogy with thermodynamics is that the amount of water is the energy, and the ‘ease of removing water’ is the temperature. When we say that two objects have the same temperature, we don’t mean that they have the same energy. We mean that if we place them in contact their atoms or molecules will jiggle around and collide, just as the molecules in Joule’s paddle collided with molecules in the water and imparted energy to them, but if the objects are at the same temperature, the net transfer of energy will be zero and nothing will change on average.

Now recall Wheeler’s description of entropy; ‘Whatever is composed of the fewest number of units arranged in the most orderly way … has the least entropy.’ What does ‘order’ mean? Imagine we decide to select an atom at random from the box and ask: Which energy level did that atom come from? At zero temperature, we know the answer. The atom came from the ground state. The entropy in this case is zero.‡ This is what Wheeler means by ‘units’ being arranged in an orderly way. We know exactly what we are going to get when we pull an atom out of the box; we are not in the least bit ignorant. If we raise the temperature, the atoms will spread out across the available energy levels, and if we now select an atom at random, we can’t be sure which energy level it will come from. The atom could come from the ground state, or from one of the higher energy levels. This means our ignorance has increased as a result of raising the temperature. Equivalently, the entropy is larger, and continues to rise with increasing temperature as the atoms become more distributed among the allowed energy levels.

Temperature, energy and the change in entropy are quantities that appear in classical thermodynamics without any knowledge of the underlying structure of the ‘thing’ being studied (by ‘thing’ we mean anything from a box of gas to a galaxy of stars). Thanks to Boltzmann, we now understand that these quantities are intimately related to the constituents of the thing, how those constituents are arranged and how they share the total energy. Temperature, for example, tells us how fast the molecules in a box of gas are moving around on average. Similarly, entropy tells us about the number of possible internal configurations a thing can have. Boltzmann’s tombstone bears an inscription of his famous equation for the entropy of a system, which makes the connection with the component parts explicit:

S = kB log W

In this equation, W is the number of possible internal configurations and the entropy, denoted S, is proportional to the logarithm of W. Accordingly, larger W means larger entropy. The logarithm and Boltzmann’s constant kB are not important for what follows, other than to note that they allow us to put a precise number on the entropy that also agrees with Clausius’s definition in terms of energy and temperature. The important point is that W is the total number of different ways that the component parts of a system could be arranged in a manner consistent with what we know about the system. For atoms in a box, if the temperature is zero, there is only one way the atoms could be arranged, and so W = 1 and the entropy is zero.§ If the temperature is raised and some of the atoms hop into higher energy levels, there are more possible arrangements and so W is larger and the entropy is larger.

For a gas in a room, the component parts are atoms or molecules and the things we know about the system might be the volume of the room, the total weight of the gas inside and the temperature. Computing the entropy is then an exercise in counting the different ways the atoms could be arranged inside the room given what we know. One possible arrangement would be that all the atoms, bar one, are sitting still in one corner of the room, while a single lone atom carries almost all the energy. Or maybe the atoms share out the energy equally and are distributed uniformly around the room. And so on. Crucially, there are vastly more ways to arrange the atoms in the room such that the atoms are spread out across the room and share the energy reasonably evenly between them, compared to arrangements where all the atoms are in one corner or the energy is distributed very unevenly. Boltzmann understood that if the energy in the room is allowed to get shuffled around among the atoms because the atoms collide, then all the different arrangements will be more-or-less equally likely. Given that insight and given the numerical dominance of arrangements where the atoms are scattered all over the room, it follows that if we are in a ‘typical’ room then we are very likely to find the atoms distributed in a roughly uniform fashion. When everything has settled down and things are evenly distributed, we say that the system is in thermodynamic equilibrium. The entropy is then as big as it can be, and every region of the room is at the same temperature.

Now, here is why the Second Law embodies the idea of change. If we begin with a system far from equilibrium, which is to say that the component parts are distributed in an unusual way, then as long as the components can interact with each other and share their energy, the system will inexorably head towards equilibrium because that’s the most likely thing to happen. We can now appreciate why Maxwell’s observation that the Second Law has a statistical element to it was so insightful. The Second Law deals, ultimately, with what is more or less likely, and it is far more likely that a system will head towards thermodynamic equilibrium because there are so many more ways for it to be in thermodynamic equilibrium.

This ‘one way’ evolution of a system is often called the thermodynamic arrow of time because it draws a sharp distinction between the past and the future: the past is more ordered than the future. In our Universe as a whole, the arrow of time can be traced back to the mysterious highly-ordered, low-entropy state of the Big Bang.

Entropy and information

Suppose we know the precise details of every atom in a room and choose to think in these terms rather than in terms of the volume, weight of gas and temperature. Then the entropy would be zero because we know the configuration exactly. This means that an omniscient being has no need for entropy. For mortal beings and physicists, however, the vast numbers of atoms in rooms and other large objects makes keeping track of their individual motions impossible and, as a result, entropy is a very useful concept. Entropy is telling us about the amount of information that is hidden from us when we describe a system in terms of just a few numbers. Seen in this way, entropy is a measure of our lack of knowledge, our ignorance. The connection between entropy and information was made explicit in 1948 by Claude Shannon in one of the foundational works in what is now known as information theory, which is central to modern computing and communications technology.

Returning to our gas-filled room, there will be many possible configurations of the atoms inside that are consistent with our measurements of the volume, weight and temperature. The logarithm of that number, according to Boltzmann, is the entropy. Importantly, though, the atoms are actually in a particular configuration at some moment of time. We just don’t know what it is. Let’s imagine that we make a measurement and determine precisely which configuration. What have we learnt? To be more specific, exactly how much information have we gained? Following Shannon, the amount of information gained is defined to be the minimum number of binary digits (bits) required to distinguish the measured configuration from all the other possible configurations. Imagine, for example, that there are only four possible configurations. In binary code, we would label those configurations as 00, 01, 10 and 11. That means we gain two bits of information when we measure it. If there are eight possible configurations, we would label them 000, 001, 010, 011, 100, 101, 110 and 111. That’s three bits. And so on. If there were a million possible configurations, it would take us some time to write all the combinations out by hand, but we don’t need to because there is a simple formula that tells us how many bits we’d need. If the number of configurations is W, the number of bits is:

N = log2 W

This is very similar to Boltzmann’s formula for the entropy of the box of gas. If you know a little mathematics, you’ll notice that the logarithm is now in base 2 rather than the natural logarithm in Boltzmann’s formula, but that just leads to an overall numerical factor.¶ The key point is that the information gained in our measurement of the precise state of the gas is directly proportional to the entropy of the gas before the measurement. Specifically:

This is the key to understanding the fundamental importance of entropy. Entropy tells us about the internal structure of a thing – it is intimately related to the amount of information that the thing can store and as such it is intimately linked to the fundamental building blocks of the world. It is a window into the underlying structure of cups of tea, steam engines and stars. And, if we follow Bekenstein and associate an entropy with the area of the event horizon of a black hole, it is a window into the underlying structure of space and time.

The entropy of a black hole

The immediate issue is that black holes as we have described them so far have no component parts. They are pure spacetime geometry and quite featureless. Superficially, therefore, a black hole would appear to have zero entropy. Throw a couple of cups of tea into a black hole and its mass will increase, but that’s all and so the entropy should still be zero. This was Wheeler’s point. To save the Second Law, in true Eddingtonian spirit, Bekenstein guessed that black holes must have an entropy, and that must be proportional to the area of the horizon. Bekenstein did more than just guess though. In an ingenious back of the envelope calculation, he also estimated the numerical value of the entropy of a black hole and discovered something very deep.

Let’s imagine that we want to drop a single bit of information into a black hole. How can we think of achieving this? One answer is to drop a single photon into the black hole. A photon is a massless particle of light, and one photon can store one bit of information. We can think of a photon as spinning clockwise (0) or anticlockwise (1), which means it can represent one bit. Each photon also carries a fixed amount of energy inversely proportional to its wavelength. This relationship between energy and wavelength was first proposed by Einstein in 1905 and is a key feature of quantum theory. Long-wavelength photons have smaller energy and short-wavelength photons have higher energy. This is the reason UV light from the Sun can be dangerous but the light from a candle can’t: UV photons have short wavelengths and carry enough energy to damage your cells, whereas candlelight photons have longer wavelengths and don’t carry enough energy to cause damage. As a general rule, the position of a photon cannot be resolved to distances smaller than its wavelength. This means we would like to drop a photon with a wavelength roughly equal to or smaller than the Schwarzschild radius into the black hole, since longer wavelength photons would typically reside outside the hole. Now we can calculate the largest possible number of such photons we can fit inside the black hole. This should give us a crude estimate of the maximum number of bits it can store, which is the entropy.** We go through the calculation in Box 9.1. The answer for the number of bits hidden in a Schwarzschild black hole is:

where A is the area of the event horizon.†† One very interesting thing about this equation is the collection of numbers in front of the horizon area, A. This combination of the speed of light, c, Newton’s gravitational constant, G, and Planck’s constant, h, a number that lies at the heart of quantum theory, is well known to physicists. It is the square of the so-called Planck length. We’ve described the significance of the Planck length in more detail in Box 9.2. The short version is that it is the fundamental length scale in our Universe and the smallest distance we can speak of as a distance. The suggestion is that the entropy of a black hole, which is to say the number of bits of information it hides, can be found by tiling the event horizon in Planck-length-sized pixels and assuming that the black hole stores one bit per pixel. This is illustrated in Figure 9.3.

Figure 9.3. The horizon of a black hole, with an imaginary tiling of Planck-area-sized cells. Remarkably, the total number of cells is equal to the entropy of the black hole.

It’s hard to overstate what an intriguing result this is. What is the nature of these Planckian pixels and why are they tiling the horizon when, according to general relativity, the horizon is just empty space? Recall, an astronaut freely falling through the horizon should experience nothing out of the ordinary according to the Equivalence Principle. And yet Bekenstein’s result suggests that they encounter a dense collection of bits. Furthermore, why should the information capacity of a black hole be proportional to the area of its horizon rather than its volume? How much information can be stored in a library? Surely the answer must depend on the number of books that fit inside. For a black hole library, however, it seems as if we are only allowed to paper the outside walls with the pages of the books. It is as if the interior doesn’t exist.

One might wonder whether the black hole might be missing a trick from an information storage perspective, but a simple argument suggests that a black hole of a given mass has the maximum possible information storage capacity (entropy) for that mass. Imagine dropping an object into a Schwarzschild black hole. To obey the Second Law, the black hole entropy must increase by at least the entropy of the object it swallows. The area of its event horizon will grow accordingly, but the area increase depends only on the mass of the object because the area of the horizon is proportional only to its mass. Now imagine instead dropping a super-high entropy object with the same mass as before into the black hole. The horizon area will increase by precisely the same amount, proportional only to the mass of the object. This means that as we add mass to the black hole, it must increase its entropy by the largest possible amount. It is as if objects thrown in get completely scrambled up, to guarantee that our ignorance is maximised.

A black hole therefore has the largest possible entropy. It can store the maximum possible amount of information in a given region of space, and the amount of information measured in bits is given by the surface area of the region in Planck units. This hints at something deeply hidden; everything that exists in a volume of space can be completely described by information on a surface surrounding the region. This is our first encounter with the holographic principle.

BOX 9.1. Black hole entropy

Roughly speaking, only photons whose wavelengths (as measured by a distant observer) are less than the Schwarzschild radius can fit inside the hole. According to quantum physics, a photon has an energy E = hc/l, where l is the wavelength and h is Planck’s constant. Therefore, the smallest possible photon energy is E = hc/R, where R is the Schwarzschild radius. The total energy of a black hole of mass M is, according to the famous Einstein relation, Mc2. The maximum number of photons we can fit inside the hole is:

Now, the Schwarzschild radius R = 2GM/c2, which means we can write:

The horizon area A = 4πR2, and therefore:

You may wonder if more information can be stored using other types of particle (electrons also spin and can be used to encode bits). Unlike photons, other particles carry mass which means fewer of them can fit inside the hole.

BOX 9.2. The Planck length

The Planck length is a combination of three fundamental physical constants; Planck’s constant, Newton’s gravitational constant and the speed of light. The Planck length is a very tiny length. The diameter of a proton is 100,000,000,000,000,000,000 Planck lengths. Max Planck first introduced his eponymous unit in 1899 as a system of measurement that depends only on fundamental physical constants. This is preferable to using things like metres or seconds, which reflect the vagaries of history and have more to do with the size of humans and the orbit of our planet than the underlying laws of Nature. The strength of gravity, the behaviour of atoms and the universal speed limit, however, are independent of humans. If we encountered an alien civilisation and asked them to tell us the area of the horizon of the M87 black hole in Planck units, they would come up with the same number that we do. In a formula, the Planck length is given by:

It is believed to be the smallest distance that makes any sense: smaller than this, it is likely that the idea of a continuous space breaks down.

History repeats itself

There is a parallel between Bekenstein’s proposal for the entropy of a black hole and the development of nineteenth-century statistical mechanics. When Boltzmann died in 1906, aged 62 years, his atomistic explanation of the Second Law was still not universally accepted. Led by the greatly influential Austrian physicist Ernst Mach, many scientists still doubted the very existence of atoms. Mach’s objections were initially of a philosophical nature, but they developed a momentum in part because Boltzmann’s work led to a good deal of confusion that Boltzmann himself struggled to dispel. The argument became centred around the statistical nature of the Second Law. According to Boltzmann, if matter is made of atoms all moving around then it is overwhelmingly likely that entropy will increase, but it is not guaranteed. For example, there is a near-vanishingly small probability that all the atoms in a room will end up clustered in one corner. Mach and his followers felt that a fundamental law of Nature should not be statistical. ‘Entropy almost always increases’ didn’t sound authoritative enough, especially since Clausius’s formulation of the Second Law has no ‘almost’ about it. Today, we know Boltzmann was right – the Second Law does involve an element of probability.

A similar debate now centres around the physical significance of the thermodynamic behaviour of black holes. If we accept the idea that the entropy of a black hole is signalling the presence of ‘moving parts’ of some sort, the astonishing implication is that general relativity is underpinned by a statistical theory just as classical thermodynamics is underpinned by statistical mechanics. This means that we should regard spacetime as an approximation; an averaged-out description of the world akin to the description of a box of gas in terms of temperature, volume and weight. In 1902, the pioneer of statistical mechanics Josiah Willard Gibbs wrote: ‘The laws of thermodynamics … express the approximate and probable behaviour of systems of a great number of particles, or, more precisely, they express the laws of mechanics for such systems as they appear to beings who have not the fineness of perception to enable them to appreciate quantities of the order of magnitude of those which relate to single particles …’ Could it be that, in the first decades of the twenty-first century, we find ourselves in a similar position, as beings who have not the fineness of perception to appreciate the underlying structure of space and time?

When Bekenstein made his suggestion that black holes have an entropy, however, there was one huge fly in the ointment. As we’ve seen, entropy and temperature go hand in hand, and the assignment of a thermodynamic entropy to a black hole requires it to have a temperature. But for an object to have a temperature it must be able to emit things as well as to absorb them. Temperature, after all, can be defined in terms of the net transfer of energy between objects – in Feynman’s analogy, the ‘ease of removing water’. If it is not possible to extract anything from a black hole then the temperature must be zero. And, as everyone knew in 1972 and as general relativity makes abundantly clear, nothing can escape from a black hole.

Then, in 1974, everything changed. Along came Stephen Hawking with a short paper entitled ‘Black Hole Explosions?’


* Taken from Clausius’s excellently titled 1865 paper, ‘The Main Equations of the Mechanical Heat Theory in Various Forms that are Convenient for Use’.


† Specifically, the change in entropy dS = dQ/T, where dQ is the amount of heat energy transferred to the cup of tea and T is its temperature.


‡ A technical note. The entropy of a system at zero temperature is zero if the ground state is not degenerate, which means that there are not multiple ground states of the same energy. Solid carbon monoxide and ice are two examples of solids that have degenerate ground states and therefore have a ‘residual entropy’ at zero temperature because there will still be uncertainty about which state a randomly selected molecule came from.


§ log 1 = 0. We use ‘log W’ to indicate the natural logarithm of W.


¶ log 2 = 1.4427.


** We explain why a black hole has the largest possible entropy shortly.


†† A more careful calculation, which properly accounts for the quantum physics, gives an entropy equal to the horizon area divided by (4 x Planck length squared), which differs from our estimate by a numerical factor.

Загрузка...