CHAPTER 2 ENTRO, EVO, INFO


The first keystone in understanding the human condition is the concept of entropy or disorder, which emerged from 19th-century physics and was defined in its current form by the physicist Ludwig Boltzmann.1 The Second Law of Thermodynamics states that in an isolated system (one that is not interacting with its environment), entropy never decreases. (The First Law is that energy is conserved; the Third, that a temperature of absolute zero is unreachable.) Closed systems inexorably become less structured, less organized, less able to accomplish interesting and useful outcomes, until they slide into an equilibrium of gray, tepid, homogeneous monotony and stay there.

In its original formulation the Second Law referred to the process in which usable energy in the form of a difference in temperature between two bodies is inevitably dissipated as heat flows from the warmer to the cooler body. (As the musical team Flanders & Swann explained, “You can’t pass heat from the cooler to the hotter; Try it if you like but you far better notter.”) A cup of coffee, unless it is placed on a plugged-in hot plate, will cool down. When the coal feeding a steam engine is used up, the cooled-off steam on one side of the piston can no longer budge it because the warmed-up steam and air on the other side are pushing back just as hard.

Once it was appreciated that heat is not an invisible fluid but the energy in moving molecules, and that a difference in temperature between two bodies consists of a difference in the average speeds of those molecules, a more general, statistical version of the concept of entropy and the Second Law took shape. Now order could be characterized in terms of the set of all microscopically distinct states of a system (in the original example involving heat, the possible speeds and positions of all the molecules in the two bodies). Of all these states, the ones that we find useful from a bird’s-eye view (such as one body being hotter than the other, which translates into the average speed of the molecules in one body being higher than the average speed in the other) make up a tiny fraction of the possibilities, while all the disorderly or useless states (the ones without a temperature difference, in which the average speeds in the two bodies are the same) make up the vast majority. It follows that any perturbation of the system, whether it is a random jiggling of its parts or a whack from the outside, will, by the laws of probability, nudge the system toward disorder or uselessness—not because nature strives for disorder, but because there are so many more ways of being disorderly than of being orderly. If you walk away from a sandcastle, it won’t be there tomorrow, because as the wind, waves, seagulls, and small children push the grains of sand around, they’re more likely to arrange them into one of the vast number of configurations that don’t look like a castle than into the tiny few that do. I’ll often refer to the statistical version of the Second Law, which does not apply specifically to temperature differences evening out but to order dissipating, as the Law of Entropy.

How is entropy relevant to human affairs? Life and happiness depend on an infinitesimal sliver of orderly arrangements of matter amid the astronomical number of possibilities. Our bodies are improbable assemblies of molecules, and they maintain that order with the help of other improbabilities: the few substances that can nourish us, the few materials in the few shapes that can clothe us, shelter us, and move things around to our liking. Far more of the arrangements of matter found on Earth are of no worldly use to us, so when things change without a human agent directing the change, they are likely to change for the worse. The Law of Entropy is widely acknowledged in everyday life in sayings such as “Things fall apart,” “Rust never sleeps,” “Shit happens,” “Whatever can go wrong will go wrong,” and (from the Texas lawmaker Sam Rayburn) “Any jackass can kick down a barn, but it takes a carpenter to build one.”

Scientists appreciate that the Second Law is far more than an explanation of everyday nuisances. It is a foundation of our understanding of the universe and our place in it. In 1928 the physicist Arthur Eddington wrote:

The law that entropy always increases . . . holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations—then so much the worse for Maxwell’s equations. If it is found to be contradicted by observation—well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.2

In his famous 1959 Rede lectures, published as The Two Cultures and the Scientific Revolution, the scientist and novelist C. P. Snow commented on the disdain for science among educated Britons in his day:

A good many times I have been present at gatherings of people who, by the standards of the traditional culture, are thought highly educated and who have with considerable gusto been expressing their incredulity at the illiteracy of scientists. Once or twice I have been provoked and have asked the company how many of them could describe the Second Law of Thermodynamics. The response was cold: it was also negative. Yet I was asking something which is about the scientific equivalent of: Have you read a work of Shakespeare’s?3

The chemist Peter Atkins alludes to the Second Law in the title of his book Four Laws That Drive the Universe. And closer to home, the evolutionary psychologists John Tooby, Leda Cosmides, and Clark Barrett entitled a recent paper on the foundations of the science of mind “The Second Law of Thermodynamics Is the First Law of Psychology.”4

Why the awe for the Second Law? From an Olympian vantage point, it defines the fate of the universe and the ultimate purpose of life, mind, and human striving: to deploy energy and knowledge to fight back the tide of entropy and carve out refuges of beneficial order. From a terrestrial vantage point we can get more specific, but before we get to familiar ground I need to lay out the other two foundational ideas.


At first glance the Law of Entropy would seem to allow for only a discouraging history and a depressing future. The universe began in a state of low entropy, the Big Bang, with its unfathomably dense concentration of energy. From there everything went downhill, with the universe dispersing—as it will continue to do—into a thin gruel of particles evenly and sparsely distributed through space. In reality, of course, the universe as we find it is not a featureless gruel. It is enlivened with galaxies, planets, mountains, clouds, snowflakes, and an efflorescence of flora and fauna, including us.

One reason the cosmos is filled with so much interesting stuff is a set of processes called self-organization, which allow circumscribed zones of order to emerge.5 When energy is poured into a system, and the system dissipates that energy in its slide toward entropy, it can become poised in an orderly, indeed beautiful, configuration—a sphere, spiral, starburst, whirlpool, ripple, crystal, or fractal. The fact that we find these configurations beautiful, incidentally, suggests that beauty may not just be in the eye of the beholder. The brain’s aesthetic response may be a receptiveness to the counter-entropic patterns that can spring forth from nature.

But there is another kind of orderliness in nature that also must be explained: not the elegant symmetries and rhythms in the physical world, but the functional design in the living world. Living things are made of organs that have heterogeneous parts which are uncannily shaped and arranged to do things that keep the organism alive (that is, continuing to absorb energy to resist entropy).6

The customary illustration of biological design is the eye, but I will make the point with my second-favorite sense organ. The human ear contains an elastic drumhead that vibrates in response to the slightest puff of air, a bony lever that multiplies the vibration’s force, a piston that impresses the vibration into the fluid in a long tunnel (conveniently coiled to fit inside the wall of the skull), a tapering membrane that runs down the length of the tunnel and physically separates the waveform into its harmonics, and an array of cells with tiny hairs that are flexed back and forth by the vibrating membrane, sending a train of electrical impulses to the brain. It is impossible to explain why these membranes and bones and fluids and hairs are arranged in that improbable way without noting that this configuration allows the brain to register patterned sound. Even the fleshy outer ear—asymmetrical top to bottom and front to back, and crinkled with ridges and valleys—is shaped in a way that sculpts the incoming sound to inform the brain whether the soundmaker is above or below, in front or behind.

Organisms are replete with improbable configurations of flesh like eyes, ears, hearts, and stomachs which cry out for an explanation. Before Charles Darwin and Alfred Russel Wallace provided one in 1859, it was reasonable to think they were the handiwork of a divine designer—one of the reasons, I suspect, that so many Enlightenment thinkers were deists rather than outright atheists. Darwin and Wallace made the designer unnecessary. Once self-organizing processes of physics and chemistry gave rise to a configuration of matter that could replicate itself, the copies would make copies, which would make copies of the copies, and so on, in an exponential explosion. The replicating systems would compete for the material to make their copies and the energy to power the replication. Since no copying process is perfect—the Law of Entropy sees to that—errors will crop up, and though most of these mutations will degrade the replicator (entropy again), occasionally dumb luck will throw one up that’s more effective at replicating, and its descendants will swamp the competition. As copying errors that enhance stability and replication accumulate over the generations, the replicating system—we call it an organism—will appear to have been engineered for survival and reproduction in the future, though it only preserved the copying errors that led to survival and reproduction in the past.

Creationists commonly doctor the Second Law of Thermodynamics to claim that biological evolution, an increase in order over time, is physically impossible. The part of the law they omit is “in a closed system.” Organisms are open systems: they capture energy from the sun, food, or ocean vents to carve out temporary pockets of order in their bodies and nests while they dump heat and waste into the environment, increasing disorder in the world as a whole. Organisms’ use of energy to maintain their integrity against the press of entropy is a modern explanation of the principle of conatus (effort or striving), which Spinoza defined as “the endeavor to persist and flourish in one’s own being,” and which was a foundation of several Enlightenment-era theories of life and mind.7

The ironclad requirement to suck energy out of the environment leads to one of the tragedies of living things. While plants bask in solar energy, and a few creatures of the briny deep soak up the chemical broth spewing from cracks in the ocean floor, animals are born exploiters: they live off the hard-won energy stored in the bodies of plants and other animals by eating them. So do the viruses, bacteria, and other pathogens and parasites that gnaw at bodies from the inside. With the exception of fruit, everything we call “food” is the body part or energy store of some other organism, which would just as soon keep that treasure for itself. Nature is a war, and much of what captures our attention in the natural world is an arms race. Prey animals protect themselves with shells, spines, claws, horns, venom, camouflage, flight, or self-defense; plants have thorns, rinds, bark, and irritants and poisons saturating their tissues. Animals evolve weapons to penetrate these defenses: carnivores have speed, talons, and eagle-eyed vision, while herbivores have grinding teeth and livers that detoxify natural poisons.


And now we come to the third keystone, information.8 Information may be thought of as a reduction in entropy—as the ingredient that distinguishes an orderly, structured system from the vast set of random, useless ones.9 Imagine pages of random characters tapped out by a monkey at a typewriter, or a stretch of white noise from a radio tuned between channels, or a screenful of confetti from a corrupted computer file. Each of these objects can take trillions of different forms, each as boring as the next. But now suppose that the devices are controlled by a signal that arranges the characters or sound waves or pixels into a pattern that correlates with something in the world: the Declaration of Independence, the opening bars of “Hey Jude,” a cat wearing sunglasses. We say that the signal transmits information about the Declaration or the song or the cat.10

The information contained in a pattern depends on how coarsely or finely grained our view of the world is. If we cared about the exact sequence of characters in the monkey’s output, or the precise difference between one burst of noise and another, or the particular pattern of pixels in just one of the haphazard displays, then we would have to say that each of the items contains the same amount of information as the others. Indeed, the interesting ones would contain less information, because when you look at one part (like the letter q) you can guess others (such as the following letter, u) without needing the signal. But more commonly we lump together the immense majority of random-looking configurations as equivalently boring, and distinguish them all from the tiny few that correlate with something else. From that vantage point the cat photo contains more information than the confetti of pixels, because it takes a garrulous message to pinpoint a rare orderly configuration out of the vast number of equivalently disorderly ones. To say that the universe is orderly rather than random is to say that it contains information in this sense. Some physicists enshrine information as one of the basic constituents of the universe, together with matter and energy.11

Information is what gets accumulated in a genome in the course of evolution. The sequence of bases in a DNA molecule correlates with the sequence of amino acids in the proteins that make up the organism’s body, and they got that sequence by structuring the organism’s ancestors—reducing their entropy—into the improbable configurations that allowed them to capture energy and grow and reproduce.

Information is also collected by an animal’s nervous system as it lives its life. When the ear transduces sound into neural firings, the two physical processes—vibrating air and diffusing ions—could not be more different. But thanks to the correlation between them, the pattern of neural activity in the animal’s brain carries information about the sound in the world. From there the information can switch from electrical to chemical and back as it crosses the synapses connecting one neuron to the next; through all these physical transformations, the information is preserved.

A momentous discovery of 20th-century theoretical neuroscience is that networks of neurons not only can preserve information but can transform it in ways that allow us to explain how brains can be intelligent. Two input neurons can be connected to an output neuron in such a way that their firing patterns correspond to logical relations such as AND, OR, and NOT, or to a statistical decision that depends on the weight of the incoming evidence. That gives neural networks the power to engage in information processing or computation. Given a large enough network built out of these logical and statistical circuits (and with billions of neurons, the brain has room for plenty), a brain can compute complex functions, the prerequisite for intelligence. It can transform the information about the world that it receives from the sense organs in a way that mirrors the laws governing that world, which in turn allows it to make useful inferences and predictions.12 Internal representations that reliably correlate with states of the world, and that participate in inferences that tend to derive true implications from true premises, may be called knowledge.13 We say that someone knows what a robin is if she thinks the thought “robin” whenever she sees one, and if she can infer that it is a kind of bird which appears in the spring and pulls worms out of the ground.

Getting back to evolution, a brain wired by information in the genome to perform computations on information coming in from the senses could organize the animal’s behavior in a way that allowed it to capture energy and resist entropy. It could, for example, implement the rule “If it squeaks, chase it; if it barks, flee from it.”

Chasing and fleeing, though, are not just sequences of muscle contractions—they are goal-directed. Chasing may consist of running or climbing or leaping or ambushing, depending on the circumstances, as long as it increases the chances of snagging the prey; fleeing may include hiding or freezing or zigzagging. And that brings up another momentous 20th-century idea, sometimes called cybernetics, feedback, or control. The idea explains how a physical system can appear to be teleological, that is, directed by purposes or goals. All it needs are a way of sensing the state of itself and its environment, a representation of a goal state (what it “wants,” what it’s “trying for”), an ability to compute the difference between the current state and the goal state, and a repertoire of actions that are tagged with their typical effects. If the system is wired so that it triggers actions that typically reduce the difference between the current state and the goal state, it can be said to pursue goals (and when the world is sufficiently predictable, it will attain them). The principle was discovered by natural selection in the form of homeostasis, as when our bodies regulate their temperature by shivering and sweating. When it was discovered by humans, it was engineered into analog systems like thermostats and cruise control and then into digital systems like chess-playing programs and autonomous robots.

The principles of information, computation, and control bridge the chasm between the physical world of cause and effect and the mental world of knowledge, intelligence, and purpose. It’s not just a rhetorical aspiration to say that ideas can change the world; it’s a fact about the physical makeup of brains. The Enlightenment thinkers had an inkling that thought could consist of patterns in matter—they likened ideas to impressions in wax, vibrations in a string, or waves from a boat. And some, like Hobbes, proposed that “reasoning is but reckoning,” in the original sense of reckoning as calculation. But before the concepts of information and computation were elucidated, it was reasonable for someone to be a mind-body dualist and attribute mental life to an immaterial soul (just as before the concept of evolution was elucidated, it was reasonable to be a creationist and attribute design in nature to a cosmic designer). That’s another reason, I suspect, that so many Enlightenment thinkers were deists.

Of course it’s natural to think twice about whether your cell phone truly “knows” a favorite number, your GPS is really “figuring out” the best route home, and your Roomba is genuinely “trying” to clean the floor. But as information-processing systems become more sophisticated—as their representations of the world become richer, their goals are arranged into hierarchies of subgoals within subgoals, and their actions for attaining the goals become more diverse and less predictable—it starts to look like hominid chauvinism to insist that they don’t. (Whether information and computation explain consciousness, in addition to knowledge, intelligence, and purpose, is a question I’ll turn to in the final chapter.)

Human intelligence remains the benchmark for the artificial kind, and what makes Homo sapiens an unusual species is that our ancestors invested in bigger brains that collected more information about the world, reasoned about it in more sophisticated ways, and deployed a greater variety of actions to achieve their goals. They specialized in the cognitive niche, also called the cultural niche and the hunter-gatherer niche.14 This embraced a suite of new adaptations, including the ability to manipulate mental models of the world and predict what would happen if one tried out new things; the ability to cooperate with others, which allowed teams of people to accomplish what a single person could not; and language, which allowed them to coordinate their actions and to pool the fruits of their experience into the collections of skills and norms we call cultures.15 These investments allowed early hominids to defeat the defenses of a wide range of plants and animals and reap the bounty in energy, which stoked their expanding brains, giving them still more know-how and access to still more energy. A well-studied contemporary hunter-gatherer tribe, the Hadza of Tanzania, who live in the ecosystem where modern humans first evolved and probably preserve much of their lifestyle, extract 3,000 calories daily per person from more than 880 species.16 They create this menu through ingenious and uniquely human ways of foraging, such as felling large animals with poison-tipped arrows, smoking bees out of their hives to steal their honey, and enhancing the nutritional value of meat and tubers by cooking them.

Energy channeled by knowledge is the elixir with which we stave off entropy, and advances in energy capture are advances in human destiny. The invention of farming around ten thousand years ago multiplied the availability of calories from cultivated plants and domesticated animals, freed a portion of the population from the demands of hunting and gathering, and eventually gave them the luxury of writing, thinking, and accumulating their ideas. Around 500 BCE, in what the philosopher Karl Jaspers called the Axial Age, several widely separated cultures pivoted from systems of ritual and sacrifice that merely warded off misfortune to systems of philosophical and religious belief that promoted selflessness and promised spiritual transcendence.17 Taoism and Confucianism in China, Hinduism, Buddhism, and Jainism in India, Zoroastrianism in Persia, Second Temple Judaism in Judea, and classical Greek philosophy and drama emerged within a few centuries of one another. (Confucius, Buddha, Pythagoras, Aeschylus, and the last of the Hebrew prophets walked the earth at the same time.) Recently an interdisciplinary team of scholars identified a common cause.18 It was not an aura of spirituality that descended on the planet but something more prosaic: energy capture. The Axial Age was when agricultural and economic advances provided a burst of energy: upwards of 20,000 calories per person per day in food, fodder, fuel, and raw materials. This surge allowed the civilizations to afford larger cities, a scholarly and priestly class, and a reorientation of their priorities from short-term survival to long-term harmony. As Bertolt Brecht put it millennia later: Grub first, then ethics.19

When the Industrial Revolution released a gusher of usable energy from coal, oil, and falling water, it launched a Great Escape from poverty, disease, hunger, illiteracy, and premature death, first in the West and increasingly in the rest of the world (as we shall see in chapters 5–8). And the next leap in human welfare—the end of extreme poverty and spread of abundance, with all its moral benefits—will depend on technological advances that provide energy at an acceptable economic and environmental cost to the entire world (chapter 10).


Entro, evo, info. These concepts define the narrative of human progress: the tragedy we were born into, and our means for eking out a better existence.

The first piece of wisdom they offer is that misfortune may be no one’s fault. A major breakthrough of the Scientific Revolution—perhaps its biggest breakthrough—was to refute the intuition that the universe is saturated with purpose. In this primitive but ubiquitous understanding, everything happens for a reason, so when bad things happen—accidents, disease, famine, poverty—some agent must have wanted them to happen. If a person can be fingered for the misfortune, he can be punished or squeezed for damages. If no individual can be singled out, one might blame the nearest ethnic or religious minority, who can be lynched or massacred in a pogrom. If no mortal can plausibly be indicted, one might cast about for witches, who may be burned or drowned. Failing that, one points to sadistic gods, who cannot be punished but can be placated with prayers and sacrifices. And then there are disembodied forces like karma, fate, spiritual messages, cosmic justice, and other guarantors of the intuition that “everything happens for a reason.”

Galileo, Newton, and Laplace replaced this cosmic morality play with a clockwork universe in which events are caused by conditions in the present, not goals for the future.20 People have goals, of course, but projecting goals onto the workings of nature is an illusion. Things can happen without anyone taking into account their effects on human happiness.

This insight of the Scientific Revolution and the Enlightenment was deepened by the discovery of entropy. Not only does the universe not care about our desires, but in the natural course of events it will appear to thwart them, because there are so many more ways for things to go wrong than for them to go right. Houses burn down, ships sink, battles are lost for want of a horseshoe nail.

Awareness of the indifference of the universe was deepened still further by an understanding of evolution. Predators, parasites, and pathogens are constantly trying to eat us, and pests and spoilage organisms try to eat our stuff. It may make us miserable, but that’s not their problem.

Poverty, too, needs no explanation. In a world governed by entropy and evolution, it is the default state of humankind. Matter does not arrange itself into shelter or clothing, and living things do everything they can to avoid becoming our food. As Adam Smith pointed out, what needs to be explained is wealth. Yet even today, when few people believe that accidents or diseases have perpetrators, discussions of poverty consist mostly of arguments about whom to blame for it.

None of this is to say that the natural world is free of malevolence. On the contrary, evolution guarantees there will be plenty of it. Natural selection consists of competition among genes to be represented in the next generation, and the organisms we see today are descendants of those that edged out their rivals in contests for mates, food, and dominance. This does not mean that all creatures are always rapacious; modern evolutionary theory explains how selfish genes can give rise to unselfish organisms. But the generosity is measured. Unlike the cells in a body or the individuals in a colonial organism, humans are genetically unique, each having accumulated and recombined a different set of mutations that arose over generations of entropy-prone replication in their lineage. Genetic individuality gives us our different tastes and needs, and it also sets the stage for strife. Families, couples, friends, allies, and societies seethe with partial conflicts of interest, which are played out in tension, arguments, and sometimes violence. Another implication of the Law of Entropy is that a complex system like an organism can easily be disabled, because its functioning depends on so many improbable conditions being satisfied at once. A rock against the head, a hand around the neck, a well-aimed poisoned arrow, and the competition is neutralized. More tempting still to a language-using organism, a threat of violence may be used to coerce a rival, opening the door to oppression and exploitation.

Evolution left us with another burden: our cognitive, emotional, and moral faculties are adapted to individual survival and reproduction in an archaic environment, not to universal thriving in a modern one. To appreciate this burden, one doesn’t have to believe that we are cavemen out of time, only that evolution, with its speed limit measured in generations, could not possibly have adapted our brains to modern technology and institutions. Humans today rely on cognitive faculties that worked well enough in traditional societies, but which we now see are infested with bugs.

People are by nature illiterate and innumerate, quantifying the world by “one, two, many” and by rough guesstimates.21 They understand physical things as having hidden essences that obey the laws of sympathetic magic or voodoo rather than physics and biology: objects can reach across time and space to affect things that resemble them or that had been in contact with them in the past (remember the beliefs of pre–Scientific Revolution Englishmen).22 They think that words and thoughts can impinge on the physical world in prayers and curses. They underestimate the prevalence of coincidence.23 They generalize from paltry samples, namely their own experience, and they reason by stereotype, projecting the typical traits of a group onto any individual that belongs to it. They infer causation from correlation. They think holistically, in black and white, and physically, treating abstract networks as concrete stuff. They are not so much intuitive scientists as intuitive lawyers and politicians, marshaling evidence that confirms their convictions while dismissing evidence that contradicts them.24 They overestimate their own knowledge, understanding, rectitude, competence, and luck.25

The human moral sense can also work at cross-purposes to our well-being.26 People demonize those they disagree with, attributing differences of opinion to stupidity and dishonesty. For every misfortune they seek a scapegoat. They see morality as a source of grounds for condemning rivals and mobilizing indignation against them.27 The grounds for condemnation may consist in the defendants’ having harmed others, but they also may consist in their having flouted custom, questioned authority, undermined tribal solidarity, or engaged in unclean sexual or dietary practices. People see violence as moral, not immoral: across the world and throughout history, more people have been murdered to mete out justice than to satisfy greed.28


But we’re not all bad. Human cognition comes with two features that give it the means to transcend its limitations.29 The first is abstraction. People can co-opt their concept of an object at a place and use it to conceptualize an entity in a circumstance, as when we take the pattern of a thought like The deer ran from the pond to the hill and apply it to The child went from sick to well. They can co-opt the concept of an agent exerting physical force and use it to conceptualize other kinds of causation, as when we extend the image in She forced the door to open to She forced Lisa to join her or She forced herself to be polite. These formulas give people the means to think about a variable with a value and about a cause and its effect—just the conceptual machinery one needs to frame theories and laws. They can do this not just with the elements of thought but with more complex assemblies, allowing them to think in metaphors and analogies: heat is a fluid, a message is a container, a society is a family, obligations are bonds.

The second stepladder of cognition is its combinatorial, recursive power. The mind can entertain an explosive variety of ideas by assembling basic concepts like thing, place, path, actor, cause, and goal into propositions. And it can entertain not only propositions, but propositions about the propositions, and propositions about the propositions about the propositions. Bodies contain humors; illness is an imbalance in the humors that bodies contain; I no longer believe the theory that illness is an imbalance in the humors that bodies contain.

Thanks to language, ideas are not just abstracted and combined inside the head of a single thinker but can be pooled across a community of thinkers. Thomas Jefferson explained the power of language with the help of an analogy: “He who receives an idea from me, receives instruction himself without lessening mine; as he who lights his taper at mine, receives light without darkening me.”30 The potency of language as the original sharing app was multiplied by the invention of writing (and again in later epochs by the printing press, the spread of literacy, and electronic media). The networks of communicating thinkers expanded over time as populations grew, mixed, and became concentrated in cities. And the availability of energy beyond the minimum needed for survival gave more of them the luxury to think and talk.

When large and connected communities take shape, they can come up with ways of organizing their affairs that work to their members’ mutual advantage. Though everyone wants to be right, as soon as people start to air their incompatible views it becomes clear that not everyone can be right about everything. Also, the desire to be right can collide with a second desire, to know the truth, which is uppermost in the minds of bystanders to an argument who are not invested in which side wins. Communities can thereby come up with rules that allow true beliefs to emerge from the rough-and-tumble of argument, such as that you have to provide reasons for your beliefs, you’re allowed to point out flaws in the beliefs of others, and you’re not allowed to forcibly shut people up who disagree with you. Add in the rule that you should allow the world to show you whether your beliefs are true or false, and we can call the rules science. With the right rules, a community of less than fully rational thinkers can cultivate rational thoughts.31

The wisdom of crowds can also elevate our moral sentiments. When a wide enough circle of people confer on how best to treat each other, the conversation is bound to go in certain directions. If my starting offer is “I get to rob, beat, enslave, and kill you and your kind, but you don’t get to rob, beat, enslave, or kill me or my kind,” I can’t expect you to agree to the deal or third parties to ratify it, because there’s no good reason that I should get privileges just because I’m me and you’re not.32 Nor are we likely to agree to the deal “I get to rob, beat, enslave, and kill you and your kind, and you get to rob, beat, enslave, and kill me and my kind,” despite its symmetry, because the advantages either of us might get in harming the other are massively outweighed by the disadvantages we would suffer in being harmed (yet another implication of the Law of Entropy: harms are easier to inflict and have larger effects than benefits). We’d be wiser to negotiate a social contract that puts us in a positive-sum game: neither gets to harm the other, and both are encouraged to help the other.

So for all the flaws in human nature, it contains the seeds of its own improvement, as long as it comes up with norms and institutions that channel parochial interests into universal benefits. Among those norms are free speech, nonviolence, cooperation, cosmopolitanism, human rights, and an acknowledgment of human fallibility, and among the institutions are science, education, media, democratic government, international organizations, and markets. Not coincidentally, these were the major brainchildren of the Enlightenment.

Загрузка...