CHAPTER 12 SAFETY


The human body is a fragile thing. Even when people keep themselves fueled, functioning, and free of pathogens, they are vulnerable to “the thousand shocks that flesh is heir to.” Our ancestors were easy pickings for predators like crocodiles and large cats. They were done in by the venom of snakes, spiders, insects, snails, and frogs. Trapped in the omnivore’s dilemma, they could be poisoned by toxic ingredients in their expansive diets, including fish, beans, roots, seeds, and mushrooms. As they ventured up trees in pursuit of fruit and honey, their bodies obeyed Newton’s law of universal gravitation and were liable to accelerate toward the ground at a rate of 9.8 meters per second per second. If they waded too far into lakes and rivers, the water could cut off their air supply. They played with fire and sometimes got burned. And they could be victims of malice aforethought: any technology that can fell an animal can fell a human rival.

Few people get eaten today, but every year tens of thousands die from snakebites, and other hazards continue to kill us in large numbers.1 Accidents are the fourth-leading cause of death in the United States, after heart disease, cancer, and respiratory diseases. Worldwide, injuries account for about a tenth of all deaths, outnumbering the victims of AIDS, malaria, and tuberculosis combined, and are responsible for 11 percent of the years lost to death and disability.2 Personal violence also takes a toll: it is among the top five hazards for young people in the United States and for all people in Latin America and sub-Saharan Africa.3

People have long given thought to the causes of danger and how they might be forfended. Perhaps the most stirring moment in Jewish religious observance is a prayer recited before the open Torah ark during the Days of Awe:

On Rosh Hashanah will be inscribed and on Yom Kippur will be sealed: . . . who will live and who will die; who will die at his allotted time and who before his time, who by water and who by fire, who by sword and who by beast, who by famine and who by thirst, who by earthquake and who by plague, who by strangling and who by stoning. . . . But repentance, prayer, and charity annul the severity of the decree.

Fortunately, our knowledge of how fatalities are caused has gone beyond divine inscription, and our means of preventing them have become more reliable than repentance, prayer, and charity. Human ingenuity has been vanquishing the major hazards of life, including every one enumerated in the prayer, and we are now living in the safest time in history.

In previous chapters we have seen how cognitive and moralistic biases work to damn the present and absolve the past. In this one we will see another way in which they conceal our progress. Though lethal injuries are a major scourge of human life, bringing the numbers down is not a sexy cause. The inventor of the highway guard rail did not get a Nobel Prize, nor are humanitarian awards given to designers of clearer prescription drug labels. Yet humanity has benefited tremendously from unsung efforts that have decimated the death toll from every kind of injury.


Who by sword. Let’s begin with the category of injury that is the hardest to eliminate precisely because it is no accident, homicide. With the exception of the world wars, more people are killed in homicides than wars.4 During the battle-scarred year of 2015 the ratio was around 4.5 to 1; more commonly it is 10 to 1 or higher. Homicides were an even greater threat to life in the past. In medieval Europe, lords massacred the serfs of their rivals, aristocrats and their retinues fought each other in duels, brigands and highwaymen murdered the victims of their robberies, and ordinary people stabbed each other over insults at the dinner table.5

But in a sweeping historical development that the German sociologist Norbert Elias called the Civilizing Process, Western Europeans, starting in the 14th century, began to resolve their disputes in less violent ways.6 Elias credited the change to the emergence of centralized kingdoms out of the medieval patchwork of baronies and duchies, so that the endemic feuding, brigandage, and warlording were tamed by a “king’s justice.” Then, in the 19th century, criminal justice systems were further professionalized by municipal police forces and a more deliberative court system. Over those centuries Europe also developed an infrastructure of commerce, both physical, in the form of better roads and vehicles, and financial, in the form of currency and contracts. Gentle commerce proliferated, and the zero-sum plundering of land gave way to a positive-sum trade of goods and services. People became enmeshed in networks of commercial and occupational obligations laid out in legal and bureaucratic rules. Their norms for everyday conduct shifted from a macho culture of honor, in which affronts had to be answered with violence, to a gentlemanly culture of dignity, in which status was won by displays of propriety and self-control.

The historical criminologist Manuel Eisner has assembled datasets on homicide in Europe which put numbers to the narrative that Elias had published in 1939.7 (Homicide rates are the most reliable indicator of violent crime across different times and places because a corpse is always hard to overlook, and rates of homicide correlate with rates of other violent crimes like robbery, assault, and rape.) Eisner argues that Elias’s theory was on the right track, and not just in Europe. Whenever a government brings a frontier region under the rule of law and its people become integrated into a commercial society, rates of violence fall. In figure 12-1, I show Eisner’s data for England, the Netherlands, and Italy, with updates through 2012; the curves for other Western European countries are similar. I have added lines for parts of the Americas in which law and order came later: colonial New England, followed by a region in the “Wild West,” followed by Mexico, notorious for its violence today but far more violent in the past.

When I introduced the concept of progress I noted that no progressive trend is inexorable, and violent crime is a case in point. Starting in the 1960s, most Western democracies saw a boom in personal violence that erased a century of progress.8 It was most dramatic in the United States, where the rate of homicide shot up by a factor of two and a half, and where urban and political life were upended by a widespread (and partly justified) fear of crime. Yet this reversal of progress has its own lessons for the nature of progress.

During the high-crime decades, most experts counseled that nothing could be done about violent crime. It was woven into the fabric of a violent American society, they said, and could not be controlled without solving the root causes of racism, poverty, and inequality. This version of historical pessimism may be called root-causism: the pseudo-profound idea that every social ill is a symptom of some deep moral sickness and can never be mitigated by simplistic treatments which fail to cure the gangrene at the core.9 The problem with root-causism is not that real-world problems are simple but the opposite: they are more complex than a typical root-cause theory allows, especially when the theory is based on moralizing rather than data. So complex, in fact, that treating the symptoms may be the best way of dealing with the problem, because it does not require omniscience about the intricate tissue of actual causes. Indeed, by seeing what really does reduce the symptoms, one can test hypotheses about the causes, rather than just assuming them to be true.


Figure 12-1: Homicide deaths, Western Europe, US, and Mexico, 1300–2015

Sources: England, Netherlands & Belgium, Italy, 1300–1994: Eisner 2003, plotted in fig. 3–3 of Pinker 2011. England, 2000–2014: UK Office for National Statistics. Italy and Netherlands, 2010–2012: United Nations Office on Drugs and Crime 2014. New England (New England, whites only, 1636–1790, and Vermont and New Hampshire, 1780–1890): Roth 2009, plotted in fig. 3–13 of Pinker 2011; 2006 and 2014 from FBI Uniform Crime Reports. Southwest US (Arizona, Nevada, and New Mexico), 1850 and 1914: Roth 2009, plotted in fig. 3–16 of Pinker 2011; 2006 and 2014 from FBI Uniform Crime Reports. Mexico: Carlos Vilalta, personal communication, originally from Instituto Nacional de Estadística y Geografía 2016 and Botello 2016, averaged over decades until 2010.

In the case of the 1960s crime explosion, even the facts at hand refuted the root-cause theory. That was the decade of civil rights, with racism in steep decline (chapter 15), and of an economic boom, with levels of inequality and unemployment for which we are nostalgic.10 The 1930s, in contrast, was the decade of the Great Depression, Jim Crow laws, and monthly lynchings, yet the overall rate of violent crime plummeted. The root-cause theory was truly deracinated by a development that took everyone by surprise. Starting in 1992, the American homicide rate went into free fall during an era of steeply rising inequality, and then took another dive during the Great Recession beginning in 2007 (figure 12-2).11 England, Canada, and most other industrialized countries also saw their homicide rates fall in the past two decades. (Conversely, in Venezuela during the Chávez-Maduro regime, inequality fell while homicide soared.)12 Though numbers for the entire world exist only for this millennium and include heroic guesstimates for countries that are data deserts, the trend appears to be downward as well, from 8.8 homicides per 100,000 people in 2000 to 6.2 in 2012. That means there are 180,000 people walking around today who would have been murdered just in the last year if the global homicide rate had remained at its level of a dozen years before.13


Figure 12-2: Homicide deaths, 1967–2015

Sources: United States: FBI Uniform Crime Reports, https://ucr.fbi.gov/, and Federal Bureau of Investigation 2016. England (data include Wales): Office for National Statistics 2017. World, 2000: Krug et al. 2002. World, 2003–2011: United Nations Economic and Social Council 2014, fig. 1; the percentages were converted to homicide rates by setting the 2012 rate at 6.2, the estimate reported in United Nations Office on Drugs and Crime 2014, p. 12. The arrows point to the most recent years plotted in Pinker 2011 for the world (2004, fig. 3–9), US (2009, fig. 3–18), and England (2009, fig. 3–19).

Violent crime is a solvable problem. We may never get the homicide rate for the world down to the levels of Kuwait (0.4 per 100,000 per year), Iceland (0.3), or Singapore (0.2), let alone all the way to 0.14 But in 2014, Eisner, in consultation with the World Health Organization, proposed a goal of reducing the rate of global homicide by 50 percent within thirty years.15 The aspiration is not utopian but practical, based on two facts about the statistics of homicide.

The first is that the distribution of homicide is highly skewed at every level of granularity. The homicide rates of the most dangerous countries are several hundred times those of the safest, including Honduras (90.4 homicides per 100,000 per year), Venezuela (53.7), El Salvador (41.2), Jamaica (39.3), Lesotho (38), and South Africa (31).16 Half of the world’s homicides are committed in just twenty-three countries containing about a tenth of humanity, and a quarter are committed in just four: Brazil (25.2), Colombia (25.9), Mexico (12.9), and Venezuela. (The world’s two murder zones—northern Latin America and southern sub-Saharan Africa—are distinct from its war zones, which stretch from Nigeria through the Middle East into Pakistan.) The lopsidedness continues down the fractal scale. Within a country, most of the homicides cluster in a few cities, such as Caracas (120 per 100,000) and San Pedro Sula (in Honduras, 187). Within cities, the homicides cluster in a few neighborhoods; within neighborhoods, they cluster in a few blocks; and within blocks, many are carried out by a few individuals.17 In my hometown of Boston, 70 percent of the shootings take place in 5 percent of the city, and half the shootings were perpetrated by one percent of the youths.18

The other inspiration for the 50-30 goal is evident from figure 12-2: high rates of homicide can be brought down quickly. The most murderous affluent democracy, the United States, saw its homicide rate plunge by almost half in nine years; New York City’s decline during that time was even steeper, around 75 percent.19 Countries that are still more famous for their violence have also enjoyed steep declines, including Russia (from 19 per 100,000 in 2004 to 9.2 in 2012), South Africa (from 60.0 in 1995 to 31.2 in 2012), and Colombia (from 79.3 in 1991 to 25.9 in 2015).20 Among the eighty-eight countries with reliable data, sixty-seven have seen a decline in the last fifteen years.21 The unlucky ones (mostly in Latin America) have been ravaged by terrible increases, but even there, when leaders of cities and regions set their mind to reducing the bloodshed, they often succeed.22 Figure 12-1 shows that Mexico, after suffering a reversal from 2007 to 2011 (entirely attributable to organized crime), enjoyed a reversal of the reversal by 2014, including an almost 90 percent drop from 2010 to 2012 in notorious Juárez.23 Bogotá and Medellín saw declines by four-fifths in two decades, and São Paulo and the favelas of Rio de Janeiro saw declines by two-thirds.24 Even the world’s murder capital, San Pedro Sula, has seen homicide rates plunge by 62 percent in just two years.25

Now, combine the cockeyed distribution of violent crime with the proven possibility that high rates of violent crime can be brought down quickly, and the math is straightforward: a 50 percent reduction in thirty years is not just practicable but almost conservative.26 And it’s no statistical trick. The moral value of quantification is that it treats all lives as equally valuable, so actions that bring down the highest numbers of homicides prevent the greatest amount of human tragedy.

The lopsided skew of violent crime also points a flashing red arrow at the best way to reduce it.27 Forget root causes. Stay close to the symptoms—the neighborhoods and individuals responsible for the biggest wedges of violence—and chip away at the incentives and opportunities that drive them.

It begins with law enforcement. As Thomas Hobbes argued during the Age of Reason, zones of anarchy are always violent.28 It’s not because everyone wants to prey on everyone else, but because in the absence of a government the threat of violence can be self-inflating. If even a few potential predators lurk in the region or could show up on short notice, people must adopt an aggressive posture to deter them. This deterrent is credible if only they advertise their resolve by retaliating against any affront and avenging any depredation, regardless of the cost. This “Hobbesian trap,” as it is sometimes called, can easily set off cycles of feuding and vendetta: you have to be at least as violent as your adversaries lest you become their doormat. The largest category of homicide, and the one that varies the most across times and places, consists of confrontations between loosely acquainted young men over turf, reputation, or revenge. A disinterested third party with a monopoly on the legitimate use of force—that is, a state with a police force and judiciary—can nip this cycle in the bud. Not only does it disincentivize aggressors by the threat of punishment, but it reassures everyone else that the aggressors are disincentivized and thereby relieves them of the need for belligerent self-defense.

The most blatant evidence for the impact of law enforcement may be found in the sky-high rates of violence in the times and places where law enforcement is rudimentary, such as the upper left tips of the curves in figure 12-1. Equally persuasive is what happens when police go on strike: an eruption of looting and vigilantism.29 But crime rates can also soar when law enforcement is merely ineffective—when it is so inept, corrupt, or overwhelmed that people know they can break the law with impunity. That was a contributor to the 1960s crime boom, when the judicial system was no match for a wave of baby boomers entering their crime-prone years, and it is a contributor to the high-crime regions of Latin America today.30 Conversely, an expansion of policing and criminal punishment (though with a big overshoot in incarceration) explains a good part of the Great American Crime Decline of the 1990s.31

Here is Eisner’s one-sentence summary of how to halve the homicide rate within three decades: “An effective rule of law, based on legitimate law enforcement, victim protection, swift and fair adjudication, moderate punishment, and humane prisons is critical to sustainable reductions in lethal violence.”32 The adjectives effective, legitimate, swift, fair, moderate, and humane differentiate his advice from the get-tough-on-crime rhetoric favored by right-wing politicians. The reasons were explained by Cesare Beccaria two hundred and fifty years ago. While the threat of ever-harsher punishments is both cheap and emotionally satisfying, it’s not particularly effective, because scofflaws just treat them like rare accidents—horrible, yes, but a risk that comes with the job. Punishments that are predictable, even if less draconian, are likelier to be factored into day-to-day choices.

Together with the presence of law enforcement, the legitimacy of the regime appears to matter, because people not only respect legitimate authority themselves but factor in the degree to which they expect their potential adversaries to respect it. Eisner, together with the historian Randolph Roth, notes that crime often shoots up in decades in which people question their society and government, including the American Civil War, the 1960s, and post-Soviet Russia.33

Recent reviews of what does and doesn’t work in crime prevention back up Eisner’s advisory, particularly a massive meta-analysis by the sociologists Thomas Abt and Christopher Winship of 2,300 studies evaluating just about every policy, plan, program, project, initiative, intervention, nostrum, and gimmick that has been tried in recent decades.34 They concluded that the single most effective tactic for reducing violent crime is focused deterrence. A “laser-like focus” must first be directed on the neighborhoods where crime is rampant or even just starting to creep up, with the “hot spots” identified by data gathered in real time. It must be further beamed at the individuals and gangs who are picking on victims or roaring for a fight. And it must deliver a simple and concrete message about the behavior that is expected of them, like “Stop shooting and we will help you, keep shooting and we will put you in prison.” Getting the message through, and then enforcing it, depends on the cooperation of other members of the community—the store owners, preachers, coaches, probation officers, and relatives.

Also provably effective is cognitive behavioral therapy. This has nothing to do with psychoanalyzing an offender’s childhood conflicts or propping his eyelids open while he retches to violent film clips like in A Clockwork Orange. It is a set of protocols designed to override the habits of thought and behavior that lead to criminal acts. Troublemakers are impulsive: they seize on sudden opportunities to steal or vandalize, and lash out at people who cross them, heedless of the long-term consequences.35 These temptations can be counteracted with therapies that teach strategies of self-control. Troublemakers also have narcissistic and sociopathic thought patterns, such as that they are always in the right, that they are entitled to universal deference, that disagreements are personal insults, and that other people have no feelings or interests. Though they cannot be “cured” of these delusions, they can be trained to recognize and counteract them.36 This swaggering mindset is amplified in a culture of honor, and it can be deconstructed in therapies of anger management and social-skills training as part of counseling for at-risk youth or programs to prevent recidivism.

Whether or not their impetuousness has been brought under control, potential miscreants can stay out of trouble simply because opportunities for instant gratification have been removed from their environments.37 When cars are harder to steal, houses are harder to burgle, goods are harder to pilfer and fence, pedestrians carry more credit cards than cash, and dark alleys are lit and video-monitored, would-be criminals don’t seek another outlet for their larcenous urges. The temptation passes, and a crime is not committed. Cheap consumer goods are another development that has turned weak-willed delinquents into law-abiding citizens despite themselves. Who nowadays would take the risk of breaking into an apartment just to steal a clock radio?

Together with anarchy, impulsiveness, and opportunity, a major trigger of criminal violence is contraband. Entrepreneurs in illegal goods and pastimes cannot file a lawsuit when they feel they have been swindled, or call the police when someone threatens them, so they have to protect their interests with a credible threat of violence. Violent crime exploded in the United States when alcohol was prohibited in the 1920s and when crack cocaine became popular in the late 1980s, and it is rampant in Latin American and Caribbean countries in which cocaine, heroin, and marijuana are trafficked today. Drug-fueled violence remains an unsolved international problem. Perhaps the ongoing decriminalization of marijuana, and in the future other drugs, will lift these industries out of their lawless underworld. In the meantime, Abt and Winship observe that “aggressive drug enforcement yields little anti-drug benefits and generally increases violence,” while “drug courts and treatment have a long history of effectiveness.”38

Any evidence-based reckoning is bound to pour cold water on programs that seemed promising in the theater of the imagination. Conspicuous by their absence from the list of what works are bold initiatives like slum clearance, gun buybacks, zero-tolerance policing, wilderness ordeals, three-strikes-and-you’re-out mandatory sentencing, police-led drug awareness classes, and “scared straight” programs in which at-risk youths are exposed to squalid prisons and badass convicts. And perhaps most disappointing to those who hold strong opinions without needing evidence are the equivocal effects of gun legislation. Neither right-to-carry laws favored by the right, nor bans and restrictions favored by the left, have been shown to make much difference—though there is much we don’t know, and political and practical impediments to finding out more.39


As I sought to explain various declines of violence in The Better Angels of Our Nature I put little stock in the idea that in the past “human life was cheap” and that over time it became more precious. It seemed woolly and untestable, almost circular, so I stuck to explanations that were closer to the phenomena, such as governance and trade. After sending in the manuscript, I had an experience that gave me second thoughts. To reward myself for completing that massive undertaking I decided to replace my rusty old car, and in the course of car shopping I bought the latest issue of Car and Driver magazine. The issue opened with an article called “Safety in Numbers: Traffic Deaths Fall to an All-Time Low,” and it was illustrated with a graph that was instantly familiar: time on the x-axis, rate of death on the y-axis, and a line that snaked from the top left to the bottom right.40 Between 1950 and 2009, the rate of death in traffic accidents fell sixfold. Staring up at me was yet another decline in violent death, but this time dominance and hatred had nothing to do with it. Some combination of forces had been working over the decades to reduce the risk of death from driving—as if, yes, life had become more precious. As society became richer, it spent more of its income, ingenuity, and moral passion on saving lives on the roads.

Later I learned that Car and Driver had been conservative. Had they plotted the dataset from its first year, 1921, it would have shown an almost twenty-four-fold reduction in the death rate. Figure 12-3 shows the full time line—though not even the full story, since for every person who died there were others who were crippled, disfigured, and racked with pain.


Figure 12-3: Motor vehicle accident deaths, US, 1921–2015

Sources: National Highway Traffic Safety Administration, accessed from http://www.informedforlife.org/demos/FCKeditor/UserFiles/File/TRAFFICFATALITIES(1899-2005).pdf, http://www-fars.nhtsa.dot.gov/Main/index.aspx, and https://crashstats.nhtsa.dot.gov/Api/Public/ViewPublication/812384.

The magazine graph was annotated with landmarks in auto safety which identified the technological, commercial, political, and moralistic forces at work. Over the short run they sometimes pushed against each other, but over the long run they collectively pulled the death rate down, down, down. At times there were moral crusades to reduce the carnage, with automobile manufacturers as the villains. In 1965 a young lawyer named Ralph Nader published Unsafe at Any Speed, a j’accuse of the industry for neglecting safety in automotive design. Soon after, the National Highway Traffic Safety Administration was established and legislation was passed requiring new cars to be equipped with a number of safety features. Yet the graph shows that steeper reductions came before the activism and the legislation, and the auto industry was sometimes ahead of its customers and regulators. A signpost in the graph pointing to 1956 notes, “Ford Motor Company offers the ‘Lifeguard’ package. . . . It includes seatbelts, a padded dash, padded visors, and a recessed steering-wheel hub designed to not turn drivers into a kebab during a collision. It is a sales failure.” It took a decade for those features to become mandatory.

Sprinkled along the slope were other episodes of push and pull among engineers, consumers, corporate suits, and government bureaucrats. At various times, crumple zones, four-wheel dual braking systems, collapsible steering columns, high-mounted center brake lights, buzzing and garroting seat belts, and air bags and stability control systems wended their way from the lab to the showroom. Another lifesaver was the paving of long ribbons of countryside into divided, reflectored, guard-railed, smooth-curved, and broad-shouldered interstate highways. In 1980 Mothers Against Drunk Driving was formed, and they lobbied for higher drinking ages, lowered legal blood alcohol levels, and the stigmatization of drunk driving, which popular culture had treated as a source of comedy (such as in the movies North by Northwest and Arthur). Crash testing, traffic law enforcement, and driver education (together with unintentional boons like congested roads and economic recessions) saved still more lives. A lot of lives: since 1980, about 650,000 Americans have lived who would have died if traffic death rates had remained the same.41 The numbers are all the more remarkable when we consider that with each passing decade, Americans drove more miles (55 billion in 1920, 458 billion in 1950, 1.5 trillion in 1980, and 3 trillion in 2013), so they were enjoying all the pleasures of leafy suburbs, soccer-playing children, seeing the USA in their Chevrolet, or just cruising down the streets, feeling out of sight, spending all their money on a Saturday night.42 The additional miles driven did not eat up the safety gains: automobile deaths per capita (as opposed to per vehicle mile) peaked in 1937 at close to 30 per 100,000 per year, and have been in steady decline since the late 1970s, hitting 10.2 in 2014, the lowest rate since 1917.43

The progress in the number of motorists who arrive alive is not uniquely American. Fatality rates have sunk in other wealthy countries such as France, Australia, and of course safety-conscious Sweden. (I ended up buying a Volvo.) But it can be attributed to living in a wealthy country. Emerging nations like India, China, Brazil, and Nigeria have per capita traffic death rates that are double that of the United States and seven times that of Sweden.44 Wealth buys life.

A decline in road deaths would be a dubious achievement if it left us more endangered than we were before the automobile was invented. But life before the car was not so safe either. The pictorial curator Otto Bettmann recounts contemporary accounts of city streets in the horse-drawn era:

“It takes more skill to cross Broadway . . . than to cross the Atlantic in a clamboat.” . . . The engine of city mayhem was the horse. Underfed and nervous, this vital brute was often flogged to exhaustion by pitiless drivers, who exulted in pushing ahead “with utmost fury, defying law and delighting in destruction.” Runaways were common. The havoc killed thousands of people. According to the National Safety Council, the horse-associated fatality rate was ten times the car-associated rate of modern times [in 1974, which is more than double the per capita rate today—SP].45

The Brooklyn Dodgers, before they moved to Los Angeles, had been named after the city’s pedestrians, famous for their skill at darting out of the way of hurtling streetcars. (Not everyone in that era succeeded: my grandfather’s sister was killed by a streetcar in Warsaw in the 1910s.) Like the lives of drivers and passengers, the lives of pedestrians have become more precious, thanks to lights, crosswalks, overpasses, traffic law enforcement, and the demise of hood ornaments, bumper bullets, and other chrome-plated weaponry. Figure 12-4 shows that walking the streets of America today is six times as safe as it was in 1927.


Figure 12-4: Pedestrian deaths, US, 1927–2015

Sources: National Highway Traffic Safety Administration. For 1927–1984: Federal Highway Administration 2003. For 1985–1995: National Center for Statistics and Analysis 1995. For 1995–2005: National Center for Statistics and Analysis 2006. For 2005–2014: National Center for Statistics and Analysis 2016. For 2015: National Center for Statistics and Analysis 2017.

The almost 5,000 pedestrians killed in 2014 is still a shocking toll (just compare it with the 44 killed by terrorists to much greater publicity), but it’s better than the 15,500 who were mowed down in 1937, when the country had two-fifths as many people and far fewer cars. And the biggest salvation is to come. Within a decade of this writing, most new cars will be driven by computers rather than by slow-witted and scatterbrained humans. When robotic cars are ubiquitous, they could save more than a million lives a year, becoming one of the greatest gifts to human life since the invention of antibiotics.

A cliché in discussions of risk perception is that many people have a fear of flying but almost no one a fear of driving, despite the vastly greater safety of plane travel. But the overseers of air traffic safety are never satisfied. They scrutinize the black box and wreckage after every crash, and have steadily made an already safe mode of transportation even safer. Figure 12-5 shows that in 1970 the chance that an airline passenger would die in a plane crash was less than five in a million; by 2015 that small risk had fallen a hundredfold.


Figure 12-5: Plane crash deaths, 1970–2015

Source: Aviation Safety Network 2017. Data on the number of passengers are from World Bank 2016b.


Who by water and who by fire. Well before the invention of cars and planes, people were vulnerable to lethal dangers in their environments. The sociologist Robert Scott began his history of life in medieval Europe as follows: “On December 14, 1421, in the English city of Salisbury, a fourteen-year-old girl named Agnes suffered a grievous injury when a hot spit pierced her torso.” (She was reportedly cured by a prayer to Saint Osmund.)46 It was just one example of how the communities of medieval Europe were “very dangerous places.” Infants and toddlers, who were left unattended while their parents worked, were especially vulnerable, as the historian Carol Rawcliffe explains:

The juxtaposition in dark, cramped surroundings of open hearths, straw bedding, rush-covered floors and naked flames posed a constant threat to curious infants. [Even at play] children were in danger because of ponds, agricultural or industrial implements, stacks of timber, unattended boats and loaded wagons, all of which appear with depressing frequency in coroners’ reports as causes of death among the young.47

The Encyclopedia of Children and Childhood in History and Society notes that “to modern audiences, the image of a sow devouring a baby, which appears in Chaucer’s ‘The Knight’s Tale,’ borders on the bizarre, but it almost certainly reflected the common threat that animals posed to children.”48

Adults were no safer. A Web site called Everyday Life and Fatal Hazard in Sixteenth-Century England (sometimes known as the Tudor Darwin Awards) posts monthly updates on the historians’ analyses of coroners’ reports. The causes of death include eating tainted mackerel, getting stuck while climbing through a window, being crushed by a stack of peat slabs, being strangled by a strap that hung baskets from one’s shoulders, plunging off a cliff while hunting cormorants, and falling onto one’s knife while slaughtering a pig.49 In the absence of artificial lighting, anyone who ventured out after dark faced the risk of drowning in wells, rivers, ditches, moats, canals, and cesspools.

Today we don’t worry about babies getting eaten by sows, but other hazards are still with us. After car crashes, the likeliest cause of accidental death consists of falls, followed by drownings and fires, followed by poisonings. We know this because epidemiologists and safety engineers tabulate accidental deaths with almost plane-wreckage attention to detail, classifying and sub-classifying them to determine which kill the most people and how the risks may be reduced. (The International Classification of Diseases, tenth revision, has codes for 153 kinds of falls alone, together with 39 exclusions.) As their advisories are translated into laws, building codes, inspection regimes, and best practices, the world becomes safer. Since the 1930s, the chance that Americans will fall to their deaths has declined by 72 percent, because they have been protected by railings, signage, window guards, grab bars, worker harnesses, safer flooring and ladders, and inspections. (Most of the remaining deaths are of frail, elderly people.) Figure 12-6 shows the fall of falling,50 together with the trajectories of the other major risks of accidental death since 1903.


Figure 12-6: Deaths from falls, fire, drowning, and poison, US, 1903–2014

Source: National Safety Council 2016. Data for Fire, Drowning, and Poison (solid or liquid) are aggregated over 1903–1998 and 1999–2014 datasets. For 1999–2014, data for Poison (solid or liquid) include poisonings by gas or vapor. Data for Falls extend only to 1992 because of reporting artifacts in subsequent years (see note 50 for details).

The slopes for the liturgical categories of dying by fire and dying by water are almost identical, and the number of victims of each has declined by more than 90 percent. Fewer Americans drown today, thanks to lifejackets, lifeguards, fences around pools, instruction in swimming and lifesaving, and increased awareness of the vulnerability of small children, who can drown in bathtubs, toilets, even buckets.

Fewer are overcome by flames and smoke. In the 19th century, professional brigades were established to extinguish fires before they turned into conflagrations that could raze entire cities. In the middle of the 20th century, fire departments turned from just fighting fires to preventing them. The campaign was prompted by horrific blazes such as the 1942 Cocoanut Grove nightclub fire in Boston, which left 492 dead, and it was publicized with the help of heart-wrenching photos of firefighters carrying the lifeless bodies of small children out of smoldering houses. Fire was designated a nationwide moral emergency in reports from presidential commissions with titles like America Burning.51 The campaign led to the now-ubiquitous sprinklers, smoke detectors, fire doors, fire escapes, fire drills, fire extinguishers, fire-retardant materials, and fire safety education mascots like Smokey the Bear and Sparky the Fire Dog. As a result, fire departments are putting themselves out of business. About 96 percent of their calls are for cardiac arrests and other medical emergencies, and most of the remainder are for small fires. (Contrary to a charming image, they don’t rescue kittens from trees.) A typical firefighter will see just one burning building every other year.52

Fewer Americans are accidentally gassing themselves to death. One advance was a transition starting in the 1940s from toxic coal gas to nontoxic natural gas in household cooking and heating. Another was better design and maintenance of gas stoves and heaters so they wouldn’t burn their fuel incompletely and spew carbon monoxide into the house. Starting in the 1970s, cars were equipped with catalytic converters, which had been designed to reduce air pollution but which also prevented them from becoming mobile gas chambers. And throughout the century people were increasingly reminded that it’s a bad idea to run cars, generators, charcoal grills, and combustion heaters indoors or beneath windows.

Figure 12-6 shows an apparent exception to the conquest of accidents: the category called “Poison (solid or liquid).” The steep rise starting in the 1990s is anomalous in a society that is increasingly latched, alarmed, padded, guard-railed, and warning-stickered, and at first I could not understand why more Americans were apparently eating roach powder or drinking bleach. Then I realized that the category of accidental poisonings includes drug overdoses. (I should have recalled that Leonard Cohen’s song based on the Yom Kippur prayer contains the lines “Who in her lonely slip / Who by barbiturate.”) In 2013, 98 percent of the “Poison” deaths were from drugs (92 percent) or alcohol (6 percent), and almost all the others were from gases and vapors (mostly carbon monoxide). Household and occupational hazards like solvents, detergents, insecticides, and lighter fluid were responsible for less than a half of one percent of the poisoning deaths, and would scrape the bottom of figure 12-6.53 Though small children still rummage under sinks, taste the offerings, and get rushed to poison control centers, few of them die.

So the single rising curve in figure 12-6 is not a counterexample to humanity’s progress in reducing environmental hazards, though it certainly is a step backward with respect to a different kind of hazard, drug abuse. The curve begins to rise in the psychedelic 1960s, jerks up again during the crack cocaine epidemic of the 1980s, and blasts off during the far graver epidemic of opioid addiction in the 21st century. Starting in the 1990s, doctors overprescribed synthetic opioid painkillers like oxycodone, hydrocodone, and fentanyl, which are not just addictive but gateway drugs to heroin. Overdoses of both the legal and illegal opioids have become a major menace, killing more than 40,000 a year and lifting “poison” into the largest category of accidental death, exceeding even traffic accidents.54

Drug overdoses clearly are a different kind of phenomenon from car crashes, falls, fires, drownings, and gassings. People don’t get addicted to carbon monoxide, or crave taller and taller ladders, so the kinds of mechanical safeguards that worked so well for environmental hazards will not be enough to end the opioid epidemic. Politicians and public health officials are coming to grips with the enormity of the problem, and countermeasures are being implemented: monitoring prescriptions, encouraging the use of safer analgesics, shaming or punishing pharma companies that recklessly promote the drugs, making the antidote naloxone more available, and treating addicts with opiate antagonists and cognitive behavior therapy.55 A sign that the measures might be effective is that the number of overdoses of prescription opioids (though not of illicit heroin and fentanyl) peaked in 2010 and may be starting to come down.56

Also noteworthy is that opioid overdoses are largely an epidemic of the druggy Baby Boomer cohort reaching middle age. The peak age of poisoning deaths in 2011 was around fifty, up from the low forties in 2003, the late thirties in 1993, the early thirties in 1983, and the early twenties in 1973.57 Do the subtractions and you find that in every decade it’s the members of the generation born between 1953 and 1963 who are drugging themselves to death. Despite perennial panic about teenagers, today’s kids are, relatively speaking, all right, or at least better. According to a major longitudinal study of teenagers called Monitoring the Future, high schoolers’ use of alcohol, cigarettes, and drugs (other than marijuana and vaping) have dropped to the lowest levels since the survey began in 1976.58


With the shift from a manufacturing to a service economy, many social critics have expressed nostalgia for the era of factories, mines, and mills, probably because they never worked in one. On top of all the lethal hazards we’ve examined, industrial workplaces add countless others, because whatever a machine can do to its raw materials—sawing, crushing, baking, rendering, stamping, threshing, or butchering them—it can also do to the workers tending it. In 1892 President Benjamin Harrison noted that “American workmen are subjected to peril of life and limb as great as a soldier in time of war.” Bettmann comments on some of the gruesome pictures and captions he collected from the era:

The miner, it was said, “went down to work as to an open grave, not knowing when it might close on him.” . . . Unprotected powershafts maimed and killed hoopskirted workers. . . . The circus stuntman and test pilot today enjoy greater life assurance than did the [railroad] brakeman of yesterday, whose work called for precarious leaps between bucking freight cars at the command of the locomotive’s whistle. . . . Also subject to sudden death . . . were the train couplers, whose omnipresent hazard was loss of hands and fingers in the primitive link-and-pin devices. . . . Whether a worker was mutilated by a buzz saw, crushed by a beam, interred in a mine, or fell down a shaft, it was always “his own bad luck.”59

“Bad luck” was a convenient explanation for employers, and until recently it was a part of a widespread fatalism about lethal accidents, which were commonly attributed to destiny or acts of God. (Today, safety engineers and public health researchers don’t even use the word accident, since it implies a fickle finger of fate; the term of art is unintentional injury.) The first safety measures and insurance policies in the 18th and 19th centuries protected property, not people. As injuries and deaths started to increase unignorably during the Industrial Revolution, they were written off as “the price of progress,” according to a nonhumanistic definition of “progress” that was not reckoned in human welfare. A railroad superintendent, justifying his refusal to put a roof over a loading platform, explained that “men are cheaper than shingles. . . . There’s a dozen waiting when one drops out.”60 The inhuman pace of industrial production has been immortalized in cultural icons such as Charlie Chaplin on the assembly line in Modern Times and Lucille Ball in the chocolate factory in I Love Lucy.

Workplaces began to change in the late 19th century as the first labor unions organized, journalists took up the cause, and government agencies started to collect data quantifying the human toll.61 Bettmann’s comment on the lethality of work on trains was based on more than just pictures: in the 1890s, the annual death rate for trainmen was an astonishing 852 per 100,000, almost one percent a year. The carnage was reduced when an 1893 law mandated the use of air brakes and automatic couplers in all freight trains, the first federal law intended to improve workplace safety.

The safeguards spread to other occupations in the early decades of the 20th century, the Progressive Era. They were the result of agitation by reformers, labor unions, and muckraking journalists and novelists like Upton Sinclair.62 The most effective reform was a simple change in the law brought over from Europe: employers’ liability and workmen’s compensation. Previously, injured workers or their survivors had to sue for compensation, usually unsuccessfully. Now, employers were required to compensate them at a fixed rate. The change appealed to management as much as to workers, since it made their costs more predictable and the workers more cooperative. Most important, it yoked the interests of management and labor: both had a stake in making workplaces safer, as did the insurers and government agencies that underwrote the compensation. Companies set up safety committees and safety departments, hired safety engineers, and implemented many protections, sometimes out of economic or humanitarian motives, sometimes as a response to public shaming after a well-publicized disaster, often under the duress of lawsuits and government regulations. The results are plain to see in figure 12-7.63

At almost 5,000 deaths in 2015, the number of workers killed on the job is still too high, but it’s much better than the 20,000 deaths in 1929, when the population was less than two-fifths the size. Much of the savings is the result of the movement of the labor force from farms and factories to stores and offices. But much of it is a gift of the discovery that saving lives while producing the same number of widgets is a solvable engineering problem.


Figure 12-7: Occupational accident deaths, US, 1913–2015

Sources: Data are from different sources and may not be completely commensurable (see note 63 for details). For 1913, 1933, and 1980: Bureau of Labor Statistics, National Safety Council, and CDC National Institute for Occupational Safety and Health, respectively, cited in Centers for Disease Control 1999. For 1970: Occupational Safety and Health Administration, “Timeline of OSHA’s 40 Year History,” https://www.osha.gov/osha40/timeline.html. For 1993–1994: Bureau of Labor Statistics, cited in Pegula & Janocha 2013. For 1995–2005: National Center for Health Statistics 2014, table 38. For 2006–2014: Bureau of Labor Statistics 2016a. The latter data were reported as deaths per full-time-equivalent workers and are multiplied by .95 for rough commensurability with the preceding years, based on the year 2007, when the Census of Fatal Occupation Injuries reported rates both per worker (3.8) and per FTE (4.0).

Who by earthquake. Could the efforts of mortals even mitigate what lawyers call “acts of God”—the droughts, floods, wildfires, storms, volcanoes, avalanches, landslides, sinkholes, heat waves, cold snaps, meteor strikes, and yes, earthquakes that are the quintessentially uncontrollable catastrophes? The answer, shown in figure 12-8, is yes.

After the ironic 1910s, when the world was ravaged by a world war and an influenza pandemic but relatively spared from natural disasters, the rate of death from disasters has rapidly declined from its peak. It’s not that with each passing decade the world has miraculously been blessed with fewer earthquakes, volcanoes, and meteors. It’s that a richer and more technologically advanced society can prevent natural hazards from becoming human catastrophes. When an earthquake strikes, fewer people are crushed by collapsing masonry or burned in conflagrations. When the rains stop, they can use water impounded in reservoirs. When the temperature soars or plummets, they stay in climate-controlled interiors. When a river floods its banks, their drinking water is safeguarded from human and industrial waste. The dams and levees that impound water for drinking and irrigation, when properly designed and built, make floods less likely in the first place. Early warning systems allow people to evacuate or take shelter before a cyclone makes landfall. Though geologists can’t yet predict earthquakes, they can often predict volcanic eruptions, and can prepare the people who live along the Rim of Fire and other fault systems to take lifesaving precautions. And of course a richer world can rescue and treat its injured and quickly rebuild.


Figure 12-8: Natural disaster deaths, 1900–2015

Source: Our World in Data, Roser 2016q, based on data from EM-DAT, The International Disaster Database, www.emdat.be. The graph plots the sum of the death rates for Drought, Earthquake, Extreme temperature, Flood, Impact, Landslide, Mass movement (dry), Storm, Volcanic activity, and Wildfire (excluding Epidemics). In many decades a single disaster type dominates the numbers: droughts in the 1910s, 1920s, 1930s, and 1960s; floods in the 1930s and 1950s; earthquakes in the 1970s, 2000s, and 2010s.

It’s the poorer countries today that are most vulnerable to natural hazards. A 2010 earthquake in Haiti killed more than 200,000 people, while a stronger one in Chile a few weeks later killed just 500. Haiti also loses ten times as many of its citizens to hurricanes as the richer Dominican Republic, the country with which it shares the island of Hispaniola. The good news is that as poorer countries get richer, they get safer (at least as long as economic development outpaces climate change). The annual death rate from natural disasters in low-income countries has come down from 0.7 per 100,000 in the 1970s to 0.2 today, which is lower than the rate for upper-middle-income countries in the 1970s. That’s still higher than the rate for high-income countries today (0.05, down from 0.09), but it shows that rich and poor countries alike can make progress in defending themselves against a vengeful deity.64

And what about the very archetype of an act of God? The projectile that Zeus hurled down from Olympus? The standard idiom for an unpredictable date with death? The literal bolt from the blue? Figure 12-9 shows the history.

Yes, thanks to urbanization and to advances in weather prediction, safety education, medical treatment, and electrical systems, there has been a thirty-seven-fold decline since the turn of the 20th century in the chance that an American will be killed by a bolt of lightning.


Figure 12-9: Lightning strike deaths, US, 1900–2015

Source: Our World in Data, Roser 2016q, based on data from National Oceanic and Atmospheric Administration, http://www.lightningsafety.noaa.gov/victims.shtml, and López & Holle 1998.


Humanity’s conquest of everyday danger is a peculiarly unappreciated form of progress. (Some readers of a draft of this chapter wondered what it was even doing in a book on progress.) Though accidents kill more people than all but the worst wars, we seldom see them through a moral lens. As we say: Accidents will happen. Had we ever been confronted with the dilemma of whether a million deaths and tens of millions of injuries a year was a price worth paying for the convenience of driving our own cars at enjoyable speeds, few would have argued that it was. Yet that is the monstrous choice we tacitly made, because the dilemma was never put to us in those terms.65 Now and again a hazard is moralized and a crusade against it is mounted, particularly if a disaster makes the news and a villain can be fingered (a greedy factory owner, a negligent public official). But soon it recedes back into the lottery of life.

Just as people tend not to see accidents as atrocities (at least when they are not the victims), they don’t see gains in safety as moral triumphs, if they are aware of them at all. Yet the sparing of millions of lives, and the reduction of infirmity, disfigurement, and suffering on a massive scale, deserve our gratitude and demand an explanation. That is true even of murder, the most moralized of acts, whose rate has plummeted for reasons that defy standard narratives.

Like other forms of progress, the ascent of safety was led by some heroes, but it was also advanced by a motley of actors who pushed in the same direction inch by inch: grassroots activists, paternalistic legislators, and an unsung cadre of inventors, engineers, policy wonks, and number-crunchers. Though we sometimes chafe at the false alarms and the nanny-state intrusions, we get to enjoy the blessings of technology without the threats to life and limb.

And though the story of seat belts, smoke alarms, and hot-spot policing is not a customary part of the Enlightenment saga, it plays out the Enlightenment’s deepest themes. Who will live and who will die are not inscribed in a Book of Life. They are affected by human knowledge and agency, as the world becomes more intelligible and life becomes more precious.

Загрузка...