CHAPTER 7 SUSTENANCE


Together with senescence, childbirth, and pathogens, another mean trick has been played on us by evolution and entropy: our ceaseless need for energy. Famine has long been part of the human condition. The Hebrew Bible tells of seven lean years in Egypt; the Christian Bible has Famine as one of the four horsemen of the apocalypse. Well into the 19th century a crop failure could bring sudden misery even to privileged parts of the world. Johan Norberg quotes the childhood reminiscence of a contemporary of one of his ancestors in Sweden in the winter of 1868:

We often saw mother weeping to herself, and it was hard on a mother, not having any food to put on the table for her hungry children. Emaciated, starving children were often seen going from farm to farm, begging for a few crumbs of bread. One day three children came to us, crying and begging for something to still the pangs of hunger. Sadly, her eyes brimming with tears, our mother was forced to tell them that we had nothing but a few crumbs of bread which we ourselves needed. When we children saw the anguish in the unknown children’s supplicatory eyes, we burst into tears and begged mother to share with them what crumbs we had. Hesitantly she acceded to our request, and the unknown children wolfed down the food before going on to the next farm, which was a good way off from our home. The following day all three were found dead between our farm and the next.1

The historian Fernand Braudel has documented that premodern Europe suffered from famines every few decades.2 Desperate peasants would harvest grain before it was ripe, eat grass or human flesh, and pour into cities to beg. Even in good times, many would get the bulk of their calories from bread or gruel, and not many at that: in The Escape from Hunger and Premature Death, 1700–2100, the economist Robert Fogel noted that “the energy value of the typical diet in France at the start of the eighteenth century was as low as that of Rwanda in 1965, the most malnourished nation for that year.”3 Many of those who were not starving were too weak to work, which locked them into poverty. Hungry Europeans titillated themselves with food pornography, such as tales of Cockaigne, a country where pancakes grew on trees, the streets were paved with pastry, roasted pigs wandered around with knives in their backs for easy carving, and cooked fish jumped out of the water and landed at one’s feet.

Today we live in Cockaigne, and our problem is not too few calories but too many. As the comedian Chris Rock observed, “This is the first society in history where the poor people are fat.” With the usual first-world ingratitude, modern social critics rail against the obesity epidemic with a level of outrage that might be appropriate for a famine (that is, when they are not railing at fat-shaming, slender fashion models, or eating disorders). Though obesity surely is a public health problem, by the standards of history it’s a good problem to have.

What about the rest of the world? The hunger that many Westerners associate with Africa and Asia is by no means a modern phenomenon. India and China have always been vulnerable to famine, because millions of people subsisted on rice that was watered by erratic monsoons or fragile irrigation systems and had to be transported across great distances. Braudel recounts the testimony of a Dutch merchant who was in India during a famine in 1630–31:

“Men abandoned towns and villages and wandered helplessly. It was easy to recognize their condition: eyes sunk deep in the head, lips pale and covered with slime, the skin hard, with the bones showing through, the belly nothing but a pouch hanging down empty. . . . One would cry and howl for hunger, while another lay stretched on the ground dying in misery.” The familiar human dramas followed: wives and children abandoned, children sold by parents, who either abandoned them or sold themselves in order to survive, collective suicides. . . . Then came the stage when the starving split open the stomachs of the dead or dying and “drew at the entrails to fill their own bellies.” “Many hundred thousands of men died of hunger, so that the whole country was covered with corpses lying unburied, which caused such a stench that the whole air was filled and infected with it. . . . In the village of Susuntra . . . human flesh was sold in open market.”4

But in recent times the world has been blessed with another remarkable and little-noticed advance: in spite of burgeoning numbers, the developing world is feeding itself. This is most obvious in China, whose 1.3 billion people now have access to an average of 3,100 calories per person per day, which, according to US government guidelines, is the number needed by a highly active young man.5 India’s billion people get an average of 2,400 calories a day, the number recommended for a highly active young woman or an active middle-aged man. The figure for the continent of Africa comes in between the two at 2,600.6 Figure 7-1, which plots available calories for a representative sample of developed and developing nations and for the world as a whole, shows a pattern familiar from earlier graphs: hardship everywhere before the 19th century, rapid improvement in Europe and the United States over the next two centuries, and, in recent decades, the developing world catching up.


Figure 7-1: Calories, 1700–2013

Sources: United States, England, and France: Our World in Data, Roser 2016d, based on data from Fogel 2004. China, India, and the World: Food and Agriculture Organization of the United Nations, http://www.fao.org/faostat/en/#data.

The numbers plotted in figure 7-1 are averages, and they would be a misleading index of well-being if they were just lifted by rich people scarfing down more calories (if no one was getting fat except Mama Cass). Fortunately, the numbers reflect an increase in the availability of calories throughout the range, including the bottom. When children are underfed, their growth is stunted, and throughout their lives they have a higher risk of getting sick and dying. Figure 7-2 shows the proportion of children who are stunted in a representative sample of countries which have data for the longest spans of time. Though the proportion of stunted children in poor countries like Kenya and Bangladesh is deplorable, we see that in just two decades the rate of stunting has been cut in half. Countries like Colombia and China also had high rates of stunting not long ago and have managed to bring them even lower.


Figure 7-2: Childhood stunting, 1966–2014

Source: Our World in Data, Roser 2016j, based on data from the World Health Organization’s Nutrition Landscape Information System, http://www.who.int/nutrition/nlis/en/.

Figure 7-3 offers another look at how the world has been feeding the hungry. It shows the rate of undernourishment (a year or more of insufficient food) for developing countries in five regions and for the world as a whole. In developed countries, which are not included in the estimates, the rate of undernourishment was less than 5 percent during the entire period, statistically indistinguishable from zero. Though 13 percent of people in the developing world being undernourished is far too much, it’s better than 35 percent, which was the level forty-five years earlier, or for that matter 50 percent, an estimate for the entire world in 1947 (not shown on the graph).7 Remember that these figures are proportions. The world added almost five billion people in those seventy years, which means that as the world was reducing the rate of hunger it was also feeding billions of additional mouths.


Figure 7-3: Undernourishment, 1970–2015

Source: Our World in Data, Roser 2016j, based on data from the Food and Agriculture Organization 2014, also reported in http://www.fao.org/economic/ess/ess-fs/ess-fadata/en/.

Not only has chronic undernourishment been in decline, but so have catastrophic famines—the crises that kill people in large numbers and cause widespread wasting (the condition of being two standard deviations below one’s expected weight) and kwashiorkor (the protein deficiency which causes the swollen bellies of the children in photographs that have become icons of famine).8 Figure 7-4 shows the number of deaths in major famines in each decade for the past 150 years, scaled by world population at the time.

Writing in 2000, the economist Stephen Devereux summarized the world’s progress in the 20th century:

Vulnerability to famine appears to have been virtually eradicated from all regions outside Africa. . . . Famine as an endemic problem in Asia and Europe seems to have been consigned to history. The grim label “land of famine” has left China, Russia, India and Bangladesh, and since the 1970s has resided only in Ethiopia and Sudan.

[In addition,] the link from crop failure to famine has been broken. Most recent drought- or flood-triggered food crises have been adequately met by a combination of local and international humanitarian response. . . .

If this trend continues, the 20th century should go down as the last during which tens of millions of people died for lack of access to food.9


Figure 7-4: Famine deaths, 1860–2016

Sources: Our World in Data, Hasell & Roser 2017, based on data from Devereux 2000; Ó Gráda 2009; White 2011, and EM-DAT, The International Disaster Database, http://www.emdat.be/; and other sources. “Famine” is defined as in Ó Gráda 2009.

So far, the trend has continued. There is still hunger (including among the poor in developed countries), and there were famines in East Africa in 2011, the Sahel in 2012, and South Sudan in 2016, together with near-famines in Somalia, Nigeria, and Yemen. But they did not kill on the scale of the catastrophes that were regular occurrences in earlier centuries.

None of this was supposed to happen. In 1798 Thomas Malthus explained that the frequent famines of his era were unavoidable and would only get worse, because “population, when unchecked, increases in a geometrical ratio. Subsistence increases only in an arithmetic ratio. A slight acquaintance with numbers will show the immensity of the first power in comparison with the second.” The implication was that efforts to feed the hungry would only lead to more misery, because they would breed more children who were doomed to hunger in their turn.

Not long ago, Malthusian thinking was revived with a vengeance. In 1967 William and Paul Paddock wrote Famine 1975!, and in 1968 the biologist Paul R. Ehrlich wrote The Population Bomb, in which he proclaimed that “the battle to feed all of humanity is over” and predicted that by the 1980s sixty-five million Americans and four billion other people would starve to death. New York Times Magazine readers were introduced to the battlefield term triage (the emergency practice of separating wounded soldiers into the savable and the doomed) and to philosophy-seminar arguments about whether it is morally permissible to throw someone overboard from a crowded lifeboat to prevent it from capsizing and drowning everyone.10 Ehrlich and other environmentalists argued for cutting off food aid to countries they deemed basket cases.11 Robert McNamara, president of the World Bank from 1968 to 1981, discouraged financing of health care “unless it was very strictly related to population control, because usually health facilities contributed to the decline of the death rate, and thereby to the population explosion.” Population-control programs in India and China (especially under China’s one-child policy) coerced women into sterilizations, abortions, and being implanted with painful and septic IUDs.12

Where did Malthus’s math go wrong? Looking at the first of his curves, we already saw that population growth needn’t increase in a geometric ratio indefinitely, because when people get richer and more of their babies survive, they have fewer babies (see also figure 10-1). Conversely, famines don’t reduce population growth for long. They disproportionately kill children and the elderly, and when conditions improve, the survivors quickly replenish the population.13 As Hans Rosling put it, “You can’t stop population growth by letting poor children die.”14

Looking at the second curve, we discover that the food supply can grow geometrically when knowledge is applied to increase the amount of food that can be coaxed out of a patch of land. Since the birth of agriculture ten thousand years ago, humans have been genetically engineering plants and animals by selectively breeding the ones that had the most calories and fewest toxins and that were the easiest to plant and harvest. The wild ancestor of corn was a grass with a few tough seeds; the ancestor of carrots looked and tasted like a dandelion root; the ancestors of many wild fruits were bitter, astringent, and more stone than flesh. Clever farmers also tinkered with irrigation, plows, and organic fertilizers, but Malthus always had the last word.

It was only at the time of the Enlightenment and the Industrial Revolution that people figured out how to bend the curve upward.15 In Jonathan Swift’s 1726 novel, the moral imperative was explained to Gulliver by the King of Brobdingnag: “Whoever makes two ears of corn, or two blades of grass to grow where only one grew before, deserves better of humanity, and does more essential service to his country than the whole race of politicians put together.” Soon after that, as figure 7-1 shows, more ears of corn were indeed made to grow, in what has been called the British Agricultural Revolution.16 Crop rotation and improvements to plows and seed drills were followed by mechanization, with fossil fuels replacing human and animal muscle. In the mid-19th century it took twenty-five men a full day to harvest and thresh a ton of grain; today one person operating a combine harvester can do it in six minutes.17

Machines also solve an inherent problem with food. As any zucchini gardener in August knows, a lot becomes available all at once, and then it quickly rots or gets eaten by vermin. Railroads, canals, trucks, granaries, and refrigeration evened out the peaks and troughs in the supply and matched it with demand, coordinated by the information carried in prices. But the truly gargantuan boost would come from chemistry. The N in SPONCH, the acronym taught to schoolchildren for the chemical elements that make up the bulk of our bodies, stands for nitrogen, a major ingredient of protein, DNA, chlorophyll, and the energy carrier ATP. Nitrogen atoms are plentiful in the air but bound in pairs (hence the chemical formula N2), which are hard to split apart so that plants can use them. In 1909 Carl Bosch perfected a process invented by Fritz Haber which used methane and steam to pull nitrogen out of the air and turn it into fertilizer on an industrial scale, replacing the massive quantities of bird poop that had previously been needed to return nitrogen to depleted soils. Those two chemists top the list of the 20th-century scientists who saved the greatest number of lives in history, with 2.7 billion.18

So forget arithmetic ratios: over the past century, grain yields per hectare have swooped upward while real prices have plunged. The savings are mind-boggling. If the food grown today had to be grown with pre-nitrogen-farming techniques, an area the size of Russia would go under the plow.19 In the United States in 1901, an hour’s wages could buy around three quarts of milk; a century later, the same wages would buy sixteen quarts. The amount of every other foodstuff that can be bought with an hour of labor has multiplied as well: from a pound of butter to five pounds, a dozen eggs to twelve dozen, two pounds of pork chops to five pounds, and nine pounds of flour to forty-nine pounds.20

In the 1950s and ’60s, another giga-lifesaver, Norman Borlaug, outsmarted evolution to foment the Green Revolution in the developing world.21 Plants in nature invest a lot of energy and nutrients in woody stalks that raise their leaves and blossoms above the shade of neighboring weeds and of each other. Like fans at a rock concert, everyone stands up, but no one gets a better view. That’s the way evolution works: it myopically selects for individual advantage, not the greater good of the species, let alone the good of some other species. From a farmer’s perspective, not only do tall wheat plants waste energy in inedible stalks, but when they are enriched with fertilizer they collapse under the weight of the heavy seedhead. Borlaug took evolution into his own hands, crossing thousands of strains of wheat and then selecting the offspring with dwarfed stalks, high yields, resistance to rust, and an insensitivity to day length. After several years of this “mind-warpingly tedious work,” Borlaug evolved strains of wheat (and then corn and rice) with many times the yield of their ancestors. By combining these strains with modern techniques of irrigation, fertilization, and crop management, Borlaug turned Mexico and then India, Pakistan, and other famine-prone countries into grain exporters almost overnight. The Green Revolution continues—it has been called “Africa’s best-kept secret”—driven by improvements in sorghum, millet, cassava, and tubers.22

Thanks to the Green Revolution, the world needs less than a third of the land it used to need to produce a given amount of food.23 Another way of stating the bounty is that between 1961 and 2009 the amount of land used to grow food increased by 12 percent, but the amount of food that was grown increased by 300 percent.24 In addition to beating back hunger, the ability to grow more food from less land has been, on the whole, good for the planet. Despite their bucolic charm, farms are biological deserts which sprawl over the landscape at the expense of forests and grasslands. Now that farms have receded in some parts of the world, temperate forests have been bouncing back, a phenomenon we will return to in chapter 10.25 If agricultural efficiency had remained the same over the past fifty years while the world grew the same amount of food, an area the size of the United States, Canada, and China combined would have had to be cleared and plowed.26 The environmental scientist Jesse Ausubel has estimated that the world has reached Peak Farmland: we may never again need as much as we use today.27

Like all advances, the Green Revolution came under attack as soon as it began. High-tech agriculture, the critics said, consumes fossil fuels and groundwater, uses herbicides and pesticides, disrupts traditional subsistence agriculture, is biologically unnatural, and generates profits for corporations. Given that it saved a billion lives and helped consign major famines to the dustbin of history, this seems to me like a reasonable price to pay. More important, the price need not be with us forever. The beauty of scientific progress is that it never locks us into a technology but can develop new ones with fewer problems than the old ones (a dynamic we will return to here).

Genetic engineering can now accomplish in days what traditional farmers accomplished in millennia and Borlaug accomplished in his years of “mind-warping tedium.” Transgenic crops are being developed with high yields, lifesaving vitamins, tolerance of drought and salinity, resistance to disease, pests, and spoilage, and reduced need for land, fertilizer, and plowing. Hundreds of studies, every major health and science organization, and more than a hundred Nobel laureates have testified to their safety (unsurprisingly, since there is no such thing as a genetically unmodified crop).28 Yet traditional environmentalist groups, with what the ecology writer Stewart Brand has called their “customary indifference to starvation,” have prosecuted a fanatical crusade to keep transgenic crops from people—not just from whole-food gourmets in rich countries but from poor farmers in developing ones.29 Their opposition begins with a commitment to the sacred yet meaningless value of “naturalness,” which leads them to decry “genetic pollution” and “playing with nature” and to promote “real food” based on “ecological agriculture.” From there they capitalize on primitive intuitions of essentialism and contamination among the scientifically illiterate public. Depressing studies have shown that about half of the populace believes that ordinary tomatoes don’t have genes but genetically modified ones do, that a gene inserted into a food might migrate into the genomes of people who eat it, and that a spinach gene inserted into an orange would make it taste like spinach. Eighty percent favored a law that would mandate labels on all foods “containing DNA.”30 As Brand put it, “I daresay the environmental movement has done more harm with its opposition to genetic engineering than with any other thing we’ve been wrong about. We’ve starved people, hindered science, hurt the natural environment, and denied our own practitioners a crucial tool.”31

One reason for Brand’s harsh judgment is that opposition to transgenic crops has been perniciously effective in the part of the world that could most benefit from it. Sub-Saharan Africa has been cursed by nature with thin soil, capricious rainfall, and a paucity of harbors and navigable rivers, and it never developed an extensive network of roads, rails, or canals.32 Like all farmed land, its soils have been depleted, but unlike those in the rest of the world, Africa’s have not been replenished with synthetic fertilizer. Adoption of transgenic crops, both those already in use and ones customized for Africa, grown with other modern practices such as no-till farming and drip irrigation, could allow Africa to leapfrog the more invasive practices of the first Green Revolution and eliminate its remaining undernourishment.

For all the importance of agronomy, food security is not just about farming. Famines are caused not only when food is scarce but when people can’t afford it, when armies prevent them from getting it, or when their governments don’t care how much of it they have.33 The pinnacles and valleys in figure 7-4 show that the conquest of famine was not a story of steady gains in agricultural efficiency. In the 19th century, famines were triggered by the usual droughts and blights, but they were exacerbated in colonial India and Africa by the callousness, bungling, and sometimes deliberate policies of administrators who had no benevolent interest in their subjects’ welfare.34 By the early 20th century, colonial policies had become more responsive to food crises, and advances in agriculture had taken a bite out of hunger.35 But then a horror show of political catastrophes triggered sporadic famines for the rest of the century.

Of the seventy million people who died in major 20th-century famines, 80 percent were victims of Communist regimes’ forced collectivization, punitive confiscation, and totalitarian central planning.36 These included famines in the Soviet Union in the aftermaths of the Russian Revolution, the Russian Civil War, and World War II; Stalin’s Holodomor (terror-famine) in Ukraine in 1932–33; Mao’s Great Leap Forward in 1958–61; Pol Pot’s Year Zero in 1975–79; and Kim Jong-il’s Arduous March in North Korea as recently as the late 1990s. The first governments in postcolonial Africa and Asia often implemented ideologically fashionable but economically disastrous policies such as the mass collectivization of farming, import restrictions to promote “self-sufficiency,” and artificially low food prices which benefited politically influential city-dwellers at the expense of farmers.37 When the countries fell into civil war, as they so often did, not only was food distribution disrupted, but both sides could use hunger as a weapon, sometimes with the complicity of their Cold War patrons.

Fortunately, since the 1990s the prerequisites to plenty have been falling into place in more of the world. Once the secrets to growing food in abundance are unlocked and the infrastructure to move it around is in place, the decline of famine depends on the decline of poverty, war, and autocracy. Let’s turn to the progress that has been made against each of these scourges.

Загрузка...