8. TRUE WISDOM

God grant me the serenity to accept the things I cannot change, courage to change the things I can, and wisdom to know the difference.

— REINHOLD NIEBUHR

To know that one knows what one knows, and to know that one doesn’t know what one doesn’t know, there lies true wisdom.

— CONFUCIUS

HUMAN BEINGS HAVE INTELLECTUAL skills of unparalleled power. We can talk, we can reason, we can dance, we can sing. We can debate politics and justice; we can work for the betterment not just of ourselves but our species. We can learn calculus and physics, we can invent, educate, and wax poetic. No other species comes close.

But not every advance has been to the good. The machinery of language and deliberative reason has led to enormous cultural and technological advances, but our brain, which developed over a billion years of pre-hominid ancestry, hasn’t caught up. The bulk of our genetic material evolved before there was language, before there was explicit reasoning, and before creatures like us even existed. Plenty of rough spots remain.

In this book, we’ve discussed several bugs in our cognitive makeup: confirmation bias, mental contamination, anchoring, framing, inadequate self-control, the ruminative cycle, the focusing illusion, motivated reasoning, and false memory, not to mention absentmindedness, an ambiguous linguistic system, and vulnerability to mental disorders. Our memory, contextually driven as it is, is ill suited to many of the demands of modern life, and our self-control systems are almost hopelessly split. Our ancestral mechanisms were shaped in a different world, and our more modern deliberative mechanisms can’t shake the influence of that past. In every domain we have considered, from memory to belief, choice, language, and pleasure, we have seen that a mind built largely through the progressive overlay of technologies is far from perfect. None of these aspects of human psychology would be expected from an intelligent designer; instead, the only reasonable way to interpret them is as relics, leftovers of evolution.

In a sense, the argument I have presented here is part of a long tradition. Gould’s notion of remnants of history, a key inspiration for this book, goes back to Darwin, who started his legendary work The Descent of Man with a list of a dozen “useless, or nearly useless” features — body hair, wisdom teeth, the vestigial tail bone known as the coccyx. Such quirks of nature were essential to Darwin’s argument.

Yet imperfections of the mind have rarely been discussed in the context of evolution. Why should that be? My guess is that there are at least two reasons. The first, plain and simple, is that many of us just don’t want human cognition to turn out to be less than perfect, either because it would be at odds with our beliefs (or fondest desires) or because it leads to a picture of humankind that we find unattractive. The latter factor arises with special force in scientific fields that try to characterize human behavior; the more we stubbornly deviate from rationality, the harder it is for mathematicians and economists to capture our choices in neat sets of equations.

A second factor may stem from the almost mystifying popularity of creationism, and its recent variant, intelligent design. Few theories are as well supported by evidence as the theory of evolution, yet a large portion of the general public refuses to accept it. To any scientist familiar with the facts — ranging from those garnered through the painstaking day-to-day studies of evolution in the contemporary Galapagos Islands (described in Jonathan Weiner’s wonderful book The Beak of the Finch) to the details of molecular change emerging from the raft of recently completed genomes — this continued resistance to evolution seems absurd.[54] Since so much of it seems to come from people who have trouble accepting the notion that well-organized structure could have emerged without forethought, scientists often feel compelled to emphasize evolution’s high points — the cases of well-organized structure that emerged through sheer chance.

Such emphasis has led to a great understanding of how a blind process like evolution can produce systems of tremendous beauty — but at the expense of an equally impassioned exploration of the illuminating power of imperfection. While there is nothing inherently wrong in examining nature’s greatest hits, one can’t possibly get a complete and balanced picture by looking only at the highlights.

The value of imperfections extends far beyond simple balance, however. Scientifically, every kluge contains a clue to our past; wherever there is a cumbersome solution, there is insight into how nature layered our brain together; it is no exaggeration to say that the history of evolution is a history of overlaid technologies, and kluges help expose the seams.

Every kluge also underscores what is fundamentally wrongheaded about creationism: the presumption that we are the product of an all-seeing entity. Creationists may hold on to the bitter end, but imperfection (unlike perfection) beggars the imagination. It’s one thing to imagine an all-knowing engineer designing a perfect eyeball, another to imagine that engineer slacking off and building a half-baked spine.

There’s a practical side too: investigations into human idiosyncrasy can provide a great deal of useful insight into the human condition; as they say in Alcoholics Anonymous, recognition is the first step. The more we can understand our clumsy nature, the more we can do something about it.

When we look at imperfections as a source of insight, the first thing to realize is that not every imperfection is worth fixing. I’ve long since come to terms with the fact that my calculator is better than I am at solving square roots, and I see little point in cheering for Garry Kasparov over his computer opponent, Deep Blue, in the world chess championships. If computers can’t beat us now at chess and Trivial Pursuit, they will someday soon. John Henry’s fin-de-siècle Race Against the Machine was noble but, in hindsight, a lost cause. In many ways machines have (or eventually will have) the edge, and we might as well accept it. The German chemist Ernst Fischer mused that “as machines become more and more efficient and perfect, so it will become clear that imperfection is the greatness of man.” A creature designed by an engineer might never know love, never enjoy art, never see the point of poetry. From the perspective of brute rationality, time spent making and appreciating art is time that could be “better” spent gathering nuts for winter. From my perspective, the arts are part of the joy of human existence. By all means, let us make poetry out of ambiguity, song and literature out of emotion and irrationality.

That said, not every quirk of human cognition ought to be celebrated. Poetry is good, but stereotyping, egocentrism, and our species-wide vulnerability to paranoia and depression are not. To accept everything that is inherent to our biological makeup would be to commit a version of the “naturalistic fallacy,” confusing what is natural with what is good. The trick, obviously, is to sort through our cognitive idiosyncrasies and decide which are worth addressing and which are worth letting go (or even celebrating).

For example, it makes little sense to worry about ambiguity in everyday conversation because we can almost always use context and interaction to figure out what our conversational partners have in mind. It makes little sense to try to memorize the phone numbers of everyone we know because our memory just isn’t built that way (and now we have cell phones to do that for us). For much of our daily business, our mind is more than sufficient. It generally keeps us well-fed, employed, away from obstacles, and out of harm’s reach. As much as I envy the worry-free life of the average domesticated cat, I wouldn’t trade my brain for Fluffy’s for all the catnip in China.

But that doesn’t mean that we can’t, as thinkers, do even better. In that spirit, I offer, herewith, 13 suggestions, each founded on careful empirical research:

1. Whenever possible, consider alternative hypotheses. As we have seen, we humans are not in the habit of evaluating evidence in dispassionate and objective ways. One of the simplest things we can do to improve our capacity to think and reason is to discipline ourselves to consider alternative hypotheses. Something as simple as merely forcing ourselves to list alternatives can improve the reliability of reasoning.

One series of studies has shown the value of the simple maxim “Consider the opposite”; another set has shown the value of “counterfactual thinking” — contemplating what might have been or what could be, rather than focusing on what currently is.

The more we can reflect on ideas and possibilities other than those to which we are most attached, the better. As Robert Rubin (Bill Clinton’s first treasury secretary) said, “Some people I’ve encountered in various phases of my career seem more certain about everything than I am about anything.” Making the right choice often requires an understanding of the road not traveled as well as the road ultimately taken.

2. Reframe the question. Is that soap 99.4 percent pure or 0.6 percent toxic? Politicians, advertisers, and even our local supermarket staff routinely spin just about everything we hear, see, and read. Everything is presented to be as positive as possible. Our job — as consumers, voters, and citizens — must be to perpetually cast a skeptical eye and develop a habit of rethinking whatever we are asked. (Should I construe this “assisted suicide” legislation as an effort to protect people from murderous doctors or as a way of allowing folks to die with dignity? Should I think about the possibility of reducing my hours to part-time work as a pay cut or as an opportunity to spend more time with my kids?) If there’s another way to think about a problem, do it. Contextual memory means that we are always swimming upstream: how we think about a question invariably shapes what we remember, and what we remember affects the answers we reach. Asking every question in more than one way is a powerful way to counter that bias.

3. Always remember that correlation does not entail causation. Believe it or not, if we look across the population of the United States, shoe size is highly correlated with general knowledge; people with bigger shoes tend to know more history and more geography than people with smaller shoes. But that doesn’t mean buying bigger shoes will make you smarter, or even that having big feet makes you smart. This correlation, like so many others, seems more important than it really is because we have a natural tendency to confuse correlation with causation. The correlation I described is real, but the natural inference — that one factor must be causing the other — doesn’t follow. In this example, the reason that the correlation holds is that the people with the littlest feet (and tiniest shoes) are our planet’s newest visitors: infants and toddlers, human beings too young to have yet taken their first history class. We learn as we grow, but that doesn’t mean that growing (per se) makes us learn.[55]

4. Never forget the size of your sample. From medicine to baseball statistics, people often fail to take into account the amount of data they’ve used in drawing their conclusions. Any single event may be random, but recurrence of the same pattern over and again is less likely to be an accident. Mathematically speaking, the bigger the sample, the more reliable the estimate. That’s why, on average, a poll of 2,000 people is a lot more reliable than a poll of 200 people, and seeing someone bat .400 (successfully getting a hit in 40 percent of their tries) over 10 baseball games doesn’t mean nearly as much as seeing them bat .400 over a 162-game season.

As obvious as this fact is, it’s easy to forget. The person who first formalized this notion, known as the law of large numbers, thought it was so obvious that “even the stupidest man knows [it] by some instinct of nature,” yet people routinely ignore it. We can’t help but search for “explanations” of patterns in our data, even in small samples (say, a handful of baseball games or a single day’s stock market results) that may well reflect nothing more than random chance. Boomer hit .400 in the last ten games because “he’s seeing the ball real well,” never because (statistically speaking) a .300 hitter is likely to occasionally look like a .400 hitter for a few days. Stock market analysts do the same thing, tying every day’s market moves to some particular fact of the news. “The market went up today because Acme Federated reported unexpectedly high fourth-quarter results.” When was the last time you heard any analyst say “Actually, today’s rise in the market was probably nothing more than a random fluctuation”?

Happily, psychologist Richard Nisbett has shown that ordinary folks can be taught to be more sensitive to the law of large numbers in less than half an hour.

5. Anticipate your own impulsivity and pre-commit. Odysseus tied himself to a mast to resist the temptations of the Sirens; we would all do well to learn from him. Compare, for example, the groceries we might choose a week in advance, with a well-rested stomach, to the junk we buy in the store when we are hungry. If we commit ourselves in advance to purchasing only what we’ve decided on in advance, we come home with a more healthful basket of groceries. “Christmas Clubs,” which tie up money all year long for holiday shopping, are completely irrational from the perspective of an economist — why earmark money when liquidity is power? — but become completely sensible once we acknowledge our evolved limitations. Temptation is greatest when we can see it, so we are often better off in plans for the future than in impulses of the moment. The wise person acts accordingly.

6. Don’t just set goals. Make contingency plans. It’s often almost impossible for people to stick to vague goals like “I intend to lose weight” or “I plan to finish this article before the deadline.” And simply making a goal more specific (“I plan to lose six pounds”) is not enough. But research by the psychologist Peter Gollwitzer shows that by transforming goals into specific contingency plans — of the form “if X, then Y” (for example, “If I see French fries, then I will avoid them”) — we can markedly increase the chance of success.

A recognition of our klugey nature can help explain why our late-evolved deliberative reasoning, grafted onto a reflexive, ancestral system, has limited access to the brain’s steering wheel; instead, almost everything has to pass through the older ancestral, reflexive system. Specific contingency plans offer a way of working around that limitation by converting abstract goals into a format {if-then, basic to all reflexes) that our ancestral systems can understand. To the extent that we can speak the language of our older system, we increase our chances of achieving our goals.

7. Whenever possible, don’t make important decisions when you are tired or have other things on your mind. Thinking while tired (or distracted) is not so different from driving while drinking. As we get tired, we rely more on our reflexive system, less on deliberative reasoning; ditto as we get distracted. One study, for example, showed that a healthful-minded consumer who is given a choice between a fruit salad and a chocolate cake becomes more likely to choose the cake when forced to remember a seven-digit number. If we want to reason by emotion alone, fine, but if we prefer rationality, it is important to create “winning conditions” — and that means, for important decisions, adequate rest and full concentration.

8. Always weigh benefits against costs. Sounds obvious, but it is not something that comes naturally to the human mind. People tend to find themselves in either a “prevention” frame of mind, emphasizing the costs of their actions (if I don’t go, I’ll waste the money I spent on concert tickets), or a “promotion” frame of mind, emphasizing the benefits (It’ll be fun! Who cares if I’ll be late for work in the morning?). Sound judgment obviously requires weighing both costs and benefits, but unless we are vigilant, our temperament and mood often stand in the way.

Pay special attention, by the way, to what some economists call “opportunity costs”; whenever you make an investment, financial or otherwise, ponder what else you might be doing instead. If you’re doing one thing, you can’t do another — a fact that we often forget. Say, for example, that people are trying to decide whether it makes sense to invest $100 million in public funds in a baseball stadium. That $100 million may well bring some benefits, but few people evaluate such projects in the context of what else that money might do, what opportunities (such as paying down the debt to reduce future interest payments or building three new elementary schools) must be foresworn in order to make that stadium happen. Because such costs don’t come with a readily visible price tag, we often ignore them. On a personal level, taking opportunity costs into account means realizing that whenever we make a choice to do something, such as watch television, we are using time that could be spent in other ways, like cooking a nice meal or taking a bike ride with our kids.

9. Imagine that your decisions may be spot-checked. Research has shown that people who believe that they will have to justify their answers are less biased than people who don’t. When we expect to be held accountable for our decisions, we tend to invest more cognitive effort and make correspondingly more sophisticated decisions, analyzing information in more detail.

For that matter (and no, I’m not making this up) office workers are more likely to pay for coffee from a communal coffee machine if the coffee machine is positioned under a poster featuring a pair of eyes — which somehow makes people feel that they are accountable — than under a poster that has a picture of flowers.

10. Distance yourself. Buddhists tells us that everything seems more important in the moment, and for the most part, they’re right. If an out-of-control car is bearing down on you, by all means, drop everything and focus all of your energies on the short-term goal of getting out of the way. But if I want to top off the meal with that chocolate cake, I should ask myself this: am I overvaluing my current goals (satisfying my sweet tooth) relative to my long-term goals (staying healthy)? It’ll feel good now to send that email excoriating your boss, but next week you’ll probably regret it.

Our mind is set up to ponder the near and the far in almost totally different ways, the near in concrete terms, the far in abstract terms. It’s not always better to think in more distant terms; remember the last time you promised to do something six months hence, say, attend a charity event or volunteer at your child’s school? Your promise probably seemed innocuous at the time but might have felt like an imposition when the date came to actually fulfill it. Whenever we can, we should ask, How will my future self feel about this decision? It pays to recognize the differences in how we treat the here and now versus the future, and try to use and balance both modes of thinking — immediate and distant — so we won’t fall prey to basing choices entirely on what happens to be in our mind in the immediate moment. (A fine corollary: wait awhile. If you still want it tomorrow, it may be important; if the need passes, it probably wasn’t.) Empirical research shows that irrationality often dissipates with time, and complex decisions work best if given time to steep.

11. Beware the vivid, the personal, and the anecdotal. This is another corollary to “distancing ourselves,” also easier said than done. In earlier chapters we saw the relative temptation prompted by cookies that we can see versus cookies that we merely read about. An even more potent illustration might be Timothy Wilson’s study of undergraduates and condom brands, which yielded a classic “do as I say, not as I do” result. Subjects in the experiment were given two sources of information, the results of a statistically robust study in Consumer Reports favoring condoms of Brand A and a single anecdotal tale (allegedly written by another student) recommending Brand B, on the grounds that a condom of Brand A had burst in the middle of intercourse, leading to considerable anxiety about possible pregnancy. Virtually all students agreed in principle that Consumer Reports would be more reliable and also that they would not want their friends to choose on the basis of anecdotal evidence. But when asked to choose for themselves, nearly a third (31 percent) still yielded to the vivid and anecdotal, and went with Brand B. Our four-legged ancestors perhaps couldn’t help but pay attention to whatever seemed most colorful or dramatic; we have the luxury to take the time to reflect, and it behooves us to use it, compensating for our vulnerability to the vivid by giving special weight to the impersonal but scientific.

12. Pick your spots. Decisions are psychologically, and even physically, costly, and it would be impossible to delay every decision until we had complete information and time to reflect on every contingency and counteralternative. The strategies I’ve given in this list are handy, but never forget the tale of Buridan’s Ass, the donkey that starved to death while trying to choose between two equally attractive, equally close patches of hay. Reserve your most careful decision making for the choices that matter most.

13. Try to be rational. This last suggestion may sound unbelievably trivial, on par with the world’s most worthless stock market advice (“Buy low, sell high” — theoretically sound yet utterly useless). But reminding yourself to be rational is not as pointless as it sounds.

Recall, for example, “mortality salience,” a phenomenon I described earlier in the chapter on belief: people who are led in passing to think about their own death tend to be harsher toward members of other groups. Simply telling them to consider their answers before responding and “to be as rational and analytic as possible” (instead of just answering with their “gut-level reactions”) reduces the effect. Another recent study shows similar results.

One of the most important reasons why it just might help to tell yourself to be rational is that in so doing, you can, with practice, automatically prime yourself to use some of the other techniques I’ve just described (such as considering alternatives or holding yourself accountable for your decisions). Telling ourselves to be rational isn’t, on its own, likely to be enough, but it might just help in tandem with the rest.

Every one of these suggestions is based on sound empirical studies of the limits of the human mind. Each, in its own way, addresses a different weakness in the human mind and each, in its own way, offers a technique for smoothing out some of the rough spots in our evolution.

With a properly nuanced understanding of the balance between the strengths and weaknesses of the human mind, we may have an opportunity to help not only ourselves but society. Consider, for example, our outmoded system of education, still primarily steeped in ideas from nineteenth-century pedagogy, with its outsized emphasis on memorization echoing the Industrial Revolution and Dickens’s stern schoolmaster, Mr. Gradgrind: “Now, what I want is, Facts. Teach these boys and girls nothing but Facts… Plant nothing else, and root out everything else.” But it scarcely does what education ought to do, which is to help our children learn how to fend for themselves. I doubt that such a heavy dose of memorization ever served a useful purpose, but in the age of Google, asking a child to memorize the state capital has long since outlived its usefulness.

Deanna Kuhn, a leading educational psychologist and author of the recent book Education for Thinking, presents a vignette that reminds me entirely too much of my own middle-school experience: a seventh-grader at a considerably above average school asked his (well-regarded) social studies teacher, “Why do we have to learn the names of all thirteen colonies?” The teacher’s answer, delivered without hesitation, was “Well, we’re going to learn all fifty states by June, so we might as well learn the first thirteen now.” Clear evidence that the memorization cart has come before the educational horse. There is value, to be sure, in teaching children the history of their own country and — especially in light of increasing globalization — the world, but a memorized list of states casts no real light on history and leaves a student with no genuine skills for understanding (say) current events. The result, in the words of one researcher, is that many students are unable to give evidence of more than a superficial understanding of the concepts and relationships that are fundamental to the subjects they have studied, or of an ability to apply the content knowledge they have acquired to real-world problems… it is possible to finish 12 or 13 years of public education in the United States without developing much competence as a thinker.

In the information age, children have no trouble finding information, but they have trouble interpreting it. The fact (discussed earlier) that we tend to believe first and ask questions later is truly dangerous in the era of the Internet — wherein anyone, even people with no credentials, can publish anything. Yet studies show that teenagers frequently take whatever they read on the Internet at face value. Most students only rarely or occasionally check to see who the author of a web page is, what the person’s credentials are, or whether other sources validate the information in question. In the words of two Wellesley college researchers, “Students use the Net as a primary source of information, usually with little regard as to the accuracy of that information.” The same is true for most adults; one Internet survey reported that “the average consumer paid far more attention to the superficial aspects of a [web] site, such as visual cues, than to its content. For example, nearly half of all consumers (or 46.1%) in the study assessed the credibility of sites based in part on the appeal of the overall visual design of a site, including layout, typography, font size, and color schemes.”[56]

Which is exactly why we need schools and not just Wikipedia and an Internet connection. If we were naturally good thinkers, innately skeptical and balanced, schools would be superfluous.

But the truth is that without special training, our species is inherently gullible. Children are born into a world of “revealed truths,” where they tend to accept what they are told as gospel truth. It takes work to get children to understand that often multiple opinions exist and that not everything they hear is true; it requires even more effort to get them to learn to evaluate conflicting evidence. Scientific reasoning is not something most people pick up naturally or automatically.

And, for that matter, we are not born knowing much about the inner operations of our brain and mind, least of all about our cognitive vulnerabilities. Scientists didn’t even determine with certainty that the brain was the source of thinking until the seventeenth century. (Aristotle, for one, thought the purpose of the brain was to cool the blood, inferring this backward from the fact that large-brain humans were less “hot-blooded” than other creatures.) Without lessons, we are in no better position to understand how our mind works than how our digestive system works. Most us were never taught how to take notes, how to evaluate evidence, or what human beings are (and are not) naturally good at. Some people figure these things out on their own; some never do. I cannot recall a single high school class on informal argument, how to spot fallacies, or how to interpret statistics; it wasn’t until college that anybody explained to me the relation between causation and correlation.

But that doesn’t mean we couldn’t teach such things. Studies in teaching so-called critical thinking skills are showing increasingly promising results, with lasting effects that can make a difference. Among the most impressive is a recent study founded on a curriculum known as “Philosophy for Children,” which, as its name suggests, revolves around getting children to think about — and discuss — philosophy. Not Plato and Aristotle, mind you, but stories written for children that are explicitly aimed at engaging children in philosophical issues. The central book in the curriculum, Harry Stottlemeier’s Discovery (no relation to Harry Potter), begins with a section in which the eponymous Harry is asked to write an essay called “The Most Interesting Thing in the World.” Harry, a boy after my own heart, chooses to write his on thinking: “To me, the most interesting thing in the whole world is thinking. I know that lots of other things are also very important and wonderful, like electricity, and magnetism and gravitation. But although we understand them, they can’t understand us. So thinking must be something very special.”

BCids of ages 10-12 who were exposed to a version of this curriculum for 16 months, for just an hour a week, showed significant gains in verbal intelligence, nonverbal intelligence, self-confidence, and independence.

Harry Stottlemeier’s essay — and the “Philosophy for Children” curriculum — is really an example of what psychologists call metacognition, or knowing about knowing. By asking children to reflect on how they know what they know, we may significantly enhance their understanding of the world. Even a single course — call it “The Human Mind: A User’s Guide” — could go a long way.

No such guide will give us the memory power to solve square roots in our head, but many of our cognitive peccadilloes are addressable: we can train ourselves to consider evidence in a more balanced way, to be sensitive to biases in our reasoning, and to make plans and choices in ways that better suit our own long-term goals. If we do — if we learn to recognize our limitations and address them head on — we just might outwit our inner kluge.

Загрузка...