Cheating Ourselves

Imagine yourself on a soft, sandy beach. The tide is rolling out, creating a wide swath of wet sand for you to wander along. You’re heading to the place where you go from time to time to check out girls. Oh, and you’re a feisty blue crab. And in reality, you’re going to spar with a few other male crabs to see who will win the favor of the females.

Ahead you see a pretty little thing with cute claws. At the same time, you notice that your competition is quickly closing in. You know that the ideal way to handle the situation is to scare off the other crabs. That way you don’t have to get into a fight and risk hurting yourself or, worse, lose your chance to mate. So you have to convince the other crabs that you’re bigger and stronger. As you inch closer to your competition, you know you need to emphasize your size. However, if you simply pretend to be larger by standing on your toes and halfheartedly waving your claws around, you will probably give yourself away. What to do?

What you need to do is give yourself a pep talk and start believing that you are, in fact, bigger and tougher than you really are. “Knowing” you’re the biggest crab on the beach, you stand as high as you can on your hind legs and spread your claws as far and high above you as possible (antlers, peacock tails, and general puffing up help other male creatures do the same thing). Believing in your own fabrication means that you will not flinch. And your (exaggerated) self-confidence might cow your opponents.

NOW BACK TO us. As humans, we have slightly more sophisticated means of puffing ourselves up than our animal counterparts. We have the ability to lie—not just to others but also to ourselves. Self-deception is a useful strategy for believing the stories we tell, and if we are successful, it becomes less likely that we will flinch and accidentally signal that we’re anything other than what we pretend to be. I’m hardly endorsing lying as a means of attaining a partner, a job, or anything else. But in this chapter, we’ll look at the ways we succeed in fooling ourselves as we try to fool others.

Of course, we can’t instantly believe every one of our lies. For instance, let’s say you’re a guy at a speed-dating event and you’re trying to impress an attractive woman. A wild idea enters your mind: you tell her that you have a pilot’s license. Even if you sold her this story, it’s unlikely you will convince yourself that you do, in fact, have such a license and start suggesting to the pilots on your next flight how to improve their landings. On the other hand, let’s say you go out running with your buddy and you get into a discussion about best running times. You tell your friend that you’ve run a mile in under seven minutes, when in reality your best time was a tiny bit over seven minutes. A few days later, you tell someone else the same thing. After repeating this slightly exaggerated claim over and over, you could eventually forget that you hadn’t actually broken the seven-minute mark. You may come to believe it to such a degree that you might even be willing to bet money on it.

ALLOW ME TO tell you a story of a time when I embraced my own deception. In the summer of 1989—about two years after I left the hospital—my friend Ken and I decided to fly from New York to London to see another friend. We bought the cheapest flight to London, which happened to be on Air India. When the taxi dropped us off at the airport, we were dismayed to see a line of people trailing all the way out of the terminal. Thinking fast, Ken came up with an idea: “Why don’t we put you in a wheelchair?” I thought about his suggestion. Not only would I be more comfortable, but we could also get through much faster. (Truthfully speaking, it is difficult for me to stand for a prolonged time because the circulation in my legs is far from good. But I don’t need a wheelchair.)

We were both convinced that it was a good plan, so Ken jumped out of the cab and returned with the wheelchair. We breezed through check-in and, with an extra two hours to kill, we enjoyed coffee and a sandwich. But then I needed to use the restroom. So Ken pushed me in the wheelchair to the nearest bathroom, which unfortunately was not designed to accommodate a wheelchair. I maintained my role, though; we got the wheelchair as close to the toilet as possible and I tried to hit the mark from a distance, with limited success.

Once we made it through the bathroom challenge, it was time to board the plane. Our seats were in row 30, and as we neared the entrance to the plane, I realized that the wheelchair was going to be too wide for the aisle. So we did what my new role dictated: I left the wheelchair at the entrance of the plane, grabbed on to Ken’s shoulders, and he hauled me to our seats.

As I sat waiting for the flight to take off, I was annoyed that the bathroom in the airport wasn’t handicap-accessible and that the airline hadn’t provided me with a narrower wheelchair to get to my seat. My irritation increased when I realized that I shouldn’t drink anything on the six-hour flight because there would be no way for me to keep up the act and use the bathroom. The next difficulty arose when we landed in London. Once again, Ken had to carry me to the entrance of the plane, and when the airline didn’t have a wheelchair waiting for us, we had to wait.

This little adventure made me appreciate the daily irritations of handicapped people in general. In fact, I was so annoyed that I decided to go and complain to the head of Air India in London. Once we got the wheelchair, Ken rolled me to Air India’s office, and with an overblown air of indignation I described each difficulty and humiliation and reprimanded the regional head of Air India for the airline’s lack of concern for disabled people everywhere. Of course he apologized profusely, and after that we rolled away.

The odd thing is that throughout the process I knew I could walk, but I adopted my role so quickly and thoroughly that my self-righteousness felt as real as if I had a legitimate reason to be upset. Then after all that, we got to the baggage claim, where I simply picked up my backpack and walked away un-hampered, like Keyser Söze in the film The Usual Suspects.

TO MORE SERIOUSLY examine self-deception, Zoë Chance (a postdoc at Yale), Mike Norton, Francesca Gino, and I set out to learn more about how and when we deceive ourselves into believing our own lies and whether there are ways to prevent ourselves from doing so.

In the first phase of our exploration, participants took an eight-question IQ-like test (one of the questions, for example, was this: “What is the number that is one half of one quarter of one tenth of 400?”). After they finished taking the quiz, participants in the control group handed their answers over to the experimenter who checked their responses. This allowed us to establish the average performance on the test.*

In the condition where cheating was possible, participants had an answer key at the bottom of the page. They were told that the answer key was there so that they could score how well they did on the test and also to help them estimate in general how good they were at answering these types of questions. However, they were told to answer the questions first and only then use the key for verification. After answering all the questions, participants checked their own answers and reported their own performance.

What did the results from phase one of the study show? As we expected, the group that had the opportunity to “check their answers” scored a few points higher on average, which suggested that they had used the answer key not only to score themselves but also to improve their performance. As was the case with all of our other experiments, we found that people cheat when they have a chance to do so, but not by a whole lot.



Helping Myself to a Higher MENSA Score

The inspiration for this experimental setup came from one of those complimentary magazines that you find in seat-back pockets on airplanes. On one particular flight, I was flipping through a magazine and discovered a MENSA quiz (questions that are supposed to measure intelligence). Since I am rather competitive, I naturally had to try it. The directions said that the answers were in the back of the magazine. After I answered the first question, I flipped to the back to see if I was correct, and lo and behold, I was. But as I continued with the quiz, I also noticed that as I was checking the answer to the question I just finished solving, my eyes strayed just a bit to the next answer. Having glanced at the answer to the next question, I found the next problem to be much easier. At the end of the quiz, I was able to correctly solve most of the questions, which made it easier for me to believe that I was some sort of genius. But then I had to wonder whether my score was that high because I was supersmart or because I had seen the answers out of the corner of my eye (my inclination was, of course, to attribute it to my own intelligence).

The same basic process can take place in any test in which the answers are available on another page or are written upside down, as they often are in magazines and SAT study guides. We often use the answers when we practice taking tests to convince ourselves that we’re smart or, if we get an answer wrong, that we’ve made a silly mistake that we would never make during a real exam. Either way, we come away with an inflated idea of how bright we actually are—and that’s something we’re generally happy to accept.

THE RESULTS FROM phase one of our experiments showed that participants tended to look ahead at the answers as a way to improve their score. But this finding did not tell us whether they engaged in straight-up old-fashioned cheating or if they were actually deceiving themselves. In other words, we didn’t yet know if the participants knew they were cheating or if they convinced themselves that they legitimately knew the correct answers all along. To figure this out, we added another component to our next experiment.

Imagine that you are taking part in an experiment similar to the previous one. You took the eight-question quiz and answered four questions correctly (50 percent), but thanks to the answers at the bottom of the page, you claimed that you had solved six correctly (75 percent). Now, do you think that your actual ability is in the 50 percent range, or do you think it is in the 75 percent range? On the one hand, you may be aware that you used the answer key to inflate your score, and realize that your real ability is closer to the 50 percent mark. On the other hand, knowing that you were paid as if you really had solved six problems, you might be able to convince yourself that your ability to solve such questions is in reality closer to the 75 percent level.

This is where phase two of the experiment comes in. After finishing the math quiz, the experimenter asks you to predict how well you will do on the next test, in which you will be asked to answer a hundred questions of a similar nature. This time, it’s clear that there are not going to be any answers at the bottom of the page (and therefore no chance to consult the key). What do you predict your performance will be on the next quiz? Will it be based on your real ability in the first phase (50 percent), or will it be based on your exaggerated ability (75 percent)? Here is the logic: if you are aware that you used the answer key in the previous test to artificially inflate your score, you would predict that you would correctly solve the same proportion of questions as you solved unassisted in the first test (four out of eight, or around 50 percent). But let’s say you started believing that you really did answer six questions correctly on your own and not because you looked at the answers. Now you might predict that in this next test, too, you would correctly solve a much larger percentage of the questions (closer to 75 percent). In truth, of course, you can solve only about half of the questions correctly, but your self-deception may puff you up, crablike, and increase your confidence in your ability.

The results showed that participants experienced the latter sort of self-puffery. The predictions of how well they would perform on the second phase of the test showed that participants not only used the answer key in the first phase to exaggerate their score, but had very quickly convinced themselves that they truly earned that score. Basically, those who had a chance to check their answers in the first phase (and cheated) started believing that their exaggerated performance was a reflection of their true skill.

But what would happen if we paid participants to predict their score accurately in the second phase? With money on the line, maybe our participants wouldn’t so patently ignore the fact that in phase one they had used the answer key to improve their scores. To that end, we repeated the same experiment with a new group of participants, this time offering them up to $20 if they correctly predicted their performance on the second test. Even with a financial incentive to be accurate, they still tended to take full credit for their scores and overestimate their abilities. Despite having a strong motivation to be accurate, self-deception ruled the day.




I KNEW IT ALL ALONG

I give a considerable number of lectures about my research to different groups, from academics to industry types. When I started giving talks, I would often describe an experiment, the results, and finally what I thought we could learn from it. But I often found that some people were rather unsurprised by the results and were eager to tell me so. I found this puzzling because, as the person who actually carried out the research, I’d often been surprised by the outcomes myself. I wondered, were the people in the audience really that insightful? How did they know the results sooner than I did? Or was it just an ex post facto feeling of intuition?

Eventually I discovered a way to combat this “I knew it all along” feeling. I started asking the audience to predict the results of the experiments. After I finished describing the setup and what we measured, I gave them a few seconds to think about it. Then I would ask them to vote on the outcome or write their prediction down. Only once they committed to their answer would I provide the results. The good news is that this approach works. Using this ask-first method, I rarely receive the “I knew it all along” response.

In honor of our natural tendency to convince ourselves that we knew the right answers all along, I call my research center at Duke University “The Center for Advanced Hindsight.”



Our Love of Exaggeration

Once upon a time—back in the early 1990s—the acclaimed movie director Stanley Kubrick began hearing stories through his assistant about a man who was pretending to be him. The man-who-would-be-Kubrick (whose real name was Alan Conway and who looked nothing like the dark-bearded director) went around London telling people who he famously was(n’t). Since the real Stanley Kubrick was a very private person who shunned the paparazzi, not many people had any idea of what he looked like. So a lot of gullible people, thrilled to “know” the famous director personally, eagerly took Conway’s bait. Warner Bros., which financed and distributed Kubrick’s films, began calling Kubrick’s office practically every day with new complaints from people who could not understand why “Stanley” would not get back to them. After all, they had treated him to drinks and dinner and paid for his cab, and he had promised them a part in his next film!

One day, Frank Rich (the former theater critic and op-ed columnist of The New York Times) was having dinner in a London restaurant with his wife and another couple. As it happened, the Kubrick imitator was sitting at a nearby table with a knighted MP and a few other young men, regaling them with stories of his moviemaking marvels. When the imposter saw Rich at the next table, he walked over to him and told the critic that he was inclined to sue the Times for having called him “creatively dormant.” Rich, excited to meet the reclusive “Kubrick,” asked for an interview. Conway told Rich to call him, gave Rich his home phone number, and … disappeared.

Very shortly after this encounter, things began to unravel for Conway as it dawned on Rich and others that they’d been conned. Eventually the truth came out when Conway began selling his story to journalists. He claimed to be a recovering victim of a mental disorder (“It was uncanny. Kubrick just took me over. I really did believe I was him!”). In the end Conway died a penniless alcoholic, just four months before Kubrick.*

Although this story is rather extreme, Conway may well have believed that he was Kubrick when he was parading around in disguise, which raises the question of whether some of us are more prone to believe our own fibs than others. To examine this possibility, we set up an experiment that repeated the basic self-deception task, but this time we also measured participants’ general tendency to turn a blind eye to their own failures. To measure this tendency, we asked participants to agree or disagree with a few statements, such as “My first impressions of people are usually right” and “I never cover up my mistakes.” We wanted to see whether people who answered “yes” to more of these questions also had a higher tendency for self-deception in our experiment.

Just as before, we saw that those in the answer-key condition cheated and got higher scores. Again, they predicted that they would correctly answer more questions in the following test. And once more, they lost money because they exaggerated their scores and overpredicted their ability. And what about those who answered “yes” to more of the statements about their own propensities? There were many of them, and they were the ones who predicted that they would do best on our second-phase test.




HEROIC VETERANS?

In 1959, America’s “last surviving Civil War veteran,” Walter Williams, died. He was given a princely funeral, including a parade that tens of thousands gathered to see, and an official week of mourning. Many years later, however, a journalist named William Marvel discovered that Williams had been only five years old when the war began, which meant he wouldn’t have been old enough at any point to serve in the military in any capacity. It gets worse, though. The title that Walter Williams bore falsely to the grave had been passed to him from a man named John Salling, who, as Marvel discovered, had also falsely called himself the oldest Civil War veteran. In fact, Marvel claims that the last dozen of so-called oldest Civil War veterans were all phony.

There are countless other stories like these, even in recent wars, where one might think it would be more difficult to make up and sustain such claims. In one example, Sergeant Thomas Larez received multiple gunshot wounds fighting the Taliban in Afghanistan while helping an injured soldier to safety. Not only did he save his friend’s life, but he rallied from his own wounds and killed seven Taliban fighters. So went the reporting of Larez’s exploits aired by a Dallas news channel, which later had to run a retraction when it turned out that although Larez was indeed a marine, he had never been anywhere near Afghanistan—the entire story was a lie.

Journalists often uncover such false claims. But once in a while, it’s the journalist who’s the fibber. With teary eyes and a shaky voice, the longtime journalist Dan Rather described his own career in the marines, even though he had never made it out of basic training. Apparently, he must have believed that his involvement was far more significant than it actually was.1

THERE ARE PROBABLY many reasons why people exaggerate their service records. But the frequency of stories about people lying on their résumés, diplomas, and personal histories brings up a few interesting questions: Could it be that when we lie publicly, the recorded lie acts as an achievement marker that “reminds” us of our false achievement and helps cement the fiction into the fabric of our lives? So if a trophy, ribbon, or certificate recognizes something that we never achieved, would the achievement marker help us hold on to false beliefs about our own ability? Would such certificates increase our capacity for self-deception?

BEFORE I TELL you about our experiments on this question I should point out that I proudly hang two diplomas on my office wall. One is an “MIT Bachelor of Science in Charm,” and the other is a “PhD in Charm,” also from MIT. I was awarded these diplomas by the Charm School, which is an activity that takes place at MIT during the cold and miserable month of January. To fulfill the requirements, I had to take many classes in ballroom dancing, poetry, tie tying, and other such cotillion-inspired skills. And in truth, the longer I have the certificates on my office wall, the more I believe that I am indeed quite charming.

WE TESTED THE effects of certificates by giving our participants a chance to cheat on our first math test (by giving them access to the answer key). After they exaggerated their performance, we gave some of them a certificate emphasizing their (false) achievement on that test. We even wrote their name and score on the certificate and printed it on nice, official-looking paper. The other participants did not receive a certificate. Would the achievement markers raise the participants’ confidence in their overstated performance, which in reality was partially based on consulting the answer key? Would it make them believe that their score was, indeed, a true reflection of their ability?

As it turns out, I am not alone in being influenced by diplomas hanging on the wall. The participants who received a certificate predicted that they would correctly answer more questions on the second test. It looks as though having a reminder of a “job well done” makes it easier for us to think that our achievements are all our own, regardless of how well the job was actually done.

THE NINETEENTH-CENTURY NOVELIST Jane Austen provided a fantastic example of the way our own selfish interests, together with the help of others around us, can get us to believe that our selfishness is really a mark of charity and generosity. In Sense and Sensibility there is a revealing scene in which John, the first and only son and legal heir, considers what, exactly, is involved in a promise he made to his father. At his father’s deathbed, John promises the old man to take care of his very kind but poor stepmother and three half sisters. Of his own accord, he decides to give the women £3,000, a mere fraction of his inheritance, which would take care of them nicely. After all, he genially reasons, “he could spare so considerable a sum with little inconvenience.”

Despite the satisfaction John gets from this idea and the ease with which the gift can be given, his clever and selfish wife convinces him—without much difficulty and with a great deal of specious reasoning—that any money he gives his step-family will leave him, his wife, and their son “impoverished to a most dreadful degree.” Like a wicked witch from a fairy tale, she argues that his father must have been light-headed. After all, the old man was minutes from death when he made the request. She then harps on the stepmother’s selfishness. How can John’s stepmother and half sisters think they deserve any money? How can he, her husband, squander his father’s fortune by providing for his greedy stepmom and sisters? The son, brainwashed, concludes that “It would be absolutely unnecessary, if not highly indecorous, to do more for the widow and his father’s three daughters …” Et voilà! Conscience appeased, avarice rationalized, fortune untouched.




SELF-DECEPTION IN SPORTS

All players know that steroid use is against the rules and that if they are ever discovered using them it will tarnish their records as well as the sport. Yet the desire to beat new (steroid-fueled) records and to win media attention and fan adoration drives many athletes to cheat by doping. The problem is everywhere and in every sport. There was Floyd Landis, who was stripped of his Tour de France victory because of steroid use in 2006. The University of Waterloo in Canada suspended its entire football team for a year when eight players tested positive for anabolic steroids. A Bulgarian soccer coach was banned from the sport for four years for giving players steroids before a match in 2010. And yet we can only wonder what steroid users think as they win a match or while receiving a medal. Do they recognize that their praise is undeserved, or do they truly believe that their performance is a pure tribute to their own skill?

Then, of course, there’s baseball. Would Mark McGwire hold so many records if not for steroid use? Did he believe his achievement was owing to his own skill? After admitting to steroid use, McGwire stated, “I’m sure people will wonder if I could have hit all those home runs had I never taken steroids. I had good years when I didn’t take any, and I had bad years when I didn’t take any. I had good years when I took steroids, and I had bad years when I took steroids. But no matter what, I shouldn’t have done it and for that I’m truly sorry.”2

Sorry he may be, but in the end neither his fans nor McGwire himself can know exactly how good he really is.

AS YOU CAN see, people tend to believe their own exaggerated stories. Is it possible to stop or at least decrease this behavior? Since offering money to people to judge their performance more accurately did not seem to eliminate self-deception, we decided to intervene beforehand, right at the moment people were tempted with the opportunity to cheat. (This approach is related to our use of the Ten Commandments in chapter 2, “Fun with the Fudge Factor.”) Since our participants were clearly able to ignore the effect that the answer key had on their scores, we wondered what would happen if we made the fact that they were relying on the answer key more obvious at the moment that they were using it. If using the answer key to boost their scores was blatantly obvious, would they be less able to convince themselves that they had known the correct answer all along?

In our initial (paper-based) experiments, it was not possible to figure out exactly when our participants’ eyes wandered to the answer key and the level to which they were aware of the help that they got from the written answers. So in our next experiment, we had our participants take a computerized version of the same test. This time the answer key at the bottom of the screen was initially hidden from sight. To reveal the answers, participants had to move the cursor to the bottom of the screen, and when the cursor was moved away, the answer key was hidden again. That way the participants were forced to think about exactly when and for how long they used the answer key, and they could not as easily ignore such a clear and deliberate action.

Although almost all of the participants consulted the answer key at least once, we found that this time around (in contrast to the paper-based tests) they did not overestimate their performance in the second test. Despite the fact that they still cheated, consciously deciding to use the answer key—rather than merely glancing at the bottom of the page—eliminated their self-deceptive tendencies. It seems, then, that when we are made blatantly aware of the ways we cheat, we become far less able to take unwarranted credit for our performance.



Self-deception and Self-help

So where do we stand on self-deception? Should we maintain it? Eliminate it? I suspect that self-deception is similar to its cousins, overconfidence and optimism, and as with these other biases, it has both benefits and disadvantages. On the positive side, an unjustifiably elevated belief in ourselves can increase our general well-being by helping us cope with stress; it can increase our persistence while doing difficult or tedious tasks; and it can get us to try new and different experiences.

We persist in deceiving ourselves in part to maintain a positive self-image. We gloss over our failures, highlight our successes (even when they’re not entirely our own), and love to blame other people and outside circumstances when our failures are undeniable. Like our friend the crab, we can use self-deception to boost our confidence when we might not otherwise feel bold. Positioning ourselves on the basis of our finer points can help us snag a date, finish a big project, or land a job. (I am not suggesting that you puff up your résumé, of course, but a little extra confidence can often work in our favor.)

On the negative side, to the extent that an overly optimistic view of ourselves can form the basis of our actions, we may wrongly assume that things will turn out for the best and as a consequence not actively make the best decisions. Self-deception can also cause us to “enhance” our life stories with, say, a degree from a prestigious university, which can lead us to suffer a great deal when the truth is ultimately revealed. And, of course, there is the general cost of deception. When we and those around us are dishonest, we start suspecting everyone, and without trust our lives become more difficult in almost every way.

As in other aspects of life, here too the balance lies between happiness (partially driven by self-deception) and optimal decisions for the future (and a more realistic view of ourselves). Sure, it is exciting to be bright-eyed, with hopes for a wonderful future—but in the case of self-deception, our exaggerated beliefs can devastate us when reality comes crashing in.



Some Upsides of Lying

When we lie for another person’s benefit, we call it a “white lie.” When we tell a white lie, we’re expanding the fudge factor, but we’re not doing it for selfish reasons. For example, consider the importance of insincere compliments. We all know the gold standard of white lies, in which a woman who is less than svelte puts on a slinky new dress and asks her husband, “Do I look fat in this?” The man does a quick cost-benefit analysis; he sees his whole life pass before his eyes if he answers with the brutal truth. So he tells her, “Darling, you look beautiful.” Another evening (marriage) saved.

Sometimes white lies are just social niceties, but other times they can work wonders to help people get through the most difficult of circumstances, as I learned as an eighteen-year-old burn victim.

After an accident that nearly killed me, I found myself in the hospital with third-degree burns covering over 70 percent of my body. From the beginning, the doctors and the nurses kept telling me, “Everything will be okay.” And I wanted to believe them. To my young mind, “Everything will be okay” meant that the scars from my burns and many, many skin transplants would eventually fade and go away, just as when someone burns himself while making popcorn or roasting marshmallows over a campfire.

One day toward the end of my first year in the hospital, the occupational therapist said she wanted to introduce me to a recovered burn victim who’d suffered a similar fate a decade earlier. She wanted to demonstrate to me that it was possible for me to go out into the world and do things that I used to do—basically, that everything would be okay. But when the visitor came in, I was horrified. The man was badly scarred—so badly that he looked deformed. He was able to move his hands and use them in all kinds of creative ways, but they were barely functional. This image was far from the way I imagined my own recovery, my ability to function, and the way I would look once I left the hospital. After this meeting I became deeply depressed, realizing that my scars and functionality would be much worse than I had imagined up to that point.

The doctors and nurses told me other well-meaning lies about what kind of pain to expect. During one unbearably long operation on my hands, the doctors inserted long needles from the tips of my fingers through the joints in order to hold my fingers straight so that the skin could heal properly. At the top of each needle they placed a cork so that I couldn’t unintentionally scratch myself or poke my eyes. After a couple of months of living with this unearthly contraption, I found that it would be removed in the clinic—not under anesthesia. That worried me a lot, because I imagined that the pain would be pretty awful. But the nurses said, “Oh, don’t worry. This is a simple procedure and it’s not even painful.” For the next few weeks I felt much less worried about the procedure.

When the time came to withdraw the needles, one nurse held my elbow and the other slowly pulled out each needle with pliers. Of course, the pain was excruciating and lasted for days—very much in contrast to how they described the procedure. Still, in hindsight, I was very glad they had lied to me. If they had told me the truth about what to expect, I would have spent the weeks before the extraction anticipating the procedure in misery, dread, and stress—which in turn might have compromised my much-needed immune system. So in the end, I came to believe that there are certain circumstances in which white lies are justified.

Загрузка...