7

The Stories We Live By

I looked around, it was like a horror movie, people were mounted on each other, the smell of burnt skin and people’s insides was gagging. I kept thinking about my fiancé, about our wedding, I wanted to wear that white dress and swear my love for him. Something gave me the strength to get up. I believe today that it was my fiancé on his way to heaven.

Tania Head

1

Who can forget the day they saw the attack on the Twin Towers? You didn’t even have to be there. It was the first live televised terrorist atrocity witnessed by the world. I was at work in Bristol, England, and recently had a television mounted on my office wall that I used to review research videos, but that afternoon I had it turned it on to watch the horror unfold on that crisp, sunny September morning in New York. It was surreal – it couldn’t be happening. I remember trying to be disconnected from it – as if it was just another piece of news. I did not want to think too hard about what I was seeing. And yet I will not forget that day. It is seared into the memories of all who witnessed the events that have simply become known as 9/11.

As discussed, memories are not recordings but stories we retrieve from the compost heap that is our long-term memory; we construct these stories to make sense of the events we have experienced. They change over time as they become distorted, merged, turned over, mixed in and mangled with other experiences that eventually fade. But some memories remain as vivid as the day they happened or at least they seem so – those episodes that refuse to decompose. These are the events that we can’t forget. When we witness something that is truly terrifying, then a memory can be branded into our brain, like a hot searing iron that marks our mind forever. This is because emotionally charged memories are fuel-injected by the electrical activity of the limbic system.2 Arousal, triggered in the amygdala, produces heightened sensitivity and increased attention. The dilation of our pupils reveals that our vigilance systems have been put on high alert to look out for danger. The world suddenly becomes very clear and enriched as we notice all manner of trivial details that we would not normally care about. It’s like the scene has suddenly been illuminated by bright light – as if some paparazzi photographer has lit up the world in a brilliant blaze of light during our moments of terror – which is why these recollections are called ‘flashbulb’ memories.3 And we experience the emotion – we feel the past. It is the heightened arousal and emotional significance that seems to lay down the life-track in the brain that becomes a flashbulb memory.

We usually lament our loss of memory as we age but sometimes it is better to forget. While many flashbulb memories are associated with the more joyous events in life such as births and weddings, most are generated by the horrors. Victims and survivors typically experience traumatic memories that they can’t erase – a common symptom of post-traumatic stress disorder (PTSD). Following 9/11, one in five New Yorkers living in the vicinity of the Twin Towers developed PTSD.4 They were haunted by nightmares and constantly ruminated on the events of that terrible day. Our emotional systems seem compelled to never let us forget the worst things that can happen to us. In truth, details of flashbulb memories can be as false as any other memory, but they just seem so accurate. For example, many people (including George Bush) remember seeing the plane hit the first tower on 9/11 even though video footage did not emerge until much later. Maybe flashbulb memories serve some form of evolutionary value to always remember the worst case scenario. When it comes to surviving, it would seem that Mother Nature has decided that it is more important to remember how we felt when endangered compared to the pleasures of life.

One way to combat PTSD is to administer a beta-blocker such as propranolol immediately after the event.5 Beta-blockers dampen the arousal of the limbic system so that events are not encoded with the same degree of emotional kick. People still remember what happened but feel less upset. Currently, there is research underway at Yale Medical School by Deane Aikins to determine whether propranolol alleviates PTSD in combat troops but some have even suggested that the drug should be given to all soldiers. This raises concerns. Do we really want to have a moral morning-after pill that shuts down a system that usually prevents us from doing things that might lead to remorse and regret? 6 Some have even suggested giving propranolol to soldiers before they go into battle as a prophylaxis. If you have no pangs of guilt then you could become immune to suffering. But do we really want blind obedience without a moral compass in our solders? Remember the lessons of Milgram and Abu Ghraib. There is a big difference between inoculating against PTSD and helping those to overcome events out of their control and no fault of their own. This future of the psychopharmacological treatment of PTSD is a moral minefield.

In any event, the scale of the emotional devastation created by 9/11 was unlikely to be solved so easily with a pill, and certainly not for those who had managed to escape the collapsing towers. The survivors of 9/11 were left traumatized and tormented by their flashbulb memories. Initially, the nation joined them in their grief as everyone tried to comprehend the sheer horror of 9/11 but, eventually, things started to return to normal. Memories started to fade and people wanted to move on, but not those who had been there. Two years after the event, survivors sought each other out and met up in small meetings to share their experiences, nightmares and pain. There was a lot of guilt that they had survived and they needed to talk. Gerry Bogacz, who co-founded the Survivors’ Network, explained: ‘After a while, you can’t talk about this anymore with your family. You’ve worn them out. Your need to talk is greater than their ability to listen.’ The Survivors’ Network began to expand their meetings across Manhattan. More and more sought the solace and comfort of fellow survivors because only those who had undergone the same ordeal themselves could relate to the legacy left by 9/11. At these meetings, they would exchange stories and, with each retelling, it seemed to help unleash or ease the feelings and emotions that had been bottled up.

Very soon, one particular story started to spread among the groups. It was the story of Tania Head who had survived the attack on the South Tower. She had been on the seventy-eighth floor, waiting for an elevator, when United Airlines flight 175 slammed into her tower. Tania had been badly burned by aviation fuel but managed to crawl through the rubble and even encountered a dying man who handed her his wedding ring, which she later returned to his widow. She would be only one of nineteen above the impact point who would survive that day. Tania recalled how she was rescued by twenty-four-year-old volunteer firefighter, Welles Crowther, who always wore a red bandanna. Witnesses say he was later killed making his fourth return to the collapsing tower to save more victims trapped in the debris. But Tania was not entirely without loss. Though she was saved, she later discovered that her fiancé, Dave, who had been in the North Tower had been killed. The wedding, for which she had bought her dress only weeks earlier, was never to be.

Like other survivors, Tania needed to do something to deal with the emotional aftermath. She started an internet group for survivors and eventually news of her efforts reached Gerry Bogacz who invited her to join the Survivors’ Network. Tania’s story was noteworthy. She had lost more than most others but somehow she had found the courage and conviction to overcome adversity. Tania’s tale was a story of triumph. She offered hope to those who had been lost in the pits of despair. How could anyone wallow in self-pity when Tania had managed to overcome her own loss?

Soon Tania was campaigning for the survivors. Their voice had to be heard. She championed the group’s right to visit Ground Zero, the site of the collapsed towers, which up to that point had been off-limits to all. She became the spokesperson for the Survivors’ Network and then their president. She gave the inaugural guided tour of the Tribute W.T.C. Visitor Center in 2005 when she showed New York City Mayor Bloomberg, former Mayor Giuliani, former New York Governor Pataki and other important dignitaries around the facility, regaling them with her experiences during that fateful day.

Tania Head had become the figurehead of 9/11 survivors. Except … Tania had never been there. She did not have a false memory. She was a fraud. Like me, Tania had watched the events on television back in her native Spain. She had not been in the South Tower. She did not work for Merrill Lynch. She had not been on the seventy-eighth floor of the World Trade Center. She had not crawled through rubble to retrieve a dying man’s wedding ring. She had not been saved by a real hero, Welles Crowther. And she did not lose her fiancé, Dave, in the collapsed North Tower. Tania was really Alicia Esteve Head who only arrived in United States in 2003 – two years after 9/11. She had made everything up.7 However, the authorities could not arrest Tania because she had not broken any law. In 2007 she disappeared and, in February 2008, a telegram was sent from a Spanish account to the Survivors’ Network informing them that Alicia Esteve Head had committed suicide. Not surprisingly, very few believe this.

Alicia Head came from a wealthy Spanish background but something must have been missing in her life that money could not buy. She needed the attention and sympathy from others, and saw herself as the victim in a romantic tragedy set against the backdrop of the world’s worst terrorist attack. As Tania, Alicia would have lived out this lie if she had not been exposed. We will probably never know exactly why she created this charade but we must assume that this was the story she wanted to live. She may have come to even believe her own false memories, locked in her own fantasy world where she recast herself as a survivor against the odds.

We Are Our Memories

What is a memory? Can you hold one? Can you make one? Can you copy a memory? If we are our memories, can we be re-created? Memory is information stored as a pattern of electrical activity that ‘re-presents’ the original pattern at the time it was formed. This representation is what memories are – although human memories are not rigid but dynamic and continually changing as new information is encountered. If we are our brains and our brains are a network of physical cells connected together in a pattern of weighted electrical activity, then it really should be possible to copy a memory in the same way we can copy any information. We should be able to copy our selves.

The possibility of copying memory is at the heart of what it is to be unique. Imagine a machine that can copy any physical object right down to its basic atomic structure. It can perfectly duplicate any material thing irrespective of what it is made of or how complicated it is. Remarkably, engineers are working on precisely this type of machine known as a 3D printer. They typically work using a laser to scan a target object to calculate its dimensions and then relay that information to a jet-moulding device where liquid plastic is squirted to gradually build up a reproduction of the object. It’s the sort of technology that would make constructing colonies on distant planets more feasible without having physically to transport every object. At the moment the technology is fairly crude and solving how to build the internal structures of complicated objects made of different substances presents considerable challenges. However, just like the wooden block printing press of Johannes Gutenberg was considered a technological marvel of the fifteenth century and yet seems so primitive by today’s standards, it may simply be a matter of time before we can reliably manipulate matter to create accurate duplicates.

Whatever way we achieve it, let us assume that we have the technology to reliably duplicate anything. Imagine now that you step into the machine and an identical physical copy of you is created. What would this new you be like? Let’s also assume that you accept that there is no immaterial spirit or soul that cannot be reproduced. Would this identical copy be you? It’s the sort of question that has entertained philosophers and writers8 in one form or another for centuries though in recent years it has enjoyed a resurgence of interest because of rapid developments in technology such as gene sequencing and 3D printers. In all of these different scenarios, the same fundamental question of identity is raised: what makes us unique?

John Locke thought about this issue in the context of reincarnation9 – something that was of interest in the seventeenth century when it came to the notion of the immortal soul. Locke was of the opinion that conscious awareness of one’s own history was important when it came to unique personal identity. In short, he was thinking about the role of autobiographical memories in defining the self. Even if one does not believe in the immortal soul, modern adults also regard personal memory as the most important thing that defines who we are. In one study, adults were told about the unfortunate Jim who was in a serious accident where his body was irreparably damaged so that he needed a transplant.10 Only this was a science fiction story where the transplantation was very advanced. In one version of the story, Jim had lost all his memories but they could transplant his amnesic brain into either a robot or genetically engineered biological body. In another version of the story, doctors had managed to download all of Jim’s life memories before his brain died and could transfer them to the replacement body. After the transplantation, Jim’s original body was cremated. Adults were then asked if each operation was a success – was the patient still Jim?

The most important thing that determined whether adults considered the patient to be Jim was whether his memories had been saved irrespective of whether they were now stored in a mechanical body or a biologically engineered one. In fact, the biologically engineered body that contained Jim’s original brain was considered less Jim than the robot with his memories. In the absence of his memories, Jim was gone.

The relationship between memory and identity is an intuition that starts to emerge in children from around four to five years of age. We used duplication machine studies11 to see if children would think that a live hamster could be copied exactly and whether its doppelganger would have the same memories.12 To achieve the illusion of duplication we used two identical-looking Russian hamsters that were indistinguishable to the untrained eye. Once children had been convinced by the power of the machine to faithfully duplicate any toy, we introduced our pet hamster and proceeded to tell children about some of the unique physical properties the hamster had that could not be directly seen. We said it had swallowed a marble in its tummy, had a broken back tooth and had a blue heart. We then created some memories that were unique to the hamster. Of course, memories are also invisible but they are not physical like marbles and blue hearts. We showed the hamster a drawing by each child, whispered the child’s name into the hamster’s ear and got the child to tickle the hamster. These are all episodes that can be stored in memory. We then placed the hamster in the duplicating machine and after the buzzer sounded, we opened both boxes to reveal that there were now two identical hamsters. What would the children think? Would the invisible physical properties and the memories be the same or different for each animal? So we asked the child whether each hamster had a blue heart, a broken tooth and a marble in its tummy. We also asked about the memories. Did each hamster know the child’s name and what picture the child drew, and remember being tickled?

What’s your intuition? If you walked into the machine would the duplicate of you have the same memories? We conducted a straw poll online with sixty adults to get a sense of their intuition about duplicating hamsters and themselves in our machine. We asked whether the identical copy created by the machine would have the same body and memory? Around four out of five adults agreed that both the copied animal (84 per cent) and human (80 per cent) would have the same body. About half (46 per cent) thought the hamster would have the same memories compared to just over a third when it came to the human (35 per cent). So overall, adults thought that bodies were more likely to be copied compared to memories and this belief was stronger when related to humans compared to hamsters.

Back in the lab, we repeated the hamster study a number of times with variations to check the results and found the same basic pattern. About one third of children from four to six years thought that the second hamster was completely different on both mental and physical properties. Maybe they did not believe that the machine could copy something alive. Another third thought that the second hamster was identical on all properties. However, the interesting group was the remaining third of children who thought that the physical properties but not the memories were copied. In other words, they believed that the machine worked but could not copy the mind, just as the adults did.

In another version, we found this uniqueness effect of memories was stronger when we gave our first hamster a name, suggesting that it really does have something to do with identity. It’s remarkable how naming an animal confers a new sense of identity, which is why you should not name your livestock if you intend to eat your animals. The uniqueness and identity conferred by names may also explain why young children are often affronted when they first learn that they share their name with another child.

We think our findings show that children begin to appreciate how minds and memories, in particular, create the unique individual. Earlier, we saw there is an increasing awareness of other people’s mental states from four years of age as shown by the ‘theory of mind’ research. Initially, young children appreciate that other people have minds. As they develop, children come to increasingly appreciate the importance of their mind and the contents of their mind as being different to others and unique.

By the time we are adults, most of us think that our autobiographical memories are crucial to our sense of self. Our bodies could be copied but not our memories. Our memories are what make us who we are. Aside from the science fiction movies we have already discussed, anyone unfortunate in real life to witness the decline of a loved one with rapidly progressing dementia, which destroys memory, knows how the person’s identity and sense of self can unravel. That loss of identity is one of the reasons why memory failure is considered such a traumatizing symptom for relatives because the sufferer no longer recognizes those around him.

Once again, neurologist Oliver Sacks reminds us how we rely on others to create our sense of self-identity. One of his patients, a former grocer called William Thompson, had Korsakoff’s syndrome, that produced a profound amnesia so he was unable to remember anything for more than a second or two – just like Clive Wearing who we encountered earlier. He lived in the eternal present and was unable to generate a stable sense of self. In one exchange, Sacks walked on to the ward in a white coat to see William who greeted him:

‘What’ll it be today?’ he says, rubbing his hands. ‘Half a pound of Virginia, a nice piece of Nova?’

(Evidently he saw me as a customer – he often would pick up the phone on the ward, and say ‘Thompson’s Delicatessen.’)

‘Oh Mr. Thompson!’ I exclaim. ‘And who do you think I am?’

‘Good heavens, the light’s bad – I took you for a customer. As if it isn’t my old friend Tom Pitkins … Me and Tom’ (he whispers in an aside to the nurse) ‘was always going to the races together.’

‘Mr. Thompson, you are mistaken again.’

‘So I am,’ he rejoins, not put out for one moment. ‘Why would you be wearing a white coat if you were Tom? You’re Hymie, the kosher butcher next door. No bloodstains on your coat though. Business bad today? You’ll look like a slaughterhouse by the end of the week.’13

It was if William reeled effortlessly from one self-reflected identity to the next depending on who he thought Sacks was. He was oblivious to his circumstances. He had no awareness that he was a Korsakoff’s patient in a psychiatric hospital but rather, as Sacks put it, had to ‘literally make himself (and his world) up every moment’. Unlike the woman with Tourette’s Syndrome who could not stop incorporating the mannerisms of others, William used the identity of those around him in order to create his own identity.

Being in Two Minds

Constructing a plausible story is known as confabulation and found in various forms of dementia as the patient attempts to make sense of their circumstances. Remember TH who could not recognize himself in the mirror and thought his reflection belonged to his neighbour who had snuck into the house? However, we can all confabulate to some extent even though we are not aware we are doing this. These produce the biases, selective interpretations, reframing and cognitive dissonance processes in which we are less objective than usual. We are all naturally inclined to interpret the world in terms of meaningful stories and this probably reflects the activity of a system known as the ‘interpreter’ which appears to be localized to the left hemisphere.14

We are not aware of this system normally as our brain processes are effortlessly and invisibly integrated below our levels of awareness. We simply experience the output of the interpreter as our conscious appraisal of our situations, our thoughts and our behaviours. After all, we are our minds and if that is largely constructed by unconscious processes why should we ever become aware of the so-called interpreter? However, the activity of the interpreter was revealed by neuroscientist Michael Gazzaniga in his research on split-brain patients.

The normal brain is really a tale of two cities on the left and the right. Gazzaniga demonstrated that you could reveal the autonomy of the two hemispheres by selectively feeding different information to each. To do this, he presented words and images on the left and right side of a computer screen while the patient stared at a spot in the middle. This ensured that each hemisphere processed stimuli on the opposite side and because they were no longer connected to each other through the corpus callosum, there was no exchange of information. For example, if the words ‘Key’ and ‘Ring’ were briefly flashed in the left and right halves of the screen respectively, the patient reported seeing the word ‘Ring’ because this was processed by the opposite left hemisphere that controls language. However, if the patient was asked to choose the corresponding object from a selection on the table, they would pick up a key with the left hand that was controlled by the right hemisphere. Experiment after experiment revealed that the two hemispheres were functioning independently of each other. In one study, a naked man was flashed into the right hemisphere causing the female patient to laugh but not be able to say what it was she was finding amusing. Her left hemisphere was unaware of the naked man and so could not explain what was amusing.

Sometimes, however, the patients make up a story to make sense of their unconscious activity. In one classic example told by Gazzaniga, one of his split-brain patients, Paul, was shown a snow scene in this left visual field and a picture of a chicken foot in his right visual field, and asked to choose the correct image from a selection on the table. He picked out a picture of a shovel with his left hand and a picture of the chicken foot with his right. When his attention was drawn to the discrepancy, and he was asked why he had chosen two different images, Paul replied, ‘Oh that’s simple, the chicken claw goes with the chicken, and you need the shovel to clean out the chicken shit!’

Gazzaniga has proposed that there are not two separate minds or selves in these spilt-brain patients. Rather, the mind is a product of the mental processes of the brain that are shared across the two hemispheres. Language has the advantage of providing the narrative output, so the interpreter in the left hemisphere is able to articulate a coherent account to integrate the different pieces of information. Normally theses processes are a collaborative effort with information streaming in from all the different processing regions. But in the spilt-brain patient there is no shared communication is possible. Presented with the choices of the right hemisphere that are inconsistent with the information in the left, the interpreter reconciles the difference with a plausible story.

What the split-brain studies reveal is that the self illusion is really the culmination of a multitude of processes. These usually work together in synchrony to produce a unified self but when inconsistencies arise, the system, strongly influenced by language, works to re-establish coherence. Probably one of the most compelling examples of this process comes from a personal anecdote that Gazzaniga15 tells that comes from the late Mark Rayport, a neurosurgeon from Ohio. During one operation on a patient in which Rayport was stimulating the olfactory bulb, the brain region associated with smell, the patient reported experiencing different aromas depending on the context. When the patient was asked to reminisce about a happy time in his life, stimulation of the region produced the sensation of roses. Rayport then asked the patient to think about a bad time in his life. This time stimulation of the same cluster of neurons produced the sensation of rotten eggs! This anecdote suggests that the neural networks of the brain store associations that fit together into a coherent story. In many ways, confabulation in the patient unaware of the true nature of their surroundings or disrupted brain processes is the same storytelling we all use to make sense of the inconsistencies that punctuate our lives when we deviate from the normal storyline of what we believe we are our selves.

Know Thy Self

Psychologist Dan McAdams proposes that when it comes to making sense of our lives, we create narrative or personal myths to explain where we have come from, what we do and where we are going.16 This is the narrative arc of our lives – the background, the struggle, the climax and resolution that people readily attribute to the story of their lives. For example, some may see themselves as victims of circumstances beyond their control, reinterpreting events to fit with this perspective. Another could take the same set of circumstances and cast themselves as the resilient hero, triumphing over adversity to get where they are today. Presumably these myths reflect the influences of culture and those close to us when it comes to providing a meaning to our lives. These accounts are myths because they are not grounded in reality but rather follow a well-worn narrative path of a protagonist character (our self) and what the world throws at them. In some individuals, the reality is complete fantasy, as in the case of Tania Head.

Our self-centred way of constructing the story means that we only pay attention to those events as we see them being related to us. This personal myth is constantly being revised and updated throughout our life by both conscious and unconscious processes and re-emerges at times either through deliberate retelling to others to explain who we are, or at times of insight when something from our past seems to become surprisingly poignant or relevant. Even cultures continually recycle the same old stories in the form of myths.17 For example, Star Wars may have been set in the future but it is just as much a Greek myth as Homer and The Iliad. We like stories that are about journeys and conflicts, with goodies and baddies. The same is true for our own personal stories.

The problem with self-narratives is that we are the ones writing the story, which means our myths are open to all manner of distortions of what we think we should be like. This has been called the ‘totalitarian ego’ in which we repress, distort and ignore negative aspects of our lives that do not fit with our idealized self-narrative.18 For example, we tend to remember information that fits with our idealized self and conveniently ignore that which does not. If we believe that we have a particular trait of personality then we selectively interpret events that are consistent with that belief. In fact, we can easily interpret general statements to make them seem particularly relevant to us. For example, how accurate is this description of your personality?

You have a great need for other people to like and admire you. You have a tendency to be critical of yourself. You have a great deal of unused capacity which you have not turned to your advantage. While you have some personality weaknesses, you are generally able to compensate for them. Disciplined and self-controlled outside, you tend to be worrisome and insecure inside. At times you have serious doubts as to whether you have made the right decision or done the right thing. You prefer a certain amount of change and variety and become dissatisfied when hemmed in by restrictions and limitations. You pride yourself as an independent thinker and do not accept others’ statements without satisfactory proof. You have found it unwise to be too frank in revealing yourself to others. At times you are extroverted, affable, sociable, while at other times you are introverted, wary, reserved. Some of your aspirations tend to be pretty unrealistic. Security is one of your major goals in life.

Spookily accurate isn’t it? In 1948, psychologist Bertram Forer gave a personality test to his students and then provided them with an individual analysis based on their performance.19 In fact, every student got the description above as their ‘unique’ analysis. Just about everyone thought the analysis was an accurate description but Forer had, in fact, assembled the analysis from various different horoscopes to prove that the descriptions were sufficiently general as to apply to most people. This effect, known as the ‘Barnum effect’ after the showman who famously quipped, ‘We’ve got something for everyone’, shows that our self-stories are probably more similar than we imagine. Also the Barnum effect is particularly strong when the analysis contains many positive traits that support the inherent bias most of us hold.20 Most of us think that we are funnier, smarter, better looking and kinder than the average person, which, of course, is statistically impossible. Some of us have to be less funny, less clever, less beautiful and crueller to balance up the sums.

The Barnum effect reveals that we all entertain illusions of a unique self, which turns out to be remarkably consistent and familiar between different people. Our uniqueness is closer to the average than we think. Also, if you look at the sort of generic statements in Forer’s description, most are all to do with how we think others perceive us, social anxieties and concerns that we are more complicated than others realize. Again, this is more damning evidence that most of us are preoccupied with what others think and less independent that we imagine!

Swimming in the OCEAN

Although the Barnum effect reveals that we share many beliefs and attitudes, we are clearly not clones of each other like aphids or other simple organisms. When we describe different people, we come up with varied accounts that emphasize those characteristics that we think are the most notable. Even babies are not identical. We are born with different temperaments and form varying patterns of social attachment that appear to be strongly influenced by continual interaction with the environment. In short, we believe in the concept of personality – a stable set of characteristic styles of behaving, thinking and feeling that define us as individuals.

Assessing personality is a major industry backed up by decades of research showing that some people are better suited to particular occupations. The science of personality can be traced back as far as the Greek scholar Theophrastus (c.371–c.287 BCE) who described his fellow Athenians in terms of a limited number of characters.21 More recently, psychologists have argued that personality is the culmination of a combination of five distinct traits or the ‘Big-Five’ model of Openness (the willingness to try new and imaginative experiences), Conscientiousness (the extent of self-disciplined organization), Extraversion (the extent of social gregariousness), Agreeableness (the willingness to help others) and Neuroticism (the extent of insecure self-centred worry): OCEAN for short.22 The Big-Five approach is one of the most commonly used measures of personality assessment to predict how happy people are with their lives, the quality of social relationships, life expectancy and even job success and satisfaction.23

With such high praise for the Big-Five, one might be tempted to conclude that personality psychologists have dispelled the self illusion – that there is indeed a core personality that defines each of us. However, in seeking to find stable measures of the Big-Five, personality theorists have ignored the variation in the OCEAN scores that can come about by changes in the different situations and roles we adopt.24 For example, students were asked to consider themselves in five roles that that they typically occupy at that time in their life: as a student at college, as a temporary employee working to put themselves through college, as a friend of other students, as a child of their parents and as a romantic partner. They were then assessed on the OCEAN measures which revealed both inconsistency and consistency in their personality. The inconsistency was that individuals varied on their self-assessment of OCEAN measures over the different roles they imagined themselves in but as a group they were consistent on which personality factors were most prominent in each role. On the Big-Five measures, respondents were consistently most Open to experience when they were in the role of the romantic partner, most Conscientious in the employee role, most Extraverted when they were in the friend role, least Agreeable in the student role and most Neurotic in the student role.

These findings indicate that although the Big-Five factors of personality might be reliable within an individual in one role, it can completely change in another, just as the looking glass self predicts. In other words, people are not necessarily consistent in all aspects of their lives. This is why you can live with someone who is fastidious at work when it comes to detail but hopelessly disorganized when it comes to the domestic situation. This influence of context on the self has been shown over and over again. In one classic study, Princeton theology students were asked to present a sermon on the ‘Good Samaritan’ in a building across campus.25 If they were told that they were running late, only one in ten stopped to help a sickly man in a doorway on their way to the meeting compared to six out of ten who were not in a hurry. What were they thinking? Clearly nothing about the message of the sermon. How do they deal with such inconsistencies?

The answer is that we easily use our cognitive dissonance to reframe the events to justify our actions. Alicia Esteve Head may not have been a true victim of 9/11 but what Tania initially did was good for the survivors: Alicia could not have achieved this without becoming Tania. The theology students were aware of the poorly man but it was more important for their calling to deliver a sermon that would have greater impact on more people. It’s all too easy to reframe a story to protect the self-narrative from disentangling when presented with inconsistency.

Why do we create these distortions? Isn’t it better to be honest with oneself, otherwise, we will only end up fooling our selves? For one thing, positive illusions (that we are better than most others) may actually be beneficial to our mental well-being.26 These positive illusions ensure that our self esteem is protected by downgrading our failings (‘Everybody cheats on their tax returns’) to overegging our positive attributes as being special (‘Unlike most people, I have a really creative mind’). Armed with these positive illusions, we feel that we have more control over situations when in fact we have little or none. Remember how the illusion of control inoculates us from the stress of uncertainty?27 Positive illusions mean that we tend to see positive outcomes as a direct consequence of our actions whereas negative outcomes as someone else’s fault.28 This makes us unrealistically optimistic given the trials and tribulations that life can throw at us, positive illusions make us more resilient and willing to carry on.

Maybe this resilience gave us a selective advantage as we evolved. Somewhere back in the mists of time, this way of thinking may have been the difference between the hunter on the Serengeti who was willing to keep trying that bit harder and the hunter who gave up the chase too early and failed to make it back to camp to mate. It is speculation, of course, but believing you will succeed means that sometimes you will, whereas believing that you will fail means that you inevitably do.

Listen With Mother

When we describe our self to others we refer to our past experiences by way of an explanation of who we are and how we have arrived at this point in our life. This seems such an objective exercise that we never really question the truthfulness of our storytelling. However, culture plays an influential role in how we interpret the world around us. It turns out the individualism that is so characteristic of Western thinking and the collectivism of the East shape our autobiographical memories as well.

Qi Wang, a developmental psychologist at Cornell University, has shown that childhood memories differ between Eastern and Western cultures with a greater focus on the individual in the West when it comes to recounting past experiences.29 The self-obsessed Western perspective (‘I remember the time I won the class test’) drives our thought processes to focus on an elaborate encoding of moment-to-moment personal events. This is why Western children recall more specific details compared to their Eastern counterparts.30 Those Eastern children who also had demonstrated greater detail for personal memories also scored higher on measures of individualism thus proving that it was not the culture or language that determined autobiographical memory but rather the way they viewed the world.31

The way children remember is partly aided by parents reminiscing with their children. As we learned earlier, we know that if parents talk over events with their young children then the amnesia barrier that is typically reported in two to three year-olds can be pushed back much earlier. This indicates that the framework of interpretation provided by the adults helps the child to make sense of their experiences and form better memories.32 However, studies have also shown that parents from the East and the West differ in the way they reminisce, with adults and show the typical individualistic or collective frameworks when talking to their children about their memories.33

What’s more surprising is that the full content of memories is not always lost either. If you prime individuals from either the East or the West to think more individualistically or collectively, then they recall more personal or group-oriented memories accordingly. This means that the memories are still available: it is just that they are not usually retrieved. The context in which we find our selves even defines how we retrieve memories to describe our inner self – memories that we know are selectively processed. As Sir Frederick Bartlett said, ‘Social organization gives a persistent framework into which all detailed recall must fit, and it very powerfully influences both the manner and the matter of recall.’34 Even the memories we recall to define our self-story are defined by the groups to which we belong.

A Flight From Reality

For some individuals, their self-story is unacceptable – it’s too much to cope with so they seek to create a new self or at least lose the one they had. Take the case of Gene Saunders who had been experiencing considerable difficulties in his home life and had a huge argument with his eighteen-year-old son who called him a failure. Gene simply packed his bags and ended up 200 miles away in another town where he became ‘Burt’ – a short-order cook who had no memory of his past existence. This kind of memory loss is known as a dissociative ‘fugue’ state, from the Latin for ‘flight’.

‘Fugue states’ typically emerge in early adulthood and not very often after fifty years of age. They usually occur rapidly but also end abruptly and are thought to be a reaction to stress where the individual ceases to acknowledge who he or she is.

For example, Jeffrey Alan Ingram turned up at a television news conference in Denver in 2006 looking for his identity. All he knew was, that his name was ‘Al’. He asked the viewing audience, ‘If anybody recognizes me, knows who I am, please let somebody know.’ It turns out that he had been on his way to visit a friend dying of cancer but on arrival in Denver had gone into a fugue state. Eventually, his fiancée’s brother, who had watched the news, recognized Jeffrey who lived over a thousand miles away in Olympia, Washington. His own mother explained that this was not the first time Jeffrey had entered a fugue state, as a similar disappearance occurred in 1995 on trip to the grocery. He turned up nine months later with no knowledge of who he was.

Fugue states are just one of a number of conditions known as dissociative identity disorders (DIDs), formerly called multiple personalities, in which alternative selves or ‘alter egos’ are present. The first popular fictionalized account of DID was The Strange Case of Dr Jekyll and Mr Hyde (1886) by Robert Louis Stevenson but the idea that an individual can split into different personalities is a recurrent theme in modern culture. A notable recent example is Ed Norton’s alter ego, Tyler Durden, played by Brad Pitt in Fight Club (1999). Just like Jekyll and Hyde, we watch as the anarchic character of Tyler Durden increasingly drags Norton’s upstanding character into criminality only to discover at the end of the movie that Durden is in fact his own alter ego.

The notion that we all have a good and bad side has become accepted wisdom although few of us would regard the different characterizations as different individuals. And yet this is exactly the claim with DID to the extent that it has been used in criminal cases as a defence plea. The first such case was in 1978, when twenty-three-year-old Billy Milligan was arrested following an anonymous tip-off for the rape of four college women on Ohio State campus the previous year. At first he seemed like the typical drifter: troubled childhood, abused by his stepfather, constantly in trouble. That changed after a psychological examination indicated he had at least ten different personalities, two of which were women. In fact, it was one of the women, Adelena, a nineteen-year-old lesbian, who claimed responsibility for the rapes. Another of Milligan’s personalities was the fearful and abused child, David, nine, who it was claimed made the telephone call turning in Billy. According to Time, investigators found the police telephone number on a pad next to Milligan’s phone.35 Milligan ended up being sentenced to ten years in secure psychiatric hospitals and was released in 1988.

Another famous case in which DID was used as defence was that of the Hillside Strangler, Ken Bianchi, who claimed that an alter ego, Steven Walker, had been responsible. However, this defence fell apart when a dubious psychiatrist suggested that most cases of DID had at least two alter egos. In the following hypnosis session, Bianchi conveniently manifested another alter ego, Billy. When police investigated further, they found out that Steven Walker had been a real psychology student whose identity Bianchi had tried to steal in order to commit fraud. Bianchi is currently serving a life sentence.

Although DIDs are recognized in the major psychiatric manuals, they are still considered highly controversial. The first cases, reported in the nineteenth century, are linked to the psychoanalytic movement. Not much was heard of DID again until 1957 with the release of a popular movie, The Three Faces of Eve, about a woman with DID, followed by a similar movie, Sybil, in 1976. Prior to the 1970s there had been very few cases of DID but suddenly the incidence exploded, which led many to question whether it was a real medical disorder or a fashionable fad. Also DID was primarily a North American problem with few cases reported in other countries. Those that were reported in North America also tended to come from the same specialists, which cast doubt on the source of the disorder.

Just like hypnosis and the actions of student prison guards, DID has been dismissed as an extremely elaborated example of role-playing in which a belief about dissociated states is promoted by society and supported by a few influential experts, namely the psychiatrists who are experts in the field. That is not to say that individuals with DID are deliberately faking their symptoms. Support for this comes from studies that reveal that different brain states can be manifest when the individual is in one of their alternative personalities. For example, brain-imaging studies have shown that patients with DID can manifest different patterns of brain activity when in different characters.36 In one patient the memory region seemed to shut down during the transition between one personality to another as if a different set of memories was being retrieved.

The evidence from brain-imaging studies is less convincing if one considers that we can alter our brain activity by simply thinking about different things. If I think about a time when I was upset or angry, my brain activity will change. However, one dramatic case in which the brain science backs up the claim of true separated selves comes from a recent German DID patient who after fifteen years of being diagnosed as blind gradually regained sight after undergoing psychotherapy.37 At first, only a few of the personalities regained vision, whereas others remained blind. Was the patient faking? Not according to the electrical measurements recorded from her visual cortex – one of the early sensory processing areas in the brain. When her personality was sighted, electrical activity was normal over this region but absent when the patient was experiencing a blind personality. Somehow, the parts of her brain that were generating the multiple personalities were also switching on and off the activity of the visual part of the brain. This finding is beyond belief – literally. To believe that you are blind is one thing, but to switch off parts of the lower level functioning sensory processing areas of your own brain is astounding. Somehow, the network of connections that operates further upstream in the brain to deal with complex concepts, such as the self and personality, can control earlier basic processing input relay stations downstream in the brain.

How the Mighty Have Fallen

If we are not brain-damaged or suffering from DID, to what extent can we experience a different self? In modern Westernized cultures, some people appear to lead complicated, multifaceted lives juggling private and public personas, whereas others lead a more simple existence such as subsistence farming in rural villages. The selves we present to the world must be a reflection of the different circles we inhabit. Sometimes those worlds can clash, which occurs when we discover a different side to individuals whom we thought we knew so well – the unfaithful spouse, the paedophile priest, the sadistic nurse or the corrupt politician. These are the contradictions in the self that we see so often in others. Public figures seem to constantly fall from grace by engaging in activities that seem so out of character. Is there anything that the science of the self illusion can do to cast some light on these transgressions?

The first question to ask is why do people put their public self image unnecessarily at risk? For example, why do upstanding members of society with supposedly impeccable moral standards often seem to get caught with their pants down? Why did Sir Allan Green, the former Director of Public Prosecutions in the United Kingdom, go cruising around London’s King’s Cross station – at the time, a notorious hangout for prostitutes – where he must have known there was a good chance he could be arrested. Likewise, few could understand why Hollywood heartthrob Hugh Grant would pay Divine Brown for sex in a car on Sunset Boulevard, a notorious nightspot where the vice squad regularly operated. He must have known how risky were his actions. But maybe that is the whole point. There might be something thrilling and exciting about taking risks and it is only a risk when you have something to lose. When called to account, many are at a loss to explain their actions and say they were not their usual self.

Another fascinating facet of this type of behaviour is sexual role-playing where people act out a very different sort of self from that they exhibit in their daily lives. For the most part, members of repressed societies have to maintain dignity and decorum, no more so than our leaders. For example, they have to be dominant and yet how often do we hear about captains of industry or politicians engaging in submissive sexual fantasies where they pay to be dominated and subjected to humiliation? In 2010, three Long Island lawyers teamed up with a New York dominatrix to run a $50 million mortgage scam. Their victims were the many willing and wealthy clients who attended the dominatrix’s private dungeon in Manhattan. Paying for bondage, discipline, sadism and masochism (BDSM) appears to be fairly common in the corridors of power. But why?

For obvious reasons, getting people to talk about their sexual behaviour is very difficult. Luckily there are some who ask the sort of questions about sexual behaviours that the rest of us would shy away from. Katherine Morris, a psychologist from the United States, interviewed 460 heterosexual men who regularly engaged in BDSM. The majority of the men she interviewed were high-level professionals, including a fair number of corporate executives, including several chief executives of corporations. Her interviewees also included psychiatrists, attorneys, engineers, scientists, and other professionals who spend a great deal of time in high-pressure intellectual pursuits on a daily basis.38

Morris detected a pattern where their emotions during these private BDSM sessions were ones that they did not feel they could express as part of their daily public lives. It was as if something was missing in their lives that needed addressing for total satisfaction. These men felt forced to reintegrate the missing components in privacy as part of a sexual ritual. Morris described how many of the high-level corporate executives felt that they were frauds and that in her view, ‘humans seek balance’ between their public and private lives.

It is not only due to inadequate fulfilment that people seek out such role-playing. BDSM also allows the individual to lose their identity and adopt a role that they find sexually gratifying by being someone else.39 For the most part, our sexual activities are private compared to our public behaviour. It is almost as if we are allowed to become a different person in the bedroom. The cliché is the shy and demure wallflower in public who transforms into a sexual demon behind closed doors. It’s as if the persona we portray in public is just a front for the real person in private. Certainly, we are all expected to control our sexual behaviours in public – something that we are taught from a very early age. Those who cannot are regarded as perverts or mentally ill. In some societies there are very strict codes of conduct, very often based on religion, but all societies have some rules about what sexual behaviours are permitted in public. Members of these societies must conform to these rules but, ultimately, one consequence is to suppress thoughts and behaviours that do not go away but may eventually need to be vented, like the Tourette’s patients who lack the ability to suppress screaming profanities. The more they try to stop themselves, the stronger the compulsion becomes. This is the ego-depletion effect again. The illusion is that we have the self-control to decide whether we give in to our urges or not. The problem is that abstinence may lead to pent-up frustration to do exactly the things we are trying to avoid. That’s when the mighty fall down in such a spectacular way.

Technology may provide us with a way out. One place where we may be able to play out these fantasies and urges without exposure is on the internet. The next chapter considers how the Internet is changing the way we interact and share stories and ultimately how the web is going to play a major role in the self illusion we construct. This is because storytelling on the Internet flows in two directions with everyone having the capability to contribute to receiving, generating and sending information to others. If our looking glass self is a reflection of those with whom we surround our selves, then there are inevitably going to be implications for our self illusion in the way new media change, open up or restrict the others with whom we come into contact.

Загрузка...