3

The Looking Glass Self

After his career has faltered, über-male model Derek Zoolander, protagonist of the 2001 film Zoolander, looks at his reflection in the muddy puddle next to the sidewalk and asks himself, ‘Who am I?’ To answer this, he decides that he must embark on a journey home. It’s a familiar story of self-discovery – where we seek to find the answer to who we are by following the trail of evidence right back to our childhood. Most of us, including superstar male models, have this sense of origins. We think of our self as travelling a path in time from childhood to adulthood, punctuated by life events and the people along the way who have influenced us and shaped who we are.

Our self exists in the reflection that the world holds up to us. In 1902, American sociologist Charles Horton Cooley coined the term, ‘the looking glass self’ to express the way that the self is shaped by the reflected opinions of others around us.1 People shape themselves to fit other people’s perceptions, and these vary from one person and context to the next. Spouse, family, boss, colleagues, lover, adoring fans and beggar in the street each hold a looking glass up to us every time we interact and we present a different self. Each person or group may think they know us but they cannot because they are not privy to the all the different contexts in which we exist. This is the familiar lament of celebrities who complain that the persona they present to the general public is not the true personality they hold privately. More than that, Cooley argued that there is no real identity that exists separately to the one created by others. We are a product of those around us or least what we believe they expect from us. He summed up this notion of the self illusion in this tongue-twister of logic, ‘I am not what I think I am and I am not what you think I am; I am what I think that you think I am.’

Consider the different questions and implications raised by Cooley’s looking glass self. How do we develop a sense of self in the first place? How do children develop an understanding of what others think and, more importantly, what they think about them? This must be especially important during that most difficult time of adolescence when children try to find their true self. How is our identity shaped by the characteristics that are imposed on us by biology and cultural stereotypes? All of these questions reflect upon the sense that the self is defined by those around us.

Man in the Mirror

When Derek Zoolander looked in the puddle and saw an incredibly good-looking face, he immediately knew who it was staring back at him in the reflection. However, this seemingly trivial ability to recognize one’s self is not something that everyone can do. As we age, brain death can progressively destroy everyday functions that we take for granted – including those that generate our sense of identity. Take TH, a seventy-seven-year-old Australian man who would look in the mirror and describe the gentleman staring back at him as a ‘dead ringer’ for himself, but it was definitely someone else.2 TH had normal intelligence and was not crazy, but he could not appreciate that the reflection in the mirror was his own. When asked where this man in the mirror had come from, TH replied that he must be a neighbour in an adjoining apartment. He confabulated a bizarre story to fit with his experience of the stranger in the mirror, but the truth is TH has a rare neurological condition called ‘mirror misidentification’ in which patients think their own reflection does not belong to them. They appreciate the likeness, but there is no self-recognition. Something in the face-processing circuitry of their brain has failed to register their own outward identity. There is no flicker of familiarity.

Mirror misidentification is one of the dissociation disorders where individuals do not feel connected to reality. Their sense of self and identity within the world is distorted. Sometimes people even believe that they are dead and that the world around them and all their experience are an illusion. This death delusion, known as Cotard’s syndrome,3 is rare but I got an insight into the condition from a colleague whose own father had Cotard’s syndrome and described it as like living in an artificial world where nothing was real. Experiencing the here and now as real is part of being consciously aware of your present surroundings, but disconnection disorders such as Cotard’s remind us that we need a healthy brain to keep us in touch with reality. Sometimes we can all experience a disconnection or depersonalization in which we feel a sense of unreality and detachment from our self. Symptoms include dreamlike states, loss of empathy and a sense of disconnection with our bodies.4 It can seem like we are actors in a play or that we are watching the world from behind glass. It is surprisingly common. Estimates vary but up to three out of four of us have felt like this at some time in our lives, especially after a stressful life event. Over half the combat troops returning from tours of duty are thought to experience depersonalization. Clearly, if brain disorders and stressful life events can distort the personal experience of self such that an individual does not feel that they are really themselves anymore, then these episodes reveal the fragility of the self in the first place.

Even mirror misidentification may not be all that rare. Many of us have had that fleeting experience when the face we observe in the mirror does not seem to be our own – especially when we have been under the influence of various recreational drugs that can distort reality. You can even induce mirror misidentification with hypnosis.5 But you don’t have to be wasted or in an altered state of consciousness to experience a temporary disconnection between your sense of self and your own reflection. Try this out. Turn the room lights down or better still, light a candle. Now have a good look at your self in a mirror. Stare into the eyes that are reflected back at you. Scrutinize the features of your face. After a minute or two you will experience a strange sensation. You will start to experience depersonalization. Within a minute of staring, most people start to see their face distort to the extent that it no longer looks like their own but rather that of a stranger.6 Whatever the self is that we experience when looking in the mirror, it is one that is easily disrupted when we look at it more closely.

What do babies or, for that matter, animals make of their own reflections when they see them for the first time? Following an observation by Charles Darwin that an orang-utan did not seem to recognize itself in a mirror at the London zoo, psychologist Gordon Gallup7 developed a way of measuring self-recognition in animals by placing a small dab of odourless red rouge makeup on their foreheads while they were asleep and then seeing how they responded when they saw themselves in a mirror. If the animal noticed that something about its appearance was not quite right, Gallup argued it had a self concept – an idea of who they were.

Gallup found that many animals, including some adult apes, could recognize themselves since they tried to remove the makeup, but that other animals failed. Numerous other studies have shown that the animals that pass the mirror test are those that live in social groups. It is surprising then that human infants do not typically recognize themselves in the mirror test until well into their second year.8 They simply treat the baby in the mirror as another baby. In effect, very young infants are experiencing mirror misidentification when they see the other baby in the mirror. Some would argue that without this self-recognition in the mirror, they have not yet constructed their own sense of their self.9

Figure 6: Somewhere around eighteen months, human infants pass the rouge test

Why We Lose Our Self in Reflection

Why can’t we remember what it was like to be a baby? Why can’t we remember our infant self? What’s the earliest memory you have? If you are like most people, it will be sometime around your third to fourth birthday and really patchy. There are always the odd few (and, indeed, they are odd) who say they can remember being born – passing down the birth canal and being slapped on the bottom by the midwife. Most have no memory of self before their second birthday and, even then, the memories from around that time are fragmented and unconnected.10 It’s not that you have forgotten what it was like to be an infant – you simply were not ‘you’ at that age because there was no constructed self, and so you cannot make sense of early experiences in the context of the person to whom these events happened. You can look at photographs and recognize your self, but you cannot get back inside the toddler you once were. Why is this?

Has the passage of time worn out the trace of your memory, like a photograph fading? This seems unlikely. An articulate twelve-year-old is equally oblivious to their own infant memories, as a forty-year-old who can remember events when they were twelve, almost thirty years later.11 The lack of memory cannot be because too much time has passed. Is it the case that babies do not form memories in the first place? Without the ability to form memories, your sense of self would be utterly shattered. This loss happened to Clive Wearing, an eminent musicologist at Cambridge University, who was struck down with Herpes simplex encephalitis in 1985. Herpes simplex is the same infection that produces cold sores, but for Clive it had infiltrated the protective tissue that protects the brain, which caused it to swell and crush the delicate structures of the hippocampus – a region where the neural circuits encode memories. Even though he survived the encephalitis, Clive was left with severe amnesia and is now unable to remember from one moment to the next. In her 2005 memoir, Forever Today, Deborah Wearing describes her husband Clive’s tormented existence:

It was as if every waking moment was the first waking moment. Clive was under the constant impression that he had just emerged from unconsciousness because he had no evidence in his own mind of ever being awake before … ‘I haven’t heard anything, seen anything, touched anything, smelled anything,’ he would say. ‘It’s like being dead.’12

Probably the most harrowing aspect of Clive’s condition is that he still remembers fragments of his previous life and knows exactly who Deborah is – each time he sees her, he runs tearfully into her arms like the reunion of long-lost lovers when in reality she may have only left the room minutes earlier. Without the ability to store new memories, Clive is permanently trapped in the here and now. He maintains a diary in an attempt to keep track of existence but this makes for painful reading: ‘2.00 p.m. – I am awake for the very first time. 2.14 p.m. – I am now conscious. 2.19 p.m. – have just woken for the first time.’ Each previous entry is crossed out as he asserts that he has only just become conscious. Deborah describes how one day she found Clive holding a chocolate in one hand and repeatedly covering and uncovering it with the other hand as if practising a magic trick.13 Each time he removed his hand he was amazed by the appearance of the chocolate. Without the ability to form new memories, everything that is out of sight is out of mind for Clive Wearing.

The child psychologist Jean Piaget believed that infants begin life just like Clive – unable to remember anything that cannot be immediately perceived. He thought that infants lacked the capacity to form enduring memories of the world around them.14 However, we now know that Piaget’s vision is not entirely accurate because infants can form memories. Babies learn in the womb and that requires forming a memory in the neural networks of our brain. Hundreds of experiments conducted on young infants over the past thirty years require them to possess memory that can be surprisingly enduring. For example, three-month-olds who learn to kick their legs to activate a mobile that is tied to their foot by a ribbon will remember that experience one month later.15 If you bring them back into the lab they start kicking much faster compared to infants who never had the initial training. So it can’t be true that young infants do not have any memories. Whatever memories they may possess, however, do not become part of the self story that most of us rehearse and recall when we are much older and asked to reminisce.

Rather, the question is what kind of memories do infants form? One possibility is that they only have memory for events when you place them back in the same situation, which is why they can learn and remember things they have encountered before. For example, in 1999, memory researchers contacted a dozen students who had taken part in a memory test in which they saw fragments of pictures presented for one to three seconds to test if they could remember them. Even though they saw them only briefly, students could recognize the pictures. Not too amazing, you might say, until you discover that this memory test took place in 1999, seventeen years after the original study! The students were now full adults with busy lives and some could not even remember taking part in the original study back in 1982 at the University of Minnesota. Yet stored somewhere in their memory networks were traces of the original experience because they identified pictures that they could not remember having been shown.16

Even Clive Wearing seems to have this ability to learn, but he can’t remember that he has learned. It’s like unconscious knowledge. Both Clive and young babies may not have the ability to consciously recall or reflect upon previous experiences. In contrast, most of us can recall what we had for breakfast yesterday by actively reconstructing in our minds the events. That requires a different kind of memory that psychologists call ‘episodic’ – one that reflects the actual experience of remembering the episodes that punctuate our lives.17 Memories of these episodes are crucial for constructing the self story and those which are particularly personal are known as autobiographical memories – those events that we can recall in which we are the main player.18 One might be tempted to assume that our autobiographical memories are accurate recollections but, just like any memory, they are not like photographs or recordings. One of the greatest discoveries in psychology is that human memories are reconstructed and malleable. We do not have a recording of our own personal experiences in our head like some video archive. There are no microfilms in our memory banks.

Memories are constantly active – like a story being retold over and over again. Moreover, when we encounter related new experiences, we interpret them in terms of our existing memories, which in turn are transformed by the new experiences. We are constantly integrating the here and now into our past. Consider the following powerful demonstration. Read the following list of fifteen words and try to remember them as best you can. Take a couple of seconds on each word to memorize it well.

thread

pin

eye

sewing

sharp

point

prick

thimble

haystack

thorn

hurt

injection

syringe

cloth

knitting

Now turn to the end of the chapter and answer the questions to see how good your memory is. Most people fail this test19 and yet they are pretty sure that they got the right answer, which makes the effect all the more dramatic. How can most of us be so convinced and yet so wrong?

The neural networks encountered earlier show how all information is stored as a pattern of activation across networks of neurons. You falsely remember the occurrence of the word that was never presented because it was related in meaning to the other words in the list. In the neural networks that process language and meaning, the pattern representing the word you believe you encountered was triggered as part of the collateral activity of all the other words that were processed and encoded. When one considers that memory is constant neuronal updating, it is remarkable that we remember anything at all with good clarity.

In 1932, the British psychologist Sir Frederic Bartlett, one of the few psychologists ever to be knighted, demonstrated that memories are not exact copies of past events, but rather, are reconstructed – like stories.20 Similar to the game of Chinese Whispers, every time the story is told, and retold, it changes. In fact, completely false memories can be constructed simply by asking leading questions. In what are some of the most influential experiments in human psychology, Elizabeth Loftus demonstrated that if you show adults a video event of a car accident, and then ask them leading questions such as, ‘Did the white car jump the red light?’, adults correctly deny that there was a white car in the sequence.21 However, if several weeks pass and the adults are asked to recall the video, they are more likely to report seeing a white car jump the red light even though it was never in the video. The mere mention of a white car during the initial questioning has now become incorporated into their memory. The neural networks that encoded the memory have become contaminated with neural activations of networks designed to scrutinize the memory for the presence of white cars and red lights. Likewise, when children are told they were once lost in a shopping mall, they can give vivid recollections about the event even though this event never actually happened.22

Confabulations of memory are not restricted to the young and naive. Piaget used to describe the time when there was an attempt to abduct him as a young child.23 Years later, he had vivid memories of how his nanny fought off the would-be abductors. However, eventually racked by guilt, the nanny confessed that she had made the whole abduction story up so that Piaget’s parents would be indebted to her. Half the adults shown a doctored photograph of themselves as children taking a hot-air balloon ride recall the fictitious event and can describe it in detail.24 Even Elizabeth Loftus, the world’s greatest authority on false memories, is not immune to them.25 When she was only fourteen years old, Loftus’s mother drowned in a swimming pool. Thirty years later at a birthday party, Loftus’s uncle reminded her that she had found her mother’s body. Over the next couple of days, the lucid memories of that terrible moment came flooding back to haunt Loftus, except that these memories were false. Her uncle had made a mistake. Loftus had not discovered her mother’s body but rather it had been her aunt. Later, Loftus said, ‘The most horrifying idea is that what we believe with all our hearts is not necessarily the truth.’

Memory as a Compost Heap

We all know that we forget things but to discover that a recollection is completely fabricated is something else. It is shocking because it makes us question our own minds. If we all can vividly remember events that never happened then this undermines the reliability of memory and ultimately the reality of our self. This is because part of the self illusion is that we know our own minds and recognize our own memories. But we are often mistaken. The reason we find false memories so shocking is that most people do not understand how memory works. Psychologists Dan Simons and Chris Chabris recently surveyed 1,500 US adults and discovered fundamental misunderstandings held by the general public.26 About two out of three adults (63 per cent) thought that memory works like a video camera, recording experiences that can be played back later. Half of the respondents believed that once a memory was formed it was unchanged and reliable. These misconceptions have lead to comparison with other ways of storing information that evoke some notion of a permanent store. A common metaphor is to liken human memory to a vast library storing volumes of information, which is wrong. Human memory is neither like a computer hard drive nor a pre-industrial blank slate upon which experience writes a trace.

If any metaphor is going to capture memory, then is more like a compost heap in a constant state of reorganization.27 Just like the garden refuse that you put on the compost heap, experiences are laid down with the most recent still retaining much detail and structure but, with time, they eventually breakdown and become mixed in and integrated with the rest of our experiences. Some events stand out and take a long time to decompose but they are the rare instances. The remainder becomes a mush. When it comes to memory, Dan Simons reminds us that, ‘People tend to place greater faith in the accuracy, completeness and vividness of their memories than they probably should.’28

Our self illusion is so interwoven with personal memories that when we recall an event, we believe we are retrieving a reliable episode from our history like opening a photograph album and examining a snapshot in time. If we then discover the episode never really happened, then our whole self is called into question. But that’s only because we are so committed to the illusion that our self is a reliable story in the first place.

Not Total Recall

In Hollywood’s adaptation of Philip K. Dick’s brilliant story, ‘We can remember it for you wholesale’,29 Arnold Schwarzenegger plays the role of Douglas Quaid, a freedom-fighter on a Mars colony who has had the false memories of a construction worker on Earth implanted into his brain. The movie adaptation, Total Recall,30 is a roller coaster ride with a plot full of twists and turns. What makes it relevant to the discussion of self is that the identity of Quaid changes as the content of his memory is altered. This is why Elizabeth Loftus was so appalled to discover that she held false memories. It means that we are not necessarily who we think we are. Our identity is the sum of our memories, but it turns out that memories are fluid, modified by context and sometimes simply confabulated. This means we cannot trust them and our sense of self is compromised. Note how this leaves us with a glaring paradox – without a sense of self, memories have no meaning, and yet the self is a product of our memories.

This may be why there is no memory of the infantile self. As an infant we did not have the capacity to integrate our experiences into meaningful stories. We did not have world knowledge. Most important, we did not have an idea of who we were. We did not have an initial sense of self to integrate our experiences. That requires world experience and, in particular, learning from those about us who spend so much time in our company. Somewhere around two years, children start to have conversations with parents about past events. Kids whose parents spend a lot of time talking to them and discussing past events when they were between their second and fourth birthday had much better memories about their lives when they were between twelve and thirteen years old. It is not simply language, but the way parents discuss events with their children. By scaffolding their children’s early experiences the kids were able to organize their experiences into a meaningful story. This is because it is easier to remember stories that relate to us when we become a main character. The adults had the experience and the context to organize the events into a coherence that made sense to the child, which lead to better encoding and storage in their brains.31 One thing we know from decades of psychological research is that meaning and context improves memory.

This is why memory researcher Mark Howe argues that babies who fail the Gallup mirror test lack a sense of self, and so their memories are disconnected events – impressions that do not seem to make sense in any meaningful way.32 In order for memories to possess meaning, they have to be embedded within the self. However, Philippe Rochat, who has made a lifetime study of self development, argues that the mirror test in humans is actually a measure of being self-conscious about how one looks to others.33 He reasons that, at eighteen months, infants are not bothered with what they look like to others and so are not particularly concerned if they have a red smudge on their nose. Somewhere around the second year, children are more concerned with their appearance and how they look to others.

This self-conscious account would explain the surprising finding that mirror self-recognition with the rouge test is not universal. In one study of Kenyan children between two and seven years of age, Rochat found that only two out of 104 children removed a sticker, which previously had been surreptitiously placed on their forehead, when they looked in a mirror. Why? It cannot be that they do not have self-recognition in a mirror. They have seen and groomed themselves plenty of times in front of a mirror. Rather, Rochat argues that, unlike their American counterparts, Kenyan children are not sure what to do in this unusual situation. They don’t know whether they should remove a sticker from their forehead that must have been placed there by the strange Western scientist visiting the village.

This is a fascinating twist on Gallup’s self-recognition interpretation. It may be that passing the mirror test is not necessarily a measure of self-recognition, but rather a measure of embarrassment in the context of others. The mirror test reveals the point at which you become more concerned by what others must be thinking about you. However, before you can be self-conscious, you must first appreciate that others are thinking about you. You need to have a concept of what you are in order to compare that self with the expectation of others. And before you can have that expectation, you need to understand what is on their mind.

Theory of Mind

If we are worried about what others think of us then it stands to reason that we need to understand what’s going on in other peoples’ minds. We need to figure out what they are thinking and for that we need to develop a ‘theory of mind’. This term was originally coined by David Premack who wanted to know if chimpanzees understood that others had thoughts and what those thoughts might be.34

Most of us assume that people do things because they want to. In other words, they have thoughts about goals and intentions that motivate their actions. Again, this is something that is so familiar that we take it for granted when it comes to humans, but there is good evidence that this capacity takes time to develop and may not be shared with all members of the animal kingdom.

Animals can pay attention to humans and their actions, but it is not clear that they understand that others possess minds that support those actions. Animals do not engage with their human keepers in social behaviours such as imitation and copying. And yet, we are inclined to attribute sophisticated mental states to animals. Do you remember the female gorilla, Binti, which saved the little boy who fell into her enclosure at a zoo near Chicago back in1996? We watched in amazement as this wild animal gently picked up the limp body of the three-year-old boy and carried him to the door where the paramedics could attend to him. The world’s press was quick to attribute empathy and care to Binti, but what they did not know was that she had been trained by her keepers to bring a doll to them in anticipation of her possible pregnancy.35

Even our closest primate cousin, the chimpanzee, can be a distant relative when observed in the wild. Jane Goodall, the famous primatologist, observed a chimpanzee named Passion who repeatedly kidnapped the babies of other mothers and, with the help of her own children, consumed them. Despite our inclination to anthropomorphism – the attribution of human qualities to non-humans – we are unique as a species in our capacity to formulate the complex mental states of others that serve as our bread and butter in daily social interactions.

That capacity starts early. There is ample evidence that human infants are pre-adapted by evolution to seek out other humans and engage with them.36 For example, babies pay attention to what others are looking at so they understand the link between gaze and actions – people tend to want what they look at. If an adult stares longer at one of two different toys, but then picks up the toy they were not looking at, babies are surprised.37 Where we look reveals the focus of our interest and desires, and this is something the baby understands intuitively.

Expressions are also a good indicator of what someone else is thinking. When an eighteen-month-old infant is offered broccoli or crackers, they usually choose the salty biscuits. Crackers are much tastier than broccoli to a baby. However, if they watch an adult wrinkle their nose at the sight of crackers, but make a smiley ‘num num’ face to the vegetable, the baby knows to offer them the broccoli if the adult then asks the baby to pass them something to eat.38 The baby can figure out what the adult likes.

But none of this people-watching really requires a theory of their minds. Likes and dislikes can be easily worked out by simply watching whether people smile or frown. We do this all the time, looking for external markers of behaviour that reveal preferences. Even animals can do this.39 As many pet owners can attest, animals learn when their masters are angry or pleased with them, but this does not require understanding what is on their master’s mind. Rather, to prove that we can understand what is really on someone else’s mind, we have to appreciate when they hold a mistaken or false belief.40 A belief is simply an idea that we think is true; but sometimes we may be mistaken. If you can understand that someone holds a false belief, then you can imagine what they are thinking even when what they are thinking is factually wrong. That’s a powerful level of insight into someone else’s worldview. For example, if you show me a confectionery box and ask me what is inside, then I am likely to say sweets or candy, depending which continent I am on. However, if you open it up and reveal that it actually contains pencils, then I will realize I was understandably mistaken. My belief was false. Three-year-olds will also make the same mistake.41 After all, they don’t have X-ray vision. But if you now ask me to imagine what my neighbour would reply if he were asked what is inside the box, I know that he too will make the same mistake as I initially did. I can understand that he will not know what is actually in the box. In contrast, a three-year-old will assume that someone else who comes along will know that there are pencils in the box and answer, ‘Pencils’. They don’t appreciate that others will also come to the wrong conclusion about what’s in the box and can hold the same false belief. By four years of age, most children understand that people will answer, ‘Candy’, when asked what’s in the box.

Psychologists think that young children initially lack a theory of mind when it comes to understanding mistaken beliefs.42 It’s as if they cannot take another’s perspective. In one classic experiment, children see a doll called ‘Sally’ hide her marble in a cupboard before she goes out. When she is out, another doll, ‘Anne’, comes in and takes Sally’s marble and hides it in the kitchen drawer. The critical question is where Sally thinks her marble is. When children watch this scenario, three-year-olds think that, on her return, Sally will look in the kitchen drawer for her marble, whereas four-year-olds say that she will look in the cupboard. Clearly, when you understand that people can hold false beliefs, you can lie to them to make them think something that isn’t true. When you consider how so much social manipulation involves deceiving others, you can understand why having a theory mind is a valuable tool. You can outwit others by leading them to false assumptions.

An underdeveloped theory of mind in children also explains why they can make such bad liars. Initially, when a child realizes that punishment is imminent – ‘Did you eat the cake?’ – they simply say no, despite the fact that they have chocolate cake smeared across their face. Only later do children get more sophisticated in generating plausible stories for why they might have the tell-tale chocolate on their face – invariably they blame someone else.

Theory of mind is really a form of mental perspective-taking – understanding things from another’s point of view – a ‘he thinks that she thinks’ sort of thing. In order to do this, you have to be able to keep track of what developmental psychologist Alison Gopnik43 calls ‘counterfactuals – the woulda-coulda-shoulda’s of life’. Counterfactuals are what enable you to imagine different scenarios, including what people may do in the future, based on what you know now. It’s how we second-guess others and, to do that, we have to possess the mental machinery to generate different possible outcomes and play them out in our heads. This is going to happen mostly in situations of social competition where you have to anticipate what others will do next, which maybe explains why theory of mind emerges earlier in children who have siblings.44 The constant battle to keep place in the pecking order means that children have to learn to outwit their brothers and sisters.

Mindblindness

Not everyone develops a theory of mind. In his book, The Empathic Brain, neuroscientist Christian Keysers45 describes his encounter with a young graduate student, Jerome, who is finishing his PhD in theoretical physics. His colleague Bruno Wicker introduced Jerome who, on entering the room, spoke with a flat voice and never looked Christian in the eyes.

Bruno: ‘We would like to ask you something.’ (Bruno shows Jerome a box of Danish cookies.) ‘What do you think is in this box?’

Jerome: ‘Cookies.’

Bruno then opened the box to reveal a set of colored pencils instead of the expected cookies.

(His female research assistant then enters the room.)

Bruno: ‘What do you think she would think the box contains?’

Jerome: ‘Colored pencils.’

Here is a man with the mental capacity to think about abstract properties of the universe that would baffle most of us and yet he cannot imagine what someone else might think is inside a cookie box. Jerome has autism – a developmental disorder that affects around one in 500 individuals,46 though this figure appears to be on the increase and depends largely on how you define it. In general, autism can be thought of as a disorder with three major disabilities: a profound lack of social skills, poor communication and repetitive behaviours. It is regarded as a spectrum disorder because individuals vary in the extent to which they are affected. Most are intellectually challenged, some are within the normal range, and a few may have rare abilities such as being able to tell you what day of the week any date in history falls upon. But all individuals with autism spectrum disorder have problems with social interactions.

These individuals have a problem with social interaction because they lack the repertoire of developmental social skills that enable humans to become the expert mind-readers. Over the course of early childhood, typical children increasingly become more sophisticated at understanding other people because of their developing theory of mind. By the time they are around four years of age, an average child sees other people as being goal-directed, purposeful, having preferences, desires, beliefs and even misconceptions.

Not only do typical children become intuitive mind-readers, but they also become councillors as well. They begin to understand other’s sadness, joy, disappointment and jealousy as emotional correlates of the behaviours that make humans do the things they do. Again, by four years of age, children have become expert at working the social arena. They will copy, imitate, mimic and generally empathize with others, thereby signalling that they too are part of the social circles that we all must join in order to become members of the tribe. They share the same socially contagious behaviours of crying, yawning, smiling, laughing and showing disgust.

However, individuals with autism lack this repertoire of social skills.47 They are effectively ‘mindblind’.48 Alison Gopnik captured this notion of mindblindness in her terrifying vision of what it must be like to be at a dinner party if you have autism:

Around me bags of skin are draped over chairs, and stuffed into pieces of cloth, they shift and protrude in unexpected ways . . . Two dark spots near the top of them swivel restlessly back and forth. A hole beneath the spots fills with food and from it comes a stream of noises. Imagine that the noisy skin-bags suddenly moved towards you, and their noises grew loud, and you had no idea why, no way of explaining them or predicting what they would do next.49

No wonder individuals with autism find direct social interaction frightening. If you can’t figure out other people, social encounters must be intensely baffling. They cannot easily infer what others are thinking and generally withdraw into activities that do not involve people. Maybe this is why many individuals with autism often do not like direct eye contact, do not copy, do not mimic, do not yawn, retch, laugh or join in with the rich tapestry of social signals we share as a species.50

Temple Grandin provides remarkable insight into what it’s like to suffer with autism.51 She has a PhD and is one of the world’s authorities on animal husbandry, but she is also a highly intelligent or ‘high-functioning’ individual with autism, able to provide a window into what it is like to be mindblind. Temple was diagnosed with autism from early childhood. She went to progressive schools and eventually college, but always had difficulty interacting with other people. She could not understand or predict their behaviours and so turned her attention towards animals, which seemed less complex. She could get inside the minds of animals better than she could humans, and eventually went on to study animal welfare and developed techniques to soothe and calm cattle before slaughter. Humans, on the other hand, were unpredictable. Temple taught herself to study people – to pay close attention to their routines and behaviours. In this way she was able to predict what they would do in familiar situations so that she could behave appropriately. She described her experience of predicting other people’s behaviours to Oliver Sacks, as being like ‘an anthropologist on Mars’, a phrase which would go on to become the title of one of Sacks’ bestsellers.52

Although there is no definitive neurological test for Temple’s condition, autism must be some form of brain disorder. The incidence of autism is higher in identical compared to non-identical twins, which suggests that there is a genetic component to the disorder.53 Autism is also on average four times more likely in boys compared to girls, which again, strongly implicates a biological basis. To date there is tantalizing evidence based on brain-imaging studies that regions in the front part of the brain – most notably the fronto-insular- (FIC) and the anterior cingulate cortex (ACC) that are activated by social interaction in normal individuals – operate differently in individuals with autistic spectrum disorder.54 The ACC is like an ‘alarm centre’ that monitors goals and conflicts, including social interactions. If these interactions do not go according to plan, if people start to get the wrong idea out us, we get anxious. These regions are part of the mirror neuron system that activate when we imitate others either voluntarily or have our experiences hijacked by watching others.

So far, the brain-imaging studies of the mirror system in individuals with autism are inconclusive and, according to Christian Keysers, indicate that the system is not broken but may be very delayed in development because such individuals are not attending to the relevant information during normal social encounters.55

Others have targeted specific types of neurons. Neuroscientist John Allman at the California Institute of Technology has proposed that the social deficit in autism may be a lack of a special class of spindle neurons, called Von Economo neurons (VENs), after their discoverer who located them in 1925.56 VENs are cortical neurons with highly connective fibres that are thought to branch out to different brain regions that are activated by social learning. This may explain why VENs have only been found in species that are particularly social, including all the great apes, elephants, whales and dolphins.

Humans have the largest population of VENs found only in the FI and ACC areas – the same regions that may be disrupted in autism. VENs are thought to work by keeping track of social experiences – a strategy that would facilitate a rapid appreciation of similar social situations in the future. They form the neural networks that provide the basis of intuitive social learning when we watch and copy others. VENs may help to create and sculpt the self from copying and reading others.

One intriguing discovery is that the density of VENs in these social regions increases from infancy to reach adult levels somewhere around the fourth birthday in typical children – a time when most child development experts agree that there is noticeable change in social interaction skills and an emerging sense of identity. This may also explain why autistic individuals, who have disrupted VEN regions, have difficulty working out what the rest of us simply know without having to think very much. I recently discussed this with a good friend who is the mother of a high-functioning daughter with autism. Her daughter compensated for her condition by asking those around her to write down a description of who they were and their life stories as a way of understanding them. This was because she was unable spontaneously to integrate information and backgrounds to generate narratives to describe others. Without this capacity to read others and integrate socially, someone with severe autism is going to have a very different sense of self that does not include those around them. I can only speculate as I do not have autism, but I would imagine that individuals with severe autism inhabit a solitary world, very much in isolation from others.

The Agony of Adolescence

Perhaps you remember a party you went to when you were fifteen, and everyone stopped talking and stared at you when you walked into the room. Or maybe there was a time when the teacher made you stand up in class and everyone was looking at you. Do you remember feeling your face flush bright red and your palms sweating? It was so embarrassing. You felt so self-conscious.

Most of us have had some embarrassing event in our lives that at the time was the worst possible thing we could imagine. We felt we could have died and wished the ground would open up and swallow us. Being embarrassed and becoming self-conscious are key components of the looking glass self. If we did not care about what others think, then we would not be embarrassed. Initially young children are so egocentric and the apple of their parents’ eye. It is not clear that others would ever be of concern to them. However, with a developing sense of self, the child increasingly starts to care about what others think, aided by their emerging theory of mind, where they are able take another person’s perspective. This self-conscious awareness can provide the basis for a moral compass. Even our own reflection can make us acutely aware that we are potentially the focus of other people’s attention. For example, in one classic Canadian study of social transgression,57 children on Halloween night were secretly observed after being told to take only one piece of candy from a bowl while the owner went into another room. If there was a mirror placed so that it reflected a child as they approached the bowl, the children became self-conscious and did as they were told. However, in households where there was no mirror, children took more than one candy. There was no mirror to remind them that they could be seen.

By the time children hit the early teens, they are especially sensitive to the judgement of others. In fact, they often think that there is an imaginary audience evaluating them.58 How often do we see children (and quite often adults who think they are unobserved) receiving the adulation from the imaginary audience that has just heard them perform some amazing task or talent? But this imaginary audience is also the agony of adolescence. By the time they reach their teens, adolescents believe that others are constantly judging them even when this is not the case. They think they are the centre of attention and are hypersensitive to criticism.

Neuroscientist Sarah-Jayne Blakemore has used brain-imaging techniques to investigate what is going on in adolescent heads.59 She found that regions normally triggered by thoughts about one’s self are more active during these adolescent years when compared to young adults. In particular, the prefrontal cortex (PFC) is activated when individual adolescents are asked to reflect upon just about any task that forces them to consider things from their own perspective. Whether it’s thinking about self-reflected actions such as reading,60 making intentional plans,61 or simply reflecting on a socially painful memory,62 the adolescent PFC is hyperactive.

The kids simply feel that, as my teenage daughter says, ‘Everyone is getting at me.’ What hyperactive PFC actually means is not clear but it does support the idea that this region is specialized for ‘mentalizing’ about others, and much of that mental effort during adolescence is concerned with what others, especially the peer group, think. No wonder adolescents are susceptible to peer pressure, which explains why they are more likely to get into trouble and engage in risky behaviour in order to establish their own identity and position in the pecking order.63 And who are the worst offenders for risky behaviour? Boys, of course. But what are little boys made of? Is it all biology or does society shape the behaviour of little boys more than we have previously thought?

Boys Will Be Boys

The first thing anyone asks when hearing the news of a birth is invariably, ‘Is it a boy or a girl?’ So when Toronto couple, Kathy Witterick, thirty-eight, and David Stocker, thirty-nine, announced the birth of Storm in 2011, but told friends and family that they were not disclosing the sex of their third child, their announcement was met with stony silence. They explained that they did not want their child to be labelled but rather they wanted Storm to be free to develop its own identity. The problem was that no one knew how to treat the New Year’s Day baby. Four months later, news of the ‘genderless’ Storm broke, creating a media storm with a flood of criticism and ridicule of the parents.64 But Kathy and David have a point. Our identity based on whether we are a boy or a girl is greatly influenced by those around us.

We are so preoccupied with the question of sex because it is a core component of how people define themselves, how they should behave and how others should behave towards them. It is one of the first important distinctions we make growing up as children and, without knowing which sex someone is, we are at a loss to know how to interact with them. Being a boy or a girl is a sexual difference defined in terms of the chromosomes we inherit from our parents. Normally, twenty-three pairs of chromosomes are inherited from each parent. In each set of chromosomes, one pair is known as the sex chromosomes (X and Y) and the other twenty-two pairs are known as the autosomes. Human females have two X chromosomes, and males have one X and one Y chromosome. In the absence of the Y chromosome, we would develop into little girls with two X chromosomes.

Gender, on the other hand, is not simply biological but rather is related to the psychological profile of the individual. Gender is not genetic but shaped by the group consensus. It is what it is to think and behave masculine or feminine. By three years of age, boys prefer the company of other boys and girls prefer other little girls,65 and by five years, children are already ‘gender detectives’ with a rich set of rules about what is appropriate for boys and girls to do.66 Some gender stereotypes are universal such as mothers should be responsible for childcare and preparing food.67 However, such stereotypes have shifted in recent years as men and women are increasingly able to play a greater role in what were considered traditionally separated activities. This is why ‘gender-benders’ such as Boy George or Marlene Dietrich arouse passions because they challenge stereotypes.

Although not cast in stone, gender stereotypes do tend to be perpetuated across generations. This is what Storm’s parents were trying to avoid. Many parents are eager to know the sex of their children before they are born, which sets up gender expectations such as painting the nursery in either pink or blue.68 When they eventually arrive, newborn baby girls are described mainly in terms of beauty, whereas boys are described in terms of strength. In one study, adults attributed more anger to a boy than to a girl reacting to a jack-in-the-box toy even though it was always the same infant.69 Parents also tend to buy gender-appropriate toys with dolls for girls and guns for boys.70 In another study, different adults were introduced to the same child wearing either blue or pink clothes and told that it was either Sarah or Nathan. If adults thought it was a baby girl, they praised her beauty. If they thought it was a boy, they never commented on beauty but rather talked about what occupation he would eventually have. When it came to play, they were boisterous with the boy baby, throwing him into the air, but cuddled the baby when they thought it was a girl. In fact, the adults seemed to need to know which sex the baby was in order to play with them appropriately.71 Of course, it was the same baby, so the only difference was whether it was wearing either blue or pink. It is worth bearing in mind the association of the colour blue is only recent – a hundred years ago it would have been the boys wearing pink and the girls wearing blue.72

With all this encouragement from adults during the early months, is it any surprise that, by two years of age, most children easily identify with their own gender and the roles and appearances that they believe are appropriate? However, this understanding is still very superficial. For example, up until four years of age, children think that long hair and dresses determine whether you are a boy or girl. We know this because if you show four-year-olds a Ken Barbie Doll and then put a dress on the male doll, they think that he is now a girl. By six years, children’s gender understanding is more sophisticated and goes over and beyond outward appearances. They know that changing clothes and hair does not change boys into girls or vice versa. They are already demonstrating an understanding of what it means to be essentially a boy or a girl. When they identify gender as a core component of the self, they will tend to see this as unchanging and foundational to who they and others are.73

As children develop, they become more fixed in their outlook about what properties are acquired and what seem to be built in. For example, by six years, children think that men make better mechanics and women are better secretaries. Even the way parents talk to their children reinforces this generalized view of what is essential to gender.74 For example, parents tend to make statements such as ‘Boys play soccer’ and ‘Girls take ballet’ rather than qualifying the statements with ‘Some boys play soccer’ or ‘Some girls take ballet’. We can’t help but fall into the gender trap. Our interaction with children reinforces these gender divisions. Mothers tend to discuss emotional problems with their daughters more than with their sons.75 On a visit to a science museum, parents were three times more likely to explain the exhibits to the boys than to the girls.76

And it’s not just the parents. Teachers perpetuate gender stereotypes. In mixed classes, boys are more likely to volunteer answers, receive more attention from teachers and earn more praise. By the time they are eight to ten years old, girls report lower self-esteem than boys, but it’s not because they are less able.77 According to 2007 UK National Office of Statistics data, girls outperform boys at all levels of education from preschool right through to university. There may be some often-reported superior abilities in boys when it comes to mathematics but that difference does not appear until adolescence, by which time there has been ample opportunity to strengthen stereotypes.78 Male brains are different to female brains in many ways that we don’t yet understand (for example, the shape of the bundle fibres connecting the two hemispheres known as the corpus callosum is different), but commentators may have overstated the case for biology when it comes to some gender stereotypes about the way children should think and behave that are perpetuated by society.79

Stereotypes both support and undermine the self illusion. On the one hand most of us conform to stereotypes because that is what is expected from those in the categories to which we belong and not many of us want to be isolated. On the other hand, we may acknowledge the existence of stereotypes but maintain that as individuals we are not the same as everyone else. Our self illusion assumes that we could act differently if we wished. Then there are those who maintain that they do not conform to any stereotypes because they are individuals. But who is really individual in a species that requires the presence of others upon which to make a relative judgment of whether they are the same or different? By definition, you need others to conform with, or rebel against. For example, consider tattoos as a mark of individuality – an individuality that is increasingly mainstream as evidenced by the rise in popularity for getting inked! Even those who go to the extremes of self-mutilation are inadvertently using others to calibrate the extent of their individuality. The self illusion is a mighty tricky perspective to avoid.

The Supermale Myth of Aggression

Consider another universal self stereotype – that of male aggression. Why do men fight so much? Is it simply in their nature? It’s an area of psychology that has generated a multitude of explanations. Typical accounts are that males need physically to compete for dominance so that they attract the best females with whom to mate, or that males lack the same negotiation skills as women and have to resolve conflicts through action. These notions have been popularized by the ‘women are from Venus, men are from Mars’ mentality. It is true that men have higher levels of testosterone and this can facilitate aggressive behaviour because this hormone makes you stronger. But these may be predispositions that cultures shape. When we consider the nature of our self from the gender perspective, we are invariably viewing this through a lens, shaped by society, of what males and females should be.

Males may end up more aggressive but surprisingly they may not start out like that. Studies have shown equal levels of physical aggression in one-year-old males and females, but by the time they are two years of age, boys are more physically aggressive than girls and this difference generally continues throughout development.80 In contrast, girls increasingly rely less on physical violence during conflicts but are more inclined to taunting and excluding individuals as a way of exerting their influence during bullying.81 Males and females may simply differ in the ways in which they express their aggression.

It seems unquestionable that male biology makes them more physically aggressive, which has led to the ‘supermale’ myth. In some males, they inherit and extra Y chromosome (XYY) which makes them taller, leaner and more prone to acne in comparison to other males. About fifty years ago, it was claimed that these supermales are more aggressive following reports of their being a higher incidence of XYY males in Scottish prisons during the 1960s.82 The belief was further substantiated in the public’s mind by the notorious case of Richard Speck, an American mass murderer who tortured, raped and murdered nine female student nurses in one night of terror on 14 July 1966 in South Chicago Community Hospital. Speck, who had a history of violence, broke into the nurses’ home and held the women hostage. He led them out of the room, one by one, to be strangled or stabbed to death. At the time of his hearing, the defence lawyers claimed that Speck was not responsible for the crime because of diminished responsibility due to the fact that he had the XYY supermale genotype. It later transpired that Richard Speck’s defence lawyer knew that Speck did not have an XYY genotype but perpetrated the myth in order to protect his client.

Even if Speck did have the XYY genotype, many of the claims for the link with violence have not stood up to scrutiny. Early studies were poorly conducted using very small samples and, amazingly, if a criminal had acne, this was sometimes taken as sufficient evidence of them possessing the XYY genotype in the absence of any genetic analysis.83 Speck was tall and had acne. Today the myth of the XYY persists with many experts still disagreeing about a possible link between the genotype and violence. One extensive Danish study84 concluded that the prevalence of XYY was about one in 1,000 males and that the only reliable characteristic was that they were above average height. This physical difference may have contributed to them exhibiting behaviour that is considered more aggressive than normal. It may also explain why nearly half of XYY males are arrested compared with the average of one in ten XY males. Overall, it would appear that XYY males do have behavioural problems, especially during adolescence, which may be compounded by their unusual height. They also tend to have lower IQs and more impulsive behaviour that could contribute to the higher incidence of criminality, but these crimes are not typically ones of violence against others but rather property crimes such as shoplifting.

What makes the supermale myth worth considering in the context of gender stereotyping is that such biological beliefs can have unfortunate consequences. During the 1970s and 1980s, many parents took the decision to abort male foetuses diagnosed with the extra Y chromosome during prenatal examinations because of the supermale myth. The truth is that most males with XYY do not know that they have an extra Y chromosome because most are generally indistinguishable from other XY males.

Even if the XYY genotype was associated with aggression, in all likelihood the environment still plays an important triggering role. In other words, it is a predisposition that requires certain environmental conditions. For example, another gene abnormality linked to aggression affects the production of an enzyme (MAOA) that influences serotonin and dopamine neurotransmitter activity. This gene has been nicknamed the ‘warrior’ gene because it is disrupts the signalling in the PFC, and this has been linked with impulsivity and increased violence. In 2009, Bradley Waldroup escaped the death penalty in Tennessee after a murderous rampage, on the grounds that he had the warrior gene. According to his defence, it was his genes that made him do it. The trouble is that around one in three individuals of European descent possess this gene, but the murder rate in this population is less than one in a hundred. Why don’t the rest of us with the gene go on a bloody rampage?

Researchers studied over 440 New Zealand males with this gene abnormality, from birth to adulthood, to look for the biological basis of antisocial behaviour.85 They discovered that over eight out of ten males who had the MAOA gene abnormality went on to develop antisocial behaviours, but only if they had been raised in an environment in which they were maltreated as children. In contrast, only two out of ten males with the same abnormality developed antisocial adult behaviour if they had been raised in an environment with little maltreatment. This explains why not all victims of maltreatment go on to victimize others. It is the environment that appears to play a crucial role in triggering whether these individuals become antisocial.86 This is why it makes no sense to talk about nature and nurture as separate when we consider how our individuals develop.

Natural Born Killers

If early abuse turns on the effects of the warrior genes, can these negative attributes also be turned off? Neuroscientist Jim Fallon studies what makes psychopaths tick by looking at their brain activity and genes. One day, as he was sorting through lots of scans of psychopathic murderers, he noted that they all seemed to lack inactivity in the orbital cortex, a region of the prefrontal cortex. The orbital cortex is related to social behaviours such as smiling, and is also a region associated with moral decision-making and control of impulsive antisocial behaviour. People with low activity in this region tend to be free-wheeling types or psychopaths. Perhaps these psychopaths had bad brains?

At the time, Jim was also working on Alzheimer’s disease and needed control data to compare with patients. He persuaded members of his family to have their brains scanned and provide blood samples to match against the clinical sample. Every one of his relatives’ brain scans was normal – except one – his own. Jim discovered that he had the identical lack of activity in the orbital cortex that he had observed in the psychopathic killers. The irony of the neuroscientist discovering that he also had the same abnormal brain pattern as the killers was not lost on Jim.87

About a month later at a family barbecue, he was pointing this irony out to the other family members when his eighty-eight-year-old mother, Jenny, suggested that maybe he should do a little research into the family history, as he might be surprised. What Jim discovered was truly shocking. It turned out that his ancestor, Thomas Cornell, was infamous in American history as the killer of his own mother in 1667, the first documented case of matricide. But it didn’t stop there. There were another seven murderers in the line of the family from which Jim was directly descended! This was worrying. Jim looked for other evidence. Did he have the genes associated with aggression and violence? He had the blood taken from the Alzheimer study analysed. Jim’s blood was positive for the warrior gene and he had all the genetic risk factors that could predispose him to become a killer. At the time, geneticists likened the odds of Jim possessing this constellation of genes to walking into a casino and throwing double-six fifteen times in a row.

According to the biology, Jim should have been a natural born killer and a menace to society, but he wasn’t. Why not? Dr Jim Fallon used to be the type of scientist who followed a fairly genetic determinist line, believing that your genes pretty much determine your outcome, but his discoveries in brain imaging and genetics forced him to rethink his own rigid view of human nature. He had to accept that in his case the role of the environment had protected him, and in particular the nurturing from his own parents had played a major part in the way he turned out. This is because, from the very start, Jim was a special birth for his parents. His mother had four miscarriages in a row before Jim was finally born. It would be a long time before his mother had any more children and so Jim was treated as a precious child with a lot of attention and affection directed towards him. He believes all this nurturing offset the warrior gene that could have sent him off on a path of destruction.

Jim has avoided a life of crime and violence but recognizes that he still has many of the personality attributes of low orbital cortex activity. However, he recognizes that his own flaws may be residuals of his genetic predisposition. Rather than harming people, Jim simply does not make a strong emotional connection with others. He does not generally care about other people, especially those who are close to him, and he recognizes that he is close to the edge of being a psychopath.88 I expect that we all know someone like that.

Incubated in Terror

How does someone become a psychopath? Bruce Perry is a psychiatrist who believes that the origins of human violence can be traced to the environment in which we raise our children. If that environment is lacking in appropriate role models and examples of how to behave and treat others, then children fail to develop an appropriate moral dimension to their sense of self. Combine that with the stress of poverty and lack of education necessary to raise one’s self out of these conditions, and you have a recipe for disaster. Perry was called as an expert witness in several high-profile cases – the Columbine High School massacre, the Oklahoma City bombing and the Waco siege. He is a highly acclaimed and respected scientist who argues that human violence is a vicious cycle that begins early in development. To illustrate his case, Perry describes an example of a pointless teenage murder:

A fifteen year old boy sees some shoes he wants. Another child is wearing them – so he pulls out his gun and demands the shoes. The younger child, at gunpoint, takes off his shoes and gives them up. The fifteen year old puts the gun to the child’s head, smiles and pulls the trigger. When he was arrested, the officers are chilled by the apparent lack of remorse. Asked later whether he could turn back the clock and do anything differently, he thinks and replies, ‘I would have cleaned my shoes.’ His bloody shoes led to his arrest.89

Perry thinks such blindness to the plight of others is a form of retardation that results from a lack of appropriate emotional and social interaction as a child. This is an extreme case of Bowlby’s social isolation, in which the child has failed to develop a moral dimension to the sense of self. Like Bowlby, Perry argues that such retardation is a consequence of not exposing the child to appropriate experiences in which negative emotions are triggered but are then resolved. Without this experience, vulnerable children fail to lay down the models of appropriate behaviour during sensitive periods of social development.

According to Perry, this failure is due to the disruption of the development of neural circuitry that regulates behaviour. If you remember back to the organization of the functional structures of the brain, the lower brain systems are the origins for impulsive behaviour, including aggression. Perry argues that regulated behaviour depends on the relative strength of activation arising from the lower, more primitive portions of the brain and the modulating inhibitory action of higher cortical areas. Factors that increase the activity or reactivity of the brain stem, such as chronic stress and abuse, or that decrease the moderating capacity of the limbic or cortical areas, such as isolation and neglect, will increase an individual’s aggression, impulsivity and tendency to be violent. Only by raising children in nurturing environments can we provide the experiences within the right context that enable them to regulate their impulses and drives.

Examples of early violent behaviour are not rare. For instance, there has been a much-reported epidemic of fatal stabbings among teenagers in the United Kingdom over the past couple of years. However, the majority of children raised in impoverished backgrounds are not destined to become remorseless killers. According to Perry, they nevertheless carry the emotional scars. They tend to move through life in a series of destructive relationships, often with a profound sense of disconnection and emotional emptiness. This leads to the associated problems of addiction, crime and social poverty, thus establishing a destructive cycle for the next generation raised in this environment. Life loses its value and effectively becomes cheap, thus providing a fertile ground in which to breed a disregard for others. With over five million child victims of domestic violence in the United States alone and, worldwide, vast numbers of children impoverished by war and famine, Perry makes a convincing case that, despite our advances as a civilization, we are still raising children incubated in terror.

Learning to Take Control Over Your Life

As every parent knows, young children are impulsive. It’s as though they have no way of stopping themselves. They lack self-control. They dash across busy roads, laugh at fat people and shout out in public.

This inability to control thoughts and actions has been one of my research interests for decades now as I am interested by the fact that we have to develop the capacity for self-control as children in order to be clever and successful as adults. Otherwise, we would always be at the mercy of all the different urges and drives that compete for our attention and action. Young children lack adequate ways of stopping their urges, which manifests as impulsive behaviour.

All children go through a phase of impulsivity early in development but by the time they are ready for preschool, they are beginning to demonstrate the capacity to regulate behaviour. They can withhold doing things in order to achieve a greater goal. In medieval Germany, it was thought that, given the choice between an apple or coin, the child who could resist the temptation of the fruit and take the coin was ready for schooling. They were in control of their childish impulses and urges.

In my laboratory, we don’t offer children apples but we do sometimes offer them marshmallows. In what is now a classic set of studies from the late 1960s, Stanford psychologist Walter Mischel offered four-year-olds a plate with two marshmallows and told them that they could have one now, but if they waited while he left the room, the child could have both on his return.90 In our lab, to avoid the various ethical problems of using marshmallows, we use a similar test where we ask the child to turn their backs while we wrap a present that they can have if they wait. They are told not to turn around and peek at the present while we leave the room to fetch some tape to finish the wrapping. From behind a one-way mirror in the adjacent room, we record the child’s behaviour and how long they can wait.

Whether it is tempting marshmallows or hidden presents, both of these situations measure what is known as ‘delay of gratification’. This is the amount of time that children can wait before succumbing to temptation – and it turns out to be a very useful predictor of how children perform on other tasks that require self-control. What was most remarkable in Mischel’s original studies was that he found that delay of gratification measured at four years predicted a child’s academic performance and how sociable they were at fourteen years of age.91 When these children were followed up as twenty-seven year-old adults, those who had exhibited better self-control as toddlers were more successful, sociable and less likely to have succumbed to drug taking.92

The reason is simple. If you can regulate and control your impulses, then you are more patient at solving tasks, do not get bored so easily and can resist temptation. When it comes to other people, you are less selfish which makes you more likeable. Very often social interactions result in a conflict of interest between different individuals that must somehow be resolved. These coordinating abilities depend on self-control, and without it we can become antisocial.

Regulating our self is one of the major roles of the prefrontal cortex. These brain regions operate to coordinate competing thoughts and behaviour by inhibiting the excitatory commands arising from different regions. Without the executive control of our frontal lobes, we would be at the mercy of every whim, distraction, impulse, tic or urge that could threaten to sabotage any chance of achieving acceptance by the rest of society or fulfilling the goals we have set for our future self.

That’s why children with attention deficit hyperactivity disorder (ADHD) are thought to have poor self-control.93 They find it very hard to sit still. They can be disruptive. They cannot concentrate on a task. They are easily distractible. They are more likely to be shunned by other children and find it difficult to make friends. Their hyperactivity and impulsivity can become such a problem that the child becomes uncontrollable. For many decades, such children were labelled naughty and undisciplined. Not surprisingly, ADHD children perform below their classmates on academic tests with many requiring special schooling. Around half of the children diagnosed with ADHD grow out of it during adulthood, but the remainder still experience problems in later life. ADHD emerges in the preschool years and affects around one in twenty children with about three times as many boys as girls.94 Since the disorder was recognized in the 1970s, it has remained controversial. However, twin studies support a strong biological predisposition. If one identical twin has ADHD, then, in around three out of every four cases the other identical twin also has the disorder.

The behaviour of children with ADHD is sometimes described as ‘wired’, as if they are on speed. This is ironic because one of the treatments is to give them Ritalin, a stimulant similar to amphetamine drugs. These drugs increase the activity of neurotransmitters that operate in the frontal lobes, which are thought to increase inhibition and the capacity to concentrate. This is why many university students who have no medical problem that requires them to take Ritalin, nevertheless use it to improve their academic performance. It helps them concentrate. In contrast, alcohol, which is a depressant drug, reduces activity of the frontal lobes and our capacity to inhibit drives, which is why people can become hungry, harmful and horny when they are drunk.

However, there may be another way of controlling your self rather than drugs. Delay of gratification tasks reveal that children who manage to delay are not just sitting there staring at the marshmallow and using willpower to control their urges. In fact, that would be the wrong thing to do. Rather, the children use different strategies to take their mind off the temptation. Very often they distract themselves by singing a song or doing something with their hands to take their mind off the temptation. In fact, coming up with alternatives might be the secret to resisting temptation. You can even train children how to distract themselves or tell them to imagine that the marshmallow is only a picture and not real. All of these strategies reduce the attention-grabbing properties of the goal, thereby making restraint more possible. It also means that self-control is something that can be practised, which explains the counterintuitive finding that children raised in very strict households perform worse on delay of gratification. By being too controlling, parents do not allow children to develop their own internalized self-control,95 which might explain many of the stereotypes of individuals who have led sheltered lives, running amok when they are no longer under the control of others.

But who is this person who is out of control, if not the juvenile self? Who is distracting who? Some colleagues argue that the whole notion of self-control seems to demand that we accept that there is a self in the first place to lose control. Where is the illusion of self here?

One way to think about it is to imagine the self constructed like a spider’s web but without the spider. Each strand represents an influence pulling on the overall structure. The self is the resulting pattern of influences pulling together, trying to find a common ground. These are the thoughts and behaviours that compete for our activity. Some strands are stronger than others and, if they snap, the shape of the web can become distorted. In the same way, our lives are made of different strands holding our self together. The young child without self-control is still constructing his webs of influence and has not yet established ways of offsetting the strong impulses that want to take over. The arrangements of strands are self-organizing by the fact that they are competing. There need not be a self at the centre of the web holding it together.

The Essential Self

The self can be thought of as something at the core of someone’s existence. This is sometimes referred to as the essence of who someone is. People often refer to a person’s essential properties – what they are really like. In many ways, the self illusion could become an argument about whether the essential self really exists. This notion of essence is worth considering further.

Imagine that I take your wedding ring or any other object of sentimental value and, using some futuristic machine, I replace it gradually, atom by atom, until it no longer contains any original material but it is indistinguishable from the ring that existed before the processing. Would it still be the same ring at different stages? Most would accept that a ring with a few atoms replaced was essentially the same ring. A ring with everything replaced was essentially different. But at what stage would the ring transform identity and why would one atom alone make the difference? Also, if this process was gradual, most people would be inclined to say that it was the same ring maintaining identity over time even if it contained none of the original ring. But imagine that we recombine all the material from the original ring so that we now have two rings. Which is the original? Does the identity of one object suddenly cease to exist when another is reconstructed?

Clearly the identity of material objects is called into question under these circumstances, but what about the identity of persons? Imagine we perform the same sort of replacement using a person. Philosopher Derek Parfit uses these types of scenario to challenge the reality of the self.96 He asks us to imagine replacing a person cell by cell so that the original person no longer contains any of the physical material before the process started. In one example, he asks us to imagine replacing our cells one by one with those from Greta Garbo. At what point do we become the famous Swedish actress? When does our self become her self? Using such logic, Parfit dismisses the notion of an essential self in the first place.

These are compelling thought experiments that challenge our intuitions about the nature of reality and identity. Frustratingly, there are no right and wrong answers to these questions, and the academic exchanges between philosophers highlight the disagreements that these scenarios generate among those who have pondered them professionally for years. However, to the man in the street, they reveal a common psychological intuition that there must be some enduring self that exists independently of the physical body – an essential self that defines who we are.

When we think essentially, we believe that an internal property exists that refines the true nature of things and this way of reasoning emerges somewhere around the third to fourth birthday. In her seminal book, The Essential Child, Susan Gelman97 makes a convincing case that essentialism is a naturally developing way of thinking, that children use to chop up the living world into all the different species. When children learn that all dogs are members of the same group, then they do so on the basis of assuming that all dogs must have some form of doggy essence inside them that makes them different from cats which have catty essence. They understand that if you change the outward appearance of the dog so that it now looks like a cat, it is still essentially a dog and will behave like one.

In truth, this distinction could be made at the biological level when it comes to considering the DNA sequences of both species, but few children are ever told about genetics and yet they assume there must be an invisible property that differentiates the animals. Essentialism operates initially in young children’s reasoning about the biological world but eventually becomes part of categorizing the important things in their world in general. This is especially so when they come to see others as unique individuals with unique minds.

My colleague Paul Bloom argues that essentialism is also at the heart of why we value certain objects or experiences: we believe them to have certain essential truths.98 For example, we prefer and admire original works of art until we discover they are forgeries. Fakes are never worth as much as the original, even if you could not tell them apart. Most heterosexuals would enjoy sex with a good-looking member of the opposite sex until they discover that the person is a transsexual. For many heterosexuals, the thought of penetration with a member of the same sex is disgusting, even though they may never be aware of the true biological origins of their partner. Although the physical pleasure could be identical, the discovery that things are not what you believe reveals that our enjoyment depends on an assumption of authenticity. This is because we believe that a deception of identity has taken place. The same can be said for our common-sense notions of the self. The true nature of a person is their essential identity and when they are not true to their self, we call them fakes, cheats and hypocrites, lacking strong core values. All of this language betrays a notion of some internal truth or self that has been violated.

This core self, wandering down the path of development, enduring things that life throws at it is, however, the illusion. Like every other aspect of human development, the emergence of the self is epigenetic – an interaction of the genes in the environment. The self emerges out of that journey through the epigenetic landscape combining the legacy of our genetic inheritance with the influence of the early environment to produce profound and lasting effects on how we develop socially. These effects, in turn, can shape the way we interact with others and raise our own children. These thoughts and behaviours may seemingly originate from within us but they emerge largely in a social context. In a sense, who we are really comes down to those around us. We may all be born with different biological properties and dispositions but even these emerge in the context of others and in some cases can be triggered or turned off by environmental factors. The extent of these reactions and how they happen, is what scientists are trying to discover. We may feel that we are the self treading down the path of life and making our own decisions at the various junctions and forks, but that would also assume we are free to make our choices. However, the freedom to make choices is another aspect of the self illusion.

Memory Test

Were the following words present in the list of words you read?

a) needle

b) river

Загрузка...