8

Caught in the Web

It started with a man shouting in the crowd. Why was he shouting? Did someone say, ‘Bomb?’ No one was certain but no one stuck around to check. Within seconds there was a mass stampede. It was the Second World War remembrance ceremony in Amsterdam in May 2010. One witness reported,

Just after the moment of silence, a very loud shriek could be heard not that far away from where I was standing. Immediately, a huge amount of people started to move away from where the sound had come from. It was scary when this mass of people attempted to run away from the shout.1

This is an example of stampeding – a common phenomenon found among animals that live in groups. It occurs whenever something triggers a sudden movement of a small number that causes the whole to respond en masse. This is the herd mentality, the hive mind. When someone in a crowd starts to run, we instinctively follow. In the same way, we tend automatically to copy and mimic others. This makes a lot of sense. Like meerkats, we can benefit from the collective wisdom and awareness of others when some threat arises. Because we instinctively respond to other humans, a simple action in one individual can rapidly spread and escalate to complex group activity. The problems occur when large numbers gather in limited spaces and the threat is disproportionate to the danger of the moving crowd. Over sixty people were injured in the Amsterdam stampede but they were lucky. Every year, hundreds of people are killed when large crowds gather in confined spaces and panic breaks out.

It is in our nature to assemble in groups. Many of us seek out crowds and groups to satisfy a deep need to belong. In doing so we cluster with like-minded individuals who share common interests (this is why most stampedes occur at religious festivals and sporting events). This is because we substantiate our self in the crowd. Sometimes others feel their individual self is lost in the crowd as they become one with the others around. Whether we feel lost or found, our self is ultimately influenced by the collective properties of the groups we join. As soon as we join others, our self is reflected in the crowd.

This relationship between the individual and the crowd is a key interest in the field of social networking where scientists try to understand the nature of groups in terms of how they form, how they operate, how they change and how they influence the individual. Some of the most dramatic examples are the riots that periodically erupt in otherwise civilized societies. In 2011, the police shooting of a black man set a London mob burning and smashing their way through the capital. Although the killing was in London, copycat rioting broke out in other English cities. Commentators were quick to look for culprit causes – social class, education, ethnic group, poor parenting, unemployment, boredom and so on. When they started to look at the profiles of those arrested in the London disturbances, however, it soon became apparent that there was not just one type of rioter but a variety from different backgrounds, ages and opportunities. Many were disaffected youths from deprived backgrounds but there was an Oxford law graduate, a primary school teacher, an organic chef, children of a pastor and other unlikely ‘criminals’. In attempting to categorize the typical looter, the authorities had failed to understand that coherent groups emerge out of very different individuals.

It doesn’t even have to be a perceived miscarriage of justice that triggers riots. In 2011 another riot exploded in Vancouver in response to the outcome of the Stanley Cup ice hockey final when the Canucks lost to the Boston Bruins, sparking a flurry of rioting and looting. Canadians take their ice hockey very seriously!

Harvard’s Nicholas Christakis says that when you take a bird’s eye view of humans through the prism of social networks, the picture of both the individual and the group changes.2 He draws the analogy with graphite and diamonds. Both materials are made of carbon atoms but it is the way these individual atoms are connected that determines why one material is soft and dark and the other is hard and clear. The layered lattice arrangement of graphite carbon atoms means that it shears easily whereas the highly interconnected arrangement of diamond carbon atoms means that it is as hard as – well, diamonds of course. Therefore, when it comes to carbon atoms, the whole is greater than the sum of its parts. Similarly, understanding the individual self only really makes sense in terms of the groups to which they are connected. To extend the carbon metaphor, when we are well connected we are more resilient because there is safety and strength in numbers. Alone, we are more vulnerable and weaker.

The mechanisms for joining groups are not completely random. We all possess individual differences that mean we join some groups and not others. There are strong historical, geographic and physical factors at play. We tend to form friendships and associate with others who represent our culture, live close by, resemble us and with whom we can easily connect.3 We also form friendships with those who share the same interests and worldviews. We tend to like those who resemble us physically. For example, obese people are more likely to hang out with other obese people and the friends of obese people.4 If one friend is overweight, there is a 45 per cent increased likelihood above chance that the other friend will also be overweight. If you are the friend of a friend who has another overweight buddy, then your likelihood is going to be 25 per cent above chance. This is known as ‘homophily’ – the tendency for bird’s of a feather to flock together, for like to be attracted to like. Only by the time the relationship is a friend of a friend of a friend of a friend has the link to obesity disappeared.

Homophily can arise for various reasons such as shared external environments or interests. National identity, religious beliefs, team allegiances or music fans are examples of homophilic groups resulting from external factors. There is nothing genetic about being a British, Christian, Manchester Utd supporter who likes Dolly Parton. More surprising, however, is the recent discovery of genetic factors in homophily in social groupings. It has long been known that good-looking people tend to hang out with each other and that looks are partly genetic, but a recent study by Christakis and colleagues has shown that genes associated with behavioural traits are also related to friendship formation.5 For example, one gene, DRD2, associated with the disposition to alcoholism, was found to predict homophily in clusters of friends whereas another, CYP2A6, linked with openness, surprisingly produced the opposite effect of heterophily – the tendency to associate with others where there are no shared interests (‘opposites attract’). The causal mechanisms by which genes might exert this influence on behaviour is unclear and investigation of the genetic factors implicated in social networking is in its early days, but the discovery that genes operate in social environments means that we have to rethink the extent to which our biology influences our behaviour.

The Technology Savannah

Technology is changing the way we communicate and this is going to have an impact on the way we behave socially. Specifically, social networking may have very significant consequences for the way we develop. Our human mind, which was forged and selected for group interaction on the Serengeti, is now expected to operate in an alien environment of instant communication with distant, often anonymous individuals. Our face-to-face interaction that was so finely tuned by natural selection is largely disappearing as we spend more time staring at terminal screens that were only invented a generation ago. The subtle nuance of an intonation of voice or a facial micro-expression6 is lost in this new form of communication. The physicality of interaction is disappearing, which may be something to which we will need to adapt. But ultimately, it will change the way we assemble our sense of self because of the influence of groups. Even if this turns out not to be correct, we would be wise to give these new technologies some careful consideration as they have the potential to have profound effects on the way we live.

For some time now, man has had the capability to shape his own future. With our capacity to communicate and our ability to form societies, we hand down knowledge from one generation to the next. We have used this communication to develop technologies such as writing. With the advent of science, most of us in modern societies have been freed from the shackles of hostile environments and hard times. Civilization has enabled humans to take control of processes that used to whittle out the weak. In the distant past, natural selection ensured that the old, the sick and the infertile lost out in the mating game. This has been changed by technological innovation. Modern medicine, with its fertility treatments and healthcare, has shifted the goal posts. Of course, natural selection will always be with us, but we can use our science to outwit its relentless cull of the least suited. Human development is increasingly shifting away from natural selection to Lamarckian inheritance – the idea, named after the French biologist Jean-Baptiste Lamarck, that we can change our selves while we are still alive and pass on the benefits of that change to our children by tailoring their environments. It’s not clear how we will continue to evolve, but science and technology seems unlimited in their ingenuity to bend the rules. Similarly, our technologies and advances in communication through the Web will forever shape the future of humankind in ways that are not yet clear. One thing that is certain is that the Web will influence our sense of self as we increasingly live our lives online, as members of virtual groups.

I remember when the Web first emerged. I had just arrived at MIT’s Department of Brain and Cognitive Science in the autumn of 1994 as a visiting scientist. I was checking email in the computer room where Zoubin Ghahramani and Daniel Wolpert, two brilliant young scientists, were getting excited about Netscape, one of the first Web browsers that had just sent through some image files to Zoubin’s terminal. The internet had already been in existence for years to allow academics and the military to exchange emails but the invention of HTML (the Web programming language) provided the first opportunity to share more than just text. This was my first encounter with the Web. In the past, people could email you files directly but now anyone with the right address could visit and view material on a remote website. It was like creating something permanent in the electronic landscape that others could come and visit well after you were gone – almost like the virtual immortality of scratching a mark on an electronic wall for others to find. As nerdy scientists, that afternoon we all recognized the importance of this capacity to share remote information, but I doubt any of us fully understood its potential power.

The subsequent rise and spread of the Web into everyone’s lives has been astonishing for those of us who remember the pre-Web days but my daughters seem oblivious because they have grown up with the rapid change of pace today and assume it has always existed. I tell my own children that they are living during one of the major transitions in human civilization, that humankind is currently in the midst of the next great evolutionary leap. This sort of statement may sound sensationalist. It may sound nostalgic as some of us hanker for simpler times. It may even sound like the curmudgeonly grumblings of a middle-aged dad who laments that, ‘Things were different in my day.’ Indeed, every generation probably says this, but I cannot overstate this transition too much. I think that most of us are sleepwalking into uncharted territory. We need not fear it. It is one of most exciting times to be alive in the history of humankind.

Who Hath Not Googled Thyself?

Have you ever searched for your self on the Web, entering your name into the Google search engine to see if you come up? Go on. Be honest. Only the very few cannot be curious to know what’s been said about them, if anything at all. And where better to find your self than on the Web? It’s the electronic version of looking first for your self in the group photograph or hearing your name mentioned in a crowded cocktail party and then straining to listen to what is being said about you. The advent of the Web has made our preoccupation with what others think about us a part of human nature. For better or worse, most of us in industrialized countries are now on the Web whether we like it or not.

Many of us enjoy being on the Web and actively use it socially to interact with others. Social networking sites such as MySpace, Facebook and Bebo have attracted millions of users, many of whom have integrated these sites into their daily lives. The mightiest at the moment is Facebook, which currently has over 750 million active users. There are several core features of the different social networking sites. First, they enable users to construct public and semi-public profiles on a website. Second, the website enables users to view other users who subscribe to the service and, most importantly, enables them to communicate with each other by messaging and facilities for sharing files. It’s like a 24/7 virtual cocktail party where you mix with friends but sometimes meet new acquaintances, swap stories and opinions, share a joke, maybe look at each other’s family photographs and flirt (or sometimes even more). Or perhaps you sign a petition or start a cause to change things. As the new media expert Jenny Sundén succinctly put it, social networking sites enable users to, ‘type oneself into being’.7

Not surprisingly, an analysis of personal profiles posted on social networks reflects a great deal of narcissism8 – the tendency to be interested in one’s self and what others think about us. After all, why wouldn’t we want others to know about how successful our lives were? However, this obsession with our self on the Web will depend mostly on who you are. Being online is not for everyone. For example, my wife refuses to join social networks but then she also does not want to appear in the public light. Like my wife, many of the pre-Web generation cannot understand what is going on and frankly do not feel the need to surrender precious time, effort and especially privacy to join online communities. They don’t get YouTube, they don’t get Facebook and they certainly don’t get Twitter, which seems to be the ultimate in broadcasting trivial information about one’s self. However, even stalwarts against the onslaught of social networks are being dragged, kicking and screaming, into a new era. The social networking sites that have sprung up in this last decade are changing communication between people and will play an important role in self-identity. If the self illusion is correct, social networking sites will continue to expand in popularity and will increasingly shape the sense of who we are for the next generation and those that follow. As long as we remain a social animal, social networks in one form or another are here to stay.

This is because most of us want to be noticed. Surveys consistently show that the West has embraced the celebrity culture. When 3,000 British parents were asked what their pre-teen children wanted to be when they grew up, one in three said they wanted to be a sports figure, actor or pop star. Compare that to the professions that topped the aspiration list twenty-five years ago: teachers, bankers and doctors.9 Children now want to be famous for the sake of being famous because they equate fame with success. A recent survey of the UK Association of Teachers and Lecturers revealed that the majority of students would prefer to be famous than academically gifted.10 The Web facilitates this obsession with fame and popularity by providing a readily accessible and updatable medium where individuals can indulge their interest in the famous but also begin to make an impact of their own. Anyone can gather a following on the Web. It has levelled the popularity playing field so we can all be noticed.

Also, for most people, the Web is first and foremost a social medium. According to the Neilson Company, which specializes in analyzing consumer behaviour, the majority of time spent online is engaged in social networking sites and that is increasing each year.11 By August 2011, we were spending over 700 billion minutes per month on Facebook alone. One in five US adults publishes a blog and over half of the American population have one or more social networking profiles. Even when we are at work, we are social networking: on average a US worker spends 5.5 hours each month engaged in this activity on company time.12

It is even more pervasive in adolescents and young adults. At the moment, if you grow up in the West and are between sixteen and twenty-four years of age, being online is essential. This age group spends over half of their online time engaged in social networks in comparison to older age groups. Many Western teenagers feel they do not exist unless they have an online presence. Life online has taken over from the school playground and the shopping mall where the kids used to hang out.13 It has extended the window of opportunity to socialize at anytime and anywhere. We used to tell our kids to get off the phone. Now they use their own phones and can be chatting online whenever they want. According to the most recent report by Ofcom, the industry regulator of communications, half of all UK teenagers compared to a fifth of adults possess a smartphone.14 The most common use of the phone is not for making calls but visiting social networking sites. Two-thirds of teenagers use their smartphones while socialising with others; a third of teenagers use them during mealtimes; and nearly half of teenagers use their phones to social network in the bathroom or on the toilet. No wonder that six out of ten teenage users consider themselves addicted to their smartphones. They get to continue socializing well after the school is shut, the mall is closed or their families have moved them to another town.

How is this online activity going to affect their development, if at all? A child’s social development progresses from being the focus of their parent’s attention as an infant and preschooler to stepping out and competing with other children in the playground and class. Initially, children’s role models are their parents but as they move through childhood and develop into adolescents, they seek to distance themselves from the family so that they can establish an independent identity among their peers. As a parent, I have come to accept that what peers think can easily trump whatever a parent wants for their child. This may not be such a bad thing. I am not a therapist but I can easily believe that overbearing parenting creates later problems if children are not allowed to establish their identity among their peers. These are the popularity contests that preoccupy most adolescents in Western culture. It is through this chaotic period of self-construction that the adolescent hopefully emerges out the other side as a young adult with the confidence to face the world.

As every parent in the West knows, adolescence is typically a time of rebellion, bravado, showing-off, risk-taking, pushing boundaries, arguments, tears, strategic allegiances and Machiavellian negotiation. Some blame immature brains, which has perpetuated the ‘teen brain’ hypothesis – that the disruptive behaviour of this age group is the inevitable consequence of lacking of inhibitory control in the frontal lobes which are some of the last neurological structures to reach adult levels of maturity. Teenagers are hypersensitive to social evaluation, but does that explain the increase in risky behaviour? Psychologist Robert Epstein believes the teen-brain account of delinquency is a myth – that adolescent turmoil is more to do with culture and the way we treat our children.15 He points out, for instance, that teenage misbehaviour is absent in most pre-industrialized societies. He argues that teenage delinquency is an invention of modern societies and media stereotypes, and describes how, before the arrival of Western media and schooling, the Inuit of Canada did not have the problems of teenage rebellion. For him, the problems we see in Western teenagers are more to do with the way we isolate this age group and effectively let them establish their own social groups and hierarchies. These are the pecking orders of popularity through the processes of affiliation, competition and establishing one’s self esteem in the eyes of one’s peers. In this situation, teenagers think that in order to gain a reputation among their peers, they have to be outsiders to the rest of society.

Others argue that the data on teenage risk-taking is incontrovertible. It may be somewhat culturally influenced, but far from being erratic, teenagers are just as good as adults at reasoning about risk. They simply consider some risks to be worth taking to prove themselves. Developmental psychologist Lawrence Steinberg has shown that teenagers perform just as well as adults on simulated driving tasks when they are alone, but run more risks when friends are watching them.16 When friends were present, risk-taking by speeding and running red lights increased by 50% in teenagers whereas there was no increase in adults. One neural account is that the reward centres in the teenage brain are highly active during this period of development. Rewards are thus super-charged when individuals succeed in front of their friends which makes success all that more sweet and the risks to achieve them worth taking. But it is not enough to succeed. One has to be seen to succeed.

In the West, adolescents are faced with the paradox of wanting to be accepted by their peers but at the same time needing to be different. Music, fashion, films and, of course, sex are the things adolescents care about the most because these are the very things that help to create the unique identities they seek. It is no coincidence that these are the main topics of ‘likes’ and ‘dislikes’ in online social networks. Whenever you put something up on a social network, you are inviting a response from your peers. It is not for your own private viewing but rather you are broadcasting your presence to the world. The number of comments and hits your activities generate tell you and, more importantly, others that you matter. Most of us want to be noticed and social networking sites make this universal yearning the core of its activity. Having others validate your presence is the currency of popularity that individuals seek.

What a Twit I Am

One day earlier this year, the famous British actress, Dame Helen Mirren, started to follow me. I wasn’t walking down the road in London or Los Angeles where the Oscar-winner probably spends most of her time. Rather, I was seated at the kitchen table in my Somerset barn, taking a break to look at my Twitter account, when I saw that @HelenMirrenDBE was following me. Or at least I thought she was. My heart skipped a beat.

For the uninitiated, Twitter is a site where you can post any message so long as it is less than 140 characters. It’s like open access texting to the world where anyone who follows you can read your messages or link to images and other websites that you might put up. I currently have over 3,000 Twitter followers. I don’t even personally know that many people and if you sat me down, I would be hard-pressed to name a couple of hundred individuals. Even though my following may sound impressive, I am way down on the Twitter hierarchy. Individuals whom you would never imagine being icons of interest are huge on Twitter. Lance Armstrong, the top cyclist, has well over a million followers. So does the actor Brent Spiner. Who was Brent Spiner I wondered? None other than the actor who played the android, ‘Data’, on Star Trek. There are a lot of Trekkies out there!

What is it about Twitter that makes it so appealing? Why do we follow and want to be followed? It probably comes down to a number of related issues to do with the self. First, the human brain is a gossiping brain – we are nosey and want to know what others are up to even if that includes what they ate for breakfast that day. Second, we like our opinions to be validated by others. When someone responds positively to an opinion or shares it with others, we feel vindicated. Of course, if our opinion is rejected or ridiculed then our self-esteem is deflated. Having the option to follow or unfollow others means that individuals within a social network tend to share the same values and attitudes.

We also like to be the first to know something or spread the word. This is something we did as children. Remember how important it was to be the first to spread the word in a playground? If you were the last to find out something then that was a reflection of how important you were in the pecking order. By being the first to know something, we cement our self-importance with others. However, one of the most powerful draws of social networking sites like Twitter is that they make you feel important if you have a large number of friends or followers. Your self-worth is validated and the more followers and friends you have, the more you value your self.

Another reason why Twitter has taken off (it is the fastest growing social network) is that celebrities happily post their thoughts and updates on a regular basis. These used to be off-limits to the general public. Suddenly we have access to the famous in a way that was never possible before. The public’s appetite for celebrity trivia has long been insatiable. There is a whole industry of paparazzi and tabloid press that has evolved out of the primeval slime to provide the gossip to feed the masses but Twitter is far superior because it comes directly from the celebrities. Of course, celebrities need their followers because without the fans, they are out of the public eye, which usually also means out of work. So most now have Twitter presences. In fact, many employ writers to compose their tweets so that the illusion of accessibility and visibility is sustained.

The biggest boost to your self-esteem is if a celebrity such as Helen Mirren follows you. Whenever someone of a perceived higher status befriends us then we are raised in our standing by association. This is known as basking in reflected glory. Many of us take vicarious pleasure by associating with the success of others. This is why fans are so passionate about the individuals and the teams they support. Sports fans are probably the most common example. I have heard many a pub argument where fans lament the team manager’s decisions as if it were a family feud. Fans even talk as if they are a member of the team by using the pronoun, ‘we’.17 Twitter facilitates these distortions of reality by generating the illusion of easy accessibility to the famous. Anyone can follow a celebrity who is on Twitter, thus creating an interesting social phenomenon where we feel an intimacy with others whom we would never have the opportunity to meet in our normal lives. The relatively open access of Twitter also creates problems. Strangers feel that they are on a familiar basis with those they follow – which is not so very different from celebrity stalkers who are deluded in thinking that they share a life with their victims.

Karl Quinn, an Australian journalist, pointed out that Twitter is perfect for mob nastiness. It enables individuals to make cruel comments and then pass them on: ‘Many of us are in two minds about whether celebrities are flesh-and-blood human beings or merely life-sized piñatas in need of a damned good whacking.’18 The trouble is that as soon as a victim is identified most of us are more willing to join in with the bullying than we imagine. Remember how that worked in the playground? It was easier than standing up to the mob. The same is true for Twitter – people join in with the mob rule. Also, with the problem of polarization (discussed shortly) that is endemic in social networking sites, attitudes and opinions will naturally shift towards more extremism as those who seem to agree with us egg us on or we feel the need to be more judgemental. With their combination of distorted opinions, rapid communication without the time for reflection and the perceived distance as well as anonymity, social networks are a perfect platform for us to behave in a way that we would not in real life.

This raises an important point with regards to the difference between online and offline selves. If we behave differently when we are online, then where is the true self if the self really does exist? Can we draw a real difference between thoughts and actions? If our actions are virtual and anonymous, are they any less representative of our true self? One could argue that because social rules and the pressure to conform in real life are so powerful for many, offline activities do not reflect our true attitudes and thoughts. If they can only be expressed online in the absence of the threat of any repercussions or social rejection, what kind of true self is that? That’s one reason why we need to be reminded that the self is an illusion if we believe that it can exist independently to the different contexts and influences of others. One might counter that there is only one self that behaves differently depending on the situation but that is the heart of the illusion. We are far more under the influence of contexts and others than we appreciate. Just like the alcoholic who thinks they can control their drinking, our self illusion prevents us from seeing just how far we are at the mercy of influences outside of our control.

But I am sure you want to hear more about Helen Mirren. What’s she like? What does she eat for breakfast? Sadly, I was deluding myself with my own self-importance. When I looked at her profile it was clear that with only 216 followers, my Helen Mirren was most definitely a ‘troll’. Trolls are individuals who take delight in disrupting social networking sites by posting offensive comments or pretending to be someone else. I don’t even know if Helen Mirren is on Twitter but, if she is, I have no doubt she has thousands of followers. For one tantalizing moment that morning, my heart skipped a beat as I thought that my adolescent crush was taking an interest in me. That would have been an enormous boost to my ego but why would a great British actress like Helen bother with a lowly egghead like me? There again, even celebrity actresses are sometimes intrigued by the mundane lives of mere mortals. She is human, after all.

The Human Borg?

Some commentators have expressed anxiety over the rapid rise of social networks and have predicted a breakdown in human civilization. We have heard similar prophets of doom decrying all media from books to radio to television. One fear is that we are allowing the brains of our children to be destroyed forever as they lose the skills necessary to interact with others in real life and pass through a critical period of psychological development that is essential for healthy socialization.19 As the plasticity of their frontal neural circuits hardens, we are told that they will be forever locked out of normal social development and grow up into retarded adults. The claim is that they may never acquire adequate attention spans that are stimulated by real life social interaction. Social networking sites and online activity in general are depriving them of normal social environments. More alarming is the suggestion that the rise in childhood autism may be linked to increased online activity.

The scientific evidence for such claims is sparse to say the least and indeed the Internet is arguably beneficial for those who find normal social communication difficult.20 Also, critical periods are restricted to total deprivation at a very early age. Remember the Romanian orphans and the critical first six months? There are very few children using the Web before their first birthday! Also, as developmental neuropsychologist Dorothy Bishop pointed out, the claim that online activity causes autism is ludicrous as the condition appears well before school age and the use of computers.21 When it comes to social development, the human brain is incredibly resilient and resourceful. So long as there is some social interaction then all should be fine. Just like language, humans are wired for social interaction but incredibly flexible in the actual way they do it. Yes, children may not learn the same Ps and Qs of social etiquette that their parents acquired during real interactions, but they will develop their own ways of interacting both on and offline. Consider one example of how children communicate using abbreviations in texting such as LOL (‘laugh out loud’), OMG (‘oh my God’), SNM (‘say no more’), BRB (‘be right back’), GTG (‘got to go’), or ROFL (‘roll on the floor laughing’). This is a highly effective strategy for transmitting phrases in an optimal way. This was not deliberately invented and handed down by the custodians of social networks but, like much of the etiquette on the Web, emerged in a bottom-up fashion. Left to their own devices, the kids will be all right.

In fact, there are arguments that rather than threatening the future of human psychological development, the new social media is returning us to the situation before the old media of print, radio and television infiltrated all of our lives. One of the gifted prophets of this new social revolution, June Cohen from the TED organization, makes this counterintuitive point.22 For much of human civilization, she argues, media was what happened between people in the exchange of news, stories, myths, jokes, education and art. We mostly communicated with one another around the Serengeti campfires. Up to a few hundred years ago, very few of us could actually read. Then the old media of books, radio and television appeared. If all of human history were compressed into a single twenty-four-hour day, these old media only emerged in the last two minutes before midnight. But this media was different from the village gossip we used to spend our time engaged in. Unlike normal communication, which flows in both directions, the media that entered our homes was one directional. We read the news, listened to the radio and watched the television. We stopped communicating with each other. As Cohen puts it, ‘TV created a global audience, but destroyed the village in the process.’

Then Tim Berners-Lee invented the Web, providing a different kind of social experience. This new media, which by the same analogy just appeared seconds ago on the clock of human history, is much more democratized, decentralized and interactive. Cohen believes that we are returning to a point in human development where we really can communicate with each other again, only this time we are not restricted to the physical size and location of our village.

This may be true but there are some cautionary tales that we must bear in mind. We are interacting once again but the Web is very different to the campfire or garden fence. We are unlikely to become socially retarded but the way we construct our sense of self will be affected. The process won’t stop, only the way we go about it. This is because the Web is changing the way we live our lives. It is not just the amount and range of readily accessible information or the way we do business or find entertainment. It is the very way we behave toward one another. After all, interaction with one another through a computer is not natural. Then again, neither are telephone conversations and the telephone hardly reshaped our social development. The real difference is the power of each of us to communicate simultaneously with the group as a whole. That’s a first.

Never in the history of humankind have we had the ability to communicate rich information with practically anyone on the planet instantaneously. Each new innovation from the printing press to the telephone and eventually the computer has been regarded as a milestone in human technological advancement, but the invention of the Web will outstrip them all in terms of its impact on the human race. Now we can potentially communicate with anyone. We can harness the collective power of multiples brains. Many of us are amazed by what computers and software can do. For example, there is more technology and power in today’s programmable microwave oven than was needed to put a man on the moon. Moore’s Law tells us that computer power doubles approximately every two years, which is one of the reasons I always seem to delay replacing computers in anticipation of the more powerful model just around the corner. Eventually we will reach a technical limit to Moore’s Law and require a new type of computing. But the Web is different. The Web will not be so limited. This is because the Web is primarily a medium for sharing knowledge and ideas generated by brains.

Every normal human brain is more powerful than any computer so far built. By connecting them together through the Web, we have the potential to harness the collective thinking of millions of individual brains that are constantly checking and rechecking material on the Web. In 2005, the premier science journal Nature declared that the online encyclopaedia ‘Wikipedia’, created entirely of voluntary contributions from Web users, was as accurate as the Encyclopaedia Britannica, the multi-volume traditional source of knowledge produced by teams of paid experts, first published at the end of the eighteenth century. Web users were simply motivated to distribute their knowledge free of charge and, to this day, Wikipedia is funded almost entirely from public donation.

Consider decision-making and the problem of analysis paralysis which occurs when there are too many choices. Much of that problem is solved for us on the Web. When was the last time you made a purchase online and ignored the ratings and comments left by others? When did you choose the third or fourth rated item on a list? I expect never. Whether it is choosing a book, film, hotel or microwave, we ignore the expert review and pay more attention to other users’ feedback, as we trust their experience as being more honest. They have no vested reason to lie. Everywhere we are invited on the Web to give our opinion with gladiatorial thumbs up or down to make pronouncements. According to the 2010 Neilson report mentioned earlier, up to one in five Web users regularly provides feedback on movies, books, television, music, services and consumer products. The collective experience of multiple users produces a consensus of opinion that shapes our decisions. Of course, you are the one making the choice, but it is a decision based on what others think.

This hive mind process is not flawless, however, as we tend to follow the herd mentality as evidenced by stampeding, but this compliance effect is much reduced on the Web. There is more honesty and dissent when we can remain anonymous online. Of course, there are always those who attempt to subvert the process with false recommendations and condemnations, but they are eventually rumbled with time. Last year there was an almighty hullabaloo in academia when eminent British historian, Orlando Figes, was accused of trashing other historian’s books on Amazon in the guise of an anonymous reviewer who simply called themselves ‘Historian’. Figes threatened to take legal action only to discover to his embarrassment that it was his own wife who had been writing the reviews to discredit her husband’s competition.23 Some would call that charming wifely support.

With all its benefits, the spread of the Web will be relentless. Over the next few years, accessing the Web will no doubt improve in ease, efficiency, speed and volume as platforms increase our ability to interact with each other. We may even one day make the unsettling transition of being integrated to the Web through biologically compatible interfaces, but the basic fundamental change that is most important to human civilization is that, in the West, we are all now potentially connected to each other. We can benefit from the wisdom of the crowd – the collective power of billions of brains. We have become the human equivalent of the Borg – the science fiction race of cyborgs from the Star Trek series who are all simultaneously interconnected. But we are not drones. We are independent autonomous individuals – or at least that’s what we think.

Mining the Mountain of Data

The march of the Web may be relentless but there is a big problem with it – literally. Natural selection tells us that when something increases in size it becomes inefficient. In the case of the Web, it is becoming too big – too unwieldy. Cisco Systems, the worldwide leader in networking, estimates how much data are generated and stored on the Web. According to their Chief Futurist, Dave Evans, one of the guys who plans the future strategy of the company, ‘Humans generated more data in 2009 than in the previous 5,000 years combined.’24 With numbers like that, you might as well say the whole history of humankind. There is simply too much information out there to process. Most of it is junk – nuggets of gossip or titillation. As social media scientist danah boyd (she avoids capitalizing her name for some reason) has commented, ‘If we’re not careful, we’re going to develop the psychological equivalent of obesity. We’ll find ourselves consuming content that is least beneficial for ourselves or society as a whole.’ 25 Much of what is on the Web is the equivalent of information junk food so search engines like Google sift the knowledge for the relevant information using clever modelling algorithms. Whenever we look for information, search engines analyse Web pages that have been viewed by other users seeking similar information and then rank the most relevant pages for review. It harnesses the power of the crowd to establish what we are looking for. This is wonderful. We can use the collective knowledge of others to mine through the impossible mountain of data to filter out what is not relevant to us.

The problem is that filtering excludes information. Every time we surf the Web, the search engines are recording what we do and what information we provide about our selves. It’s not evil. It’s not spying or an attempt to control our behaviour. The machines are simply trying to provide us with the most relevant information. However, Eli Pariser thinks this is a big problem. In his book, The Filter Bubble: What the Internet Is Hiding from You, he explains why search engines are potentially dangerous.26 Try this out for your self. Log on to Google and search for information about ‘Egypt’. Then call up a relative or friend in a different part of the country and ask him or her to do exactly the same thing. What Eli noted was that his friends received totally different lists of links.27 This difference is important because most people only look at the first page of links. In other words, they are not being allowed to see the full picture.

The reason for this discrepancy is that Google produces a personalized search result tailored for each user by using a filter. According to an unknown engineer from Google to whom Pariser spoke, the filter is based on a profile created from fifty-seven variables known about the user. (One wonders if the engineer was pulling Eli’s leg given the famous Heinz marketing ploy of fifty-seven varieties!) Eli noted how personalization was distorting the sorts of information that were being retrieved for him. For example, in an attempt to broaden his view on issues, Eli had deliberately followed Conservatives on his Facebook account even though he mostly had Liberal friends. However, after a while he noticed that the Facebook software was deliberately filtering out the postings from the Conservatives because these were deemed less relevant than the majority of his Liberal friends. Filtering software was encapsulating him inside a bubble of ignorance about contrasting views. This was the filter bubble.

Birds of a Feather

The vision of being connected to everyone on the Web to get a broad perspective on life is false. The software that is necessary for sifting through the impossible volumes of information is only showing us what it thinks we want to see. But we can’t blame the software. In real life, we also filter whom we pay attention to. People tend to social network with like-minded individuals who share the same values and opinions and reciprocate communications. We tend to befriend those who are most like us. We tend to read the same newspapers, like the same TV shows and enjoy the same pastimes. This homophily may lead to increased group cohesion but it also isolates us from other groups who share different values. In other words, it fosters increasing polarization. For example, in one study on attitudes about global warming, Republicans shifted from 49 per cent who believed the planet was warming up in 2001 to 29 per cent in 2010. In contrast, Democrats increased from 60 per cent to 70 per cent who believed it was a problem over the same period.28 It was if they were living on different planets.

One might think that the Web should counter this tendency of homophily and broaden our minds to different viewpoints. Indeed, Twitter activity encourages total strangers to become connected. If your followers like or dislike what they hear, they can comment or communicate by ‘mentioning’ you in an open post. That way you can tell whether anyone is paying any attention to you. Twitter users ‘retweet’ messages and links they like. It’s like saying, ‘Hey everybody, look at what this person over here is saying,’ thereby spreading the influence from someone they follow to other users not directly connected to them. If you say something influential it can spread more rapidly across the ‘Twittersphere’ than conventional channels. This is how Twitter users were made aware of the top secret US assault on Osama Bin Laden’s complex as it was happening in May 2011: one Twitter user, Sohaib Athar a.k.a. @reallyvirtual, who lived near Bin Laden, live-tweeted the raid without realizing what was going on. He later tweeted, ‘Uh oh, now I’m the guy who live-blogged the Osama raid without knowing it.’ Prior to the raid, Sohaib had 750 people following him. After the raid, he had over 90,000. No wonder Twitter makes surfing blogs and the Web look boring and long-winded. You don’t even have to be at a computer terminal as these social networking sites are now all accessible on mobile phones. Twitter is the crack cocaine of social networking.

Despite the ease of Twitter connectivity, it leads to homophily, with people of the same age, race, religion, education and even temperament tending to follow each other and unfollow those with different views. For example, in one study of over 102,000 Twitter users who produced a combined 129 million tweets in six months, researchers analysed their posting on measures of positive or negative content.29 Upbeat tweets were things like, ‘Nothing feels like a good shower, shave and haircut … love it’, or ‘thanks for your follow, I am following you back, great group of amazing people’. Those of a more miserable disposition posted tweets such as ‘She doesn’t deserve the tears but I cry them anyway’ or ‘I’m sick and my body decides to attack my face and make me break out!! WTF’. When the researchers analysed the social networking of the group they found that those who clustered together shared the same dispositions. This type of clustering is illustrated in Figure 10. Happy users are connected to other happy users and unhappy users are connected to other miserable sods. It was as if there was emotional contagion between Twitter users, like the mimicry of the mirror system we encountered earlier only this time the transfer was entirely virtual. Of course, this type of clustering increases polarization. An analysis of 250,000 tweets during the US congressional midterm elections in 2010 revealed that liberals and conservatives significantly retweeted partisan messages consistent with party line, but not those from the opposing camp.30

Furthermore, the promise of communication with thousands of users is not fulfilled because of one major stumbling block – our evolved human brain. When the tweets of 1.7 million users over six months were analysed, the researchers made a remarkable discovery.31 As the number of followers increase, the capacity to interact with others becomes more difficult in this ‘economy of attention’. We cannot have meaningful exchanges with unlimited numbers of other people. There simply is not enough time and effort available to respond to everyone. It turns out that within this vast ocean of social networking the optimum number at which reciprocal communication can be maintained, peaks at somewhere between 100 and 200 followers. Likewise, on Facebook, the average user has 130 friends. Does that number seem familiar? It should. It’s close to Dunbar’s again, which describes the relationship between the primate cortex and social group size. It turns out accurately to predict our social activity in the virtual world of social networking sites to be as much as in the real world.

Figure 10: Analysis of communication on Twitter reveals significant grouping (based on a study by Bollen et al., 2011. Copyright permission given).

Time for Our Self

Technology was supposed to liberate us from the mundane chores in life. It was supposed to make us happier. Twentieth-century advertisements promised a world of automaticity and instant gratification. When the computer first came along in the 1960s and then into many Western households during the 1980s and 1990s, we were told that we would have increased freedom to pursue leisure and entertainment. We were supposed to have more time for each other. The computer has certainly made many tasks easier, but paradoxically many of us spend more time alone at our computers than engaging with the people with whom we live and work. My colleague, Simon Baron-Cohen, an expert on autism, has estimated that he answers fifty emails a day and has spent over 1,000 hours a year doing so.32 I think that my online time is much worse. I don’t get as many emails as Simon but I am online every day and cannot remember the last time I had a day offline. Even on holiday or trips, I am connected.

If I am not researching articles or preparing teaching material, then I am keeping in contact with people through social networking sites. I email, write a blog, tweet on Twitter, talk on Skype, have a LinkedIn profile and drop in and out of Facebook. I have joined Google+, the latest development in social networking. I surf the Web relentlessly. I can do this via my office computer, portable laptop, iPad or smartphone. I am all wired up. Even when I watch some important event on television, I have my social network feed running so I can keep track of what other people’s opinions are on the same broadcast. I estimate that I spend at least half of my waking day online from 7 a.m. to midnight. That’s well over 3,000 hours per year – excessive by anyone’s standards. I know that this level of Web presence is not typical and probably not healthy but, if my teenage daughters are anything to go by, many people in the West are increasingly becoming immersed in their online involvement. Some argue that excessive dependence on Web activity should be considered like any other addiction though psychiatrists are not in agreement that it really constitutes a well-defined disorder.

My addiction to the Web began in 2009 when I started my online presence and social networking at the request of the publisher of my first book. Initially, I was asked to write a blog – a website where you write stories and hope that people visit and read what you write. From the outset I thought that blogging was a self-indulgent activity but I agreed to give it a whirl to help promote my book. In spite of my initial reluctance I soon became addicted to feedback. Readers could leave comments about each posting and as an administrator of my site I could see who and how many people where visiting. It was not enough to post blogs for some unseen audience. I needed the validation from visitors that my efforts and opinions were appreciated. These were recorded as ‘hits’ – the number of times people visited my site. This feedback process is supercharged by the accelerated nature of communication on the Web. Unlike peer-reviewed scientific papers or critics’ reviews of your books that can take ages and are unpredictable, social networking sites can create instantaneous gratification from feedback. If the public responds positively to something we have written by increasing traffic or leaving kind comments, this makes us feel very good indeed. It justifies our efforts.

We know the reason for this pleasure from experiments on conditioning behaviour. Conditioning was originally discovered in the 1890s by Russian physiologist Ivan Pavlov, who noted that the dogs he had been studying learned to anticipate feeding time because they would salivate before the food arrived.33 He then presented the sound of a buzzer (not a bell as popular culture portrays it) with the food so that eventually just the sound elicited salivation. The dog had learned to associate the sound with the food. This was an important discovery. The experimenter could shape the behaviour of the dog to respond to a variety of different stimuli. They could be trained or conditioned by reward. Conditioning was soon developed into a whole school of psychological theory called Behaviourism, championed in the United States by individuals like J. B. Watson and B. F. Skinner who claimed that any complex behaviour could be shaped by rewards and punishments.34 In fact, we now know that it is not the rewards that strengthen behaviours but rather the anticipation of rewards, which is so satisfying.

This is because deep inside our brain, close to the brainstem, is a reward system that is invigorated by a cluster of around 15,000–20,000 dopamine neurons that send out long fibres to other regions of the brain. Given the billions of neurons in the brain, it is remarkable that this tiny population is the pleasure centre critical in controlling our behaviour. These neurons enable us to predict and anticipate rewards and punishment.35 Without them, we would be hopelessly inept in decision-making and our behaviour would be erratic. When an animal in a conditioning experiment learns that pressing a lever or pecking a disc will deliver a reward, anticipatory dopamine is released, which reinforces the behaviour rather than the actual reward. We know this because rats with electrodes implanted in the pleasure centre connected to a current will continue to self-stimulate in the absence of any food reward – to the point of starvation.36 The dopamine rush alone is sufficient to condition the behaviour. When patients have electrodes implanted in this same brain region for the treatment of intractable epilepsy, they report feeling pleasure. Like many addictive behaviours from gambling to sex, it’s the thrill of expectation that gives us the best buzz.

What’s more, the best way to strengthen behaviour is to only reward it occasionally – this is called intermittent reinforcement. This is because our brains are always seeking out patterns in the environment. However, information and feedback from the environment is often fragmented and incomplete but our brains allow for such inconsistency. When we do something that seems to cause some form of positive reward, we then repeat the action in an attempt to recreate the pleasure. If this reward is only intermittent we will persist for much longer repeating our attempts. This is the reinforcement principle behind gambling. We gamble more and for longer just waiting for that occasional reward.37 Slot machines only need to pay out every so often on an intermittent reinforcement schedule for players to persist in pumping more coins into them. It’s that dopamine hit of anticipation that perpetuates our behaviour.

In the same way, conditioning explains our online behaviour. We are compelled to check our emails or look for approval from our online community just in case something really important or juicy comes along. Every time I checked my email or hit activity on my blog, I was like a rat in one of Skinner’s conditioning experiments. At first, the numbers were only a handful but every week they increased. Within a month, I was checking activity every day – thrilled when there was a peak or a kind comment, depressed by the dips and disparaging remarks. Most days there was nothing but every so often, I would be rewarded. The dopamine spurt triggered by associated anticipation had become my drug of choice and I had become a numbers junkie looking for more and more hits.

So the Internet can become addictive and it can also be dangerous, especially in the case of immersive gaming where individuals can play for hours in fantasy worlds. In 2010, South Korea had a greater proportion of its population online than any other nation (81 per cent of forty-six million). Most Koreans spend their online time in internet cafés that provide fast but cheap connections. This can have devastating consequences. Many of them develop serious medical conditions related to hours of online activity at the cost of offline inactivity. Their joints swell up. They develop muscular pain. Sometimes it’s others that get hurt. In the same year, a South Korean couple who met online married in real life, but unfortunately had a sickly premature baby.38 But then they decided to continue their lives online in the café across the road in a game where they raised a virtual baby. They only returned to the house once a day to feed their own real baby. This lack of care meant that their own child eventually died of severe dehydration and malnutrition. Undoubtedly, this is an extreme case and many children raised in poverty are neglected but it highlights the compulsion of the Web. I recently hosted a highly educated academic family visiting from the United States and after the initial social conversation and exchange of anecdotes over dinner; we soon dispersed to check our email, Facebook and other online lives. It was not only the adults in the group, but the children as well. At one point, I looked up from my laptop and saw everyone else in the room silently immersed in their own Web. Whereas we once used to compartmentalize our lives into the working day and time with the family, the Web has destroyed those boundaries forever. Most of us are connected and we like it that way. Just like drug addiction, many of us get withdrawal symptoms of anxiety and irritability when we are denied our Web access.

We have become shaped and controlled by our technology in a way predicted by Marshall McLuhan when he introduced phrases such as the ‘global village’ and ‘the medium is the message’.39 Even in the 1960s, before the invention of the Web, McLuhan predicted that society would change to become dependent and shaped by our communications technology. He understood that we extend our self out to others and in doing so, become influenced by their reciprocal extensions. To this extent we are intricately inter-related to each other through the mediums by which we communicate. Likewise, Sherry Turkle, the MIT sociologist, has also described this shift from face-to-face interaction to terminal-to-terminal interaction in her recent book, Alone Together.40 As we spend more time online, we are necessarily less offline, which means that we will cease to live the same lives shaped by our immediate others. Rather, who we are will increasingly become shaped by the mediums in which we exist. Some people find this scary. For many it is liberating.

We All Want a Second Life

What do you do if you are unemployed, overweight and living off benefits with no prospect of escaping the poverty trap? Since 2003, there has been another world you can live in – a world where you can get a second chance. This is Second Life, a virtual online world where you reinvent your self and live a life among other avatars who never grow old, have perfect bodies, never get ill, have fabulous homes and lead interesting lives.

David Pollard and Amy Taylor are two individuals who separately wanted to escape the dreariness of their mundane lives.41 Both of them lived in Newquay, a seaside resort in southwest England that has become a Mecca for drunken teenagers who come in their hordes to party away the summer. The town is far from idyllic and I would imagine living there, without a job and prospects, must be depressing. To escape the drudgery, David and Amy (who initially met in an online chatroom) joined Second Life where they became ‘Dave Barmy’ and ‘Laura Skye’ (see Figure 11). Dave Barmy was in his mid-twenties, six foot four, slim, with long dark hair, and was a nightclub owner who lived in a sprawling villa. He had a penchant for smart suits and bling. In reality, David Pollard was forty, overweight at 160 kg, balding and living off incapacity benefits in a bedsit. He wore T-shirts and tracksuit bottoms.

Figure 11: Dave Barmy and Laura Skye

Laura Skye was an equally exotic character. She was also in her mid-twenties, a slim six foot with long, dark hair, living in a large house. She liked the country and western look of tight denim blouses and boots. In reality, Amy Pollard was an overweight, five-foot-four redhead who was also living off benefits. The contrast between reality and fiction could hardly have been greater (see Figure 12).

When the couple met online as Dave Barmy and Laura Skye, they fell in love and married in Second Life. But they also met up in real life with Amy moving in with David in Newquay. After two years, they married for real – just like the Korean couple. However, as in real life, that’s when things started to go wrong. Laura (Amy) suspected Dave was playing around in Second Life so she hired a virtual detective to check up on her virtual husband. At one point, she discovered Dave Barmy having sex with a call girl in the game. In real life, David apologized and begged for forgiveness. The final straw came when Amy caught her real husband in front of the computer in their small flat watching his avatar cuddling affectionately on a couch with another Second Life character, Modesty McDonnell – the creation of Linda Brinkley, a fifty-five-year-old twice-divorcee from Arkansas, USA. Amy was devastated. She filed for divorce on the grounds of unreasonable behaviour even though Dave had not actually had sex or an affair in real life. Soon after, Dave proposed to Modesty online and in real life even though the couple had never met.

Figure 12: The real Dave Barmy and Laura Skye

When the world discovered that a couple was divorcing on the grounds of make-believe unreasonable behaviour, the press flocked to Newquay. However, in what can only be described as reality imitating art, imitating reality, the Cornish couple initially declined to give interviews and would not answer the door. Then something very odd happened. Two enterprising journalists from the South West News hit on the bright idea of going into Second Life to secure an interview. From their offices miles away in Bristol, Jo Pickering and Paul Adcock created virtual ace reporters ‘Meggy Paulse’ and ‘Jashly Gothley’ to seek out Dave Barmy and Laura Skye for an interview.

Jo still works on South West News and she told me that she had the idea after speaking to a colleague who had been using avatars to attend online courses. As Meggy Paulse, Jo found Laura Skye in Second Life. She told me that the online Laura Skye was much more approachable and confident than the real life Amy. Eventually Meggy Paulse persuaded Amy to logoff and go downstairs and open the door to speak to the reporters camped on her doorstep. They eventually got their story.

Jo explained that Amy had felt that the betrayal online was far worse than betrayal in real life, because both she and David had created these perfect selves and still that was not good enough. In real life, we are all flawed and often put up with each other’s weaknesses, but in Second Life there were supposed to be no weaknesses. That’s why the online betrayal hurt. As Jo says, ‘She had created this perfect version of herself – and even that wasn’t good enough for him.’

I asked Jo about what ever happened to the couple. Apparently Dave did eventually meet up with Linda Brinkley, but reality must have kicked in when it came to having a real marriage when you are both poor and live on different continents. When online Dave Barmy met online Modesty McDonnell for real, David Pollard and Linda Brinkley got real. What this morality tale tells us is that the boundaries between reality and fantasy can sometimes become blurred. Paul Bloom, tells of a research assistant who was asked by her professor to do some research on these virtual communities.42 Apparently, the young woman never came back. Like some electronic cult, she preferred life in a virtual world compared to the real one. If the urge to live a life online is so compelling, it does make you wonder what the future holds. Surely something has to give, as one cannot be in two places at the same time even between virtual and real worlds. Both require the limited resource of time and that is something that cannot be easily divided.

When Online Behaviour is Off

Some individuals in power seek out sexual gratification by engaging in risky encounters. They step over the boundaries of decent behaviour. The Web has made this type of transgression all too easy. With what must be the most unfortunate of surnames, Republican Congressman Anthony Weiner found himself at the centre of a career-destroying scandal in 2011 when he was forced to resign after confessing to sending pictures of his penis to women whom he followed on Twitter.43 ‘Weinergate’, as it became known, was just another example of high-profile men using the Web to send naked images of themselves to women. In the past, men exposed themselves for sexual gratification in public places but, with the advent of social networking sites, offline flashing has moved online and is much more common.

Indeed, some argue that one of the main uses of the Internet is for sex. A 2008 survey of more than 1,280 teenagers (13–20 years) and young adults (20–26 years) revealed that one in five teenagers and one in three young adults had sent nude or semi-nude photographs of themselves over the internet.44 One online dating site, Friendfinder.com, estimates that nearly half of its subscribers are married. Either they are looking for new partners or the opportunity to flirt.45 Probably one of the most remarkable cases was US Army Colonel Kassem Saleh, who had simultaneously wooed over fifty women online and made marriage proposals to many of them despite the fact that he was already married.

‘Sexting’ is a relatively new phenomenon in which individuals use technology to engage in sexual activities at a distance. Susan Lipkins, a psychologist from Port Washington, New York, reports that in her online survey of thirteen to seventy-two-year-olds, two-thirds of the sample had sent sexually explicit messages. The peak activity was in the late teenagers and young adults. What was interesting was that this behaviour was associated with personality measures of power such as assertiveness, dominance and aggression in those over the age of twenty-seven. Power was not a factor in the younger group but was significantly related to sexting in the older men.46 The ease and speed of the Web, as well as the perceived dissociation and distance from reality, lead to an escalation of brazen activity. This can easily slide into moral indiscretions that are unregulated by social norms compared to real life. Just like bullying, the apparent anonymity, distance and remoteness of being online allows us to not be our self as we would behave in the real world.

The Cyber Rape by Mr Bungle the Clown

When it comes to the boundaries between reality and fantasy and between moral and immoral acts, probably the most poignant tale that reveals the blurring in these situations is the story of Mr Bungle the Clown. Mr Bungle was a cyber character who inhabited the virtual world of LamdaMOO – one of the first online communities back in the early 1990s where multiple players create and control virtual characters. Mr Bungle was a particularly nasty piece of work. In one notorious event one evening in a virtual room in a virtual mansion, he violated members of his fellow online community using software subroutines (sections of code designed for a particular task in computer programming) to make the other characters perform perverted sexual acts.47

Of course, this terrifying vision of Mr Bungle was all in the users’ mind. He didn’t really exist. If you logged on to LamdaMOO back in these early days of virtual communities, you simply accessed a database stored somewhere inside a Xerox Corporation research computer in Silicon Valley that presented the user with scrolling lines of description. The environment, objects and all the characters were just subroutines of text – fairly basic stuff compared to the rich visual environments that are expected in today’s technologically advanced online communities. LamdaMOO was nothing compared to the graphical 3D visual worlds of Second Life or World of Warcraft, but then human imagination doesn’t require very much to generate a vivid impression.

Mr Bungle was the disturbed creation of a young hacker logging on from New York University who had managed to hack the system’s software to produce a subroutine that presented other users with unsolicited text. During the event in question, several female users were online when they were presented with text describing how their characters inserted various utensils and derived sexual pleasure as Mr Bungle watched, laughing sadistically. Again it was all in the mind as the whole attack was played out as a series of scrolling text.

Afterwards, one female user from Seattle whose character, called ‘Legba’, had been virtually abused, publicly posted her assault on the LambdaMOO’s community chatboard and called for Mr Bungle’s castration. Months later she told the reporter who had first covered the story that as she wrote those words, ‘post-traumatic tears were streaming down her face’. Clearly this was no longer a virtual incident in her mind – as a victim she had taken it to heart. The assault had crossed the boundary of imagination to affect the real-life emotions of those concerned.

They say words can never harm you but for the self illusion, words from other people can be everything. The case of Mr Bungle raises so many interesting issues about identity, the self and the way these operate in online communities. Everything that happened, the characters, the assault, the reaction and the eventual retribution were nothing more than words, the frenetic typing of cyber geeks on their keyboards. But why the outrage? Why did people feel emotionally upset? No physical contact had ever taken place. Clearly the players were not deluded into believing that a real assault had happened, but psychologically the users felt violated. In the minds of the players it had gone beyond role-playing. Their indignation was real. Ostracism and the pain of social rejection can be so easily triggered by simple computer simulations of communities that are a sufficient substitute for reality. That’s because they stimulate our deep-seated need for social interaction.

So where is the real self in these different examples of online communities and virtual worlds? Most of us believe that we are not hypocrites or duplicitous. We like to think we have integrity. If the self is a coherent, integrated entity then one would predict that the way we behave online should accurately mirror the way we behave offline. However, that does not appear to be the case. How people behave depends on the context in which they find themselves. The Web is no different. The way you behave online would never be acceptable offline and vice versa. Online you have to be open, engaging and willing to share but then you are more likely to tell others what you think of them, flirt and generally act in a way that would get you into trouble in real life.

Sometimes we surprise our self in the way we behave online as if we have become a different person. Maybe this is why online life is so popular. We get to be a different self. We get to be someone else – maybe someone we aspire to be. At the very least we get to interact with others who are missing in our daily lives. This need for an online identity that seems so different to our offline self perplexes pre-Web adults, but we need to understand how this need for technological escapism has become integrated into the human psychological development. This is because the Web will eventually swallow up everyone on the planet so it is important to consider how it may influence and change the next generation. We are not likely to become like the Borg, but we do seem to shift effortlessly between our online and offline selves. Consequently, the Web dramatically reveals the extent to which the notion of a core self is an illusion.

Загрузка...