Conclusion
THE POSITIVE HOUR
I do not hope to know again
The infirm glory of the positive hour
T. S. Eliot, Ash Wednesday, 1930
Who can doubt Eliot’s sentiment, that the twentieth century was the positive hour, or that its glory, however glorious, was also infirm? He continues in magnificent dissatisfaction, and resolve:
Because I know that time is always time
And place is always and only place
And what is actual is actual only for one time
And only for one place
I rejoice that things are as they are…
Consequently I rejoice, having to construct something
Upon which to rejoice…
Because these wings are no longer wings to fly
But merely vans to beat the air
The air which is now thoroughly small and dry
Smaller and dryer than the will
Teach us to care and not to care
Teach us to sit still.1
Eliot was writing in the middle of the golden age of physics but also in the golden age of Heidegger, before the fall of both. ‘Teach us to sit still’ was his way of saying, as Heidegger put it, ‘submit.’ Submit to the world as it is and rejoice, celebrate, without forever looking to explain it all. Relish the mystery, which allows us to be as we wish to be. But Eliot, as the rest of the poem and its elegiac tone make clear, was not entirely happy with this as a complete solution. Like too many others, he found the cause that science advanced convincing, too convincing to go back wholly to the status quo ante. No more than the next man could he unknow what was now known. But, as a poet, he could mark what was happening. And crucially, 1930, when Ash Wednesday appeared, was perhaps the earliest date at which all the three great intellectual forces of the twentieth century became apparent. These three forces were: science; free-market economics; and the mass media.
This is not to say, of course, that science, or free-market economics, or the mass media, were entirely twentieth-century phenomena: they were not. But there were important aspects of the twentieth century which meant that each of these forces took on a new potency, which only emerged for all to see in the 1920s.
What was just emerging in science at the time of Ash Wednesday, particularly as a result of Edwin Hubble’s discoveries, gathered force as the century went on more than Eliot – or anyone – could have guessed. Whatever impact individual discoveries had, the most important development intellectually, which added immeasurably to the authority of science, and changed man’s conception of himself, was the extent to which science began to come together, the way in which the various disciplines could be seen as telling the same story from different angles. First physics and chemistry came together; then physics and astronomy/cosmology; then physics and geology; more recently physics and mathematics, though they have always been close. In the same way economics and sociology came together. Even more strongly biology, in the form of genetics, came together with linguistics, anthropology, and archaeology. Biology and physics have not yet come together in the sense that we understand how inert substances can combine to create life. But they have come together, as Ian Stewart’s work showed in the last chapter, in the way physics and mathematics help explain biological structures; even more so in the expanded concept of evolution, producing a single narrative from the Big Bang onward, throughout the billions of years of the history of the universe, giving us the creation of galaxies, the solar system, the earth, the oceans and continents, all the way through to life itself and the distribution about our planet of plants and animals. This is surely the most powerful empirically based idea there has ever been.
The final layer of this narrative has been provided only recently by Jared Diamond. Diamond, a professor of physiology at California Medical School but also an anthropologist who has worked in New Guinea, won the Rhône-Poulenc Science Book Prize in 1998 for Guns, Germs and Steel.2 In this book, he set out to explain nothing less than the whole pattern of evolution over the last 13,000 years – i.e., since the last ice age – and his answer was as bold as it was original. He was in particular concerned to explore why it was that evolution brought us to the point where the Europeans invaded and conquered the Americas in 1492 and afterward, and not the other way round. Why had the Incas, say, not crossed the Atlantic from west to east and subdued the Moroccans or the Portuguese? He found that the explanation lay in the general layout of the earth, in particular the way the continents are arranged over the surface of the globe. Simply put, the continents of the Americas and Africa have their main axis running north/south, whereas in Eurasia it is east/west.3 The significance of this is that the diffusion of domesticated plants and animals is much easier from east to west, or west to east, because similar latitudes imply similar geographical and climatic conditions, such as mean temperatures, rainfall, or hours of daylight. Diffusion from north to south, or south to north, on the other hand, is correspondingly harder and therefore inhibited the spread of domesticated plants and animals. Thus the spread of cattle, sheep, and goats was much more rapid, and thorough, in Eurasia than it was in either Africa or the Americas.4 In this way, says Diamond, the dispersal of farming meant the buildup of greater population densities in Eurasia as opposed to the other continents, and this in turn had two effects. First, competition between different societies fuelled the evolution of new cultural practices, in particular the development of weapons, which were so important in the conquest of the Americas. The second consequence was the evolution of diseases contracted from (largely domesticated) animals. These diseases could only survive among relatively large populations of humans, and when they were introduced to peoples who had developed no immune systems, they devastated them. Thus the global pattern was set, says Diamond. In particular, Africa, which had ‘six million years’ start’ in evolutionary terms compared with other parts of the world, failed to develop because it was isolated by vast oceans on three sides and desert on the north, and had few species of animals or plants that could be domesticated along its north/south axis.5
Diamond’s account – an expanded version of la longue durée — although it has been criticised as being speculative (which it undoubtedly is), does if accepted bring a measure of closure to one area of human thought, showing why different races around the world have reached different stages of development, or had done so by, say, 1500 AD. In doing this, Diamond, as he specifically intended, defused some of the racist sentiment that sought to explain the alleged superiority of Europeans over other groupings around the globe. He therefore used science to counter certain socially disruptive ideas still current in some quarters at the end of the century.
The fundamental importance of science, if it needs further underlining, shows in the very different fates of Germany and France in the twentieth century. Germany, the world leader in many areas of thought until 1933, had its brains ripped out by Hitler in his inquisition, and has not yet recovered. (Remember Allan Bloom’s wide-ranging references to German culture in The Closing of the American Mind?) World War II was not only about territory and Lebensraum; in a very real sense it was also about ideas. In France the situation was different. Many continental thinkers, especially French and from the German-speaking lands, were devoted to the marriage of Freud and Marx, one of the main intellectual preoccupations of the century, and maybe the biggest dead end, or folly, which had the effect, in France most of all, of blinding thinkers to the advances in the ‘harder’ sciences. This has created a cultural divide in intellectual terms between francophone and anglophone thought.
The strength of the second great force in the twentieth century – free-market economics – was highlighted by the great ‘experiment’ that was launched in Russia in 1917, and lasted until the late 1980s. The presence of the rival systems, and the subsequent collapse of communism, drew attention to the advantages of free-market economics in a way that Eliot, writing Ash Wednesday at the time of the Great Crash, could perhaps not have envisaged. This triumph of the free-market system was so complete that, to celebrate it, Francis Fukuyama published in 1992 The End of History and the Last Man.6 Based on a lecture given at the invitation of Allan Bloom, at the University of Chicago, Fukuyama took as his starting point the fact that the preceding years had seen the triumph of liberal democracies all over the world and that this marked the ‘endpoint of mankind’s ideological evolution’ and the ‘final form of human government.’7 He was talking not only about Russia but the great number of countries that have embraced the free market and democracy, to some extent: Argentina, Botswana, Brazil, Chile, the Eastern European countries, Namibia, Portugal, South Korea, Spain, Thailand, Uruguay, and so on. More than that, though, Fukuyama sought to show that there is, as he put it, a Universal History, a single, coherent evolutionary process that takes into account ‘the experience of all peoples in all times.’8 His argument was that natural science is the mechanism by which this coherent story is achieved, that science is by consensus both cumulative and directional ‘even if its ultimate impact on human happiness is ambiguous.’9 He added, ‘Moreover, the logic of modern natural science would seem to dictate a universal evolution in the direction of capitalism.’ Fukuyama thought this accounted for many of the nonmaterial developments in twentieth-century life, most notably the psychological developments. He implied that modern natural science brought democratic progress – because the institutions of science are essentially democratic, and require widespread education for their successful operation, and this in turn brought about a concern on the part of many people, as Hegel had predicted, for a ‘desire for recognition’ – a desire to be appreciated in their own right. In such an environment, the individualistic developments we have seen in the twentieth century became almost unavoidable – from the psychological revolution to the civil rights movement and even postmodernism. In the same way, we have been living through a period equivalent or analogous to the Reformation. In the Reformation, religion and politics became divorced; in the twentieth century political liberation has been replaced by personal liberation. In this process Fukuyama discussed Christianity, following Hegel, as the ‘absolute religion,’ not out of any narrow-minded ethnocentrism, he said, but because Christianity regards all men as equal in the sight of God, ‘on the basis of their faculty for moral choice or belief’ and because Christianity regards man as free, morally free to choose between right and wrong.10 In this sense then, Christianity is a more ‘evolved’ religion than the other great faiths.
Just as there is an intimate link between science, capitalism, and liberal democracies, so too there is a link to the third force of the twentieth century, the mass media. Essentially democratic to start with, the mass media have in fact grown more so as the century has proceeded. The globalisation of the markets has been and is a parallel process. This is not to deny that these processes have brought with them their own set of problems, some of which will be addressed presently. But for now my point is simply to assert that science, freemarket economics, and the mass media stem from the same impulse, and that this impulse has been dominant throughout the past century.
Jared Diamond’s thesis, and Francis Fukuyama’s, come together uncannily in David Landes’s Wealth and Poverty of Nations (1998).11 At one level, this book is a restatement of the ‘traditional’ historical narrative, describing the triumph of the West. At a deeper level it seeks to explain why it was that, for example, China, with its massive fleet in the Middle Ages, never embarked on a period of conquest as the Western nations did, or why Islamic technological innovation in the same period was interrupted, never to resume. Landes’s answer was partly geographical (the distribution of parasites across the globe, limiting mortality), religion (Islam turned its back on the printing press, fearful of the sacrilege it might bring with it), population density and immigration patterns (families of immigrants flooded into north America, single men into Latin America, to intermarry with the indigenous population), and economic/political and ideological systems that promote self-esteem (and therefore hard work) rather than, say, the Spanish system in South America, where Catholicism was much less curious about the new world, less adaptable and innovative.12 Like Fukuyama, Landes linked capitalism and science, but in his case he argued that they are both systems of cumulative knowledge. For Landes these are all-important lessons; as he points out at the end of his book, convergence isn’t happening. The rich are getting richer and the poor poorer. Countries – civilisations – ignore these lessons at their peril.
But science brings problems too, and these need to be addressed. In The End of Science: Facing the Limits of Knowledge in the Twilight of the Scientific Age (1996), the science writer John Horgan explored two matters.13 He considered whether all the major questions in science had already been answered – that all biology, for example, is now merely a footnote to Darwin, or that all physics pales in the shallow of the Big Bang – and he looked at whether this marks a decisive phase in human history. He interviewed a surprisingly large number of scientists who thought that we are coming to the end of the scientific age, that there are limits to what we can know and, in general, that such a state of affairs might not be such a bad thing. By his own admission, Horgan was building on an idea of Gunther Stent, a biologist at the University of California in Berkeley, who in 1969 had published The Coming of the Golden Age: A View of the End of Progress. This book contended ‘that science, as well as technology, the arts and all progressive, cumulative enterprises were coming to an end.’14 The starting point for Stent was physics, which he felt was becoming more difficult to comprehend, more and more hypothetical and impractical.
One of the scientists Horgan interviewed who thought there is a limit to knowledge was Noam Chomsky, who divided scientific questions into problems, ‘which are at least potentially answerable, and mysteries, which are not.’15 According to Chomsky there has been ‘spectacular progress’ in some areas of science, but no progress at all in others – such as consciousness and free will. There, he said, ‘We don’t even have bad ideas.’16 In fact, Chomsky went further, arguing in his own book, Language and Problems of Knowledge (1988), that ‘it is quite possible – overwhelmingly probable, one might guess – that we will always learn more about human life and human personality from novels than from scientific psychology.’17
Horgan considered that there were perhaps two outstanding fundamental problems in science – immortality, and consciousness. He thought that immortality was quite likely to be achieved in the next century and that, to an extent, as J. D. Bernal had predicted in 1992, man would eventually be able to direct his own evolution.
The challenge implicit in Horgan’s thesis was taken up by John Maddox, the recently retired editor of Nature, in his 1998 book, What Remains to Be Discovered.18 This was in fact an excellent review of what we know – and do not know – in physics, biology, and mathematics, and at the same time a useful corrective to the triumphalism of some scientists. For example, Maddox went out of his way to emphasise the provisional nature of much physics – he referred to black holes as ‘putative’ only, to the search for theories of everything as ‘the embodiment of a belief, even a hope,’ and stated that the reason why the quantum gravity project is ‘becalmed’ right now is because ‘the problem to be solved is not yet fully understood,’ and that the idea the universe began with a Big Bang ‘will be found false.’19 At the same time, Maddox thought science far from over. His thesis was that the world has been overwhelmed by science in the twentieth century for the first time. He thought that the twenty-first century is just as likely to throw up a ‘new physics’ as a Theory of Everything. In astronomy, for example, there is the need to confirm the existence of the hypothetical structure known as the ‘great attractor,’ toward which, since February 1996, it has been known that 600 visible galaxies are moving. In cosmology, there is the search for the ‘missing mass,’ perhaps as much as 80 percent of the known universe, which alone can explain the expansion rate after the Big Bang. Maddox also underlines that there is no direct evidence for inflation in the early universe, or that rapid expansion, a Big Bang, took place before. As he puts it, the Big Bang is ‘not so much a theory as a model.’ Even more pithily, he dismisses Lee Smolin’s ideas of parallel universes, with no unique beginning, as ‘no more persuasive than the account in Genesis of how the universe began.’20 In fact, Maddox says plainly, we do not know how the universe began; Hubble’s law urgently needs to be modified; and, ‘from all appearances, space-time in our neighborhood is not noticeably curved [as it should be according to relativity], but flat.’21
Maddox considers that even our understanding of fundamental particles is far from complete and may be crucially hampered after the new CERN accelerator comes on stream in 2005 – because experiments there will suggest new experiments that we don’t, and shan’t, have the capability for. He points out that in the early weeks of 1997 there were suggestions that even electrons may have internal structures, and be composite, and that therefore ‘the goal of specifying just why the particles in the real world are what they are is still a long way off.’22 In regard to string theory, Maddox makes a fundamental objection: If strings must exist in many dimensions, how can they relate to the real world in which we live? His answer is that string theory may be no more than a metaphor, that our understanding of space or time may be seriously flawed, that physics has been too concerned with, as he put it, ‘the naming of parts,’ in too much of a hurry to provide us with proper understanding. Maddox’s reserve about scientific progress is hugely refreshing, coming as it does from such an impeccable source, the editor who first allowed so many of these theories into print. He does agree with Horgan that life itself is one of the mysteries that will be unravelled in the twenty-first century, that cancer will finally be conquered, that huge advances will be made in understanding the link between genetics and individuality, and that the biggest remaining problem/mystery of all is consciousness.
The application of evolutionary thinking to consciousness, discussed in chapter 39, is only one of the areas where the neo-Darwinists have directed their most recent attention. In practice, we are now in an era of ‘universal Darwinism,’ when the algorithmic approach has been applied almost everywhere: evolutionary cosmology, evolutionary economics (and therefore politics), the evolution of technology. But perhaps the most radical idea of the neo-or ultra-Darwinians relates to knowledge itself and raises the intriguing question as to whether we are at the present living through an era in the evolution of knowledge forms.23 We are living at a time – the positive hour – when science is taking over from the arts, humanities, and religion as the main form of knowledge. Recall that in Max Planck’s family in Germany at the turn of the century, as was reported in chapter 1, the humanities were regarded as a form of knowledge superior to science. Richard Hofstadter was one of the first to air the possibility that all this was changing when he drew attention to the great impact in America in the 1960s of nonfiction and sociology, as compared with novels (see chapter 39). Let us also recall the way Eugène Ionesco was attuned to the achievements of science: ‘I wonder if art hasn’t reached a dead-end,’ he said in 1970. ‘If indeed in its present form, it hasn’t already reached its end. Once, writers and poets were venerated as seers and prophets. They had a certain intuition, a sharper sensitivity than their contemporaries, better still, they discovered things and their imaginations went beyond the discoveries even of science itself, to things science would only establish twenty-five or fifty years later…. But for some time now, science [has] been making enormous progress, whereas the empirical revelations of writers have been making very little … can literature still be considered as a means to knowledge?’24
In The Death of Literature (1990), Alvin Kernan quotes George Steiner: ‘We are now seeing, all of us today, the gradual end of the classical age of reading.’25 Kernan himself puts it this way: ‘Humanism’s long dream of learning, of arriving at some final truth by enough reading and writing, is breaking up in our time.’26 He has no doubt about the culprit. ‘Television, however, is not just a new way of doing old things but a radically different way of seeing and interpreting the world. Visual images not words, simple open meanings not complex and hidden, transience not permanence, episodes not structures, theater not truth. Literature’s ability to coexist with television, which many take for granted, seems less likely when we consider that as readers turn into viewers, as the skill of reading diminishes, and as the world as seen through a television screen feels and looks more pictorial and immediate, belief in a word-based literature will inevitably diminish.’27 ‘There is always the possibility that literature was so much a product of print culture and industrial capitalism, as bardic poetry and heroic epic were of tribal oral society, that … it will simply disappear in the electronic age, or dwindle to a merely ceremonial role, something like Peking opera perhaps.’28
Both Gunther Stent, referred to earlier, and John Barrow, an astronomer, have written about what they see as an evolutionary process in the arts ‘which has steadily relaxed the compositional constraints placed on the artist…. As the constraints imposed by convention, technology, or individual preference have been relaxed, so the resulting structure is less formally patterned, closer to the random, and harder to distinguish from the work of others working under similar freedom from constraint.’29 Stent argued that music actually has evolved like anything else. Studies have shown, for instance, that in order to be liked, music must strike a balance between the expected and the introduction of surprises. If it is too familiar, it is boring; if it is too surprising, it ‘jars.’ Physicists with a mathematical bias have actually calculated the familiarity/surprise ratio of music, and Stent was able to show that, beginning with ‘the maximal rigidity of rhythmic drumming in ancient times, music has exhausted the scope of each level of constraint for its listeners, before relaxing them and moving down to a new level of freedom of expression. At each stage, from ancient to medieval, renaissance baroque, romantic, to the atonal and modern periods, evolution has proceeded down a staircase of ever-loosening constraints, the next step down provoked by the exhaustion of the previous level’s repertoire of novel patterns…. The culmination of this evolutionary process in the 1960s saw composers like John Cage relinquish all constraints, leaving the listeners to create what they would from what they heard: an acoustic version of the Rorschach inkblot test.’30 John Barrow added the thought that other creative activities like architecture, poetry, painting, and sculpture have all displayed similar trends away from constraint. ‘Stent’s suspicion,’ he wrote, ‘was that they were all quite close to reaching the asymptote of their stylistic evolution: a final structureless state that required purely subjective responses.’31
A related way in which Darwinism encourages the evolution of knowledge forms has been suggested by Robert Wright. As he puts it, the various ways of conceiving the world – moral, political, artistic, literary, scientific – are ‘by Darwinian lights, raw power struggles. A winner will emerge, but there’s often no reason to expect that winner to be truth.’ Wright calls this approach ‘Darwinian cynicism,’ which he equates to the postmodern sensibility that views all modes of human communication as ‘discourses of power,’ where ‘ironic self-consciousness is the order of the day,’ where ideals can’t be taken seriously because one cannot avoid ‘self-serving manipulation.’32 On this analysis, postmodernism has itself evolved and, as with music, poetry, and painting, has reached the end as a way of looking at the world. Fukuyama didn’t know what he was starting when he wrote about the end of history.
Yet another reason why many of the arts must rate as unsatisfactory forms of knowledge in the twentieth century stems from the modernist reliance on the theories of Sigmund Freud. Here I agree with Britain’s Nobel Prize-winning doctor Sir Peter Medawar, who in 1972 described psychoanalysis as ‘one of the saddest and strangest of all landmarks in the history of twentieth-century thought.’33 Freud unveiled the unconscious to the world in 1900, at much the same time that the electron, the quantum, and the gene were identified. But whereas they have been confirmed by experiment after experiment, developing and proliferating, Freudianism has never found unequivocal empirical support, and the very idea of a systematic unconscious, and the tripartite division of the mind into the id, ego, and superego has seemed increasingly far-fetched. This is crucial in my view, for the consequences of the failure of Freudianism have not been thought through, and a re-evaluation of psychoanalysis is now urgently needed. For example, if Freud was so wrong, as I and many others believe, where does that leave any number of novels and virtually the entire corpus of surrealism, Dada, and certain major forms of expressionism and abstraction, not to mention Richard Strauss’s ‘Freudian’ operas such as Salomé and Elektra, and the iconic novels of numerous writers, including D. H. Lawrence, Franz Kafka, Thomas Mann, and Virginia Woolf? It doesn’t render these works less beautiful or pleasurable, necessarily, but it surely dilutes their meaning. They don’t owe their entire existence to psychoanalysis. But if they are robbed of a large part of their meaning, can they retain their intellectual importance and validity? Or do they become period pieces? I stress the point because the novels, paintings, and operas referred to above have helped to popularise and legitimise a certain view of human nature, one that is, all evidence to the contrary lacking, wrong. The overall effect of this is incalculable. All of us now harbor the view, for example, that our adult selves bear a certain relation to our childhood experiences, and to conflicts with our parents. Yet in 1998 Judith Rich Harris, a psychologist who had been dismissed from her Ph.D. course at Harvard, caused consternation among the psychological profession in America and elsewhere by arguing in her book The Nurture Assumption that parents have much less influence on their children than has traditionally been supposed; what matters instead is the child’s peer group – other children. She produced plenty of evidence to support her claim, which turned a century of Freudian jargoneering on its head.34 As a result of Freud, there has been a strain of thought in the twentieth century that holds, rather as in primitive societies, that the mad have an alternative view of the human condition. There is no evidence for this; moreover, it damages the fortunes of the mentally ill.
Robert Wright has described still other ways in which evolutionary thinking has been used to sow further doubt about Freudianism. As he wrote in The Moral Animal: Why We Are the Way We Are: The New Science of Evolutionary Psychology (1994), ‘Why would people have a death instinct (‘thanatos’) [as Freud argued]? Why would girls want male genitals (‘penis envy’)? Why would boys want to have sex with their mothers and kill their fathers (the Oedipus complex’)? Imagine genes that specifically encourage any of these impulses, and you’re imagining genes that aren’t exactly destined to spread through a hunter-gatherer population overnight.’35
The muddle over Freud, and psychoanalysis, was shown starkly by an exhibition scheduled for the mid-1990s at the Library of Congress in Washington, D.C. The exhibition was designed to celebrate the centenary of the birth of psychoanalysis.36 However, when word of the planned exhibition was released, a number of scholars, including Oliver Sacks, objected, arguing that the planning committee was packed with Freud ‘loyalists’ and that the exhibition threatened to become mere propaganda and hagiography, ‘ignoring the recent tide of revisionist writings about Freud.’37 When the book of the exhibition appeared, in 1998, no mention of this controversy was made, either by the Librarian of Congress, who wrote the foreword, or by the editor. Even so, the book could not avoid completely the doubts about Freud that have grown as the centenary of The Interpretation of Dreams approached. Two authors wrote papers describing Freud’s ideas as unstable and untestable, ‘on a par with flying saucers,’ while two others, including Peter Kramer, author of Listening to Prozac, described them as unconvincing but conceded that Freud has been influential. It is noticeable, for instance, that a great deal of the book was given over to talk of Freud’s ‘industry,’ ‘courage,’ and ‘genius,’ and to arguing that he should be judged ‘less as a scientist than as an imaginative artist.’38 Even psychoanalysts now concede that his ideas about women, early societies of hunter-gatherers, and the ‘Primal Crime’ are both fanciful and embarrassing. And so we are left in the paradoxical situation that, as the critic Paul Robinson says, the dominant intellectual presence of our century was, for the most part, wrong.
Nor did this revisionism stop with Freud. In 1996 Richard Noll, an historian of science at Harvard, published The Jung Cult and, two years later, The Aryan Christ.39 These books provoked a controversy no less bitter than the one over Freud, for Noll argued that Jung had lied about his early research and had actually fabricated dates in his notes to make it appear that patients’ memories of such things as fairy tales were part of the ‘collective unconscious’ and had not been learned as children. Noll also documented Jung’s anti-Semitism in detail and criticised present-day Jungians for not wanting to put his ideas to the test, lest they scare away potential clients.
The commercial side of jungianism need not concern us. More important is that, when this is taken together with Freud’s shortcomings, we can see that psychology in the twentieth century is based on theories – myths almost – that are not supported by observation, and is characterised by fanciful, idiosyncratic, and at times downright fraudulent notions. Psychology has been diverted for too long by Freud and Jung. The very plausibility of Freud’s theories is their most problematical feature. It has taken an entire century to get out from under their shallow. Until we can rid ourselves of our Freudian mindset, the Freudian ‘climate of opinion,’ as Auden called it, it is highly unlikely that we can look upon ourselves in the new way that is required. Darwin provides the only hope at the moment, together with the latest advances being made in neuroscience.
A related trend regarding the evolution of knowledge may be seen by juxtaposing Russell Jacoby’s The Last Intellectuals (1987) alongside John Brockman’s The Third Culture (1995).40 Jacoby described the fall of the ‘public intellectual’ in American life. Until the early 1960s, he said, figures like Daniel Bell, Jane Jacobs, Irving Howe, and J. K. Galbraith had lived in urban bohemias and written for the public, raising and keeping alive issues common to all – but especially educated people.41 Since then, however, they had disappeared, or at least no new generation of public intellectuals had followed them, and by the late 1980s, when his book appeared, the big names were still Bell, Galbraith, et al.42 Jacoby attributed this to several factors: the decline in bohemia, which has been taken ‘on the road’ by the Beats, then lost in suburbia; the removal of urban Jews from their marginal position with the decline in anti-Semitism; the fall of the left with the revelations about Stalin’s atrocities; but above all the expansion of the universities, which sucked in the intellectuals and then broke them on the rack of tenure and careerism.43 This thesis was a little unfair to the later generation of intellectuals like Christopher Lasch, Andrew Hacker, Irving Louis Horowitz, or Francis Fukuyama, but Jacoby nonetheless had a point. In reply, however, as was referred to in the introduction, John Brockman argued that this function has now been taken over by the scientists, since science has more and more policy and philosophical ramifications than ever before. Jacoby describes the complete triumph of analytic philosophy in U.S. and U.K. universities, but for Brockman’s scientists it is their philosophy of science that is now more advanced, and more useful. This is the evolution of ideas, and knowledge forms, in action.
Finally, in considering this evolution of knowledge forms, think back to the link between science, free-market economics, and liberal democracy which was mentioned earlier in this conclusion. The relevance and importance of that link is brought home in this book by an interesting absence that readers may have noticed. I refer to the relative dearth of non-Western thinkers. When this book was conceived, it was my intention (and the publishers’) to make the text as international and multicultural as possible. The book would include not just European and North American – Western – ideas, but would delve into the major non-Western cultures to identify their important ideas and their important thinkers, be they philosophers, writers, scientists, or composers. I began to work my way through scholars who specialised in the major non-Western cultures: India, China, Japan, southern and central Africa, the Arab world. I was shocked (and that is not too strong a word) to find that they all (I am not exaggerating, there were no exceptions) came up with the same answer, that in the twentieth century, the non-Western cultures have produced no body of work that can compare with the ideas of the West. In view of the references throughout the book to racism, I should make it clear that a good proportion of these scholars were themselves members of those very non-Western cultures. More than one made the point that the chief intellectual effort of his or her own (non-Western) culture in the twentieth century has been a coming to terms with modernity, learning how to cope with or respond to Western ways and Western patterns of thought, chiefly democracy and science. This underlines Frantz Fanon’s point, and James Baldwin’s, discussed in chapter 30, that for many groups, the struggle is their culture for the present. I was astounded by this response, which was all the more marked for being made in near-identical terms by specialists from different backgrounds and in different disciplines.
Of course, there are important Chinese writers and painters of the twentieth century, and we can all think of important Japanese film directors, Indian novelists, and African dramatists. Some of them are in this book. We have examined the thriving school of revisionist Indian historiography. Distinguished scholars from a non-Western background are very nearly household names – one thinks of Edward Said, Amartya Sen, Anita Desai, or Chandra Wickramasinghe. But, it was repeatedly put to me, there is no twentieth-century Chinese equivalent of, say, surrealism or psychoanalysis, no Indian contribution to match logical positivism, no African equivalent of the Annales school of history. Whatever list you care to make of twentieth-century innovations, be it plastic, antibiotics and the atom or stream-of-consciousness novels, vers libre or abstract expressionism, it is almost entirely Western.
One person who may offer a clue to this discrepancy is Sir Vidia (V. S.) Naipaul. In 1981 Naipaul visited four Islamic societies – Iran, Pakistan, Malaysia, and Indonesia. Iran he found confused and angry, ‘the confusion of a people of high mediaeval culture awakening to oil and money, a sense of power and violation, and a knowledge of a great new encircling civilisation.’44 ‘That civilisation couldn’t be mastered. It was to be rejected; at the same time it was to be depended upon.’45 Pakistan, he found, was a fragmented country, economically stagnant, ‘its gifted people close to hysteria.’46 The failure of Pakistan as a society, he said, ‘led back again and again to the assertion of the faith.’47 As with Iran there was an emotional rejection of the West, especially its attitudes to women. He found no industry, no science, the universities stifled by fundamentalism, which ‘provides an intellectual thermostat, set low.’48 The Malays, he found, had an ‘inability to compete’ (he meant with the Chinese, who constituted half its population and dominated the country economically). The Islam of Indonesia Naipaul described as ‘stupefaction’; community life was breaking down, and the faith was the inevitable response.49 In all four places, he said, Islam drew its strength from a focus on the past that prevented development, and that very lack of development meant that the peoples of the Islamic nations could not cope with the West. The ‘rage and anarchy’ induced by this kept them locked into the faith – and so the circle continues. Not for nothing did Naipaul quote Bertrand Russell in his book: ‘History makes one aware that there is no finality in human affairs; there is not a static perfection and an unimprovable wisdom to be achieved.’50
Naipaul was even harder on India. He visited the country three times to write books about it – An Area of Darkness (1967), India: A Wounded Civilisation (1977), and India: A Million Mutinies Now (1990).51 ‘The crisis of India,’ he wrote in 1967, ‘… is that of a decaying civilisation, where the only hope lies in further swift decay.’ In 1977 things didn’t look so black, though that could have meant that the swift decay was already overtaking the country. Though not unsympathetic to India, Naipaul pulled no punches in his second book. Phrases taken at random: ‘The crisis of India is not only political or economic. The larger crisis is of a wounded old civilisation that has at last become aware of its inadequacies and is without the intellectual means to move ahead’;52 ‘Hinduism … has exposed [Indians] to a thousand years of defeat and stagnation. It has given men no idea of contract with other men, no idea of the state…. Its philosophy of withdrawal has diminished men intellectually and not equipped them to respond to challenge; it has stifled growth.’53
Octavio Paz, Mexico’s Nobel Prize-winning poet, was twice attached to Mexico’s embassy in India, the second time as ambassador. His In Light of India, published in 1995, is much more sympathetic to the subcontinent, celebrating in particular its poetry, its music, its sculpture.54 At the same time, Paz is not blind to India’s misfortunes: ‘The most remarkable aspect of India, and the one that defines it, is neither political nor economic, but religious: the coexistence of Hinduism and Islam. The presence of the strictest and most extreme form of monotheism alongside the richest and most varied polytheism is, more than a historical paradox, a deep wound. Between Islam and Hinduism there is not only an opposition, but an incompatibility’;55 ‘Hindu thought came to a halt, the victim of a kind of paralysis, toward the end of the thirteenth century, the period when the last of the great temples were erected. This historical paralysis coincides with two other important phenomena: the extinction of Buddhism and the victory of Islam in Delhi and other places’;56 ‘The great lethargy of Hindu civilisation began, a lethargy that persists today…. India owes to Islam some sublime works of art, particularly in architecture and, to a lesser degree, in painting, but not a single new or original thought.’57
Naipaul’s third book on the subcontinent, India: A Million Mutinies Now, published in 1990, was very different in tone, altogether sunnier, a collection of vivid reportage, looking at filmmakers, architects, scientists, newspaper people, charity workers, with far fewer – hardly any – of the great sobering generalisations that had characterised the earlier books. When he did sum up, right at the end, it was to conclude, ‘People everywhere have ideas now of who they are and what they owe themselves…. The liberation of spirit that has come to India could not come as release alone…. It had to come as rage and revolt…. But there was in India now what didn’t exist 200 years before: a central will, a central intellect, a national idea.’58 India, he reflected, was growing again, on its way to restoration.59
I draw attention to the issue, since I found it so surprising, because these later encomiums of Naipaul cannot quite wash away the large thoughts he raised in his earlier works, about the links between religion and politics on the one hand, and creativity and intellectual and social progress on the other, and because it helps explain the shape of this book and why there isn’t more about non-Western intellectual developments. I can’t attempt a full answer here, because I haven’t done the work. Nor has anyone else, so far as I am aware, though David Landes comes close in his Wealth and Poverty of Nations (1998), referred to earlier. He pulls no punches either, frankly labelling the Arab nations, the Indians, the Africans, and the South Americans as ‘losers.’60 Quoting figures to show that not even colonialism was all bad, Landes settles on the fact that intellectual segregation is the chief burden of religious fundamentalism, producing technological lag. Landes’s book is best seen as an heroic attempt at being cruel to be kind, to shock and provoke ‘failing’ cultures into reality. There is much more to be said.
The issues just discussed are partly psychological, partly sociological. In The Decomposition of Sociology (1994), Irving Louis Horowitz, Hannah Arendt Distinguished Professor of Sociology at Rutgers University, and president of Transaction/Society, a sociological publishing house, laments both the condition and direction of the discipline to which he has given his life.61 His starting point, and the reason why his book appeared when it did, was the news in February 1992 that the sociology departments in three American universities had been closed down and the one at Yale cut back by more than half. At the same time, the number of students graduating in sociology was 14,393, well down on the 35,996 in 1973. Horowitz is in no doubt about the cause of this decline, a decline that, he notes, is not confined to the United States: ‘I firmly believe that a great discipline has turned sour if not rancid.’62 Strong words, but that all-important change, he said, has been brought about by the injection of ideology into the discipline – to wit, a belief that a single variable can explain human behaviour: ‘Thus, sociology has largely become a repository of discontent, a gathering of individuals who have special agendas, from gay and lesbian rights to liberation theology’;63 ‘Any notion of a common democratic culture or a universal scientific base has become suspect. Ideologists masked as sociologists attack it as a dangerous form of bourgeois objectivism or, worse, as imperialist pretension…. That which sociology once did best of all, support the humanistic disciplines in accurately studying conditions of the present to make the future a trifle better, is lost. Only the revolutionary past and the beatific future are seen as fit for study, now that the aim of sociology has become to retool human nature and effect a systematic overhaul of society.’64 The result, he said, has been the departure from sociology of all those scholars for whom social science is linked to public policy – social planners, penologists, demographers, criminologists, hospital administrators, and international development specialists.65 Sociology, rather than being the study of ideology, has become ideology itself-in particular Marxist ideology. ‘Every disparity between ghetto and suburb is proof that capitalism is sick. Every statistic concerning increases in homicide and suicide demonstrates the decadence of America or, better, resistance to America. Every child born out of wedlock is proof that “the system” has spun out of control.’66
For Horowitz, the way to rehabilitate and reinvent sociology is for it to tackle some big sympathetic issues, to describe those issues in detail and without bias, and to offer explanation. The Holocaust is the biggest issue, he wrote, still – amazingly – without a proper sociological description or a proper sociological explanation. Other areas where sociology should seek to offer help – to government and public alike – are in drug abuse, AIDS, and an attempt to define ‘the national interest,’ which would help foreign policy formulation. He also outlines a sociological ‘canon,’ a list of authors with whom, he said, any literate sociologist should be familiar. Finally, he makes a point very germane to the thesis of this chapter, that the positive hour, or ‘positive bubble’ as he put it, might not always last, or produce a vision of society that we can live with.67 It is, above all, he said, the sociologist’s job to help us see past this bubble, to explore how we might live together. Horowitz’s book finishes up far more positive in tone than it starts out, but it cannot be said that sociology has changed much as a result; its decomposition is still its dominant feature.
Horowitz’s thoughts bring us back to the Introduction, and to the fact that, in this book, I have sought to shift the focus away from political and military events. Of course, as was said at the beginning, this is an artificial division, a convenience merely for the sake of exploring significant and interesting issues often sidelined in more conventional histories. Yet one of the more challenging aspects of politics lies in the attempt to adapt such findings as those reported here to the governance of peoples. Whole books could be written about both the theory and practicalities of such adaptation, and while there is certainly no space to attempt such an exercise in the present work, it is necessary to acknowledge such a limit, and to make (as I see it) one all-important point.
This is that neither side of the conventional political divide (left versus right) holds all the virtues when it comes to dealing with intellectual and social problems. From the left, the attempted marriage of Marx and Freud has failed, as it was bound to do, being based on two rigid and erroneous theories about human nature (Freud even more so than Marx). The postmodern tradition is more successful as a diagnosis and description than as a prognosis for a way forward, except in one respect – that it cautions us to be wary of ‘big’ ideas that work for all people, in all places, at all times.
Looking back over the century, and despite the undoubted successes of the free-market system, one wonders whether the theorists of the right have any more reason to feel satisfied. Too often a substantial part of what they have offered is a directive to do nothing, to allow matters to take their ‘natural’ course, as if doing nothing is somehow more natural than doing something. The theories of Milton Friedman or Charles Murray, for example, seem very plausible, until one thinks of the writings of George Orwell. Had Friedman and Murray been writing in the 1930s, they would probably have still been arguing for the status quo, for economics to take its ‘natural’ course, for no intervention. Yet who can doubt that Orwell helped bring about a shift in sensibility that, combined with the experience of war, wrought a major change in the way the poor were regarded? However unsatisfactory the welfare state is now, it certainly improved living conditions for millions of people across the world. This would not have happened if left to laisser-faire economists.
Perhaps Karl Popper had it about right when he said that politics is like science, in that it is – or ought to be – endlessly modifiable. Under such a system, a welfare state might be a suitable response to a certain set of circumstances. But, once it has helped to create a healthier, wealthier population in which far greater numbers survive into old age, with all the implications that has for disease and the economic profile of an entire people, surely a different set of circumstances is called for? We should know by now – it is one of the implicit messages of this book – that in a crowded world, the world of mass society (a twentieth-century phenomenon), every advance is matched by a corresponding drawback or problem. In this regard, we should never forget that science teaches us two lessons, one just as important as the other. While it has revealed to us some of the fundamentals of nature, science has also taught us that the pragmatic, piecemeal approach to life is by far the most successful way of adapting. We should beware grand theories.
As the century drew to its close, the shortcomings and failures first recognised by Gunther Stent and John Horgan began to grow in importance – in particular the idea that there are limits to what science can tell us and what, in principle, we can know. John Barrow, professor of astronomy at the University of Sussex, put these ideas together in his 1998 book Impossibility: The Limits of Science and the Science of Limits.68 ‘Science,’ said Barrow in his concluding chapter, ‘exists only because there are limits to what Nature permits. The laws of Nature and the unchanging “constants” of Nature define the borders that distinguish our Universe from a host of other conceivable worlds where all things are possible…. On a variety of fronts we have found that growing complexity ultimately leads to a situation that is not only limited, but self-limiting. Time and again, the development of our most powerful theories has followed this path: they are so successful that it is believed that they can explain everything…. The concept of a “theory of everything” occasionally rears its head. But then something unexpected happens. The theory predicts that it cannot predict: it tells us that there are things it cannot tell us.’69 In particular, Barrow says, taking as his starting point Kurt Gödel’s 1931 theory, there are things mathematics cannot tell us; there are limits that arise from our humanity and the evolutionary heritage we all share, which determine our biological nature and, for instance, our size. There are limits to the amount of information we can process; the great questions about the nature of the universe turn out to be unanswerable, because for one thing the speed of light is limited. Chaoplexity and randomness may well be beyond us in principle. ‘Whether it be an election, a bank of linked computers, or the “voting” neurones inside our head, it is impossible to translate individual rational choices into collective rationality.’70
Not everyone agrees with Barrow, but if he is right, then the end of the century has brought with it yet another change in sensibility, perhaps the most important since Galileo and Copernicus: we are living near the end of the positive hour, and a ‘post scientific age’ awaits us. For many, this can’t come soon enough, but it is important not to overstate the case – as John Maddox has shown, there is still plenty of science to be done. Nevertheless, science has always promised, however far down the road, an ultimate explanation of the universe. If, as Barrow and others tell us, that now looks like a theoretical impossibility, who can tell what the consequences will be? Where will the evolution of knowledge forms next lead?
One thing seems clear: as Eliot said, there’s no going back. The arch-critics of science, with their own brand of secular zealotry, while they often skilfully describe why science can never be a complete answer to our philosophical condition, usually have little to add to it or replace it with. They tend either to look back to an age of religion or to recommend some sort of Heideggerean ‘submission’ to nature, to just ‘be.’ They lament the ‘disenchantment’ that has disappeared as we have turned away from God, but are unclear as to whether ‘reenchantment’ could ever be meaningful.
The British philosopher Roger Scruton is one of the most articulate of such thinkers. His An Intelligent Person’s Guide to Modern Culture (1998) brilliantly punctures the pretensions, postures, and vacuities of modernist and popular culture, its failure to provide the ‘experience of membership’ that was true in an age of shared religious high culture, and laments how we can ever learn to judge ‘in a world that will not be judged.’ His view of science is sceptical: ‘The human world is a world of significances, and no human significance can be fully grasped by science.’ For Scruton, fiction, the imagination, the world of enchantment, is the highest calling, for it evokes sympathy for our condition, toleration, shared feelings, a longing that cannot be fulfilled, and ‘processes’ that, like Wagner’s operas, he deeper than words.71
Scruton is nostalgic for religion but does not make the most of its possibilities. Perhaps the most sophisticated religious postscientific argument has come from John Polkinghorne. A physicist by training, Polkinghorne studied with Paul Dirac, Murray Gell-Mann, and Richard Feynman, became professor of mathematical physics at Cambridge and therefore a close colleague of Stephen Hawking, and in 1982 was ordained as a priest in the Anglican Church. His thesis in Beyond Science (1996) has two elements: one, that ‘our scientific, aesthetic, moral and spiritual powers greatly exceed what can convincingly be claimed to be needed in the struggle for survival, and to regard them as merely a fortunate but fortuitous by-product of that struggle is not to treat the mystery of their existence with adequate seriousness’;72 and two, that ‘the evolution of conscious life seems the most significant thing that has happened in cosmic history and we are right to be intrigued by the fact that so special a universe is required for its possibility.’73 In fact, Polkinghorne’s main argument for his belief in a creator is the anthropic principle – that our universe is so finely tuned, providing laws of physics that allow for our existence, that a creator must be behind it all. This is an updated argument as compared with those of the bishop of Birmingham and Dean Inge in the 1930s, but Polkinghorne’s case for God still lies in the details that we don’t – and maybe can’t – grasp. In that sense it is no different from any of the arguments about religion and science that have gone before.74
In his intellectual autobiography, Confessions of a Philosopher (1997), Bryan Magee writes as follows: ‘Not being religious myself, yet believing that most of reality is likely to be permanently unknowable to human beings, I see a compelling need for the demystification of the unknowable. It seems to me that most people tend either to believe that all reality is in principle knowable or to believe that there is a religious dimension to things. A third alternative – that we can know very little but have equally little ground for religious belief – receives scant consideration, and yet seems to me to be where the truth lies.’75 I largely share Magee’s views as expressed here, and I also concur with the way he describes ‘the main split in western philosophy.’ There is, he says, the analytic approach, mainly identified with the logical positivists and British and American philosophers, who are fascinated by science and its implications and whose main aim is ‘explanation, understanding, insight.’76 In contrast to them are what are known in Britain and America as the ‘continental’ school of philosophers, led by such figures as Husserl and Heidegger but including Jacques Lacan, Louis Althusser, Hans-Georg Gadamer, and Jürgen Habermas, and looking back to German philosophy – Kant, Hegel, Marx, and Nietzsche. These philosophers are not so interested in science as the analytic ones are, but they are interested in Freudian (and post-Freudian) psychology, in literature, and in politics. Their approach is rhetorical and partisan, more interested in comment than in understanding.77 This is an important distinction, I think, because it divides some of our deepest thinkers between science, on the one hand, and Freud, literature, and politics on the other. Whatever we do, it seems we cannot get away from this divide, these ‘two cultures,’ and yet if I am right the main problems facing us require us to do so. In the twentieth century, what we may characterise as scientific/analytic reason has been a great success, by and large; political, partisan, and rhetorical reason, on the other hand, has been a catastrophe. The very strengths of analytic, positive/passive reason have lent political rhetorical reason an authority it does not deserve. George Orwell, above and before everyone, saw this and sought to bring home the point. Oswald Spengler and Werner Sombart’s distinction between heroes and traders is recast as one between heroes and scientists.
Out of all this, however, it seems to me that we might still make something, at the very least an agenda for the way ahead. It is clear from the narrative of this book that the failures of science, as mentioned earlier in this chapter – in particular the failure of sociology and psychology – have been crucial to the past century, and have been intimately associated with the political disasters. The combined effects stemming from the successes of science, Uberai democracy, free-market economics, and the mass media have produced an era of personal freedom and a realised individuality unrivalled in the past. This is no mean accomplishment, but it still leaves a lot to be achieved. Look at America’s inability to deal with the race issue, which has cast its shallow down the century. Look at the ethnic cleansing in Rwanda and, more recently, in Kosovo, so reminiscent of both the Holocaust and Conrad’s Heart of Darkness. Look at the figures for crime, drug abuse, illegitimacy, and abortion. All of these reflect, at some level, a breakdown in relations between different groups – different nations, different races, different tribes, different sexes, different families, different ages. The developments of the twentieth century have taught us more and more about ourselves as individuals, but they have not taught us much about ourselves as members of groups, interlocking groups, with shared responsibilities as well as rights. In sociology, the dominant influence of Marx has been to stress the way some groups (the middle classes, management) dominate and exploit others. This has caused massive neglect in the study of the other ways in which groups relate to one another. In psychology, Freud’s emphasis on individual development, again allegedly based on self-interest, hostility, and competition, has put personal realisation above everything else.
The task before science is, therefore, as clear as it is urgent. It is to turn its attention to groups, groups of people, the psychology and sociology of groups, to explore how they relate to each other, how individuals relate to the different groups of which they are members (families, sexes, generations, races, nations), in the hope that we shall one day be able to understand and control such phenomena as racism, rape, and child and drug abuse.78 As Samuel Huntington argued in The Clash of Civilisations and the Remaking of World Order (1996), the critical distinctions between people are not primarily ideological any more – they are cultural, i.e., group-based.79 There is no question but that these are the most critical issues for sociology and psychology in the future.
One final observation about science, free-market economics, and the mass media. The vast majority of ideas in this book were produced in universities, away from the hurly-burly of the market. The people who have had the ideas reported in these pages did not, for the most part, do what they did for the money, but because they were curious. Figures like Peter Brook and Pierre Boulez have deliberately avoided the market system, so that their work could develop in ways not constrained by market considerations. By the same token, the mass medium that has made the greatest contribution to our intellectual and communal life – the BBC – is again deliberately removed from the raw market. We should be aware that knowledge, particularly the production of basic science, ethical philosophy and social commentary appear to be human activities that do not lend themselves to market economics, though they clearly flourish in protected environments under such a system. Universities have evolved into highly tolerant communities, for the most part, where people of different ages, different backgrounds, with different outlooks, interests, and skills, can explore varied ways of living together. We should never forget how precious universities are, and that with our current problems, as discussed in the pages above, and notwithstanding anything else said in the rest of this epilogue, universities may offer a way forward, a lead, out of the impasse facing psychology and sociology.
The New Humanities and a New Canon
Science apart, the major division in Western thought today, which affects philosophy, literature, religion, architecture, even history, is between the postmodernists, who are happy with the fragmented, disparate, ‘carnival’ of culture (to use Stanley Fish’s phrase), and those traditionalists who genuinely feel this sells us short (the young particularly), that this approach involves an ethical betrayal, avoids judging what is better and what is less good in human achievement and, in so doing, hinders people in raising their game. Postmodernism and relativism are still in the ascendant, but for how much longer? While the cultures of Africa, Bali and other third world countries have been recovered, to an extent, and given a much needed boost, none has so far found the widespread resonance that the classical civilisations of the Middle East once enjoyed. No one doubts that jewels of art, learning and science have occurred in all places and at all times, and the identification and extension of this wide range has been a major achievement of twentieth-century scholarship. In particular, the vast body of knowledge concerning the early, pre-Columbus native Americans, has revealed a very rich set of interlocking cultures. But have these discoveries produced any body of written material, say, which causes us to re-think the way that we live? Has it revealed any body of law, or medicine, or technology which leads us to change our ways either of thinking or doing? Has it reveled a completely new literature or philosophy with a new vision? The blunt answer is no.
The possibility – one might almost say the probability – arises then, that, some time in the twenty-first century, we shall eventually enter a postpostmodern world, in which the arguments of Jean-François Lyotard, Clifford Geertz, Frederick Jameson, David Harvey and their colleagues are still accepted, but only up to a point. We shall have reached a stage where, even after all the cultures of the world have been recovered and described, there will still be a hierarchy of civilisations, in the sense that a few of them were vastly more important in shaping our world than others. It should be said that, at the end of the twentieth century, the traditional hierarchy (which implies the traditional met narrative’), despite various attempts to destabilise it, is not much changed.
Hiram Bingham’s re-discovery of Machu Picchu, or Basil Davidson’s ‘recovery’ of Mapungubwe, or Clifford Geertz’s own ‘thick description’ of Balinese cockfights may, each in its own way, rival Plato’s Republic, or Shakespeare’s Falstaff, or Planck’s quantum. But – and this is surely the main point – though they are all part of the emerging ‘one story’ that is the crucial achievement of twentieth-century scholarship, Machu Picchu, Mapungubwe and Bali did not help shape the one story anywhere near as directly as the more traditional ideas did.
It is not racist or ethnocentrist to insist on this. As Richard Rorty has correctly pointed out, thick descriptions of Balinese cockfights are themselves an achievement of Western anthropology. But I think the differences between the postmodernists and the traditionalists (for want of a better term) can be reconciled, at least partly. Neil Postman drew my attention to the fact that at the beginning of our century William James said that any subject, treated historically, can become a ‘humanity.’80 ‘You can give humanistic value to almost anything by teaching it historically. Geology, economics, mechanics, are humanities when taught with reference to the successive achievements of the geniuses to which these sciences owe their being. Not taught thus, literature remains grammar, art a catalogue, history a list of dates, and natural science a sheet of formulas and weights and measures.’ The narrative form, properly realised, brings with it a powerful authority, showing not only where we are at any point but how we arrived there. In the case of this narrative, the grand narrative that has emerged in the course of the twentieth century, the story is so overwhelming that I believe it can provide, or begin to provide, an antidote to some of the problems that have plagued our educational institutions in recent years – in particular, the so-called ‘culture wars’ and the battles over the Western canon.
As was mentioned earlier, many avenues of thought, many disciplines, are coming together to tell one story. The most powerful advocate of this idea has been E. O. Wilson, who even resurrected the term consilience to describe the process. In his 1998 book of that name, Consilience: The Unity of Knowledge, Wilson offered the arch-reductionist view of the world, not only describing the way scientific knowledge has come together but also putting forward the idea that one day science will be able to ‘explain’ art, religion, ethics, kinship patterns, forms of government, etiquette, fashion, courtship, gift-giving patterns, funeral rites, population policy, penal sanctions, and if that’s not enough, virtually everything else.81 At its most basic, he argued that colour preferences are largely innate, that the arts are innately focused toward certain themes, that metaphors are the consequence of spreading activation in the brain during learning and are therefore ‘the building blocks of creative thought.’82 Among the innate impulses that go to make up art are imitation, making things geometrical, and intensification. Good artists instinctively know what patterns arouse the brain most.83 In myth and fiction ‘as few as two dozen’ plots cover most epic stories that make up the classical corpus of many societies. These include emigration of the tribe, meeting the forces of evil, apocalypse, sexual awakening. ‘The dominating influence that spawned the arts was the need to impose order on the confusion caused by intelligence.’84 ‘We are entering a new era of existentialism,’ says Wilson, ‘not the old absurdist existentialism of Kierkegaard and Sartre, giving complete autonomy to the individual, but the concept that only unified learning, universally shared, makes accurate foresight and wise choice possible…. In the course of all of it we are learning the fundamental principle that ethics is everything. Human social existence, unlike animal sociality, is based on the genetic propensity to form long-term contracts that evolve by culture into moral precepts and law.’85
In other words, for Wilson the arts also become part of one story. And it is this story, I suggest, which ought to become the basis of a new canon. Understanding this narrative, and the way it was arrived at, involves a good appreciation of all the important sciences, the significant phases of history, the rise and fall of civilisations, and the reasons for the underlying patterns. Great works of religion, literature, music, painting, and sculpture fit into this narrative, this system of understanding, in the sense that all cultures have been attempts to come to terms with both the natural and the supernatural world, to create beauty, produce knowledge, and get at the truth. The significance of language, the way languages are related and have evolved, and yet remain very different, fits in here. Evolution enables us to place the world of culture within the world of nature with as comfortable a fit as possible. It shows how groups are related to one another. In addition, this narrative shows how mankind is moving on, where the old ways of thought are being superseded. Many people will disagree with this argument, replying that there is no teleological direction in evolution. Even more won’t like, or will be sceptical of the thrust of what I have to say. But I think the evidence speaks for itself.
That evidence, at the end of the century, suggests that we are already living in what may be called a crossover culture. While people lament the effects of the mass media on our intellectual life generally, an inspection of the shelves in any good bookstore more or less anywhere in the Western world shows that, on the other hand, one of the greatest growth areas is in what is called popular science. That phrase is in fact misleading, to the extent that many of these books are relatively difficult, examining for example the nature of matter, abstruse mathematics (Fermat’s last theorem, longitude), the minutiae of evolution, the byways of palaeontology, the origin of time, the philosophy of science. But a growing number of people now accepts that one cannot call oneself educated unless one is up-to-date on these issues. The numbers are small, relatively speaking, but it remains true that both this category of book, and its shelf space on bookshop walls, barely existed twenty years ago.
To my mind this is very encouraging, not least because it will prevent too big a division opening up in our society between scientists and the rest. If – a big ‘if’ perhaps – the superstring revolution really does come to something, that something may prove very difficult for scientists to share with the rest of us. They are already at the limit as to what metaphor can explain and we must face at least the possibility that, some day, the secrets of the universe will only be truly available to those with an above-average grasp of mathematics. It is no use the rest of us saying that we don’t like the way knowledge is going. That’s where the advances are being made, and is an added reason why I am arguing for this particular new canon, taught – as James said – as a humanity, so that it is attractive to as wide a cross-section of people as possible.
Evolution is the story of us all. Physics, chemistry, and biology are international in a way that literature, art, or religion can never be. Although science may have begun in the West, there are now distinguished Indian, Arab, Japanese, and Chinese scientists in great numbers. (In July 1999 China announced its capability to produce a neutron bomb, an intellectual triumph of sorts.) This is not to provide a framework for avoiding difficult judgements: science and liberal democracy are, or were, Western ideas. Nor is it a way of evading debate over the Western literary canon. But studying twentieth-century thought, as a narrative, provides a new kind of humanity and a canon for life as it is now lived. In offering something common to us all, a sketch of an historical/intellectual canon, it also begins to address our remaining problems. It is something we can all share.