“They were apes only yesterday.
Give them time.”
“Once an ape-always an ape.”…
“No, it will be different… Come back here in
an age or so and you shall see…”
The gods, discussing the Earth, in the motion
picture version of H. G. Wells’ The Man Who
Could Work Miracles (1936)
IT WAS A MUSEUM, in a way like any other, this Musée de l’Homme, Museum of Man, situated on a pleasant eminence with, from the restaurant plaza in back, a splendid view of the Eiffel Tower. We were there to talk with Yves Coppens, the able associate director of the museum and a distinguished paleoanthropologist. Coppens had studied the ancestors of mankind, their fossils being found in Olduvai Gorge and Lake Turkana, in Kenya and Tanzania and Ethiopia. Two million years ago there were four-foot-high creatures, whom we call Homo habilis, living in East Africa, shearing and chipping and flaking stone tools, perhaps building simple dwellings, their brains in the course of a spectacular enlargement that would lead one day-to us.
Institutions of this sort have a public and a private side. The public side includes the exhibits in ethnography, say, or cultural anthropology: the costumes of the Mongols, or bark cloths painted by Native Americans, some perhaps prepared especially for sale to voyageurs and enterprising French anthropologists. But in the innards of the place there are other things: people engaged in the construction of exhibits; vast storerooms of items inappropriate, because of subject matter or space, for general exhibition; and areas for research. We were led through a warren of dark, musty rooms, ranging from cubicles to rotundas. Research materials overflowed into the corridors: a reconstruction of a Paleolithic cave floor, showing where the antelope bones had been thrown after eating. Priapic wooden statuary from Melanesia. Delicately painted eating utensils. Grotesque ceremonial masks. Assagai-like throwing spears from Oceania. A tattered poster of a steatopygous woman from Africa. A dank and gloomy storeroom filled to the rafters with gourd woodwinds, skin drums, reed panpipes and innumerable other reminders of the indomitable human urge to make music.
Here and there could be found a few people actually engaged in research, their sallow and deferential demeanors contrasting starkly with the hearty bilingual competence of Coppens. Most of the rooms were evidently used for storage of anthropological items, collected from decades to more than a century ago. You had the sense of a museum of the second order, in which were stored not so much materials that might be of interest as materials that had once been of interest. You could feel the presence of nineteenth-century museum directors engaged, in their frock coats, in goniométrie and craniologie, busily collecting and measuring everything, in the pious hope that mere quantification would lead to understanding.
But there was another area of the museum still more remote, a strange mix of active research and virtually abandoned cabinets and shelves. A reconstructed and articulating skeleton of an orangutan. A vast table covered with human skulls, each neatly indexed. A drawer full of femurs, piled in disarray, like the erasers in some school janitor’s supply closet. A province dedicated to Neanderthal remains, including the first Neanderthal skull, reconstructed by Marcellin Boule, which I held cautiously in my hands. It felt lightweight and delicate, the sutures starkly visible, perhaps the first compelling piece of evidence that there once were creatures rather like us who became extinct, a disquieting hint that our species likewise might not survive forever. A tray filled with the teeth of many hominids, including the great nutcracker molars of Australopithecus robustus, a contemporary of Homo habilis. A collection of Cro-Magnon skull cases, stacked like cordwood, scrubbed white and in good order. These items were reasonable and in a way expected, the necessary shards of evidence for reconstructing something of the history of our ancestors and collateral relatives.
Deeper in the room were more macabre and more disturbing collections. Two shrunken heads reposing on a cabinet, sneering and grimacing, their leathery lips curled back to reveal rows of sharp, tiny teeth. Jar upon jar of human embryos and fetuses, pale white, bathed in a murky greenish fluid, each jar competently labeled. Most specimens were normal, but occasionally an anomaly could be glimpsed, a disconcerting teratology-Siamese twins joined at the sternum, say, or a fetus with two heads, the four eyes tightly shut.
There was more. An array of large cylindrical bottles containing, to my astonishment, perfectly preserved human heads. A red-mustachioed man, perhaps in his early twenties, originating, so the label said, from Nouvelle Calédonie. Perhaps he was a sailor who had jumped ship in the tropics only to be captured and executed, his head involuntarily drafted in the cause of science. Except he was not being studied; he was only being neglected, among the other severed heads. A sweet-faced and delicate little girl of perhaps four years, her pink coral earrings and necklace still perfectly preserved. Three infant heads, sharing the same bottle, perhaps as an economy measure. Men and women and children of both sexes and many races, decapitated, their heads shipped to France only to moulder-perhaps after some brief initial study-in the Musée de l’Homme. What, I wondered, must the loading of the crates of bottled heads have been like? Did the ship’s officers speculate over coffee about what was down in the hold? Were the sailors heedless because the heads were, by and large, not those of white Europeans like themselves? Did they joke about their cargo to demonstrate some emotional distance from the little twinge of horror they privately permitted themselves to feel? When the collections arrived in Paris, were the scientists brisk and businesslike, giving orders to the draymen on the disposition of severed heads? Were they impatient to unseal the bottles and embrace the contents with calipers? Did the man responsible for this collection, whoever he might be, view it with unalloyed pride and zest?
And then in a still more remote corner of this wing of the museum was revealed a collection of gray, convoluted objects, stored in formalin to retard spoilage-shelf upon shelf of human brains. There must have been someone whose job it was to perform routine craniotomies on the cadavers of notables and extract their brains for the benefit of science. Here was the cerebrum of a European intellectual who had achieved momentary renown before fading into the obscurity of this dusty shelf. Here a brain of a convicted murderer. Doubtless the savants of earlier days had hoped there might be some anomaly, some telltale sign in the brain anatomy or cranial configuration of murderers. Perhaps they had hoped that murder was a matter of heredity and not society. Phrenology was a graceless nineteenth-century aberration. I could hear my friend Ann Druyan saying, “The people we starve and torture have an unsociable tendency to steal and murder. We think it’s because their brows overhang.” But the brains of murderers and savants-the remains of Albert Einstein’s brain are floating wanly in a bottle in Wichita-are indistinguishable. It is, very probably, society and not heredity that makes criminals.
While scanning the collection amid such ruminations, my eye was caught by a label on one of the many low cylindrical bottles. I took the container from the shelf and examined it more closely. The label read P. Broca. In my hands was Broca’s brain.
PAUL BROCA was a surgeon, a neurologist and an anthropologist, a major figure in the development of both medicine and anthropology in the mid-nineteenth century. He performed distinguished work on cancer pathology and the treatment of aneurisms, and made a landmark contribution to understanding the origins of aphasia-an impairment of the ability to articulate ideas. Broca was a brilliant and compassionate man. He was concerned with medical care for the poor. Under cover of darkness, at the risk of his own life, he successfully smuggled out of Paris in a horse-drawn cart 73 million francs, stuffed into carpetbags and hidden under potatoes, the treasury of the Assistance Publique which-he believed, at any rate-he was saving from pillage. He was the founder of modern brain surgery. He studied infant mortality. Toward the end of his career he was created a senator.
He loved, as one biographer said, mainly serenity and tolerance. In 1848 he founded a society of “freethinkers.” Almost alone among French savants of the time, he was sympathetic to Charles Darwin’s idea of evolution by natural selection. T. H. Huxley, “Darwin’s Bulldog,” remarked that the mere mention of Broca’s name filled him with a sense of gratitude, and Broca was quoted as saying, “I would rather be a transformed ape than a degenerate son of Adam.” For these and other views he was publicly denounced for “materialism” and, like Socrates, for corrupting the young. But he was made a senator nevertheless.
Earlier, Broca had encountered great difficulty in establishing a society of anthropology in France. The Minister of Public Instruction and the Prefect of Police believed that anthropology must, as the free pursuit of knowledge about human beings, be innately subversive to the state. When permission was at last and reluctantly granted for Broca to talk about science with eighteen colleagues, the Prefect of Police held Broca responsible personally for all that might be said in such meetings “against society, religion, or the government.” Even so, the study of human beings was considered so dangerous that a police spy in plain clothes was assigned to attend all meetings, with the understanding that authorization to meet would be withdrawn immediately if the spy was offended by anything that was said. In these circumstances the Society of Anthropology of Paris gathered for the first time on May 19, 1859, the year of the publication of The Origin of Species. In subsequent meetings an enormous range of subjects was discussed-archaeology, mythology, physiology, anatomy, medicine, psychology, linguistics and history-and it is easy to imagine the police spy nodding off in the corner on many an occasion. Once, Broca related, the spy wished to take a small unauthorized walk and asked if he might leave without anything threatening to the state being said in his absence. “No, no, my friend,” Broca responded. “You must not go for a walk: sit down and earn your pay.” Not only the police but also the clergy opposed the development of anthropology in France, and in 1876 the Roman Catholic political party organized a major campaign against the teaching of the subject in the Anthropological Institute of Paris founded by Broca.
Paul Broca died in 1880, perhaps of the very sort of aneurism that he had studied so brilliantly. At the moment of his death he was working on a comprehensive study of brain anatomy. He had established the first professional societies, schools of research, and scientific journals of modern anthropology in France. His laboratory specimens became incorporated into what for many years was called the Musée Broca. Later it merged to become a part of the Musée de l’Homme.
It was Broca himself, whose brain I was cradling, who had established the macabre collection I had been contemplating. He had studied embryos and apes, and people of all races, measuring like mad in an effort to understand the nature of a human being. And despite the present appearance of the collection and my suspicions, he was not, at least by the standards of his time, more of a jingoist or a racist than most, and certainly not that standby of fiction and, more rarely, of fact: the cold, uncaring, dispassionate scientist, heedless of the human consequences of what he does. Broca very much cared.
In the Revue d’Anthropologie of 1880 there is a complete bibliography of Broca’s writings. From the titles I could later glimpse something of the origins of the collection I had viewed: “On the Cranium and Brain of the Assassin Lemaire,” “Presentation of the Brain of a Male Adult Gorilla,” “On the Brain of the Assassin Prévost,” “On the Supposed Heredity of Accidental Characteristics,” “The Intelligence of Animals and the Rule of Humans,” “The Order of the Primates: Anatomical Parallels between Men and Apes,” “The Origin of the Art of Making Fire,” “On Double Monsters,” “Discussion on Microcephalics,” “Prehistoric Trepanning,” “On Two Cases of a Supernumerary Digit Developing at an Adult Age,” “The Heads of Two New Caledonians” and “On the Skull of Dante Alighieri.” I did not know the present resting place of the cranium of the author of The Divine Comedy, but the collection of brains and skulls and heads that surrounded me clearly began in the work of Paul Broca.
BROCA WAS a superb brain anatomist and made important investigations of the limbic region, earlier called the rhinencephalon (the “smell brain”), which we now know to be profoundly involved in human emotion. But Broca is today perhaps best known for his discovery of a small region in the third convolution of the left frontal lobe of the cerebral cortex, a region now known as Broca’s area. Articulate speech, it turns out, as Broca inferred on only fragmentary evidence, is to an important extent localized in and controlled by Broca’s area. It was one of the first discoveries of a separation of function between the left and right hemispheres of the brain. But most important, it was one of the first indications that specific brain functions exist in particular locales in the brain, that there is a connection between the anatomy of the brain and what the brain does, an activity sometimes described as “mind.”
Ralph Holloway is a physical anthropologist at Columbia University whose laboratory I imagine must bear some resemblance to Broca’s. Holloway makes rubber-latex casts of the insides of skulls of human and related beings, past and present, to attempt a reconstruction, from slight impressions on the interior of the cranium, of what the brain must have been like. Holloway believes that he can tell from a creature’s cranium whether Broca’s area is present, and he has found evidence of an emerging Broca’s area in the brain of Homo habilis some two million years ago-just the time of the first constructions and the first tools. To this limited extent there is something to the phrenological vision. It is very plausible that human thought and industry went hand in hand with the development of articulate speech, and Broca’s area may in a very real sense be one of the seats of our humanity, as well as a means for tracing our relationships with our ancestors on their way toward humanity.
And here was Broca’s brain floating, in formalin and in fragments, before me. I could make out the limbic region which Broca had studied in others. I could see the convolutions on the neocortex. I could even make out the gray-white left frontal lobe in which Broca’s own Broca’s area resided, decaying and unnoticed, in a musty corner of a collection that Broca had himself begun.
It was difficult to hold Broca’s brain without wondering whether in some sense Broca was still in there-his wit, his skeptical mien, his abrupt gesticulations when he talked, his quiet and sentimental moments. Might there be preserved in the configuration of neurons before me a recollection of the triumphant moment when he argued before the combined medical faculties (and his father, overflowing with pride) on the origins of aphasia? A dinner with his friend Victor Hugo? A stroll on a moonlit autumn evening, his wife holding a pretty parasol, along the Quai Voltaire and the Pont Royal? Where do we go when we die? Is Paul Broca still there in his formalin-filled bottle? Perhaps the memory traces have decayed, although there is good evidence from modern brain investigations that a given memory is redundantly stored in many different places in the brain. Might it be possible at some future time, when neurophysiology has advanced substantially, to reconstruct the memories or insights of someone long dead? And would that be a good thing? It would be the ultimate breach of privacy. But it would also be a kind of practical immortality, because, especially for a man like Broca, our minds are clearly a major aspect of who we are.
From the character of this neglected storeroom in the Musée de l’Homme I had been ready to attribute to those who had assembled the collection-I had not known it was Broca at the time-a palpable sexism and racism and jingoism, a profound resistance to the idea of the relatedness of human beings and the other primates. And in part it was true. Broca was a humanist of the nineteenth century, but unable to shake the consuming prejudices, the human social diseases, of his time. He thought men superior to women, and whites superior to blacks. Even his conclusion that German brains were not significantly different from French ones was in rebuttal to a Teutonic claim of Gallic inferiority. But he concluded that there were deep connections in brain physiology between gorillas and men. Broca, the founder of a society of freethinkers in his youth, believed in the importance of untrammeled inquiry and had lived his life in pursuit of that aim. His falling short of these ideals shows that someone as unstinting in the free pursuit of knowledge as Broca could still be deflected by endemic and respectable bigotry. Society corrupts the best of us. It is a little unfair, I think, to criticize a person for not sharing the enlightenment of a later epoch, but it is also profoundly saddening that such prejudices were so extremely pervasive. The question raises nagging uncertainties about which of the conventional truths of our own age will be considered unforgivable bigotry by the next. One way to repay Paul Broca for this lesson which he has inadvertently provided us is to challenge, deeply and seriously, our own most strongly held beliefs.
These forgotten jars and their grisly contents had been collected, at least partly, in a humanistic spirit; and perhaps, in some era of future advance in brain studies, they would prove useful once again. I would be interested in knowing a little more about the redmustachioed man who had been, in part, returned to France from New Caledonia.
But the surroundings, the sense of a chamber of horrors, evoked unbidden other unsettling thoughts. At the very least, we feel in such a place a pang of sympathy for those-especially those who died young or in pain-who are in so unseemly a way thus memorialized. Cannibals in northwestern New Guinea employ stacked skulls for doorposts, and sometimes for lintels. Perhaps these are the most convenient building materials available, but the architects cannot be entirely unaware of the terror that their constructions evoke in unsuspecting passers-by. Skulls have been used by Hitler’s SS, Hell’s Angels, shamans, pirates, and even those who label bottles of iodine, in a conscious effort to elicit terror. And it makes perfectly good sense. If I find myself in a room filled with skulls, it is likely that there is someone nearby, perhaps a pack of hyenas, perhaps some gaunt and dedicated decapitator, whose occupation or hobby it is to collect skulls. Such fellows are almost certainly to be avoided, or, if possible, killed. The prickle of the hairs on the back of my neck, the increased heartbeat and pulse rate, that strange, clammy feeling are designed by evolution to make me fight or flee. Those who avoid decapitation leave more offspring. Experiencing such fears bestows an evolutionary advantage. Finding yourself in a room full of brains is still more horrifying, as if some unspeakable moral monster, armed with ghastly blades and scooping tools, were shuffling and drooling somewhere in the attics of the Musée de l’Homme.
But all depends, I think, on the purpose of the collection. If its objective is to find out, if it has acquired human parts post mortem-especially with the prior consent of those to whom the parts once belonged-then little harm has been done, and perhaps in the long run some significant human good. But I am not sure the scientists are entirely free of the motives of those New Guinea cannibals; are they not at least saying, “I live with these heads every day. They don’t bother me. Why should you be so squeamish?”?
LEONARDO AND VESALIUS were reduced to bribery and stealth in order to perform the first systematic dissections of human beings in Europe, although there had been a flourishing and competent school of anatomy in ancient Greece. The first person to locate, on the basis of neuroanatomy, human intelligence in the head was Herophilus of Chalcedon, who flourished around 300 B.C. He was also the first to distinguish the motor from the sensory nerves, and performed the most thorough study of brain anatomy attempted until the Renaissance. Undoubtedly there were those who objected to his gruesome experimental predilections. There is a lurking fear, made explicit in the Faust legend, that some things are not “meant” to be known, that some inquiries are too dangerous for human beings to make. And in our own age, the development of nuclear weapons may, if we are unlucky or unwise, turn out to be a case of precisely this sort. But in the case of experiments on the brain, our fears are less intellectual. They run deeper into our evolutionary past. They call up images of the wild boars and highwaymen who would terrorize travelers and rural populations in ancient Greece, by Procrustean mutilation or other savagery, until some hero-Theseus or Hercules-would effortlessly dispatch them. These fears have served an adaptive and useful function in the past. But I believe they are mostly emotional baggage in the present. I was interested, as a scientist who has written about the brain, to find such revulsions hiding in me, to be revealed for my inspection in Broca’s collection. These fears are worth fighting.
All inquiries carry with them some element of risk. There is no guarantee that the universe will conform to our predispositions. But I do not see how we can deal with the universe-both the outside and the inside universe-without studying it. The best way to avoid abuses is for the populace in general to be scientifically literate, to understand the implications of such investigations. In exchange for freedom of inquiry, scientists are obliged to explain their work. If science is considered a closed priesthood, too difficult and arcane for the average person to understand, the dangers of abuse are greater. But if science is a topic of general interest and concern-if both its delights and its social consequences are discussed regularly and competently in the schools, the press, and at the dinner table-we have greatly improved our prospects for learning how the world really is and for improving both it and us. That is an idea, I sometimes fancy, that may be sitting there still, sluggish with formalin, in Broca’s brain.
Nothing is rich but the inexhaustible wealth
of nature. She shows us only surfaces,
but she is a million fathoms deep.
RALPH WALDO EMERSON
SCIENCE IS A WAY of thinking much more than it is a body of knowledge. Its goal is to find out how the world works, to seek what regularities there may be, to penetrate to the connections of things-from subnuclear particles, which may be the constituents of all matter, to living organisms, the human social community, and thence to the cosmos as a whole. Our intuition is by no means an infallible guide. Our perceptions may be distorted by training and prejudice or merely because of the limitations of our sense organs, which, of course, perceive directly but a small fraction of the phenomena of the world. Even so straightforward a question as whether in the absence of friction a pound of lead falls faster than a gram of fluff was answered incorrectly by Aristotle and almost everyone else before the time of Galileo. Science is based on experiment, on a willingness to challenge old dogma, on an openness to see the universe as it really is. Accordingly, science sometimes requires courage-at the very least the courage to question the conventional wisdom.
Beyond this the main trick of science is to really think of something: the shape of clouds and their occasional sharp bottom edges at the same altitude everywhere in the sky; the formation of a dewdrop on a leaf; the origin of a name or a word-Shakespeare, say, or “philanthropic”; the reason for human social customs-the incest taboo, for example; how it is that a lens in sunlight can make paper burn; how a “walking stick” got to look so much like a twig; why the Moon seems to follow us as we walk; what prevents us from digging a hole down to the center of the Earth; what the definition is of “down” on a spherical Earth; how it is possible for the body to convert yesterday’s lunch into today’s muscle and sinew; or how far is up-does the universe go on forever, or if it does not, is there any meaning to the question of what lies on the other side? Some of these questions are pretty easy. Others, especially the last, are mysteries to which no one even today knows the answer. They are natural questions to ask. Every culture has posed such questions in one way or another. Almost always the proposed answers are in the nature of “Just So Stories,” attempted explanations divorced from experiment, or even from careful comparative observations.
But the scientific cast of mind examines the world critically as if many alternative worlds might exist, as if other things might be here which are not. Then we are forced to ask why what we see is present and not something else. Why are the Sun and the Moon and the planets spheres? Why not pyramids, or cubes, or dodecahedra? Why not irregular, jumbly shapes? Why so symmetrical, worlds? If you spend any time spinning hypotheses, checking to see whether they make sense, whether they conform to what else we know, thinking of tests you can pose to substantiate or deflate your hypotheses, you will find yourself doing science. And as you come to practice this habit of thought more and more you will get better and better at it. To penetrate into the heart of the thing-even a little thing, a blade of grass, as Walt Whitman said-is to experience a kind of exhilaration that, it may be, only human beings of all the beings on this planet can feel. We are an intelligent species and the use of our intelligence quite properly gives us pleasure. In this respect the brain is like a muscle. When we think well, we feel good. Understanding is a kind of ecstasy.
But to what extent can we really know the universe around us? Sometimes this question is posed by people who hope the answer will be in the negative, who are fearful of a universe in which everything might one day be known. And sometimes we hear pronouncements from scientists who confidently state that everything worth knowing will soon be known-or even is already known-and who paint pictures of a Dionysian or Polynesian age in which the zest for intellectual discovery has withered, to be replaced by a kind of subdued languor, the lotus eaters drinking fermented coconut milk or some other mild hallucinogen. In addition to maligning both the Polynesians, who were intrepid explorers (and whose brief respite in paradise is now sadly ending), as well as the inducements to intellectual discovery provided by some hallucinogens, this contention turns out to be trivially mistaken.
Let us approach a much more modest question: not whether we can know the universe or the Milky Way Galaxy or a star or a world. Can we know, ultimately and in detail, a grain of salt? Consider one microgram of table salt, a speck just barely large enough for someone with keen eyesight to make out without a microscope. In that grain of salt there are about 1016 sodium and chlorine atoms. This is a 1 followed by 16 zeros, 10 million billion atoms. If we wish to know a grain of salt, we must know at least the three-dimensional positions of each of these atoms. (In fact, there is much more to be known-for example, the nature of the forces between the atoms-but we are making only a modest calculation.) Now, is this number more or less than the number of things which the brain can know?
How much can the brain know? There are perhaps 1011 neurons in the brain, the circuit elements and switches that are responsible in their electrical and chemical activity for the functioning of our minds. A typical brain neuron has perhaps a thousand little wires, called dendrites, which connect it with its fellows. If, as seems likely, every bit of information in the brain corresponds to one of these connections, the total number of things knowable by the brain is no more than 1014, one hundred trillion. But this number is only one percent of the number of atoms in our speck of salt.
So in this sense the universe is intractable, astonishingly immune to any human attempt at full knowledge. We cannot on this level understand a grain of salt, much less the universe.
But let us look a little more deeply at our microgram of salt. Salt happens to be a crystal in which, except for defects in the structure of the crystal lattice, the position of every sodium and chlorine atom is predetermined. If we could shrink ourselves into this crystalline world, we would see rank upon rank of atoms in an ordered array, a regularly alternating structure-sodium, chlorine, sodium, chlorine, specifying the sheet of atoms we are standing on and all the sheets above us and below us. An absolutely pure crystal of salt could have the position of every atom specified by something like 10 bits of information. [1] This would not strain the information-carrying capacity of the brain.
If the universe had natural laws that governed its behavior to the same degree of regularity that determines a crystal of salt, then, of course, the universe would be knowable. Even if there were many such laws, each of considerable complexity, human beings might have the capability to understand them all. Even if such knowledge exceeded the information-carrying capacity of the brain, we might store the additional information outside our bodies-in books, for example, or in computer memories-and still, in some sense, know the universe.
Human beings are, understandably, highly motivated to find regularities, natural laws. The search for rules, the only possible way to understand such a vast and complex universe, is called science. The universe forces those who live in it to understand it. Those creatures who find everyday experience a muddled jumble of events with no predictability, no regularity, are in grave peril. The universe belongs to those who, at least to some degree, have figured it out.
It is an astonishing fact that there are laws of nature, rules that summarize conveniently-not just qualitatively but quantitatively-how the world works. We might imagine a universe in which there are no such laws, in which the 1080 elementary particles that make up a universe like our own behave with utter and uncompromising abandon. To understand such a universe we would need a brain at least as massive as the universe. It seems unlikely that such a universe could have life and intelligence, because beings and brains require some degree of internal stability and order. But even if in a much more random universe there were such beings with an intelligence much greater than our own, there could not be much knowledge, passion or joy.
Fortunately for us, we live in a universe that has at least important parts that are knowable. Our common-sense experience and our evolutionary history have prepared us to understand something of the workaday world. When we go into other realms, however, common sense and ordinary intuition turn out to be highly unreliable guides. It is stunning that as we go close to the speed of light our mass increases indefinitely, we shrink toward zero thickness in the direction of motion, and time for us comes as near to stopping as we would like. Many people think that this is silly, and every week or two I get a letter from someone who complains to me about it. But it is a virtually certain consequence not just of experiment but also of Albert Einstein’s brilliant analysis of space and time called the Special Theory of Relativity. It does not matter that these effects seem unreasonable to us. We are not in the habit of traveling close to the speed of light. The testimony of our common sense is suspect at high velocities.
Or consider an isolated molecule composed of two atoms shaped something like a dumbbell-a molecule of salt, it might be. Such a molecule rotates about an axis through the line connecting the two atoms. But in the world of quantum mechanics, the realm of the very small, not all orientations of our dumbbell molecule are possible. It might be that the molecule could be oriented in a horizontal position, say, or in a vertical position, but not at many angles in between. Some rotational positions are forbidden. Forbidden by what? By the laws of nature. The universe is built in such a way as to limit, or quantize, rotation. We do not experience this directly in everyday life; we would find it startling as well as awkward in sitting-up exercises, to find arms outstretched from the sides or pointed up to the skies permitted but many intermediate positions forbidden. We do not live in the world of the small, on the scale of 10−13 centimeters, in the realm where there are twelve zeros between the decimal place and the one. Our common-sense intuitions do not count. What does count is experiment-in this case observations from the far infrared spectra of molecules. They show molecular rotation to be quantized.
The idea that the world places restrictions on what humans might do is frustrating. Why shouldn’t we be able to have intermediate rotational positions? Why can’t we travel faster than the speed of light? But so far as we can tell, this is the way the universe is constructed. Such prohibitions not only press us toward a little humility; they also make the world more knowable. Every restriction corresponds to a law of nature, a regularization of the universe. The more restrictions there are on what matter and energy can do, the more knowledge human beings can attain. Whether in some sense the universe is ultimately knowable depends not only on how many natural laws there are that encompass widely divergent phenomena, but also on whether we have the openness and the intellectual capacity to understand such laws. Our formulations of the regularities of nature are surely dependent on how the brain is built, but also, and to a significant degree, on how the universe is built.
For myself, I like a universe that includes much that is unknown and, at the same time, much that is knowable. A universe in which everything is known would be static and dull, as boring as the heaven of some weakminded theologians. A universe that is unknowable is no fit place for a thinking being. The ideal universe for us is one very much like the universe we inhabit. And I would guess that this is not really much of a coincidence.
To punish me for my contempt for authority,
Fate made me an authority myself.
EINSTEIN
ALBERT EINSTEIN was born in Ulm, Germany, in 1879, just a century ago. He is one of the small group of people in any epoch who remake the world through a special gift, a talent for perceiving old things in new ways, for posing deep challenges to conventional wisdom. For many decades he was a saintly and honored figure, the only scientist the average person could readily name. In part because of his scientific accomplishments, at least dimly grasped by the public; in part because of his courageous positions on social issues; and in part because of his benign personality, Einstein was admired and revered throughout the world. For scientifically inclined children of immigrant parents, or those growing up in the Depression, like me, the reverence accorded Einstein demonstrated that there were such people as scientists, that a scientific career might not be totally beyond hope. One major function he involuntarily served was as a scientific role model. Without Einstein, many of the young people who became scientists after 1920 might never have heard of the existence of the scientific enterprise. The logic behind Einstein’s Special Theory of Relativity could have been developed a century earlier, but, although there were some premonitory insights by others, relativity had to wait for Einstein. Yet fundamentally the physics of special relativity is very simple, and many of the essential results can be derived from high school algebra and pondering a boat paddling upstream and downstream. Einstein’s life was rich in genius and irony, a passion for the issues of his time, insights into education, the connection between science and politics, and was a demonstration that individuals can, after all, change the world.
As a child Einstein gave little indication of what was to come. “My parents,” he recalled later, “were worried because I started to talk comparatively late, and they consulted the doctor because of it… I was at that time… certainly not younger than three.” He was an indifferent student in elementary school, where he said the teachers reminded him of drill sergeants. In Einstein’s youth, a bombastic nationalism and intellectual rigidity were the hallmarks of European education. He rebelled against the dull, mechanized methods of teaching. “I preferred to endure all sorts of punishment rather than learn to gabble by rote.” Einstein was always to detest rigid disciplinarians, in education, in science and in politics.
At five he was stirred by the mystery of a compass. And, he later wrote, “at the age of 12 I experienced a second wonder of a totally different nature in a little book dealing with Euclidean plane geometry… Here were assertions, as for example the intersection of the three altitudes of a triangle in one point, which-though by no means evident-could nevertheless be proved with such certainty that any doubt appeared to be out of the question. This lucidity and certainty made an indescribable impression upon me.” Formal schooling provided only a tedious interruption to such contemplations. Einstein wrote of his self-education: “At the age of 12 to 16 I familiarized myself with the elements of mathematics together with the principles of differential and integral calculus. In doing so I had the good fortune of finding books which were not too particular in their logical rigor, but which made up for this by permitting the main thoughts to stand out clearly and synoptically… I also had the good fortune of getting to know the essential results and methods of the entire field of the natural sciences in an excellent popular exposition, which limited itself almost throughout to qualitative aspects… a work which I read with breathless attention.” Modern popularizers of science may take some comfort from these words.
Not one of his teachers seems to have recognized his talents. At the Munich Gymnasium, the city’s leading secondary school, one of the teachers told him, “You’ll never amount to anything, Einstein.” At age fifteen it was strongly suggested that he leave school. The teacher observed, “Your very presence spoils the respect of the class for me.” He accepted this suggestion with gusto and spent many months wandering through northern Italy, a high school dropout in the 1890s. Throughout his life he preferred informal dress and manner. Had he been a teen-ager in the 1960s or 1970s rather than the 1890s, conventional people would almost certainly have called him a hippie.
Yet his curiosity about physics and his wonder about the natural universe soon overcame his distaste for formal education, and he found himself applying, with no high school diploma, to the Federal Institute of Technology in Zurich, Switzerland. He failed the entrance examination, enrolled himself in a Swiss high school to satisfy his deficiencies, and was admitted to the Federal Institute the following year. But he was still a mediocre student. He resented the prescribed curriculum, avoided the lecture room and tried to pursue his true interests. He later wrote: “The hitch in this was, of course, the fact that you had to cram all this stuff into your mind for the examination, whether you liked it or not.”
He managed to graduate only because his close friend Marcel Grossmann assiduously attended classes and shared his notes with Einstein. On Grossmann’s death many years later, Einstein wrote: “I remember our student days. He the irreproachable student, I myself disorderly and a dreamer. He, on good terms with the teachers and understanding everything; I a pariah, discontented and little loved… Then the end of our studies-I was suddenly abandoned by everyone, standing at a loss on the threshold of life.” By immersing himself in Grossmann’s notes, he managed to graduate from college. But, he recalled, studying for the final examinations “had such a deterring effect on me that… I found the consideration of any scientific problem distasteful to me for an entire year… It is little short of a miracle that modern methods of instruction have not already completely strangled the holy curiosity of inquiry, because what this delicate little plant needs most, apart from initial stimulation, is freedom; without that it is surely destroyed… I believe that one could even deprive a healthy beast of prey of its voraciousness, if one could force it with a whip to eat continuously whether it were hungry or not…” His remarks should be sobering to those of us engaged in higher education in science. I wonder how many potential Einsteins have been permanently discouraged through competitive examinations and the forced feeding of curricula.
After supporting himself with odd jobs, and being passed over for positions he considered desirable, Einstein accepted an offer as an examiner of applications at the Swiss Patent Office in Berne, an opportunity made available through the intervention of Marcel Grossmann’s father. About the same time he rejected his German nationality and became a Swiss citizen. Three years later, in 1903, he married his college sweetheart. Almost nothing is known about which patent applications Einstein approved and which he rejected. It would be interesting to know whether any of the proposed patents stimulated his thinking in physics.
One of his biographers, Banesh Hoffman, writes that at the Patent Office, Einstein “soon learned to do his chores efficiently and this let him snatch precious morsels of time for his own surreptitious calculations, which he guiltily hid in a drawer when footsteps approached.” Such were the circumstances attending the birth of the great relativity theory. But Einstein later nostalgically recalled the Patent Office as “that secular cloister where I hatched my most beautiful ideas.”
On several occasions he was to suggest to colleagues that the occupation of lighthouse keeper would be a suitable position for a scientist-because the work would be comparatively easy and would allow the contemplation necessary to do scientific research. “For Einstein,” said his collaborator Leopold Infeld, “loneliness, life in a lighthouse, would be most stimulating, would free him from so many of the duties which he hates. In fact it would be for him the ideal life. But nearly every scientist thinks just the opposite. It was the curse of my life that for a long time I was not in a scientific atmosphere, that I had no one with whom to talk physics.”
Einstein also believed that there was something dishonest about making money by teaching physics. He argued that it was far better for a physicist to support himself by some other simple and honest labor, and do physics in his spare time. When making a similar remark many years later in America, Einstein mused that he would have liked to be a plumber, and was promptly awarded honorary membership in the plumbers’ union.
In 1905 Einstein published four research papers, the product of his spare time at the Swiss Patent Office, in the leading physics journal of the time, the Annalen der Physik. The first demonstrated that light has particle as well as wave properties, and explained the previously baffling photoelectric effect in which electrons are emitted by solids when irradiated by light. The second explored the nature of molecules by explaining the statistical “Brownian motion” of suspended small particles. And the third and fourth introduced the Special Theory of Relativity and for the first time expressed the famous equation, E = mc2, which is so widely quoted and so rarely understood.
The equation expresses the convertibility of matter into energy, and vice versa. It extends the law of the conservation of energy into a law of conservation of energy and mass, stating that energy and mass can be neither created nor destroyed-although one form of energy or matter can be converted into another form. In the equation, E stands for the energy equivalent of the mass m. The amount of energy that could, under ideal circumstances, be extracted from a mass m is mc2, where c is the velocity of light = 30 billion centimeters per second. (The velocity of light is always written as lower-case c, never as upper-case.) If we measure m in grams and c in centimeters per second, E is measured in a unit of energy called ergs. The complete conversion of one gram of mass into energy thus releases 1 × (3 × 1010)2 = 9 × 1020 ergs, which is the equivalent of the explosion of roughly a thousand tons of TNT. Thus enormous energy resources are contained in tiny amounts of matter, if only we knew how to extract the energy. Nuclear weapons and nuclear power plants are common terrestrial examples of our halting and ethically ambiguous efforts to extract the energy that Einstein showed was present in all of matter. A thermonuclear weapon, a hydrogen bomb, is a device of terrifying power-but even it is capable of extracting less than one percent of mc2 from a mass m of hydrogen.
Einstein’s four papers published in 1905 would have been an impressive output for the full-time research work of a physicist over a lifetime; for the spare-time work of a twenty-six-year-old Swiss patent clerk in a single year it is nothing short of astonishing. Many historians of science have called 1905 the Annus Mirabilis, the miracle year. There had been, with uncanny resemblances, only one previous such year in the history of physics-1666, when Isaac Newton, aged twenty-four, in enforced rural isolation (because of an epidemic of bubonic plague) produced an explanation for the spectral nature of sunlight, invented differential and integral calculus, and devised the universal theory of gravitation. Together with the General Theory of Relativity, first formulated in 1915, the 1905 papers represent the principal output of Einstein’s scientific life.
Before Einstein, it was widely held by physicists that there were privileged frames of reference, such things as absolute space and absolute time. Einstein’s starting point was that all frames of reference-all observers, no matter what their locale, velocity or acceleration-would see the fundamental laws of nature in the same way. It seems likely that Einstein’s view on frames of reference was influenced by his social and political attitudes and his resistance to the strident jingoism he found in late-nineteenth-century Germany. Indeed, in this sense the idea of relativity has become an anthropological commonplace, and social scientists have adopted the idea of cultural relativism: there are many different social contexts and world views, ethical and religious precepts, expressed by various human societies, and most of comparable validity.
Special relativity was at first by no means widely accepted. Attempting once again to break into an academic career, Einstein submitted his already published relativity paper to Berne University as an example of his work. He evidently considered it a significant piece of research. It was rejected as incomprehensible, and he was to remain at the Patent Office until 1909. But his published work did not go unnoticed, and it slowly began to dawn on a few of the leading European physicists that Einstein might well be one of the greatest scientists of all time. Still, his work on relativity remained highly controversial. In a letter of recommendation for Einstein for a position at the University of Berlin, a leading German scientist suggested that relativity was a hypothetical excursion, a momentary aberration, and that, despite it, Einstein really was a first-rate thinker. (His Nobel Prize, which he learned about during a visit to the Orient in 1921, was awarded for his paper on the photoelectric effect and “other contributions” to theoretical physics. Relativity was still considered too controversial to be mentioned explicitly.)
Einstein’s views on religion and politics were connected. His parents were of Jewish origin, but they did not observe religious ritual. Nevertheless, Einstein came to a conventional religiosity “by way of the traditional education machine, the State and the schools.” But at age twelve this came to an abrupt end: “Through the reading of popular scientific books I soon reached the conviction that much of the stories of the Bible could not be true. The consequence was a positively fanatic free thinking coupled with the impression that youth is intentionally being deceived by the State through lies; it was a crushing impression. Suspicion against every kind of authority grew out of this experience, a skeptical attitude towards the convictions which were alive in any specific social environment-an attitude which has never again left me, even though later on, because of a better insight into the causal connections, it lost some of its original poignancy.”
Just before the outbreak of World War I, Einstein accepted a professorship at the well-known Kaiser Wilhelm Institute in Berlin. The desire to be at the leading center of theoretical physics was momentarily stronger than his antipathy to German militarism. The outbreak of World War I caught Einstein’s wife and two sons in Switzerland, unable to return to Germany. A few years later this enforced separation led to divorce, but on receiving the Nobel Prize in 1921, Einstein, although since remarried, donated the full $30,000 to his first wife and their children. His eldest son later became a significant figure in civil engineering, holding a professorship at the University of California, but his second son, who idolized his father, accused him-in later years, and to Einstein’s great anguish-of having ignored him during his youth.
Einstein, who described himself as a socialist, became convinced that World War I was largely the result of the scheming and incompetence of “the ruling classes,” a conclusion with which many contemporary historians agree. He became a pacifist. When other German scientists enthusiastically supported their nation’s military enterprises, Einstein publicly condemned the war as “an epidemic delusion.” Only his Swiss citizenship prevented him from being imprisoned, as indeed happened to his friend the philosopher Bertrand Russell in England at the same time and for the same reason. Einstein’s views on the war did not increase his popularity in Germany.
However, the war did, indirectly, play a role in making Einstein’s name a household word. In his General Theory of Relativity Einstein explored the proposition-an idea still astonishing in its simplicity, beauty and power-that the gravitational attraction between two masses comes about by those masses distorting or bending ordinary Euclidean space nearby. The quantitative theory reproduced, to the accuracy to which it had been tested, Newton’s law of universal gravitation. But in the next decimal place, so to speak, general relativity predicted significant differences from Newton’s views. This is in the classic tradition of science, in which new theories retain the established results of the old but make a set of new predictions which permits a decisive distinction to be drawn between the two outlooks.
The three tests of general relativity that Einstein proposed concerned anomalies in the motion of the orbit of the planet Mercury, the red shifts in the spectral lines of light emitted by a massive star, and the deflection of starlight near the Sun. Before the Armistice was signed in 1919, British expeditions were mustered to Brazil and to the island of Principe off West Africa to observe, during a total eclipse of the Sun, whether the deflection of starlight was in accord with the predictions of general relativity. It was. Einstein’s views were vindicated; and the symbolism of a British expedition confirming the work of a German scientist when the two countries were still technically at war appealed to the better instincts of the public.
But at the same time, a well-financed public campaign against Einstein was launched in Germany. Mass meetings with anti-Semitic overtones were staged in Berlin and elsewhere to denounce the relativity theory. Einstein’s colleagues were shocked, but most of them, too timid for politics, did nothing to counter it. With the rise of the Nazis in the 1920s and early 1930s, Einstein, against his natural inclination for a life of quiet contemplation, found himself speaking up-courageously and often. He testified in German courts on behalf of academics on trial for their political views. He appealed for amnesty for political prisoners in Germany and abroad (including Sacco and Vanzetti and the Scottsboro “boys” in the United States). When Hitler became chancellor in 1933, Einstein and his second wife fled Germany.
The Nazis burned Einstein’s scientific works, along with other books by anti-Fascist authors, in public bonfires. An all-out assault was launched on Einstein’s scientific stature. Leading the attack was the Nobel laureate physicist Philipp Lenard, who denounced what he called the “mathematically botched-up theories of Einstein” and the “Asiatic spirit in Science.” He went on: “Our Führer has eliminated this same spirit in politics and national economy, where it is known as Marxism. In natural science, however, with the overemphasis on Einstein, it still holds sway. We must recognize that it is unworthy of a German to be the intellectual follower of a Jew. Natural science, properly so-called, is of completely aryan origin… Heil Hitler!”
Many Nazi scholars joined in warning against the “Jewish” and “Bolshevik” physics of Einstein. Ironically, in the Soviet Union at about the same time, prominent Stalinist intellectuals were denouncing relativity as “bourgeois physics.” Whether or not the substance of the theory being attacked was correct was, of course, never considered in such deliberations.
Einstein’s identification of himself as a Jew, despite his profound estrangement from traditional religions, was due entirely to the upsurge of anti-Semitism in Germany in the 1920s. For this reason he also became a Zionist. But according to his biographer Philipp Frank, not all Zionist groups welcomed him, because he demanded that the Jews make an effort to befriend the Arabs and to understand their way of life-a devotion to cultural relativism made more impressive by the difficult emotional issues involved. However, he continued to support Zionism, particularly as the increasing desperation of European Jews became known in the late 1930s. (In 1948 Einstein was offered the presidency of Israel, but politely declined. It is interesting to speculate what differences in the politics of the Near East, if any, might have been produced by Albert Einstein as the president of Israel.)
After leaving Germany, Einstein learned that the Nazis had placed a price of 20,000 marks on his head. (“I didn’t know it was worth so much.”) He accepted an appointment at the recently founded Institute for Advanced Study in Princeton, New Jersey, where he was to remain for the rest of his life. When asked what salary he thought fair, he suggested $3,000. Seeing a look of astonishment pass over the face of the representative of the Institute, he concluded he had proposed too much and mentioned a smaller amount. His salary was set at $16,000, a goodly sum for the 1930s.
Einstein’s prestige was so high that it was natural for other émigré European physicists in the United States to approach him in 1939 to write a letter to President Franklin D. Roosevelt, proposing the development of an atomic bomb to outstrip a likely German effort to acquire nuclear weapons. Although Einstein had not been working in nuclear physics and later played no role in the Manhattan Project, he wrote the initial letter that led to the establishment of the Manhattan Project. It is likely, however, that the bomb would have been developed by the United States regardless of Einstein’s urging. Even without E = mc2, the discovery of radioactivity by Antoine Becquerel and the investigation of the atomic nucleus by Ernest Rutherford-both done entirely independently of Einstein-would very likely have led to the development of nuclear weapons. Einstein’s dread of Nazi Germany had long since caused him to abandon, although with considerable pain, his pacifist views. But when it later transpired that the Nazis had been unable to develop nuclear weapons, Einstein expressed remorse: “Had I known that the Germans would not succeed in developing an atomic bomb, I would have done nothing for the bomb.”
In 1945 Einstein urged the United States to break its relations with Franco Spain, which had supported the Nazis in World War II. John Rankin, a conservative congressman from Mississippi, attacked Einstein in a speech to the House of Representatives, declaring that “this foreign-born agitator would have us plunge into another war in order to further the spread of Communism throughout the world… It is about time the American people got wise to Einstein.”
Einstein was a powerful defender of civil liberties in the United States during the darkest period of McCarthyism in the late 1940s and early 1950s. Watching the rising tide of hysteria, he had the disturbing feeling that he had seen something similar in Germany in the 1930s. He urged defendants to refuse to testify before the House Un-American Activities Committee, saying that every person should be “prepared for jail and economic ruin… for the sacrifice of his personal welfare in the interest of… his country.” He held that there was “a duty in refusing to cooperate in any undertaking that violates the Constitutional rights of the individual. This holds in particular for all inquisitions that are concerned with the private life and the political affiliations of the citizens…” For taking this position, Einstein was widely attacked in the press. And Senator Joseph McCarthy stated in 1953 that anyone who proffered such advice was “himself an enemy of America.” In his later years it became fashionable in some circles to couple an acknowledgment of Einstein’s scientific genius with a patronizing dismissal of his political views as “native.” But times have changed. I wonder if it is not more reasonable to argue in quite a different direction: in a field such as physics, where ideas can be quantified and tested with great precision, Einstein’s insights stand unrivaled, and we are astonished that he could see so clearly where others were lost in confusion. Is it not worth considering that in the much murkier field of politics his insights might also have some fundamental validity?
In his Princeton years Einstein’s passion remained, as always, the life of the mind. He worked long and hard on a Unified Field Theory which would combine gravitation, electricity and magnetism on a common basis, but his attempt is widely considered to have been unsuccessful. He lived to see his General Theory of Relativity incorporated as the principal tool for understanding the large-scale structure and evolution of the universe, and would have been delighted to witness the vigorous application of general relativity occurring in astrophysics today. He never understood the reverence with which he was held, and indeed complained that his colleagues and Princeton graduate students would not drop in on him unannounced for fear of disturbing him.
But he wrote: “My passionate interest in social justice and social responsibility has always stood in curious contrast to a marked lack of desire for direct association with men and women. I am a horse for single harness, not cut out for tandem or team work. I have never belonged wholeheartedly to country or State, to my circle of friends or even to my own family. These ties have always been accompanied by a vague aloofness, and the wish to withdraw into myself increases with the years. Such isolation is sometimes bitter, but I do not regret being cut off from the understanding and sympathy of other men. I lose something by it, to be sure, but I am compensated for it in being rendered independent of the customs, opinions and prejudices of others and am not tempted to rest my peace of mind upon such shifting foundations.”
His principal recreations throughout his life were playing the violin and sailing. In these years Einstein looked like and in some respects was a sort of aging hippie. He let his white hair grow long and preferred sweaters and a leather jacket to a suit and tie, even when entertaining famous visitors. He was utterly without pretense and, with no affectation, explained that “I speak to everyone in the same way, whether he is the garbage man or the President of the University.” He was often available to the public, sometimes being willing to help high school students with their geometry problems-not always successfully. In the best scientific tradition he was open to new ideas but required that they pass rigorous standards of evidence. He was open-minded but skeptical about claims of planetary catastrophism in recent Earth history and about experiments alleging extrasensory perception, his reservations about the latter stemming from contentions that purported telepathic abilities do not decline with increasing distance between sender and receiver.
In matters of religion, Einstein thought more deeply than many others and was repeatedly misunderstood. On the occasion of Einstein’s first visit to America, Cardinal O’Connell of Boston warned that the relativity theory “cloaked the ghastly apparition of atheism.” This alarmed a New York rabbi who cabled Einstein: “Do you believe in God?” Einstein cabled back: “I believe in Spinoza’s God, who revealed himself in the harmony of all being, not in the God who concerns himself with the fate and actions of men”-a more subtle religious view embraced by many theologians today. Einstein’s religious beliefs were very genuine. In the 1920s and 1930s he expressed grave doubts about a basic precept of quantum mechanics: that at the most fundamental level of matter, particles behave in an unpredictable way, as expressed by the Heisenberg uncertainty principle. Einstein said, “God does not play dice with the cosmos.” And on another occasion he asserted, “God is subtle, but he is not malicious.” In fact, Einstein was so fond of such aphorisms that the Danish physicist Niels Bohr turned to him on one occasion and with some exasperation said, “Stop telling God what to do.” But there were many physicists who felt that if anyone knew God’s intentions, it was Einstein.
One of the foundations of special relativity is the precept that no material object can travel as fast as light. This light barrier has proved annoying to many people who wish there to be no constraints on what human beings might ultimately do. But the light limit permits us to understand much of the world that was previously mysterious in a simple and elegant way. However, where Einstein taketh away, he also giveth. There are several consequences of special relativity that seem counterintuitive, contrary to our everyday experience, but that emerge in a detectable fashion when we travel close to the speed of light-a regime of velocity in which common sense has had little experience (Chapter 2). One of these consequences is that as we travel sufficiently close to the speed of light, time slows down-our wristwatches, our atomic clocks, our biological aging. Thus a space vehicle traveling very close to the speed of light could travel between any two places, no matter how distant, in any conveniently short period of time-as measured on board the spacecraft, but not as measured on the launch planets. We might therefore one day travel to the center of the Milky Way Galaxy and return in a time of a few decades measured on board the ship-although, as measured back on Earth, the elapsed time would be sixty thousand years, and very few of the friends who saw us off would be around to commemorate our return. A vague recognition of this time dilation was made in the motion picture Close Encounters of the Third Kind, although a gratuitous opinion was then injected that Einstein was probably an extraterrestrial. His insights were stunning, to be sure, but he was very human, and his life stands as an example of what, if they are sufficiently talented and courageous, human beings can accomplish.
EINSTEIN’S LAST public act was to join with Bertrand Russell and many other scientists and scholars in an unsuccessful attempt to bring about a ban on the development of nuclear weapons. He argued that nuclear weapons had changed everything except our way of thinking. In a world divided into hostile states he viewed nuclear energy as the greatest menace to the survival of the human race. “We have the choice,” he said, “to outlaw nuclear weapons or face general annihilation… Nationalism is an infantile disease. It is the measles of mankind… Our schoolbooks glorify war and hide its horrors. They inculcate hatred in the veins of children. I would teach peace rather than war. I would inculcate love rather than hate.”
At age sixty-seven, nine years before his death in 1955, Einstein described his lifelong quest: “Out yonder there was this huge world, which exists independently of us human beings and which stands before us like a great, eternal riddle, at least partially accessible to our inspection and thinking. The contemplation of this world beckoned like a liberation… The road to this paradise was not so comfortable and alluring as the road to the religious paradise; but it has proved itself as trustworthy, and I have never regretted having chosen it.”
The cultivation of the mind is a kind of food
supplied for the soul of man.
MARCUS TULLIUS CICERO,
De Finibus Bonorum et Malorum,
Vol. 19 (45-44 B.C.)
To one, science is an exalted goddess;
to another it is a cow which provides him
with butter.
FRIEDRICH VON SCHILLER,
Xenien (1796)
IN THE MIDDLE of the nineteenth century, the largely self-educated British physicist Michael Faraday was visited by his monarch, Queen Victoria. Among Faraday’s many celebrated discoveries, some of obvious and immediate practical benefit, were more arcane findings in electricity and magnetism, then little more than laboratory curiosities. In the traditional dialogue between heads of state and heads of laboratories, the Queen asked Faraday of what use such studies were, to which he is said to have replied, “Madam, of what use is a baby?” Faraday had an idea that there might someday be something practical in electricity and magnetism.
In the same period the Scottish physicist James Clerk Maxwell set down four mathematical equations, based on the work of Faraday and his experimental predecessors, relating electrical charges and currents with electric and magnetic fields. The equations exhibited a curious lack of symmetry, and this bothered Maxwell. There was something unaesthetic about the equations as then known, and to improve the symmetry Maxwell proposed that one of the equations should have an additional term, which he called the displacement current. His argument was fundamentally intuitive; there was certainly no experimental evidence for such a current. Maxwell’s proposal had astonishing consequences. The corrected Maxwell equations implied the existence of electromagnetic radiation, encompassing gamma rays, X-rays, ultraviolet light, visible light, infrared and radio. They stimulated Einstein to discover Special Relativity. Faraday and Maxwell’s laboratory and theoretical work together have led, one century later, to a technical revolution on the planet Earth. Electric lights, telephones, phonographs, radio, television, refrigerated trains making fresh produce available far from the farm, cardiac pacemakers, hydroelectric power plants, automatic fire alarms and sprinkler systems, electric trolleys and subways, and the electronic computer are a few devices in the direct evolutionary line from the arcane laboratory puttering of Faraday and the aesthetic dissatisfaction of Maxwell, staring at some mathematical squiggles on a piece of paper. Many of the most practical applications of science have been made in this serendipitous and unpredictable way. No amount of money would have sufficed in Victoria’s day for the leading scientists in Britain to have simply sat down and invented, let us say, television. Few would argue that the net effect of these inventions was other than positive. I notice that even many young people who are profoundly disenchanted with Western technological civilization, often for good reason, still retain a passionate fondness for certain aspects of high technology-for example, high-fidelity electronic music systems.
Some of these inventions have fundamentally changed the character of our global society. Ease of communication has deprovincialized many parts of the world, but cultural diversity has been likewise diminished. The practical advantages of these inventions are recognized in virtually all human societies; it is remarkable how infrequently emerging nations are concerned with the negative effects of high technology (environmental pollution, for example); they have clearly decided that the benefits outweigh the risks. One of Lenin’s aphorisms was that socialism plus electrification equals communism. But there has been no more vigorous or inventive pursuit of high technology than in the West. The resulting rate of change has been so rapid that many of us find it difficult to keep up. There are many people alive today who were born before the first airplane and have lived to see Viking land on Mars, and Pioneer 10, the first interstellar spacecraft, be ejected from the solar system, or who were raised in a sexual code of Victorian severity and now find themselves immersed in substantial sexual freedom, brought about by the widespread availability of effective contraceptives. The rate of change has been disorienting for many, and it is easy to understand the nostalgic appeal of a return to an earlier and simpler existence.
But the standard of living and conditions of work for the great bulk of the population in, say, Victorian England, were degrading and demoralizing compared to industrial societies today, and the life-expectancy and infant-mortality statistics were appalling. Science and technology may be in part responsible for many of the problems that face us today-but largely because public understanding of them is desperately inadequate (technology is a tool, not a panacea), and because insufficient effort has been made to accommodate our society to the new technologies. Considering these facts, I find it remarkable that we have done as well as we have. Luddite alternatives can solve nothing. More than one billion people alive today owe the margin between barely adequate nutrition and starvation to high agricultural technology. Probably an equal number have survived, or avoided disfiguring, crippling or killing diseases because of high medical technology. Were high technology to be abandoned, these people would also be abandoned. Science and technology may be the cause of some of our problems, but they are certainly an essential element in any foreseeable solution to those same problems-both nationally and planetwide.
I do not think that science and technology have been pursued as effectively, with as much attention to their ultimate humane objectives and with as adequate a public understanding as, with a little greater effort, could have been accomplished. It has, for example, gradually dawned on us that human activities can have an adverse effect on not only the local but also the global environment. By accident a few research groups in atmospheric photochemistry discovered that halocarbon propellants from aerosol spray cans will reside for very long periods in the atmosphere, circulate to the stratosphere, partially destroy the ozone there, and let ultraviolet light from the sun leak down to the Earth’s surface. Increased skin cancer for whites was the most widely advertised consequence (blacks are neatly adapted to increased ultraviolet flux). But very little public attention has been given to the much more serious possibility that microorganisms, occupying the base of an elaborate food pyramid at the top of which is Homo sapiens, might also be destroyed by the increased ultraviolet light. Steps have finally, although reluctantly, been taken to ban halocarbons from spray cans (although no one seems to be worrying about the same molecules used in refrigerators) and as a result the immediate dangers are probably slight. What I find most worrisome about this incident is how accidental was the discovery that the problem existed at all. One group approached this problem because it had written the appropriate computer programs, but in quite a different context: they were concerned with the chemistry of the atmosphere of the planet Venus, which contains hydrochloric and hydrofluoric acids. The need for a broad and diverse set of research teams, working on a great variety of problems in pure science, is clearly required for our continued survival. But what other problems, even more severe, exist which we do not know about because no research group happens as yet to have stumbled on them? For each problem we have uncovered, such as the effect of halocarbons on the ozonosphere, might there not be another dozen lurking around the corner? It is therefore an astonishing fact that nowhere in the federal government, major universities or private research institutes is there a single highly competent, broadly empowered and adequately funded research group whose function it is to seek out and defuse future catastrophes resulting from the development of new technologies.
The establishment of such research and environmental assessment organizations will require substantial political courage if they are to be effective at all. Technological societies have a tightly knit industrial ecology, an interwoven network of economic assumptions. It is very difficult to challenge one thread in the network without causing tremors in all. Any judgment that a technological development will have adverse human consequences implies a loss of profit for someone. The DuPont Company, the principal manufacturers of halocarbon propellants, for example, took the curious position in public debates that all conclusions about halocarbons destroying the ozonosphere were “theoretical.” They seemed to be implying that they would be prepared to stop halocarbon manufacture only after the conclusions were tested experimentally-that is, when the ozonosphere was destroyed. There are some problems where inferential evidence is all that we will have; where once the catastrophe arrives it is too late to deal with it.
Similarly, the new Department of Energy can be effective only if it can maintain a distance from vested commercial interests, if it is free to pursue new options even if such options imply loss of profits for selected industries. The same is clearly true in pharmaceutical research, in the pursuit of alternatives to the internal-combustion engine, and in many other technological frontiers. I do not think that the development of new technologies should be placed in the control of old technologies; the temptation to suppress the competition is too great. If we Americans live in a free-enterprise society, let us see substantial independent enterprise in all of the technologies upon which our future may depend. If organizations devoted to technological innovation and its boundaries of acceptability are not challenging (and perhaps even offending) at least some powerful groups, they are not accomplishing their purpose.
There are many practical technological developments that are not being pursued for lack of government support. For example, as agonizing a disease as cancer is, I do not think it can be said that our civilization is threatened by it. Were cancer to be cured completely, the average life expectancy would be extended by only a few years, until some other disease-which does not now have its chance at cancer victims-takes over. But a very plausible case can be made that our civilization is fundamentally threatened by the lack of adequate fertility control. Exponential increases of population will dominate any arithmetic increases, even those brought about by heroic technological initiatives, in the availability of food and resources, as Malthus long ago realized. While some industrial nations have approached zero population growth, this is not the case for the world as a whole.
Minor climatic fluctuations can destroy entire populations with marginal economies. In many societies where the technology is meager and reaching adulthood an uncertain prospect, having many children is the only possible hedge against a desperate and uncertain future. Such a society, in the grip of a consuming famine, for example, has little to lose. At a time when nuclear weapons are proliferating unconscionably, when an atomic device is almost a home handicraft industry, widespread famine and steep gradients in affluence pose serious dangers to both the developed and the underdeveloped worlds. The solution to such problems certainly requires better education, at least a degree of technological self-sufficiency, and, especially, fair distribution of the world’s resources. But it also cries out for entirely adequate contraception-long-term, safe birth-control pills, available for men as well as for women, perhaps to be taken once a month or over even longer intervals. Such a development would be very useful not just abroad but also here at home, where considerable concern is being expressed about the side effects of the conventional estrogen oral contraceptives. Why is there no major effort for such a development?
Many other technological initiatives are being proposed and ought to be examined very seriously. They range from the very cheap to the extremely expensive. At one end is soft technology-for example, the development of closed ecological systems involving algae, shrimp and fish which could be maintained in rural ponds and provide a highly nutritious and extremely low-cost dietary supplement. At the other is the proposal of Gerard O’Neill of Princeton University to construct large orbital cities that would, using lunar and asteroidal materials, be self-propagating-one city being able to construct another from extraterrestrial resources. Such cities in Earth orbit might be used in converting sunlight into microwave energy and beaming power down to Earth. The idea of independent cities in space-each perhaps built on differing social, economic or political assumptions, or having different ethnic antecedents-is appealing, an opportunity for those deeply disenchanted with terrestrial civilizations to strike out on their own somewhere else. In its earlier history, America provided such an opportunity for the restless, ambitious and adventurous. Space cities would be a kind of America in the skies. They also would greatly enhance the survival potential of the human species. But the project is extremely expensive, costing at minimum about the same as one Vietnam war (in resources, not in lives). In addition, the idea has the worrisome overtone of abandoning the problems on the Earth-where, after all, self-contained pioneering communities can be established at much less cost.
Clearly, there are more technological projects now possible than we can afford. Some of them may be extremely cost-effective but may have such large start-up costs as to remain impractical. Others may require a daring initial investment of resources, which will work a benevolent revolution in our society. Such options have to be considered extremely carefully. The most prudent strategy calls for combining low-risk/moderate-yield and moderate-risk/high-yield endeavors.
For such technological initiatives to be understood and supported, significant improvements in public understanding of science and technology are essential. We are thinking beings. Our minds are our distinguishing characteristic as a species. We are not stronger or swifter than many other animals that share this planet with us. We are only smarter. In addition to the immense practical benefit of having a scientifically literate public, the contemplation of science and technology permits us to exercise our intellectual faculties to the limits of our capabilities. Science is an exploration of the intricate, subtle and awesome universe we inhabit. Those who practice it know, at least on occasion, a rare kind of exhilaration that Socrates said was the greatest of human pleasures. It is a communicable pleasure. To facilitate informed public participation in technological decision making, to decrease the alienation too many citizens feel from our technological society, and for the sheer joy that comes from knowing a deep thing well, we need better science education, a superior communication of its powers and delights. A simple place to start is to undo the self-destructive decline in federal scholarships and fellowships for science researchers and science teachers at the college, graduate and postdoctoral levels.
The most effective agents to communicate science to the public are television, motion pictures and newspapers-where the science offerings are often dreary, inaccurate, ponderous, grossly caricatured or (as with much Saturday-morning commercial television programing for children) hostile to science. There have been astonishing recent findings on the exploration of the planets, the role of small brain proteins in affecting our emotional lives, the collisions of continents, the evolution of the human species (and the extent to which our past prefigures our future), the ultimate structure of matter (and the question of whether there are elementary particles or an infinite regress of them), the attempt to communicate with civilizations on planets of other stars, the nature of the genetic code (which determines our heredity and makes us cousins to all the other plants and animals on our planet), and the ultimate questions of the origin, nature and fate of life, worlds and the universe as a whole. Recent findings on these questions can be understood by any intelligent person. Why are they so rarely discussed in the media, in schools, in everyday conversation?
Civilizations can be characterized by how they approach such questions, how they nourish the mind as well as the body. The modern scientific pursuit of these questions represents an attempt to acquire a generally accepted view of our place in the cosmos; it requires open-minded creativity, tough-minded skepticism and a fresh sense of wonder. These questions are different from the practical issues I discussed earlier, but they are connected with such issues and-as in the example of Faraday and Maxwell-the encouragement of pure research may be the most reliable guarantee available that we will have the intellectual and technical wherewithal to deal with the practical problems facing us.
Only a small fraction of the most able youngsters enter scientific careers. I am often amazed at how much more capability and enthusiasm for science there is among elementary school youngsters than among college students. Something happens in the school years to discourage their interest (and it is not mainly puberty); we must understand and circumvent this dangerous discouragement. No one can predict where the future leaders of science will come from. It is clear that Albert Einstein became a scientist in spite of, not because of, his schooling (Chapter 3). In his Autobiography, Malcolm X describes a numbers runner who never wrote down a bet but carried a lifetime of transactions perfectly in his head. What contributions to society, Malcolm asked, would such a person have made with adequate education and encouragement? The most brilliant youngsters are a national and a global resource. They require special care and feeding.
Many of the problems facing us may be soluble, but only if we are willing to embrace brilliant, daring and complex solutions. Such solutions require brilliant, daring and complex people. I believe that there are many more of them around-in every nation, ethnic group and degree of affluence-than we realize. The training of such youngsters must not, of course, be restricted to science and technology; indeed, the compassionate application of new technology to human problems requires a deep understanding of human nature and human culture, a general education in the broadest sense.
We are at a crossroads in human history. Never before has there been a moment so simultaneously perilous and promising. We are the first species to have taken our evolution into our own hands. For the first time we possess the means for intentional or inadvertent self-destruction. We also have, I believe, the means for passing through this stage of technological adolescence into a long-lived, rich and fulfilling maturity for all the members of our species. But there is not much time to determine to which fork of the road we are committing our children and our future.