DISCUSSIONS AND OPINIONS

FACT AND/OR/PLUS FICTION

In 1998 the editors of the interesting litcrit magazine Paradoxa asked me to contribute to an issue on “the future of narrative,” and this was the result. I have edited and fiddled with it here and there.

In earlier times, when we divided narrative into the secular and the sacred, factuality and invention were both considered to be properties of the former, and Truth the quality of the latter. With the decline of a consensus opinion concerning Truth, the difference between fact and fiction began to take on more importance, and we took to dividing narrative into fiction and nonfiction.

This division, maintained by publishers, librarians, booksellers, teachers, and most writers, I find to be fundamental to my own concept of narrative and its uses. The file in my computer that I’m using now is labelled “Nonfiction in Progress,” as distinct from the “Fiction in Progress” file. But, perhaps as part of the postmodern boundary breakdown, some files are coalescing; a lot of fiction seems to be getting into certain types of nonfiction. I like genre transgression, but this may involve more than genre. To start thinking about it, I called as usual on the OED.

FICTION:

[1, 2—obsolete usages]

3.a. The action of ‘feigning’ or inventing imaginary incidents, existences, states of things, etc., whether for the purpose of deception or otherwise. […] Bacon, 1605: “… so great an affinitie hath fiction and beleefe.” […]

b. That which, or something that, is imaginatively invented; feigned existence, event, or state of things; invention as opposed to fact. [First citation 1398.]

4. The species of literature which is concerned with the narration of imaginary events and the portraiture of imaginary characters; fictitious composition. Now, usually, prose novels and stories collectively; the composition of works of this class. [First citation 1599.]

(Definitions 5 and after concern nonliterary and derogatory uses of the word—deliberate falsehood, moonshine, yarn spinning, and so on.)

As for the word nonfiction, it isn’t in the OED. Probably if I went to a contemporary American dictionary I’d find it, but not having one, and having found the thesaurus in my Macintosh a nice source of current usage, I asked it for its synonyms and antonyms to “fiction.” It gave me “story” as the principal synonym, then “unreality,” and then “drama, fantasy, myth, novel, romance, legend, tale.” All these synonyms except “unreality” have to do with the literary use of the word.

The principal antonym is “actuality,” then “authenticity, biography, certainty, circumstance, event, face [?], fact, genuineness, happening, history, incident, occurrence, reality.” Only two of the antonyms refer to literature: history and biography.

The antonyms didn’t include “nonfiction,” which I thought a quite common word by now. I tried the thesaurus with “nonfiction.” All it could give me was what it calls a Close Word—“fiction.”

Is my Macintosh telling me the words “fiction” and “nonfiction” are so close in meaning they can be used interchangeably?

Possibly this is what is happening.

A good deal has been said and written here and there about this blurring of definition or melding of modes, though I don’t know of a methodical or scholarly study. Most of what I’ve read on the subject has been by nonfiction writers defending their use of techniques and freedoms that have been seen as pertaining properly or only to fiction. Their arguments include the following: Since total accuracy is impossible, invention in a purportedly factual report is inevitable; since nobody perceives the same event the same way, factuality is always in question; artistic license may reach a higher form of authenticity than mere accuracy; and (therefore? anyhow?) writers have the right to write a story the way they want to.

The journalist Janet Malcolm, sued by her interviewee Jeffrey Masson for deliberate and defamatory misquotation, defended her form of journalism in a New Yorker article with such arguments. Perhaps she was inspired by Truman Capote, who called his In Cold Blood (also published in the New Yorker) a “nonfiction novel,” apparently to elevate it above mere reportage and incidentally defend himself from accusations of playing a bit fast and loose with facts. Some nonfiction writers vigorously defend their use of invented elements in their work. Others take it for granted and are surprised by objections.

In conversation, I have heard that “nature writing” often contains a good deal of invention, and that some well-known nature writers admit without shame to faking observations and relating experiences that didn’t occur. But the principal entryway of fiction into nonfiction seems to be via autobiographical writing—the memoir or “personal essay.” Two relevant quotes from reviewers (for which I thank Sara Jameson, who sent them to me): W. S. Di Piero, in the New York Times Book Review of March 8, 1998:

Remembering is an act of the imagination. Any account we make of our experience is an exercise in reinventing the self. Even when we think we’re accurately reporting past events, persons, objects, places, and their sequence, we’re theatricalizing the self and its world.

I find the term “reinventing the self” interesting. Who did the original invention? Is the implication that of an eternal self-invention, the relationship of which to experience or reality is unimportant? The word “theatricalizing” is also interesting; theatrical isn’t a neutral word, but loaded with connotations of exaggeration and emotional falsehood.

In the same issue, Paul Levy wrote: “All autobiographers have a problem conjuring with the truth. My own strategy is to regard writing about oneself as inadvertent fiction.”

“Conjuring” has the same ring to it as “theatricalizing”—autobiography as sleight of hand, doves out of thin air. The phrase “inadvertent fiction” not only disclaims the writer’s responsibility, but offers irresponsibility as a strategy. This approach certainly could slide an autobiographer over the difficulties faced by writers unwilling to regard their art as inadvertent.

A related argument concerns objectivity, famed cornerstone of the scientific method, which many scientists now consider, as a realistic criterion of even the most painstakingly factual report of an experiment or observation, illusory. Feminists add that, as an ideal, it is in many ways undesirable.

Anthropologists have generally come to admit that accounts of ethnographical observations from which the observer is omitted contain a profound element of falsification. Ethnography these days is full of postmodern uncertainties, ellipses, and self-reflexivities, sometimes to the point of appearing to be less about the natives’ behavior than the ethnographer’s soul. Claude Lévi-Strauss’s Tristes Tropiques, the founding classic of this subtle and risky genre, exhibits its value when performed by a truly searching, skillful subjectivity.

In writing this essay I consciously include my subjective reactions and partialities as part of the process and lay no claim to either objectivity or authority. This is, to put it mildly, not how I was taught to write at Harvard in the forties. To me it seems perfectly appropriate to what I am doing, which is mostly speculation and opinion (as were most of the authoritatively phrased and apparently egoless papers produced at Harvard in the forties).

If, however, I were an eyewitness journalist charged with describing an event, if I were writing a biography, or an autobiography, could I not claim a genuine authority, based not only on knowledge (research), perceptivity, and inclusiveness but also on a strenuous attempt to be objective?

When scientists come out and state that they cannot achieve objectivity, and historians follow suit, a certain demoralisation may follow. Objectivity was an ideal to journalists, too. If the scientists abandon it, why should a poor stiff working part-time for the local foreign-corporation-owned rag even try for it?

Yet most journalists still profess the ideal of objective reporting, even when it comes to highly subjective matters. No proper journalist has ever admitted that anybody who does or suffers anything that brings them into public attention, intentionally or not, has any right to privacy. But in practice, journalists respect privacy when they describe objective actions and speech, leaving subjective motives, thoughts, and feelings to be deduced from the description; and outside the tabloids, most journalists do that. Serious journalism defines itself by the avoidance of speculation presented as fact.

Though they may have abandoned the claim to objectivity, serious history and biography define themselves in the same way. As soon as the writer tells us what Napoleon murmured to Josephine in bed and how Josephine’s heart went pitpat, we know we’re nearer Oz than Paris.

Many readers, of course, want Oz, not Paris. They’re reading for the story, and don’t care if the story is inaccurate or if the characters make a travesty of the historical figures they’re based on.

Why, then, are they reading history rather than a novel? Is it because they distrust the novel as being “made up,” while the narrative that calls itself history or biography, however dishonest, is “real”?

Such a bias, reflecting the Puritan judgmentalism so common in American minds, turns up in many and unlikely places. I hear a ring of such absolutism in both the New York Times Book Review quotations above, with their emphasis on “theatricality” and “conjuring.” You can’t tell the whole truth; nothing less will do; so you fake it.

But it’s equally possible that many or most American readers are genuinely indifferent to the distinction of fiction from nonfiction. These categories mean little or nothing in preliterate cultures, and even now, when the written word is the word that counts, ever more so as we increasingly communicate via electronic media, perhaps they are not generally seen as carrying any great intellectual or ethical significance.

This perception may be in part connected to the increasing electronicisation of writing. In so far as writing becomes electronic, surely its categories and genres will change. So far, the new technology has influenced fiction only by opening to the novelist the garden of forking paths accessible through hypertext. Genuinely interactive fiction, where the reader would control the text equally with the writer, remains hype or a promise (or, to some, a threat). As for nonfiction, it seems that scant care for accuracy and fact checking, along with wide tolerance of hearsay and opinion, characterise a lot of what passes for information on the Internet. The transitory nature of Net communication encourages a freedom like that of private conversation. Rumormongering, gossip, pontification, unverified quotation, and backchat all flow freely through cyberspace, shortcutting the skills and/or self-restraints of both fiction and factual writing. The pseudo-oral, pseudonymous, transitory character of electronic writing encourages an easy abdication of the responsibility that accrues to print. But that responsibility may be truly out of place in the Net. A new form of writing has to develop its own aesthetic and ethic. That’s to come. In this essay I’m talking about print, the essence of which is that it gives writing reproducible permanence. All permanence in human terms involves responsibility.

A group I belong to that gives annual awards to writers got a letter recently asking us to divide our nonfiction prize into two—one for historical nonfiction and one for creative nonfiction. The first term was new to me, the second was familiar.

Writing workshops and programs all over the country now offer courses in “creative nonfiction.” The arts of scientific, historical, and biographical narrative are rarely if ever taught in such programs (or anywhere else). Autobiography, however, has been increasingly popular in the writing programs. It may be taught as journal writing or as therapy through self-expression. When it has more literary goals, it is called creative nonfiction, personal essay, and memoir.

The writer of a memoir, like the responsible biographer, ethnographer, or journalist, used to describe what other people did and said, leaving what they may have felt and thought as implications to be drawn by the reader or as authorial speculation identified as such. The autobiographer limited her account to her own memory of how her uncle Fred looked as he ate the grommet, what she heard him say when he’d swallowed it, and what she thought about it. The only sensations and emotions she described were her own.

According to those who defend the use of fictional devices and elements in nonfiction, the memoirist is justified in telling us how, as he swallowed it, Fred vividly recalled the slightly oily taste of the first grommet he ever ate, fifty years ago in Indiana, and how bittersweet the memory was to him.

Many writers and readers of creative nonfiction hold that such ascription of inward thought or feeling, if it’s based on a knowledge of Fred’s character, is legitimate. It does no harm to Fred (who died in 1980 of a surfeit of grommets), and no harm to the reader, who after all will almost certainly know Fred only in and through the story, just as if he were a character in a novel.

But who is to certify the writer’s knowledge of her uncle’s character as accurate, unbiased, reliable? Possibly her aunt, but we’re not likely to have the chance to consult her aunt. The memoirist’s responsibility seems to me to be exactly that of the ethnographer: not to pretend to objectivity, but also not to pretend to be able to speak for anybody but oneself. To assign oneself the power to tell us what another person thought or felt is, to my mind, co-optation of a voice: an act of extreme disrespect. The reader who accepts the tactic colludes in the disrespect.

Characters “come alive” in a story, fictive or factual, they “seem real,” not, of course, through the mere report of their actions and words, but by selection, suppression, rearrangement, and interpretation of that material. I take it this is what Mr. Di Piero, quoted above, meant by “Remembering is an act of the imagination.” (It may be what Genly Ai, in my novel The Left Hand of Darkness, meant by saying that he was taught on his home world that “Truth is a matter of the imagination”; but Genly, of course, wasn’t real.)

The most cogent argument in support of the use of invention in nonfiction is, then: as fiction involves the arrangement, manipulation, and interpretation of inventions, so creative nonfiction involves the arrangement, manipulation, and interpretation of actual events. A short story is an invention, a memoir is a reinvention, and the difference between them is negligible.

I accept the terms, but the conclusion makes me uneasy.

It’s not just that many readers evidently don’t know whether a story they just read is factual, invented, or a mixture, and don’t care. They do care, in the sense that I discussed above: American readers tend to value factuality over invention, reality over imagination. They’re uncomfortable with the fictivity of fiction.

Perhaps this is why they beg novelists to tell them, “Where do you get your ideas from?” The only honest answer is of course “I make them up,” but that’s not the answer they want. They want specific sources. In my experience, most readers vastly exaggerate the dependence of fiction on research and immediate experience. They assume that characters in a story are “taken” from somebody the author knows, “based on” a specific person used as “copy,” and believe that a story or novel is necessarily preceded by “research.”

(This latter illusion may rise from the necessity most writers are under of writing applications for grants. You can’t tell the guys with the money that you don’t actually need to spend six months in the Library of Congress doing research for your novel. You’ve been drawing maps of Glonggo ever since you were ten, you worked out the curious mores and social structure of the Glonggovians when you were twenty, the plot and characters of Thunder-Lords of Glonggo are ready and waiting in your mind, and all you need is the six months to write the story and some peanut butter to live on. But peanut butter and made-up stories aren’t what grants are given for. Grants are for serious things, like research.)

The notion that fictional characters are all portraits of actual people probably arises from natural vanity and paranoia, and is encouraged by the power fantasies of some fiction writers (you’re nothing to me but copy). Tracing back elements of great novel characters—Jane Eyre, Natasha, Mrs. Dalloway—to this or that element of real people the writer knew is an entertaining and sometimes revealing criticobiographical game. But involved in all such searches for the nonfiction in the fiction is, I suspect, a distrust of the fictive, a resistance to admitting that novelists make it up—that fiction is not reproduction, but invention.

If invention is so much distrusted, why is it admitted where it doesn’t belong?

Maybe this insistence that fiction is “really” not made up but derived immediately from fact is what has established the confusion of modes that, as if reciprocally, permits the entry of fictional data into purported nonfiction.

Nothing comes from nothing. The novelist’s “ideas” do come from somewhere. The poet Gary Snyder’s finely unpoetic image of composting is useful here. Stuff goes into the writer, a whole lot of stuff, not notes in a notebook but everything seen and heard and felt all day every day, a lot of garbage, leftovers, dead leaves, eyes of potatoes, artichoke stems, forests, streets, rooms in slums, mountain ranges, voices, screams, dreams, whispers, smells, blows, eyes, gaits, gestures, the touch of a hand, a whistle in the night, the slant of light on the wall of a child’s room, a fin in a waste of waters. All this stuff goes down into the novelist’s personal compost bin, where it combines, recombines, changes; gets dark, mulchy, fertile, turns into ground. A seed falls into it, the ground nourishes the seed with the richness that went into it, and something grows. But what grows isn’t an artichoke stem and a potato eye and a gesture. It’s a new thing, a new whole. It’s made up.

That’s how I understand the process of using fact, experience, memory, in fictional narrative.

It seems to me the process of using fact, experience, memory in nonfiction is entirely different. In a memoir, the artichoke stem remains itself. The remembered light that slants across the wall can be placed and dated: a room in a house in Berkeley in 1936. These memories are immediate to the writer’s mind. They weren’t composted, but saved.

Memory is an active and imperfect process. Memories are shaped and selected, often profoundly, in that process. Like souls in heaven, they are saved, but changed. When the writer comes to make them into a coherent story, in the interests of clarity, comprehensibility, impetus, and other aims of narrative art, they’ll be selected from, emphasised, omitted, interpreted, and thoroughly worked over.

Nothing in these processes makes them fictional. They’re still, to the best of the author’s ability, genuine memories.

But if the remembered facts are deliberately changed or rearranged, they become false. If the artichoke stem is made a zinnia because the writer finds the zinnia more aesthetically effective, if the light falls aslant on the wall in 1944 because that date fits more conveniently into the narrative, they’re no longer facts or memories of facts. They are fictional elements in a piece that calls itself nonfiction. And when in reading a memoir I suspect or identify such elements, they cause me intense discomfort.

I’ll let Tolstoy tell me what Napoleon thought and felt, because, although his novel is full of well-researched historical facts, that’s not why I’m reading it. I’m reading it for the values proper to the novel, as a work of invention. If certain aspects of the author’s uncle Fred get into a short story where he’s called Cousin Jim and eats washers, I’ll accept their rubbery taste without a qualm, because it’s a story, and I take Cousin Jim to be a fictional character. It’s when I’m not quite sure what I’m reading that qualms arise.

It can happen even when there is a surfeit of fact in what calls itself fiction.

Reading for a jury for a fiction award, I fretted to a fellow juror about one of the books: was it really a novel? It read like a pure relation of the author’s boyhood, an honest, accurate, touching memoir barely disguised with a few name changes. How could we tell? “The author calls it a novel,” said my friend, “and so I read it as fiction and judge it as such.” Dealer’s call. If the writer calls it nonfiction, read it as fact; if the writer calls it a novel, read it as fiction.

I tried. I couldn’t do it. Fiction involves invention; fiction is invention. I can’t read a book in which nothing is invented as a novel. I couldn’t give a fiction award to a book that contains only facts. Any more than I could give a prize for journalism to The Lord of the Rings.

A real novel, an entirely fictive and imaginative tale, can contain vast amounts of fact without being any less fictional for it. Historical fiction and science fiction (which, by the way, often really does require research) may be full of solid, useful information concerning an era or a body of knowledge. The ploy of the whole realistic genre is to put invented characters into a framework of reproduced actuality—imaginary toads in a real garden, to twist Marianne Moore. All fiction serves later generations as descriptive evidence of its time, place, society; for keen observation and recording of ordinary people’s lives, very little ethnography has ever equalled the novel.

But it doesn’t work the other way. The historian, biographer, anthropologist, autobiographer, nature writer, have to use real gardens and real toads. Therein lies their proper creativity: not in inventing, but in making recalcitrant reality into a story without faking it.

Anything written contains an implicit contract, which can be honored or broken in the writing, or in the reading, or in the presentation by the publisher.

The first and most tenuous and intangible contract is between the writer and his or her conscience, and goes something like this: In this piece I will try to tell my story truly, using the means I find appropriate to the form, whether fiction or nonfiction.

Then there’s a more verifiable agreement between the writer and the reader, the terms of which vary immensely, depending, in the first place, on the sophistication of both. An experienced reader may follow a sophisticated writer through a whole gallery of tricks and illusions with perfect confidence that there will be no aesthetic betrayal. For more naive readers, however, the terms of the contract depend largely on how the writer—and publisher—present the work: as factual, imaginative, or a mixture of the two.

Reader as well as writer can twist the terms of this contract, reading a novel as if it were an account of actual events, or a piece of reportage as if it were pure invention.

Despite the great affinitie of fiction and beleefe, only the very innocent believe what novelists tell them. But an attitude of distrust towards nonfiction may well be the result of experience. One has been disappointed so often.

For though a whole swarm of facts in a novel doesn’t in the least invalidate the invention as a whole, every fictive or even inaccurate element in a narrative that presents itself as factual puts the whole thing at risk. To pass a single invention off as a fact is to damage the credibility of the rest of the narrative. To keep doing so is to disauthenticate it entirely.

Lincoln’s aphorism about fooling people applies, as usual. The writer who reports inaccurately or presents invention as fact is, consciously or not, exploiting the reader’s ignorance. Only the informed reader is aware that the contract has been violated. If amused enough, this reader may privately rewrite the contract, reading the so-called nonfiction as mere entertainment, hokum—fiction in the OED’s fifth definition.

Perhaps the terms of the contract are currently being rewritten by the writers. Perhaps the whole idea of a contract is hopelessly prepostmodern, and readers are coming to accept false data in nonfiction as calmly as they accept factual information in fiction.

Certainly we’ve become so numbed by the quantity of unverifiable information poured out upon us that we admit factoids as more or less equivalent to facts. And with the same numbness, we’re generally acceptant of hype of all kinds—advertising, stories about celebrity figures, political “leaks,” patriotic and moralistic declarations, and so on—reading it without much caring if the material is credible or that we’re being treated as objects of manipulation.

If this nondistinction of the fictive and the factual is a general trend, maybe we should celebrate it as a victory of creativity over unimaginative, indiscriminate factualism. I worry about it, however, because it seems to me that by not distinguishing invention from lying it puts imagination itself at risk.

Whatever “creative” means, I don’t think the term can fairly be applied to falsification of data and memories, whether intentional or “inevitable.”

Excellence in nonfiction lies in the writer’s skills in observing, organising, narrating, and interpreting facts—skills entirely dependent on imagination, used not to invent, but to connect and illuminate observation.

Writers of nonfictional narrative who “create” facts, introduce inventions, for the sake of aesthetic convenience, wishful thinking, spiritual solace, psychic healing, vengeance, profit, or anything else, aren’t using the imagination, but betraying it.

AWARD AND GENDER

This was given as a talk and a handout at the Seattle Book Fair in 1999.

In 1998 I was on a jury of three choosing a literary prize. From 104 novels, we selected a winner and four books for the shortlist, arriving at consensus with unusual ease and unanimity. We were three women, and the books we chose were all written by women. The eldest and wisest of us said, Ouch! If a jury of women picks only women finalists, people will call us a feminist cabal and dismiss our choices as prejudiced, and the winning book will suffer for it.

I said, But if we were men and picked all books by men, nobody would say a damn thing about it.

True, said our Wise Woman, but we want our winner to have credibility, and the only way three women can have credibility as a jury is to have some men on the short list.

Against my heart and will, I agreed. And so two women who should have been there got bumped from our shortlist, and the two men whose books we had placed sixth and seventh got on it.

Literary awards used to be essentially literary events. Though a prize such as the Pulitzer certainly influenced the sale of the book, that wasn’t all it was valued for. Since the takeover of most publishing houses by their accounting departments, the financial aspect of the literary award has become more and more important.

These days, literary prizes carry a huge weight in fame, money, and shelf longevity.

But only some of them. Certain awards are newsworthy and success-assuring: most of them are not. The selection of which prize is sure to hit the headlines and which is ignored seems to be almost totally arbitrary. The media follow habit without question. Hysteria about the Booker Prize is assured; general indifference to the PEN Western States Award is certain.

Most writers who have served on award juries agree that the field of finalists is often so qualitatively even that selection of a single winner is essentially arbitrary. Many also agree that the field of finalists often contains books so various in nature and intent that the selection of a single winner is, again, essentially arbitrary. But a single winner is what is demanded of them, so they provide it. Then publishers capitalise on it, bookstores fawn on it, libraries stock their shelves with it, while the shortlist books are forgotten.

I feel that the competitive, single-winner pattern is suited to sports events but not to literature, that the increasingly exaggerated dominance of the “big” awards in the field of fiction is pernicious, and that the system inevitably perpetuates cronyism, geographical favoritism, gender favoritism, and big-name syndrome.

Of these, gender favoritism particularly irks me. It is so often and so indignantly denied that I began to wonder if I was irked over nothing. I decided to try to find if my impression that the great majority of literary awards went to men had any foundation in fact. To establish my facts, I limited my study to fiction.

If more men than women publish fiction, that would of course justify an imbalance towards male prizewinners. So to start with I did some gender sampling of authors of novels and story collections published in various periods from 1996 to 1998. My time was limited and my method was crude. The numbers (only about a thousand writers in all) may not be large enough to be statistically significant. My author-gender count covers only four recent years, while my figures on the awards go back decades. (A study on author gender in fiction in the whole twentieth century would be a very interesting subject for a thesis.) My sources were Publishers Weekly for general fiction, What Do I Read Next? for genre fiction, and the Hornbook for children’s books. I counted authors by sex, omitting collaborations and any names that were not gender identifiable. (My genre sources identified aliases. Rumor has it that many romances are written by men under female pen names, but I found only one transgenderer—a woman mystery writer who used a male name.)

AUTHOR GENDER

Summations

(see Details of the Counts and Awards, below)

— General fiction: 192 men, 167 women: slightly more men than women.

— Genre fiction: 208 men, 250 women: more women than men

— Children’s books and young adult: 83 men, 161 women: twice as many women as men

— All genres: 483 men, 578 women: about 5 women to 4 men.

Eighty of the authors in my Genre category were romance writers, all women; if you consider them as probably balanced by predominantly male-written genres such as sports, war, and porn, which I did not have figures for, you might arrive at parity. It looks as if, overall, as many women as men, perhaps slightly more women than men, write and publish novels and stories.

Author gender in fiction is pretty near 1:1.

Now for the gender counts and ratios for literary prizes. Ideally I would have listed the shortlists or runners-up where available, but given the shortness of time in which I had to prepare this paper, and the shortness of life, I list only winners. (Information on most awards, including shortlists, winners, and sometimes jurors, is accessible at libraries and on the Net.)

The years covered are the years the prize has been given, up to 1998—these spans of course vary greatly. The oldest is the Nobel Prize in Literature.

I did not try to find out the gender composition of the juries of any of these awards, though many are on record. I wish I had the time to go into this and find out whether juries are gender balanced or not, whether the balance has changed over time, and whether gender composition influences their choices. One might well assume that men tend to pick men and women women, but if juries are even moderately balanced between men and women, my figures do not support this assumption. It looks as if men and women tend to pick men.

Most awards are chosen by a judge or panel of judges, but some genre prizes are voted by readers or (in the case of the Nebula Award) fellow writers in the genre.

(In this context I want to point out that the MacArthur “genius awards” are nominated by “experts” chosen by the MacArthur Foundation, and the winners are selected by a board chosen by the Foundation—a permanently secret board, whose members are therefore, in the true meaning of the word, irresponsible. In all the arts awards given by the MacArthur Foundation, I find the 3:1 gender ratio—three men to one woman—so consistent that I must assume it is the result of deliberate policy.)

GENDER RATIO OF LITERARY PRIZES, MALE TO FEMALE
(in order of most extreme imparity to nearest parity)

— Nobel Prize in Literature, 10:1

— PEN/Faulkner Award for Fiction, 8:1

— Edgar Grand Master Award (mystery), 7:1

— National Book Award (now American Book Award), 6:1

— World Fantasy Lifetime Achievement Award, 6:1

— Pulitzer Prize for Literature, since 1943, 5:1

— Edgar Award for Best Novel, since 1970 (mystery), 5:1

— Hugo Award (science fiction) (reader vote), 3:1

— World Fantasy Best Novel Award, 3:1

— Newbery Award (juvenile), 3:1

— Nebula Award (science fiction) (voted by fellow writers), 2.4:1

— Pulitzer Prize for Literature, till 1943, 2:1

— Edgar Award for Best Novel, till 1970 (mystery), 2:1

— Booker Prize, 2:1

SOME OBSERVATIONS

Though the number of men and women writing literary fiction is nearly equal, the “big” literary awards, Nobel, National Book Award, Booker, PEN, Pulitzer, give 5.5 prizes to men for every 1 to a woman. Genre awards average 4 to 1, so a woman stands a better chance of getting a prize if she writes genre fiction.

Among all the prizes I counted, the ratio is 4.5:1—for every woman who gets a fiction prize, four and a half men do; or, to avoid the uncomfortable idea of half-men, you can say that nine men get a prize for every two women who do.

Except in the Nobel, which gave three women prizes in the nineties, there was no gain in gender parity in these prizes during the twentieth century, and in some cases a drastic decline. I broke the figures for the Pulitzer into before and after 1943, and the Edgar Best Novel into before and after 1970, to demonstrate the most notable examples of this decline. There would have to have been a massive change in author gender, a great increase in the number of men writing fiction in these fields, to explain or justify the increasing percentage of male award winners. I do not have the figures, but my impression is that there has not been any such great increase; my guess is that the fifty-fifty ratio of men and women writing fiction has been fairly constant through the twentieth century.

In children’s literature, where by my rough count there are twice as many women authors, men win three times as many prizes as women.

Nearly two-thirds of mystery writers are women, but men get three times as many prizes as women, and since 1970, five times as many.

The inescapable conclusion is that prize juries, whether they consist of readers, writers, or pundits, through conscious or unconscious prejudice, reward men four and a half times more than women.

The escapable conclusion is that men write fiction four and a half times better than women. This conclusion appears to be acceptable to many people, so long as it goes unspoken.

Those of us who do not find it acceptable have to speak.

Literary juries and the sponsors of awards need to have their prejudices queried and their consciousness raised. The perpetuation of gender prejudice through literary prizes should be challenged by fairminded writers by discussions such as this, by further and better research, and by letters of comment and protest to the awarding bodies, to literary publications, and to the press.

DETAILS OF THE COUNTS AND AWARDS

This appendix is for people who enjoy details and want to see how my system of determining author gender and gender parity worked, or suggest how it might be improved, enlarged, and updated—a job I would gladly hand on to anybody who wants to undertake it…. And I have made some notes and observations on various outcomes and oddities.

Author Gender (Novels and Story Collections)

(MF indicates male to female)

“Literary” Fiction

— Hardcover: men 128, women 98. MF ratio 1.3:1

— Trade paperback: men 64, women 69. MF ratio near parity

“Genre” Fiction

— Mystery: men 52, women 72. MF ratio 0.7:1

— Romance: men 0, women 80. MF ratio 0:1

— Western: men 60, women 22. MF ration 3:1

— Fantasy: men 39, women 40. MF ratio near parity.

— Science fiction: men 57, women 35. MF ratio 1.6:1

“Juvenile” Fiction

— Children, age 6–12: men 80, women 117. MF ratio 0.7:1

— Young adult: men 23, women 44. MF ratio 1:2

Summary

— “Literary” fiction: men 192, women 167

— “Genre” fiction: men 208, women 249

— “Juvenile” fiction: men 103, women 161

These categories, derived from my reference sources, should be taken with extreme distrust, which is why I put them in quotes. Genre, as generally understood, is itself a suspect concept. Many of the books could well have been listed in two or even three categories.


Total Count of Gender of Authors of Novels and Story Collections

— Total authors: 1,080

— Men 503, women 577

— Approximate MF ratio 5:6


Author Gender in Awards Given to Novels and Story Collections

The Nobel Prize in Literature (voted by a special board)

Between 1901 and 1998, the prize was given 91 times (it was not given 7 times, notably during World War Two). It has been split twice between two men and once between a man and a woman, so that the totals have decimals.

Men 85.5, women 8.5. MF ratio almost exactly 10:1

The years women were given the Nobel for Literature were 1909, 1926, 1928, 1938, 1945, 1966, 1991, 1993, 1996: pretty much one woman per decade, till the nineties when three women were given prizes.


The Pulitzer Prize for Literature (voted by a jury of writers)

Given since 1918, with six no-award years.

Men 50, women 23. MF ratio just over 2:1

The ratio has declined severely from parity since 1943. Of the 23 awards to women, 12 were given in the 25 years 1918–1943, but only 11 in the 54 years 1944–1998. Since 1943, though half or more of the shortlist authors are often women, 5 out of 6 winners have been men (MF ratio 5:1).


The Booker Prize (voted by a jury of writers and critics)

Given since 1969.

Men 21, women 11. MF ratio 2:1

This ratio has been pretty steady over 30 years, remaining the nearest parity of the prizes I examined.


The National Book Award/American Book Award

Given since 1950, with various types of jury, various sponsors, and several changes of category in fiction, so it is hard to count. As well as I can determine, the “Best Novel” award (excluding genre and juvenile) has been as follows:

Men 43, women 7. MF ratio 6:1


The PEN/Faulkner Award for Fiction (voted by a jury)

Given since 1981.

Men 17, women 2. MF ratio 8.5: 1

As there are always women on the shortlist for the PEN/Faulkner, I was startled, in fact shocked, to discover how few have been given the award. This prize is almost as male oriented as the Nobel.


The Nebula Award (science fiction and fantasy; voted by public nomination and secret ballot of members of the Science Fiction and Fantasy Writers’ Association)

Given since 1965.

Men 24, women 10. MF ratio 2.4:1


The Hugo Award (science fiction; voted by ballot of members of the World Science Fiction Convention)

Given since 1953.

Men 36, women 11. MF ratio 3:1

I find it interesting that these two balloted awards, the Nebula selected by writers and the Hugo by fans, are nearer parity than several juried awards, and far nearer parity than the similarly balloted Edgar.


The World Fantasy Award (given by a jury, plus an anonymous decision)

Best Novel (split awards cause decimals):

Men 18.5, women 5.5. MF ratio 3:1

Lifetime Achievement (16 awards plus a 5-way split):

Men 17, women 3. MF ratio 6:1


The Edgar Award

Best Novel (mystery; voted by the members of the Mystery Writers of America)

Given since 1946.

Men 39, women 13. MF ratio 3:1

This ratio is for the whole 52 years. From 1946 to 1970, 16 men and 8 women were given the prize, making the ratio 2:1. But in the 28 years since 1970, despite the fact that considerably more women than men write mysteries, only 5 women have won “Best Novel,” making the MF ratio almost 5:1.


Grand Master

First given in 1955, to Agatha Christie. For the next 15 years, only men were made Grand Masters. By 1998, of the 46 Grand Masters, 35 were men, 8 women—but 3 of those 8 women shared a single award. No men have been asked to share their Grand Mastery. Counting the 3-in-1 as a single award, the MF ratio is 7:1.


The Newbery Award (for excellence in children’s literature; voted by a “panel of experts”)

Given since 1922.

1922–1930, all the awards went to men; 1931–1940, all to women. From 1941–1998, men 16, women 40. As about 1 out of 3 authors of books for children and young adults is a man, the prize is a pretty fair reflection of author gender.[1]

ON GENETIC DETERMINISM

I wrote this piece as a reader’s personal response to a text. Finding myself troubled by many of E. O. Wilson’s sweeping statements, I tried to figure out what was troubling me. I did it in writing because I think best in writing. An amateur responding to a professional is likely to make a fool of herself, and no doubt I’ve done just that; but I decided to publish the piece. I am not pitting my opinions against scientific observation; I am pitting my opinions against a scientist’s opinions. Opinions and assumptions, when presented by a distinguished scientist, are likely to be mistaken for scientific observations—for fact. And that was what troubled me.

In his very interesting autobiography, Naturalist, E. O. Wilson summarises the statement of the biological foundations of human behavior made in his book Sociobiology:

Genetic determinism, the central objection raised against [Sociobiology], is the bugbear of the social sciences. So what I said that can indeed be called genetic determinism needs saying here again. My argument ran essentially as follows: Human beings inherit a propensity to acquire behavior and social structures, a propensity that is shared by enough people to be called human nature. The defining traits include division of labor between the sexes, bonding between parents and children, heightened altruism toward closest kin, incest avoidance, other forms of ethical behavior, suspicion of strangers, tribalism, dominance orders within groups, male dominance overall, and territorial aggression over limiting [limited?] resources. Although people have free will and the choice to turn in many directions, the channels of their psychological development are nevertheless—however much we might wish otherwise—cut more deeply by the genes in certain directions than in others. So while cultures vary greatly, they inevitably converge toward these traits…. The important point is that heredity interacts with environment to create a gravitational pull toward a fixed mean. It gathers people in all societies into the narrow statistical circle that we define as human nature. (E. O. Wilson, Naturalist, pp. 332, 333)

That human beings inherit a propensity to acquire behavior and that the construction of society is one of these behaviors, I agree. Whether anything worth the risk is to be gained by calling this propensity “human nature,” I wonder. Anthropologists have excellent justification for avoiding the term human nature, for which no agreed definition exists, and which all too easily, even when intended as descriptive, is applied prescriptively.

Wilson states that the traits he lists constitute a “narrow statistical circle that we define as human nature.” Like Tonto, I want to ask, “Who ‘we,’ white man?” The selection of traits is neither complete nor universal, the definitions seem sloppy rather than narrow, and the statistics are left to the imagination. More statistics and completer definitions are to be found in Sociobiology, of course; but Wilson’s own statement of what the book says is as accurate and complete as it is succinct, so that I think it fair to address my arguments to it.

Taking it, then, phrase by phrase:


Division of labor between the sexes:

This phrase means only that in all or most known societies men and women do different kinds of work; but since it is very seldom understood in that strict meaning, it is either ingenuous or disingenuous to use it in this context without acknowledging its usual implications. Unless those implications are specifically denied, the phrase “division of labor between the sexes” is understood by most readers in this society to imply specific kinds of gender-divided work, and so to imply that these are genetically determined: our genes ensure that men hunt, women gather; men fight, women nurse; men go forth, women keep the house; men do art, women do domestic work; men function in the “public sphere,” women in the “private,” and so on.

No anthropologist or person with an anthropological conscience, knowing how differently work is gendered in different societies, could accept these implications. I don’t know what implications, if any, Wilson intended. But as this kind of unstated extension of reductionist statements does real intellectual and social damage, reinforcing prejudices and bolstering bigotries, it behooves a responsible scientist to define his terms more carefully.

As some gendered division of labor exists in every society, I would fully agree with Wilson if he had used a more careful phrasing, such as “some form of gender construction, including gender-specific activities.”


Bonding between parents and children; heightened altruism toward closest kin; suspicion of strangers:

All these behaviors are related, and can be defined as forms of “selfish gene” behavior; I think they have been shown to be as nearly universal among human beings as among other social animals. But in human beings such behavior is uniquely, and universally, expressed in so immense a range of behaviors and social structures, of such immense variety and complexity, that one must ask if this range and complexity, not present in any animal behavior, is not as genetically determined as the tendencies themselves.

If my question is legitimate, then Wilson’s statement is unacceptably reductive. To focus on a type of human behavior shared with other animals, but to omit from the field of vision the unique and universal character of such behavior among humans, is to beg the question of how far genetic determination of behavior may extend. Yet that is a question that no sociobiologist can beg.


Tribalism:

I understand tribalism to mean an extension of the behavior just mentioned: social groups are extended beyond immediate blood kin by identifying nonkin as “socially kin” and strangers as nonstrangers, establishing shared membership in constructs such as clan, moiety, language group, race, nation, religion, and so on.

I can’t imagine what the mechanism would be that made this kind of behavior genetically advantageous, but I think it is as universal among human groups as the behaviors based on actual kinship. If universality of a human behavior pattern means that it is genetically determined, then this type of behavior must have a genetic justification. I think it would be a rather hard one to establish, but I’d like to see a sociobiologist try.


Incest avoidance:

Here I’m uncertain about the evolutionary mechanism that enables the selfish gene to recognise a selfish gene that is too closely akin, and so determines a behavior norm. If there are social mechanisms preventing incest among the other primates, I don’t know them. (Driving young males out of the alpha male’s group is male-dominant behavior serving only incidentally and ineffectively as incest prevention; the alpha male does mate with his sisters and daughters, though the young males have to go find somebody else’s.)

I’d like to know whether Wilson knows what the general incidence of incest among mammals is, and whether he believes that incest is “avoided” among humans any more than it is among apes, cats, wild horses, and so on. Do all human societies ban incest? I don’t know; it was an open question, last I heard. That most human societies have cultural strictures against certain types of incest is true; that many human societies fail as often as not to implement them is also true. I think in this case Wilson has confused a common cultural dictum or desideratum with actual behavior; or else he is saying that our genes program us to say we must not do something, but do not prevent us from doing it. Now those are some fancy genes.


Dominance orders within groups:

Here I suspect Wilson’s anthropology is influenced by behaviorists’ experiments with chickens and primatologists’ observations of apes more than by anthropologists’ observation of human behavior in groups. Dominance order is very common in human societies, but so are other forms of group relationship, such as maintaining order through consensus; there are whole societies in which dominance is not the primary order, and groups in most societies in which dominance does not function at all, difficult as this may be to believe at Harvard. Wilson’s statement is suspect in emphasising one aspect of behavior while omitting others. Once again, it is reductive. It would be more useful if phrased more neutrally and more accurately: perhaps, “tendency to establish structured or fluid social relationships outside immediate kinship.”


Male dominance overall:

This is indeed the human social norm. I take it that the genetic benefit is that which is supposed to accrue in all species in which the male displays to the female to attract her choice and/or drives away weaker males from his mate or harem, thus ensuring that his genes will dominate in the offspring (the male selfish gene). Species in which this kind of behavior does not occur (including so close a genetic relative as the bonobo) are apparently not considered useful comparisons or paradigms for human behavior.

That male aggressivity and display behavior extend from sexuality to all forms of human social and cultural activity is indubitable. Whether this extension has been an advantage or a liability to our genetic survival is certainly arguable, probably unprovable. It certainly cannot simply be assumed to be of genetic advantage in the long run to the human, or even the male human, gene. The “interaction of heredity with environment” in this case has just begun to be tested, since only in the last hundred years has there been a possibility of unlimited dominance by any subset of humanity, along with unlimited, uncontrollable aggressivity.


Territorial aggression over limiting [sic] resources:

This is evidently a subset of “male dominance overall.” As I understand it, women’s role in territorial aggression has been subsidiary, not institutionalised, and seldom even recognised socially or culturally. So far as I know, all organised and socially or culturally sanctioned aggression over resources or at territorial boundaries is entirely controlled and almost wholly conducted by men.

It is flagrantly false to ascribe such aggression to scarcity of resources. Most wars in the historical period have been fought over quite imaginary, arbitrary boundaries. It is my impression of warlike cultures such as the Sioux or the Yanomamo that male aggression has no economic rationale at all. The phrase should be cut to “territorial aggression,” and attached to the “male dominance” item.


Other forms of ethical behavior:

This one’s the big weasel. What forms of ethical behavior? Ethical according to whose ethics?

Without invoking the dreaded bogey of cultural relativism, I think we have a right to ask anybody who asserts that there are universal human moralities to list and define them. If he asserts that they are genetically determined, he should be able to specify the genetic mechanism and the evolutionary advantage they involve.

Wilson appends these “other forms” to “incest avoidance,” which is thus syntactically defined as ethical behavior. Incest avoidance certainly involves some genetic advantage. If there are other behaviors that involve genetic advantage and are universally recognised as ethical, I want to know what they are.

Not beating up old ladies might be one. Grandmothers have been proved to play a crucial part in the survival of grandchildren in circumstances of famine and stress. Their genetic interest of course is clear. I doubt, however, that Wilson had grandmothers in mind.

Mother-child bonding might be one of his “other forms of ethical behavior.” It is tendentious, if not hypocritical, to call it “bonding between parents and children” as Wilson does, since it is by no means a universal cultural expectation that the male human parent will, or should, bond with his child. The biological father is replaced in many cultures by the mother’s brother, or serves only as authority figure, or (as in our culture) is excused from responsibility for children he sired with women other than his current wife. A further danger in this context is that the mother-child bond is so often defined as “natural” as to be interpreted as subethical. A mother who does not bond with her child is defined less as immoral than as inhuman. This is an example of why I think the whole matter of ethics, in this context, is a can of actively indefinable worms far better left unopened.

Perhaps that’s why Wilson left these “other ethical behaviors” so vague. Also, if he had specified as ethical such behavior as, say, cooperation and mutual aid between individuals not blood kin, he would have risked his credibility with his fellow biologists still trained to interpret behavior strictly in the mechanistic mode.

Finally, I wonder if genetic determinism as such really is “the bugbear of the social sciences.” Academics are the very model of territorialism, and some social scientists certainly responded with fear and fury to what they saw as Wilson’s aggression when he published Sociobiology. But on the whole, Wilson’s statement seems a little paranoid, or a little boastful.

The controversy and animus aroused by Sociobiology might have been avoided if the author had presented his determinism in more precise, more careful, less tendentious, less anthropologically naive terms. If in fact his theory is not a bugbear to the social sciences, it’s because it has not proved itself useful or even relevant to them.

I’d find his arguments far more interesting if he had genuinely taken pains to extend his reductive theory to explain specifically human behavior, including the elaboration of the gender-based, kinbased repertory of behaviors we share with animals into the apparently infinite varieties of human social structures and the endless complexities of culture. But he has not done so.

There are social scientists and humanists, as well as determinists, who would argue that it’s the vast range and complexity of human behavioral options, in origin genetically determined, that gives us what may ultimately be the illusion of free will. But Wilson, having raised this question in Naturalist, ducks right under it with a flat statement of belief that “people have free will.” The statement as such is meaningless. I am not interested in his beliefs. He is not a religious thinker or a theologian but a scientist. He should speak as such.

ABOUT FEET

Watching a ballroom dancing competition on television, I was fascinated by the shoes the women wore. They were dancing in strapped stiff shoes with extremely high heels. They danced hard, heel and toe, kicking and prancing, clapping their feet down hard and fast with great precision. The men wore flat-heeled shoes, conformed to the normal posture of the foot. One of them had flashing jewels on the outer side of each shoe. His partner’s shoes were entirely covered with the flashing jewels, which must have made the leather quite rigid, and the heels were so high that her own heels were at least three inches above the balls of her feet, on which all her weight was driven powerfully again and again. Imagining my feet in those shoes, I cringed and winced, like the Little Mermaid walking on her knives.

The question has been asked before but I haven’t yet got an answer that satisfies me: why do women cripple their feet while men don’t?

It’s not a very long step to China, where women broke the bones of their daughters’ feet and strapped the toes under the ball of the foot to create a little aching useless ball of flesh, stinking of pus and exudations trapped in the bindings and folds of skin: the Lotus Foot, which was, we are told, sexually attractive to men, and so increased the marriageability and social value of the woman.

Such attraction is comprehensible to me as a perversity. A gendered perversity. How many women would be attracted by a man’s feet deliberately deformed, dwarfed, and smelling of rot?

So there is the question again. Why? Why do we and why don’t they?

Well, I wonder, did some Chinese women find other women’s Lotus Feet sexually attractive?

Certainly both men and women may find cruelty and suffering erotic. One person hurts the other so that one or both can feel a sexual thrill that they wouldn’t feel if neither was frightened or in pain. As in having a child’s foot broken and bound into a rotting lump and then getting an erection from fondling the rotting lump. Sadism and masochism: a sexuality dependent on pain and cruelty.

To let sexual feeling be aroused by pain and cruelty may be better—we are often told it is better—than not having any sexual feeling at all. I’m not sure. For whom is it better?

I’d like to think Chinese women looked with pity, with terror, at one another’s Lotus Feet, that they flinched and cringed when they smelled the smell of the bindings, that children burst into tears when they saw their mother’s Lotus Feet. Girl children, boy children. But what do I know?

I can understand why a mother would “give” her daughter Lotus Feet, would break the bones and knot the bindings; it’s not hard at all to understand, to imagine the circumstances that would lead a mother to make her daughter “marriageable,” that is, saleable, acceptable to her society, by torturing and deforming her.

Love and compassion, deformed, act with immense cruelty. How often have Christians and Buddhists thus deformed a teaching of compassion?

And fashion is a great power, a great social force, to which men may be even more enslaved than the women who try to please them by obeying it. I have worn some really stupid shoes myself in the attempt to be desirable, the attempt to be conventional, the attempt to follow fashion.

But that another woman would desire her friend’s Lotus Feet, find them erotic, can I imagine that? Yes, I can; but I learn nothing from it. The erotic is not the sum of our being. There is pity, there is terror.

I look at the ballroom dancer’s rigid glittering shoes with dagger heels that will leave her lame at fifty, and find them troubling and fascinating. Her partner’s flat shiny shoes are boring. His dancing may be thrilling, but his feet aren’t. And male ballet dancers’ feet certainly aren’t attractive, bundled into those soft shoes like big hotdog buns. The uncomfortable fascination comes only when the women get up on their pointes with their whole body weight on the tips of their toes, or prance in their dagger heels, and suffer.

Of course this is a sexual fascination, eroticism explains everything…. Well, does it?

Bare feet are what I find sexy—the supple, powerful arch, the complex curves and recurves of the dancer’s naked foot. Male or female.

I don’t find shod feet erotic. Or shoes, either. Not my fetish, thanks. It’s the sense of what dancers’ shoes are doing to the dancer’s feet that fascinates me. The fascination is not erotic, but it is physical. It is bodily, it is social, ethical. It is painful. It troubles me.

And I can’t get rid of the trouble, because my society denies that it is troubling. My society says it’s all right, nothing is wrong, women’s feet are there to be tortured and deformed for the sake of fashion and convention, for the sake of eroticism, for the sake of marriageability, for the sake of money. And we all say yes, certainly, all right, that is all right. Only something in me, some little nerves down in my toes that got bent awry by the stupid shoes I wore when I was young, some muscles in my instep, some tendon in my heel, all those bits of my body say No no no no. It isn’t all right. It’s all wrong.

And because my own nerves and muscles and tendons respond, I can’t look away from the dancer’s dagger heels. They pierce me.

Our mind, denying our cruelty, is trapped in it. It is in our body that we know it, and so perhaps may see how there might be an end to it. An end to fascination, an end to obedience, a beginning of freedom. One step towards it. Barefoot?

DOGS, CATS, AND DANCERS Thoughts About Beauty

An earlier version of this piece was published in 1992 in the “Reflections” section of Allure magazine, where it was retitled “The Stranger Within.” I have fiddled around with it a good bit since then.

Dogs don’t know what they look like. Dogs don’t even know what size they are. No doubt it’s our fault, for breeding them into such weird shapes and sizes. My brother’s dachshund, standing tall at eight inches, would attack a Great Dane in the full conviction that she could tear it apart. When a little dog is assaulting its ankles the big dog often stands there looking confused—“Should I eat it? Will it eat me? I am bigger than it, aren’t I?” But then the Great Dane will come and try to sit in your lap and mash you flat, under the impression that it is a Peke-a-poo.

My children used to run at sight of a nice deerhound named Teddy, because Teddy was so glad to see them that he wagged his whiplash tail so hard that he knocked them over. Dogs don’t notice when they put their paws in the quiche. Dogs don’t know where they begin and end.

Cats know exactly where they begin and end. When they walk slowly out the door that you are holding open for them, and pause, leaving their tail just an inch or two inside the door, they know it. They know you have to keep holding the door open. That is why their tail is there. It is a cat’s way of maintaining a relationship.

Housecats know that they are small, and that it matters. When a cat meets a threatening dog and can’t make either a horizontal or a vertical escape, it’ll suddenly triple its size, inflating itself into a sort of weird fur blowfish, and it may work, because the dog gets confused again—“I thought that was a cat. Aren’t I bigger than cats? Will it eat me?”

Once I met a huge, black, balloonlike object levitating along the sidewalk making a horrible moaning growl. It pursued me across the street. I was afraid it might eat me. When we got to our front steps it began to shrink, and leaned on my leg, and I recognised my cat, Leonard; he had been alarmed by something across the street.

Cats have a sense of appearance. Even when they’re sitting doing the wash in that silly position with one leg behind the other ear, they know what you’re sniggering at. They simply choose not to notice. I knew a pair of Persian cats once; the black one always reclined on a white cushion on the couch, and the white one on the black cushion next to it. It wasn’t just that they wanted to leave cat hair where it showed up best, though cats are always thoughtful about that. They knew where they looked best. The lady who provided their pillows called them her Decorator Cats.

A lot of us humans are like dogs: we really don’t know what size we are, how we’re shaped, what we look like. The most extreme example of this ignorance must be the people who design the seats on airplanes. At the other extreme, the people who have the most accurate, vivid sense of their own appearance may be dancers. What dancers look like is, after all, what they do.

I suppose this is also true of fashion models, but in such a limited way—in modeling, what you look like to a camera is all that matters. That’s very different from really living in your body the way a dancer does. Actors must have a keen self-awareness and learn to know what their body and face are doing and expressing, but actors use words in their art, and words are great illusion makers. A dancer can’t weave that word screen around herself. All a dancer has to make her art from is her appearance, position, and motion.

The dancers I’ve known have no illusions or confusions about what space they occupy. They hurt themselves a lot—dancing is murder on feet and pretty tough on joints—but they never, ever step in the quiche. At a rehearsal I saw a young man of the troupe lean over like a tall willow to examine his ankle. “Oh,” he said, “I have an owie on my almost perfect body!” It was endearingly funny, but it was also simply true: his body is almost perfect. He knows it is, and knows where it isn’t. He keeps it as nearly perfect as he can, because his body is his instrument, his medium, how he makes a living, and what he makes art with. He inhabits his body as fully as a child does, but much more knowingly. And he’s happy about it.

I like that about dancers. They’re so much happier than dieters and exercisers. Guys go jogging up my street, thump thump thump, grim faces, glazed eyes seeing nothing, ears plugged by earphones—if there was a quiche on the sidewalk, their weird gaudy running shoes would squish right through it. Women talk endlessly about how many pounds last week, how many pounds to go. If they saw a quiche they’d scream. If your body isn’t perfect, punish it. No pain no gain, all that stuff. Perfection is “lean” and “taut” and “hard”—like a boy athlete of twenty, a girl gymnast of twelve. What kind of body is that for a man of fifty or a woman of any age? “Perfect”? What’s perfect? A black cat on a white cushion, a white cat on a black one… A soft brown woman in a flowery dress… There are a whole lot of ways to be perfect, and not one of them is attained through punishment.

Every culture has its ideal of human beauty, and especially of female beauty. It’s amazing how harsh some of these ideals are. An anthropologist told me that among the Inuit people he’d been with, if you could lay a ruler across a woman’s cheekbones and it didn’t touch her nose, she was a knockout. In this case, beauty is very high cheekbones and a very flat nose. The most horrible criterion of beauty I’ve yet met is the Chinese bound foot: feet dwarfed and crippled to be three inches long increased a girl’s attractiveness, therefore her money value. Now that’s serious no pain no gain.

But it’s all serious. Ask anybody who ever worked eight hours a day in three-inch heels. Or I think of when I was in high school in the 1940s: the white girls got their hair crinkled up by chemicals and heat so it would curl, and the black girls got their hair mashed flat by chemicals and heat so it wouldn’t curl. Home perms hadn’t been invented yet, and a lot of kids couldn’t afford these expensive treatments, so they were wretched because they couldn’t follow the rules, the rules of beauty.

Beauty always has rules. It’s a game. I resent the beauty game when I see it controlled by people who grab fortunes from it and don’t care who they hurt. I hate it when I see it making people so self-dissatisfied that they starve and deform and poison themselves. Most of the time I just play the game myself in a very small way, buying a new lipstick, feeling happy about a pretty new silk shirt. It’s not going to make me beautiful, but it’s beautiful itself, and I like wearing it.

People have decorated themselves as long as they’ve been people. Flowers in the hair, tattoo lines on the face, kohl on the eyelids, pretty silk shirts—things that make you feel good. Things that suit you. Like a white pillow suits a lazy black cat…. That’s the fun part of the game.

One rule of the game, in most times and places, is that it’s the young who are beautiful. The beauty ideal is always a youthful one. This is partly simple realism. The young are beautiful. The whole lot of ’em. The older I get, the more clearly I see that and enjoy it.

But it gets harder and harder to enjoy facing the mirror. Who is that old lady? Where is her waist? I got resigned, sort of, to losing my dark hair and getting all this limp grey stuff instead, but now am I going to lose even that and end up all pink scalp? I mean, enough already. Is that another mole or am I turning into an Appaloosa? How large can a knuckle get before it becomes a kneejoint? I don’t want to see, I don’t want to know.

And yet I look at men and women my age and older, and their scalps and knuckles and spots and bulges, though various and interesting, don’t affect what I think of them. Some of these people I consider to be very beautiful, and others I don’t. For old people, beauty doesn’t come free with the hormones, the way it does for the young. It has to do with bones. It has to do with who the person is. More and more clearly it has to do with what shines through those gnarly faces and bodies.

I know what worries me most when I look in the mirror and see the old woman with no waist. It’s not that I’ve lost my beauty—I never had enough to carry on about. It’s that that woman doesn’t look like me. She isn’t who I thought I was.

My mother told me once that, walking down a street in San Francisco, she saw a blonde woman coming towards her in a coat just like hers. With a shock, she realised she was seeing herself in a mirrored window. But she wasn’t a blonde, she was a redhead!—her hair had faded slowly, and she’d always thought of herself, seen herself, as a redhead… till she saw the change that made her, for a moment, a stranger to herself.

We’re like dogs, maybe: we don’t really know where we begin and end. In space, yes; but in time, no.

All little girls are supposed (by the media, anyhow) to be impatient to reach puberty and to put on “training bras” before there’s anything to train, but let me speak for the children who dread and are humiliated by the changes adolescence brings to their body. I remember how I tried to feel good about the weird heavy feelings, the cramps, the hair where there hadn’t been hair, the fat places that used to be thin places. They were supposed to be good because they all meant that I was Becoming a Woman. And my mother tried to help me. But we were both shy, and maybe both a little scared. Becoming a woman is a big deal, and not always a good one.

When I was thirteen and fourteen I felt like a whippet suddenly trapped inside a great lumpy Saint Bernard. I wonder if boys don’t often feel something like that as they get their growth. They’re forever being told that they’re supposed to be big and strong, but I think some of them miss being slight and lithe. A child’s body is very easy to live in. An adult body isn’t. The change is hard. And it’s such a tremendous change that it’s no wonder a lot of adolescents don’t know who they are. They look in the mirror—that is me? Who’s me?

And then it happens again, when you’re sixty or seventy.

Cats and dogs are smarter than us. They look in the mirror, once, when they’re a kitten or a puppy. They get all excited and run around hunting for the kitten or the puppy behind the glass… and then they get it. It’s a trick. A fake. And they never look again. My cat will meet my eyes in the mirror, but never his own.

Who I am is certainly part of how I look and vice versa. I want to know where I begin and end, what size I am, and what suits me. People who say the body is unimportant floor me. How can they believe that? I don’t want to be a disembodied brain floating in a glass jar in a sci-fi movie, and I don’t believe I’ll ever be a disembodied spirit floating ethereally about. I am not “in” this body, I am this body. Waist or no waist.

But all the same, there’s something about me that doesn’t change, hasn’t changed, through all the remarkable, exciting, alarming, and disappointing transformations my body has gone through. There is a person there who isn’t only what she looks like, and to find her and know her I have to look through, look in, look deep. Not only in space, but in time.

I am not lost until I lose my memory.

There’s the ideal beauty of youth and health, which never really changes, and is always true. There’s the ideal beauty of movie stars and advertising models, the beauty-game ideal, which changes its rules all the time and from place to place, and is never entirely true. And there’s an ideal beauty that is harder to define or understand, because it occurs not just in the body but where the body and the spirit meet and define each other. And I don’t know if it has any rules.

One way I can try to describe that kind of beauty is to think of how we imagine people in heaven. I don’t mean some literal Heaven promised by a religion as an article of belief; I mean just the dream, the yearning wish we have that we could meet our beloved dead again. Imagine that “the circle is unbroken,” you meet them again “on that beautiful shore.” What do they look like?

People have discussed this for a long time. I know one theory is that everybody in heaven is thirty-three years old. If that includes people who die as babies, I guess they grow up in a hurry on the other side. And if they die at eighty-three, do they have to forget everything they’ve learned for fifty years? Obviously, one can’t get too literal with these imaginings. If you do, you run right up against that old, cold truth: you can’t take it with you.

But there is a real question there: How do we remember, how do we see, a beloved person who is dead?

My mother died at eighty-three, of cancer, in pain, her spleen enlarged so that her body was misshapen. Is that the person I see when I think of her? Sometimes. I wish it were not. It is a true image, yet it blurs, it clouds, a truer image. It is one memory among fifty years of memories of my mother. It is the last in time. Beneath it, behind it is a deeper, complex, ever-changing image, made from imagination, hearsay, photographs, memories. I see a little red-haired child in the mountains of Colorado, a sad-faced, delicate college girl, a kind, smiling young mother, a brilliantly intellectual woman, a peerless flirt, a serious artist, a splendid cook—I see her rocking, weeding, writing, laughing—I see the turquoise bracelets on her delicate, freckled arm—I see, for a moment, all that at once, I glimpse what no mirror can reflect, the spirit flashing out across the years, beautiful.

That must be what the great artists see and paint. That must be why the tired, aged faces in Rembrandt’s portraits give us such delight: they show us beauty not skin-deep but life-deep. In Brian Lanker’s album of photographs I Dream a World, face after wrinkled face tells us that getting old can be worth the trouble if it gives you time to do some soul making. Not all the dancing we do is danced with the body. The great dancers know that, and when they leap, our soul leaps with them—we fly, we’re free. And the poets know that kind of dancing. Let Yeats say it:

O chestnut tree, great-rooted blossomer,

Are you the leaf, the blossom or the bole?

O body swayed to music, O brightening glance,

How can we know the dancer from the dance?

COLLECTORS, RHYMESTERS, AND DRUMMERS

Some thoughts on beauty and on rhythm, written for my own entertainment early in the 1990s, and revised for this book.

COLLECTORS

People collect things. So do some birds and small mammals. The vizcacha, or bizcacha, is a little rodent that digs holes in Patagonia and the pampa and looks like a very round prairie dog with rabbity ears. Charles Darwin says:

The bizcacha has one very singular habit: namely, dragging every hard object to the mouth of its burrow: around each group of holes many bones of cattle, stones, thistle-stalks, hard lumps of earth, dry dung, etc., are collected into an irregular heap…. I was credibly informed that a gentleman, when riding on a dark night, dropped his watch; he returned in the morning, and by searching the neighborhood of every bizcacha hole in the line of road, as he expected, he soon found it. This habit of picking up whatever may be lying on the ground anywhere near its habitation, must cost much trouble. For what purpose it is done, I am quite unable to form even the most remote conjecture: it cannot be for defence, because the rubbish is chiefly placed above the mouth of the burrow…. No doubt there must exist some good reason; but the inhabitants of the country are quite ignorant of it. The only fact which I know analogous to it is the habit of that exraordinary Australian bird the Calodera maculata, which makes an elegant vaulted passage of twigs for playing in, and which collects near the spot, land and sea-shells, bones, and the feathers of birds, especially bright colored ones. (The Voyage of the Beagle, chapter 7)

Anything that left Charles Darwin unable to form even the most remote conjecture has got to be worth thinking about.

Pack rats and some magpies and crows are, I gather, more selective than bizcachas. They too take hard objects, but keep them in their nest, not outside the front door; and the objects are generally notable in being shiny, or shapely, or in some way what we would call pretty—like the gentleman’s watch. But, like the bizcacha’s clods and bits of dung, they are also notable in being absolutely useless to the collector.

And we have no idea what it is they see in them.

The male bowerbird’s collection of playpretties evidently serves to attract the female bowerbird, but has anyone observed crows or magpies using their buttons, spoons, rings, and can-pulls to enhance their allure? It seems rather that they hide them where nobody else can see them. I don’t believe anyone has seen a female pack rat being drawn to the male pack rat by the beauty of his collection (hey, honey, wanna come down and see my bottletops?).

My father, an anthropologist with interests that ranged from biology to aesthetics, kept a semipermanent conversation going—like the famous thirty-year-long poker game in Telluride—on the subject of what beauty is. Hapless visiting scholars would find themselves at our dinner table hotly discussing the nature of beauty. An aspect of the question of particular interest to anthropology is whether such concepts as beauty, or gender, are entirely constructed by each society, or whether we can identify an underlying paradigm, a universal agreement, throughout most or all societies, of what is man, what is woman, what is beautiful. Somewhere in the discussion, as it gathered weight, my father would get sneaky, cross species, and bring in the pack rat.

It is curious that evidence for what looks like an aesthetic sense—a desire for objects because they are perceived as desirable in themselves, a willingness to expend real energy acquiring something that has no practical end at all—seems to turn up only among us, some lowly little rodents, and some rowdy birds. One thing we three kinds of creature have in common is that we are nest builders, householders, therefore collectors. People, rats, and crows all spend a good deal of time gathering and arranging building materials, and bedding, and other furniture for our residences.

But there are many nesters in the animal kingdom, far closer to us genetically than birds or rodents. What about the great apes? Gorillas build a nest every night. Zoo orangs drape themselves charmingly with favorite bits of cloth or sacking. If we shared any collecting tastes with our closest relatives, it might indicate a “deep grammar” of beauty—a “deep aesthetic”?—in all us primates, or at least in the big fancy ones.

But alas I know no evidence of wild apes collecting or prizing objects because they seem to find them pretty. They examine objects of interest with interest, but that’s not quite the same as stealing something because it’s small and shiny and hiding it away as a treasure. Intelligence and the sense of beauty may overlap, but they aren’t the same thing.

Chimpanzees have been taught or allowed to paint, but their motivation seems to be interactive rather than aesthetic: they appreciate color and evidently enjoy the act of whacking the paint on the canvas, but they don’t initiate anything remotely like painting on their own in the wild; and they don’t prize their own paintings. They don’t hide them, hoard them. It appears that they’re motivated to paint because people they like want them to paint. Their reward is less the painting than the approval of these people. But a crow or a pack rat will risk its life to steal something that offers no reward of any kind except its own shiny self. And it will hoard that stolen object of beauty, treasuring and rearranging it in its collection, as if it were as precious as an egg or an infant.

The interplay of the aesthetic with the erotic is complex. The peacock’s tail is beautiful to us, sexy to the peahen. Beauty and sexual attractiveness overlap, coincide. They may be deeply related. I think they should not be confused.

We find the bowerbird’s designs exquisite, the perfume of the rose and the dance of the heron wonderful; but what about such sexual attractors as the chimp’s swollen anus, the billy goat’s stink, the slime trail a slug leaves for another slug to find so that the two slugs can couple, dangling from a slime thread, on a rainy night? All these devices have the beauty of fitness, but to define beauty as fitness would be even more inadequate than most reductionist definitions.

Darwin was never reductionist. It is like him to say that the bowerbird makes its elegant passage “for playing in”—thus leaving the bowerbird room to play, to enjoy his architecture and his treasures and his dance in his own mysterious fashion. We know that the bower is attractive to female bowerbirds, that they are drawn to it, thus becoming sexually available to the male. What attracts the females to the bower is evidently its aesthetic qualities—its architecture, its orderliness, the brightness of the colors—because the stronger these qualities are, the greater the observable attraction. But we do not know why. Least of all, if the sole end and purpose of the bower is to attract female bowerbirds, do we know why we perceive it as beautiful. We may be the wrong sex, and are certainly the wrong species.

So: What is beauty?

Beauty is small, shapely, shiny things, like silver buttons, which you can carry home and keep in your nest/box.

That’s certainly not a complete answer, but it’s an answer I can accept completely—as far as it goes. It’s a beginning.

And I think it interesting, puzzling, important that my appreciation of small, hard, shapely, shiny things is something I share with bizcachas, pack rats, crows, and magpies, of both sexes.

RHYMESTERS

Humpback whales sing. The males sing mostly in breeding season, which implies that their songs play a role in courtship. But both sexes sing; and each humpback population or nation has its distinctive song, shared by all citizens. A humpback song, which may last as much as half an hour, has a complex musical organisation, structured by phrases (groups of notes that are the same or nearly the same in each repetition) and themes (groups of repeated similar phrases).

While the humpbacks are in northern waters they don’t sing very much, and the song remains the same. When they regroup in the south, they all sing more, and the national anthem begins changing. Both the song and the changes in it may well serve to confirm community (like street slang, or any group jargon, or dialect). Every member of the community learns the current version, even when it is changing rapidly. After several years the whole tune has been radically altered. “We will sing to you a new song.”

Writing in Natural History, in March 1991, Kary B. Payne asks two questions of the whales: How do you remember your song, and why do you change it? She suggests that rhyme may help in remembering. Whale songs with a complex set of themes include “rhymes”—phrases that end similarly—and these rhymes link one theme to the next. As for the second question, why they keep changing and transforming their communal song, she says, “Can we speculate about this, and about whales’ use of rhymes, without thinking of human beings and wondering about the ancient roots in nature of even our aesthetic behavior?”

Payne’s article reminded me irresistibly of the poet/linguist Dell Hymes’s work on oral narratives in his book In Vain I Tried to Tell You and other books and articles. One such observation (summarised very crudely) is of the value of the repetitive locutions that mark divisions in Native American oral narratives. Such locutions often begin a sentence, and if translated appear as something like “So, then…” or “Now, next it happened…” or just “And.” Often discarded as meaningless, as noise, by translators intent on getting the story and its “meaning,” these locutions serve a purpose analogous to rhyme in English poetry: they signal the line, which, when there is no regular meter, is a fundamental rhythmic element; and they may also cue the larger, structural rhythmic units that shape the composition.

Following such cues, what was heard, translated, and presented as a “primitive,” purely didactic, moralising story, given what shape it has merely by the events it relates, now can be appreciated as subtly formal art, in which the form shapes the material, and in which the seemingly utilitarian narrative may actually be the means towards an essentially aesthetic end.

In oral performance, repetition does not serve only to help the performer remember the text. It is a, perhaps the, fundamental structuring element of the piece: whether it takes the form of the repetitive beat of meter, or the regular sound-echo of rhyme, or the use of refrain and other repeated structures, or the long and subtle rhythm of the lines in unmetered poetry and formal oral narrative. (To these latter are related the even longer and more elusive rhythms of written prose.)

All these uses of repetition do seem to be akin to the whales’ rhymes.

As for why the whales sing, it is certainly significant that they sing most, or the males sing most, in mating season. But if you can say a song lasting half an hour performed by a hundred individuals in chorus is a mating call, then you can say a Beethoven symphony is a mating call.

Sometimes Freud sounds as if that’s what he thought. If (as he said) the artist is motivated to make art by the desire for “fame, money, and the love of beautiful women,” then indeed Beethoven wrote the Ninth because it was mating season. Beethoven was marking his territory.

There is plenty of sexuality in Beethoven’s music, which as a woman one may sometimes be rather edgily aware of—thump, thump, thump, BANG!—but testosterone goes only so far. The Ninth Symphony reaches way, way beyond it.

The male song sparrow sings when his little gonads swell as the light grows in the spring. He sings useful information, didactically and purposefully: I am a song sparrow, this is my territory, I rule this roost, my loud sweet voice indicates my youth and health and wonderful capacity to breed, come live with me and be my love, teediddle weetoo, iddle iddle iddle! And we hear his song as very pretty. But for the crow in the next tree, “caw,” said in several different tones, serves exactly the same function. Yet to us, “caw” has negative aesthetic value. “Caw” is ugly. The erotic is not the beautiful, nor vice versa. The beauty of birdsong is incidental to its sexual or informational function.

So why do songbirds go to such elaborate, formalised, repetitive trouble, learning and passing songs down from generation to generation as they do, when they could say “caw” and be done with it?

I propose an anti-utilitarian, nonreductionist, and of course incomplete answer. The bowerbird builds his bower to court his lady, but also, in Darwin’s lovely phrase, “for playing in.” The song sparrow sings information, but plays with it as he does so. The functional message becomes complicated with a lot of “useless noise” because the pleasure of it—the beauty of it, as we say—is the noise: the trouble taken, the elaboration and repetition, the play. The selfish gene may be using the individual to perpetuate itself, and the sparrow obeys; but, being an individual not a germ cell, he values individual experience, individual pleasure, and to duty adds delight. He plays.

After all, sex, mere sex, may or may not be pleasurable. There’s no way to check with slugs or squids, and judging by the hangdog expression on the faces of dogs having sex, and the awful things cats say while having sex, and the experience of the male black widow spider, I should say that if sex is bliss sometimes it doesn’t much look like it. But sex is inarguably our duty to our genes or our species. So maybe, to make the duty more enjoyable, you play with it. You fancy it up, you add bells and whistles, tails and bowers, pleasurable complications and formalities. And if these become an end in themselves, as pleasures are likely to do, you end up singing for the joy of singing. Any useful, dutifully sexual purpose of the song has become secondary.

We don’t know why the great whales sing. We don’t know why pack rats hoard bottlecaps. We do know that young children love to sing and to be sung to, and love to see and possess pretty, shiny things. Their pleasure in such things precedes sexual maturation and seems to be quite unconnected to courtship, sexual stimulation, or mating.

And while song may affirm and confirm community, stealing silver watches certainly does not. We cannot assume that beauty is in the service of either sexuality or solidarity.

I wonder if complication and uselessness are not key words in this meditation. The pack rat seems like a little museum curator, because she has complicated her nest-building instinct with “meaningless noise”—collecting perfectly useless objects for the pleasure of it. The humpback whales can be mentioned along with Beethoven because by adding “meaningless noise” to simple mating calls and statements of community, they elaborated them into symphonies.

My husband’s Aunt Pearle employed a useful craft, crochet, with the useful purpose of making a bedspread. By making useless, highly rhythmic variations on plain crochet stitch, she complicated the whole act enormously, because she enjoyed doing so. After months of pleasurable work, she completed a beautiful thing: a “Spiderweb” coverlet, which she gave us. Although it does indeed cover a bed, it isn’t, as we women say, for everyday. It is useful, but not simply useful. It is much more than useful. It was made to put on the bed when guests are coming, to give them the pleasure of seeing its complex elegance, and the compliment of being given more than is strictly necessary—a surplus, a treat. We take what’s useful and play with it—for the beauty of it.

SILENT DRUMMERS

When people are talking about beauty in art they usually take their examples from music, the fine arts, dance, and poetry. They seldom mention prose.

When prose is what’s being talked about, the word beauty is seldom used, or it’s used as mathematicians use it, to mean the satisfying, elegant resolution of a problem: an intellectual beauty, having to do with ideas.

But words, whether in poetry or in prose, are as physical as paint and stone, as much a matter of voice and ear as music, as bodily as dancing.

I think it is a major error in criticism ever to ignore the words. Literally, the words: the sound of the words—the movement and pace of sentences—the rhythmic structures that the words establish and are controlled by.

A pedagogy that relies on the “Cliff Notes” sort of thing travesties the study of literature. To reduce the aesthetic value of a narrative to the ideas it expresses, to its “meaning,” is a drastic impoverishment. The map is not the landscape.

In poetry, the auditory and rhythmic reality of language has stayed alive all through the centuries of the Gutenberg Hegemony. Poetry has always been said or read aloud. Even in the inaudible depths of modernism, T. S. Eliot was persuaded to mumble into a microphone. And ever since Dylan Thomas wowed ’em in New York, poetry has reclaimed its proper nature as an audible art.

But prose narrative has been silent for centuries. Printing made it so.

Book-circuit readings by novelists and memoirists are popular now, and recorded readings of books have gone some way towards restoring aurality to prose; but it is still generally assumed, by writer and by critic, that prose is read in silence.

Reading is performance. The reader—the child under the blanket with a flashlight, the woman at the kitchen table, the man at the library desk—performs the work. The performance is silent. The readers hear the sounds of the words and the beat of the sentences only in their inner ear. Silent drummers on noiseless drums. An amazing performance in an amazing theater.

What is the rhythm the silent reader hears? What is the rhythm the prose writer follows?

While she was writing her last novel, Pointz Hall, which she refers to below as PH, and which when it was published became Between the Acts, Virginia Woolf wrote in her diary:

It is the rhythm of a book that, by running in the head, winds one into a ball: and so jades one. The rhythm of PH (the last chapter) became so obsessive that I heard it, perhaps used it, in every sentence I spoke. By reading the notes for memoirs I broke this up. The rhythm of the notes is far freer and looser. Two days of writing in that rhythm has completely refreshed me. So I go back to PH tomorrow. This I think is rather profound. (Virginia Woolf, Diary, 17 November 1940)

Fourteen years before this diary notation made near the end of her life, Woolf wrote the passage I used to open this book and for its title, where she speaks of prose rhythm and the wave that “breaks and tumbles in the mind.” In it she also, lightly, called her remarks on the rhythm of narrative “profound.” In both these passing notes on the rhythm of narrative, she knew, I think, that she was onto something big. I only wish she’d gone on with it.

In a letter in 1926, Woolf said that what you start with, in writing a novel, “is a world. Then, when one has imagined this world, suddenly people come in.” (Letter 1618) First comes the place, the situation, then the characters arrive with the plot…. But telling the story is a matter of getting the beat—of becoming the rhythm, as the dancer becomes the dance.

And reading is the same process, only far easier, not jading: because instead of having to discover the rhythm beat by beat, you can let yourself follow it, be taken over by it, you can let the dance dance you.

What is this rhythm Woolf talks about? Prose scrupulously avoids any clear regular beat or recurrent cadence. Are there, then, deeply syncopated patterns of stress? Or does the rhythm occur in and among the sentences—in the syntax, linkage, paragraphing? Is that why punctuation is so important to prose (whereas it often matters little in poetry, where the line replaces it)? Or is prose narrative rhythm established as well in even longer phrases and larger structures, in the occurrence of events and recurrence of themes in the story, the linkage and counterpoint of plot and chapter?

All these, I think. There are a whole lot of rhythms going in a well-written novel. Together, in their counterpoint and syncopation and union, they make the rhythm of that novel, which is unlike any other, as the rhythms of a human body in their interplay make up a rhythm unique to that body, that person.

Having made this vast, rash statement, I thought I should try to see if it worked. I felt I should be scientific. I should do an experiment.

It is not very rash to say that in a sentence by Jane Austen there is a balanced rhythm characteristic of all good eighteenth-century narrative prose, and also a beat, a timing, characteristic of Jane Austen’s prose. Following what Woolf said about the rhythm of Pointz Hall, might one also find a delicate nuancing of the beat that is characteristic of that particular Jane Austen novel?

I took down my Complete Austen and, as in the sortes Vergilianae or a lazy consultation of the I Ching, I let the book open where it wanted. First in Pride and Prejudice, and copied out the first paragraph my eyes fell on. Then again in Persuasion.

From Pride and Prejudice:

More than once did Elizabeth in her ramble within the Park, unexpectedly meet Mr Darcy.—She felt all the perverseness of the mischance that should bring him where no one else was brought: and to prevent its ever happening again, took care to inform him at first, that it was a favourite haunt of hers.—How it could occur a second time therefore was very odd!—Yet it did, and even a third.

From Persuasion:

To hear them talking so much of Captain Wentworth, repeating his name so often, puzzling over past years, and at last ascertaining that it might, that it probably would, turn out to be the very same Captain Wentworth whom they recollected meeting, once or twice, after their coming back from Clifton:—a very fine young man: but they could not say whether it was seven or eight years ago,—was a new sort of trial to Anne’s nerves. She found, however, that it was one to which she must enure herself.

Probably I’m fooling myself, but I was quite amazed at the result of this tiny test.

Pride and Prejudice is a brilliant comedy of youthful passions, while Persuasion is a quiet story about a misunderstanding that ruins a life and is set right only when it’s almost too late. One book is April, you might say, and the other November.

Well, the four sentences from Pride and Prejudice, separated rather dramatically by a period and a dash in each case, with a colon breaking the longest one in two, are all quite short, with a highly varied, rising rhythm, a kind of dancing gait, like a well-bred young horse longing to break out into a gallop. All are entirely from young Elizabeth’s point of view, in her own mental voice, which on this evidence is lively, ironical, and naive.

Though the paragraph from Persuasion is longer, it is in only two sentences; the long first one is full of hesitations and repetitions, marked by eight commas, two colons, and two dashes. Its abstract subject (“to hear them”) is separated from its verb (“was”) by several lines, all having to do with other people’s thoughts and notions. The protagonist of the sentence, Anne, is mentioned only in the next-to-last word. The sentence that follows, wholly in her own mental voice, has a brief, strong, quiet cadence.

I do not offer this little analysis and comparison as proof that any paragraph from Pride and Prejudice would have a different rhythm from any sentence in Persuasion; but as I said, it surprised me—the rhythms were in fact so different, and each was so very characteristic of the mood of the book and the nature of the central character.

But of course I am already persuaded that Woolf was right, that every novel has its characteristic rhythm. And that if the writer hasn’t listened for that rhythm and followed it, the sentences will be lame, the characters will be puppets, the story will be false. And that if the writer can hold to that rhythm, the book will have some beauty.

What the writer has to do is listen for that beat, hear it, keep to it, not let anything interfere with it. Then the reader will hear it too, and be carried by it.

A note on rhythms that I was aware of in writing two of my books:

Writing the fantasy novel Tehanu, I thought of the work as riding the dragon. In the first place, the story demanded that I be outdoors while writing it—which was lovely in Oregon in July, but inconvenient in November. Cold knees, wet notebook. And the story came not steadily, but in flights—durations of intense perception, sometimes tranquil and lyrical, sometimes frightening—which most often occurred while I was waking, early in the morning. There I would lie and ride the dragon. Then I had to get up, and go sit outdoors, and try to catch that flight in words. If I could hold to the rhythm of the dragon’s flight, the very large, long wingbeat, then the story told itself, and the people breathed. When I lost the beat, I fell off, and had to wait around on the ground until the dragon picked me up again.

Waiting, of course, is a very large part of writing.

Writing “Hernes,” a novella about ordinary people on the Oregon coast, involved a lot of waiting. Weeks, months. I was listening for voices, the voices of four different women, whose lives overlapped throughout most of the twentieth century. Some of them spoke from a long time ago, before I was born, and I was determined not to patronise the past, not to take the voices of the dead from them by making them generalised, glib, quaint. Each woman had to speak straight from her center, truthfully, even if neither she nor I knew the truth. And each voice must speak in the cadence characteristic of that person, her own voice, and also in a rhythm that included the rhythms of the other voices, since they must relate to one another and form some kind of whole, some true shape, a story.

I had no dragon to carry me. I felt diffident and often foolish, listening, as I walked on the beach or sat in a silent house, for these soft imagined voices, trying to hear them, to catch the beat, the rhythm, that makes the story true and the words beautiful.

I do think novels are beautiful. To me a novel can be as beautiful as any symphony, as beautiful as the sea. As complete, true, real, large, complicated, confusing, deep, troubling, soul enlarging as the sea with its waves that break and tumble, its tides that rise and ebb.

TELLING IS LISTENING

An unpublished piece in which I return to and go on from some of the themes and speculations of the essay “Text, Silence, Performance” in my previous nonfiction collection Dancing at the Edge of the World.

MODELS OF COMMUNICATION

In this Age of Information and Age of Electronics, our ruling concept of communication is a mechanical model, which goes like this:

Box A and box B are connected by a tube. Box A contains a unit of information. Box A is the transmitter, the sender. The tube is how the information is transmitted—it is the medium. And box B is the receiver. They can alternate roles. The sender, box A, codes the information in a way appropriate to the medium, in binary bits, or pixels, or words, or whatever, and transmits it via the medium to the receiver, box B, which receives and decodes it.

A and B can be thought of as machines, such as computers. They can also be thought of as minds. Or one can be a machine and the other a mind.

If A is a mind and B a computer, A may send B information, a message, via the medium of its program language: let’s say A sends the information that B is to shut down; B receives the information and shuts down. Or let’s say I send my computer a request for the date Easter falls on this year: this request requires the computer to respond, to take the role of box A, which sends that information, via its code and the medium of the monitor, to me, who now take the role of box B, the receiver. And so I go buy eggs, or don’t buy eggs, depending on the information I received.

This is supposed to be the way language works. A has a unit of information, codes it in words, and transmits to B, who receives it, decodes it, understands it, and acts on it.

Yes? This is how language works?

As you can see, this model of communication as applied to actual people talking and listening, or even to language written and read, is at best inadequate and most often inaccurate. We don’t work that way.

We only work that way when our communication is reduced to the most rudimentary information. “STOP THAT!” in a shout from A is likely to be received and acted on by B—at least for a moment.

If A shouts, “The British are coming!” the information may serve as information—a clear message with certain clear consequences concerning what to do next.

But what if the communication from A is, “I thought that dinner last night was pretty awful.”

Or, “Call me Ishmael.”

Or, “Coyote was going there.”

Are those statements information? The medium is the speaking voice, or the written word, but what is the code? What is A saying?

B may or may not be able to decode, or “read,” those messages in a literal sense. But the meanings and implications and connotations they contain are so enormously complex and so utterly contingent that there is no one right way for B to decode or to understand them. Their meaning depends almost entirely on who A is, who B is, what their relationship is, what society they live in, their level of education, their relative status, and so on. They are full of meaning and of meanings, but they are not information.

In such cases, in most cases of people actually talking to one another, human communication cannot be reduced to information. The message not only involves, it is, a relationship between speaker and hearer. The medium in which the message is embedded is immensely complex, infinitely more than a code: it is a language, a function of a society, a culture, in which the language, the speaker, and the hearer are all embedded.

“Coyote was going there.” Is the information being transmitted by this sentence—does it “say”—that an actual coyote actually went somewhere? Actually, no. The speaker is not talking about a coyote. The hearer knows that.

What would be the primary information obtained by a hearer who heard those words spoken, in their original language and in the context where they might have been spoken? Probably something like: Ah, Grandfather is going to tell us a story about Coyote. Because “Coyote was going there” is a cultural signal, like “Once upon a time”: a ritual formula, the implications of which include the fact that a story’s about to be told, right here, right now; that it won’t be a factual story but will be myth, or true story; in this case a true story about Coyote. Not a coyote but Coyote. And Grandfather knows that we understand the signal, we understand what he’s saying when he says, “Coyote was going there,” because if he didn’t expect us to at least partly understand it, he wouldn’t or couldn’t say it.

In human conversation, in live, actual communication between or among human beings, everything “transmitted”—everything said—is shaped as it is spoken by actual or anticipated response.

Live, face-to-face human communication is intersubjective. Intersubjectivity involves a great deal more than the machine-mediated type of stimulus-response currently called “interactive.” It is not stimulus-response at all, not a mechanical alternation of precoded sending and receiving. Intersubjectivity is mutual. It is a continuous interchange between two consciousnesses. Instead of an alternation of roles between box A and box B, between active subject and passive object, it is a continuous intersubjectivity that goes both ways all the time.

“There is no adequate model in the physical universe for this operation of consciousness, which is distinctively human and which signals the capacity of human beings to form true communities.” So says Walter Ong, in Orality and Literacy.

My private model for intersubjectivity, or communication by speech, or conversation, is amoebas having sex. As you know, amoebas usually reproduce by just quietly going off in a corner and budding, dividing themselves into two amoebas; but sometimes conditions indicate that a little genetic swapping might improve the local crowd, and two of them get together, literally, and reach out to each other and meld their pseudopodia into a little tube or channel connecting them. Thus:

Then amoeba A and amoeba B exchange genetic “information,” that is, they literally give each other inner bits of their bodies, via a channel or bridge which is made out of outer bits of their bodies. They hang out for quite a while sending bits of themselves back and forth, mutually responding each to the other.

This is very similar to how people unite themselves and give each other parts of themselves—inner parts, mental not bodily parts—when they talk and listen. (You can see why I use amoeba sex not human sex as my analogy: in human hetero sex, the bits only go one way. Human hetero sex is more like a lecture than a conversation. Amoeba sex is truly mutual because amoebas have no gender and no hierarchy. I have no opinion on whether amoeba sex or human sex is more fun. We might have the edge, because we have nerve endings, but who knows?)

Two amoebas having sex, or two people talking, form a community of two. People are also able to form communities of many, through sending and receiving bits of ourselves and others back and forth continually—through, in other words, talking and listening. Talking and listening are ultimately the same thing.

It is literacy that confuses this whole issue of communication by language. I don’t want to get into what literacy does to the human mind, though I highly recommend Walter Ong’s books on the subject. All I want to emphasise at this point is that literacy is very recent, and still not at all universal. Most people during most of the history of mankind have been, and still are, oral/aural people: people who speak and listen. Most people, most of the time, do not put words in writing, do not read, are not read to. They speak and they listen to speech.

Long, long after we learned how to talk to each other, millennia or hundreds of millennia later, we learned to write down our words. That was only about thirty-five hundred years ago, in certain restricted parts of the world.

Writing existed for three millennia, important to powerful people, seemingly unimportant to most people. Its use and uses spread. Then came printing.

With printing, literacy quite soon developed from a special craft, useful to privileged men to increase their knowledge and power, into a basic tool, a necessity for ordinary existence for ordinary people, particularly if they were seeking not to be poor and powerless.

And so effective is printed writing as a tool that those of us who use it have tended to privilege it as the most valid form of human communication. Writing has changed us, the way all our tools change us, till we have come to take it for granted that speech doesn’t matter; words don’t count till they’re written down. “I give you my word” doesn’t count for much until I’ve signed the contract. And we judge an oral culture, a culture that does not use writing, as essentially inferior, calling it “primitive.”

Belief in the absolute superiority of literacy to orality is ingrained in us literates—not without cause. Illiterates in a literate culture are terribly disadvantaged. We have arranged our North American society over the last couple of centuries so that literacy is a basic requirement for full membership.

If we compare literate and nonliterate societies, it appears that literate societies are powerful in ways nonliterate societies aren’t. Literate culture is durable in ways nonliterate culture is not. And literate people may have more breadth and variety of knowledge that nonliterate people. They are better informed. They are not necessarily wiser. Literacy does not make people good, intelligent, or wise. Literate societies are superior in some ways to nonliterate societies, but literate people are not superior to oral people.

What do anthropologists, who ought to know better, mean when they speak of “the primitive mind,” or La Pensée Sauvage (how should Lévi-Strauss’s title be translated—“How Savages Think”?)—What is a “savage,” what does “primitive” mean? Almost inevitably it means “preliterate.” “Primitives” are people who haven’t learned to write—yet. They can only talk. They are therefore inferior to anthropologists and others who can read and can write.

And indeed literacy confers such power on its owners that they can dominate illiterates, as the literate priestly and noble castes dominated illiterate medieval Europe; as literate men dominated women as long as women were kept illiterate; as literate businessmen dominate illiterate inner-city people; as English-literate corporations dominate illiterate or non-English-literate workers. If might makes right, orality is wrong.

These days, not only do we have literacy to confuse this whole issue of human communication by language, we also have what Ong calls “secondary orality.”

Primary orality refers to people who talk but don’t write—all the people we refer to as primitive, illiterate, preliterate, and so on. Secondary orality comes long after literacy, and derives from it. It is less than a hundred years old. Secondary orality is radio, TV, recordings, and such: in general, what we call “the media.”

A good deal of media presentation has a script and is therefore primarily written and secondarily oral; but these days, its most meaningful distinction from primary orality is that the speaker has no present audience.

If instead of writing this, I were giving a speech, your being in the same room listening to me would be a necessary condition of my talking. That’s primary orality: a relationship of speaker and listeners.

President Lincoln stands up and begins, “Fourscore and seven years ago,” to a crowd of more or less interested people at Gettysburg. His voice (said to have been rather thin and soft) makes a relationship between him and them, establishing community. Primary orality.

Grandfather tells a Coyote tale to a circle of grown-ups and kids on a winter evening. The story affirms and explains their community as a people and among other living beings. Primary orality.

The anchorman on the six o’clock news stares out of the box, not at us, because he can’t see us, because we aren’t where he is, or even when he is; he is in Washington, D.C., two hours ago, reading what he says off a running tape. He can’t see us or hear us, nor can we see or hear him. We see and hear an image, a simulacrum of him. There is no relationship between us and him. There is no interchange, no mutuality, between us and him. There is no intersubjectivity. His communication goes one way and stops there. We receive it, if we choose to. Our behavior, even our presence or absence, makes absolutely no difference to what he says or how he says it. If nobody was listening he would not know it and would go right on talking exactly the same way (until his sponsors found out, eventually, from the Nielsen ratings, and fired him). Secondary orality.

I read this speech into a recorder and it is taped; you buy it and listen to it. You hear the sound of my voice, but we have no actual relationship, any more than we would if you were reading the piece in print. Secondary orality.

Like the telephone, private writing, the personal letter, the private e-mail, is direct communication—conversation—mediated by technology. Amoeba A extends a pseudopodium and sends little bits of itself out to a distant amoeba B, who incorporates the material sent out and may respond to it. The telephone made immediate conversation at a distance possible; in written letters, there is an interval between messages; e-mail allows both interval and immediate exchange.

My model of printed public writing and of secondary orality is a box A shooting information out into a putative spacetime that may or may not contain many box Bs to receive it—maybe nobody—possibly an Audience of Millions (see figure 3).

Conversation is a mutual exchange or interchange of acts. Transmission via print and the media is one-way; its mutuality is merely virtual or hopeful.

Yet local, immediate community can be built upon both literacy and secondary orality. Schools and colleges are centers of the printed word, whether on paper or electronic, and are genuine if limited communities. Bible-study groups, reading clubs, fan clubs, are small printed-word-centered subcommunities, where, as in colleges, people talk about what they read. Newspapers and magazines create and foster opinion groups and facilitate communities based on information, such as sports fans comparing scores.

As for the audience of secondary orality—aside from that factitious entity the “studio audience,” which is actually part of the performance—many people watch certain TV programs not because they particularly like them but because they can talk about them with other people at work next day: they use these programs as social bonding material. But the media audience is for the most part a tenuous, widely scattered semicommunity or pseudocommunity, which can be estimated and gauged only by market research and opinion polls, and becomes actual only in political situations such as a polling place on election day, or in the response to a terrible event.

The community created by printing and by secondary orality is not immediate; it is virtual. It can be enormous—the size of America. Indeed it may be literacy more than any other factor that has enabled or coerced us to live in huge nation-states instead of tribes and city-states. Possibly the Internet will allow us to outgrow the nation-state. Although the Global Village McLuhan dreamed of is at present a City of Night, a monstrous force for cultural reductionism and internationally institutionalised greed, who knows? Perhaps we shall soar electronically to some arrangement that works better than capitalism.

But so vast a community must remain more concept than tangible fact. Written word, printed word, reproduced speech, filmed speech, the telephone, e-mail: each medium links people, but it does not link them physically, and whatever community it creates is essentially a mental one.

Let me not to the marriage of true minds admit impediment. It is marvelous that we can talk to living people ten thousand miles away and hear them speak. It is marvelous that by reading their words, or seeing a film of them, we may feel communion even with the dead. It is a marvelous thought that all knowledge might be accessible to all minds.

But marriage is not of minds only; and the living human community that language creates involves living human bodies. We need to talk together, speaker and hearer here, now. We know that. We feel it. We feel the absence of it.

Speech connects us so immediately and vitally because it is a physical, bodily process, to begin with. Not a mental or spiritual one, wherever it may end.

If you mount two clock pendulums side by side on the wall, they will gradually begin to swing together. They synchronise each other by picking up tiny vibrations they each transmit through the wall.

Any two things that oscillate at about the same interval, if they’re physically near each other, will gradually tend to lock in and pulse at exactly the same interval. Things are lazy. It takes less energy to pulse cooperatively than to pulse in opposition. Physicists call this beautiful, economical laziness mutual phase locking, or entrainment.

All living beings are oscillators. We vibrate. Amoeba or human, we pulse, move rhythmically, change rhythmically; we keep time. You can see it in the amoeba under the microscope, vibrating in frequencies on the atomic, the molecular, the subcellular, and the cellular levels. That constant, delicate, complex throbbing is the process of life itself made visible.

We huge many-celled creatures have to coordinate millions of different oscillation frequencies, and interactions among frequencies, in our bodies and our environment. Most of the coordination is effected by synchronising the pulses, by getting the beats into a master rhythm, by entrainment.

Internally, a sterling example is the muscle cells of the heart, every single one of them going lub-dub, lub-dub, together with all the others, for a lifetime.

Then there are the longer body rhythms, circadian cycles, that take a day to happen: hunger, eating, digesting, excreting; sleeping and waking. Such rhythms entrain all the organs and functions of body and mind.

And the really long bodily rhythms, which we may not even recognise, are connected with our environment, length of daylight, season, the moon.

Being in sync—internally and with your environment—makes life easy. Getting out of sync is always uncomfortable or disastrous.

Then there are the rhythms of other human beings. Like the two pendulums, though through more complex processes, two people together can mutually phase-lock. Successful human relationship involves entrainment—getting in sync. If it doesn’t, the relationship is either uncomfortable or disastrous.

Consider deliberately sychronised actions like singing, chanting, rowing, marching, dancing, playing music; consider sexual rhythms (courtship and foreplay are devices for getting into sync). Consider how the infant and the mother are linked: the milk comes before the baby cries. Consider the fact that women who live together tend to get onto the same menstrual cycle. We entrain one another all the time.

How does entrainment function in speech? William Condon did some lovely experiments which show, on film, that when we talk our whole body is involved in many tiny movements, establishing a master rhythm that coordinates our body movements with the speech rhythms. Without this beat, the speech becomes incomprehensible. “Rhythm,” he says, is “a fundamental aspect of the organisation of behavior.” To act, we have to have the beat.

Condon went on to photograph people listening to a speaker. His films show listeners making almost the same micromovements of lips and face as the speaker is making, almost simultaneously—a fiftieth of a second behind. They are locked into the same beat. “Communication,” he says, “is like a dance, with everyone engaged in intricate, shared movements across many subtle dimensions.”

Listening is not a reaction, it is a connection. Listening to a conversation or a story, we don’t so much respond as join in—become part of the action.

We can entrain without seeing the speaker; we entrain with each other when talking on the telephone. Most people feel that telephoning is less satisfactory than being with one another, that communication through hearing alone is less fully mutual, but we do it quite well; teenagers, and people with cell phones in BMWs in heavy traffic, can keep it up indefinitely.

Researchers believe that some autism may be connected with difficulty in entraining—a delayed response, a failure to catch the rhythm. We listen to ourselves as we speak, of course, and it’s very hard to speak if we can’t find the beat: this might help explain autistic silence. We can’t understand other people if we can’t get in sync with the rhythm of their speaking: this might explain autistic rage and loneliness.

Rhythm differences between dialects lead to failures in understanding. You need practice, you need training to entrain with a way of speech you aren’t familiar with.

But when you can and do entrain, you are synchronising with the people you’re talking with, physically getting in time and tune with them. No wonder speech is so strong a bond, so powerful in forming community.

I do not know to what extent people watching movies and TV entrain with speakers; since no mutual response is possible, it seems likely that the intense involvement characteristic of conversation would be much weakened.

ORAL SPACE AND ORAL TIME

Seeing is analytical, not integrative. The eye wants to distinguish objects. The eye selects. Seeing is active, outgoing. We look at. We focus on. We make distinctions easily so long as the field is clear. The visual ideal is clarity. That’s why glasses are so satisfactory. Seeing is yang.

Hearing is integrative; it unifies. Being on opposite sides of the head, ears are pretty good at telling where a sound comes from, but though the mind, the attention, can focus hearing, can listen to, the ear essentially hears from: it can’t focus narrowly and can select only with effort. The ear can’t stop hearing; we have no earlids; only sleep can shut off our reception. While we are awake our ears accept what comes. As this is likely to be noise, the auditory ideal is harmony. That’s why hearing aids, which increase noise, are so often unsatisfactory. Hearing is yin.

Light may come from vast distances, but sound, which is only vibrations in air, doesn’t travel far. Starlight carries a thousand lightyears; a human voice can carry a mile or so at most. What we hear is almost always quite local, quite nearby. Hearing is an immediate, intimate sense, not quite as close as touch, smell, taste, proprioception, but much more intimate than sight.

Sound signifies event. A noise means something is happening. Let’s say there’s a mountain out your window. You see the mountain. Your eyes report changes, snowy in winter, brown in summer, but mainly just report that it’s there. It’s scenery. But if you hear that mountain, then you know it’s doing something. I see Mount St. Helens out my study window, about eighty miles north. I did not hear it explode in 1980: the sound wave was so huge that it skipped Portland entirely and touched down in Eugene, a hundred miles to the south. Those who did hear that noise knew that something had happened. That was a word worth hearing. Sound is event.

Speech, the most specifically human sound, and the most significant kind of sound, is never just scenery, it’s always event.

Walter Ong says, “Sound exists only when it is going out of existence.” This is a very complicated simple statement. You could say it also about life. Life exists only as it is going out of existence.

Consider the word existence, printed on a page of a book. There it sits, all of it at once, nine letters, black on white, maybe for years, for centuries, maybe in thousands of copies all over the world.

Now consider the word as you speak it: “existence.” As soon as you say “tence,” “exis” is already gone, and now the whole thing’s gone. You can say it again, but that is a new event.

When you speak a word to a listener, the speaking is an act. And it is a mutual act: the listener’s listening enables the speaker’s speaking. It is a shared event, intersubjective: the listener and speaker entrain with each other. Both the amoebas are equally responsible, equally physically, immediately involved in sharing bits of themselves. The act of speaking happens NOW. And then is irrevocably, unrepeatably OVER.

Because speaking is an auditory event, not a visual one, it uses space and time differently from anything visual, including words read on paper or on a monitor.

“Auditory space has no point of favored focus. It is a sphere without fixed boundaries, space made by the thing itself, not space containing the thing.” (Ong)

Sound, speech, creates its own, immediate, instantaneous space. If we shut our eyes and listen, we are contained within that sphere.

We read printed on a page, “She shouted.” The page is durable, visible space containing the words. It is a thing not an act. But an actor shouts, and the shout is an act. It makes its own, local, momentary space.

The voice creates a sphere around it, which includes all its hearers: an intimate sphere or area, limited in both space and time.

Creation is an act. Action takes energy.

Sound is dynamic. Speech is dynamic—it is action.

To act is to take power, to have power, to be powerful.

Mutual communication between speakers and listeners is a powerful act. The power of each speaker is amplified, augmented, by the entrainment of the listeners. The strength of a community is amplified, augmented by its mutual entrainment in speech.

This is why utterance is magic. Words do have power. Names have power. Words are events, they do things, change things. They transform both speaker and hearer; they feed energy back and forth and amplify it. They feed understanding or emotion back and forth and amplify it.

ORAL PERFORMANCE

Oral performance is a particular kind of human speech. It is to an oral culture what reading is to a literate culture.

Reading is not superior to orality, and orality is not superior to reading. The two behaviors are different and have extremely different social effects. Silent reading is an implacably private activity, which while it is occurring separates the reader bodily and psychically from the people nearby. Oral performance is a powerful bonding force, which while it is occurring bonds people physically and psychically.

In our literate culture oral performance is seen as secondary, marginal. Only readings by poets of their own works and theatrical performance by actors may be perceived as having literary power comparable to written work read in silence. But oral performance in an oral culture is recognised as a powerful act, and for that reason it is always formal.

The formality is on both sides. The orator or storyteller tries to meet and fulfill certain definite expectations in the audience, gives formal cues to the audience, and may respond to formal cues from the audience. The audience will show attentiveness by certain expected behaviors: by keeping a posture of attention; in some cases, by total silence; more often, by formulaic responses—Yes, Lord! Hallelujah!—or formulaic words or affirmations: ah—hai—hah—enh…. In poetry readings, little quiet gasps. In comic performances, laughter.

Oral performance uses time and space in a particular way of its own. It creates its own, temporary, physical, actual spacetime, a sphere containing a speaking voice and listening ears, a sphere of entrained vibration, a community of body and mind.

This might be the sphere that holds a woman telling her children the tale of the Three Bears—a small, quiet, deeply intimate event.

It might be the smoky sphere that holds a stand-up comedian extemporising to an audience in a bar—a seemingly informal but, if successful, intensely and genuinely interactive event.

It could be the sphere holding a revivalist preacher speaking his hellfire sermon to a tent revival—a big, noisy, yet highly formalised, powerfully rhythmic event.

It could be the sphere that held Martin Luther King Jr. and the people who heard him say “I have a dream.”

That formal oratorical event can be echoed, can be shadowed, can be recollected, by films and recordings. Images of it can be reproduced. But it cannot. An event does not happen twice. We do not step twice into the same river.

Oral performance is irreproducible.

It takes place in a time and place set apart: cyclic time, ritual time, or sacred time. Cyclical time is heartbeat, body-cycle time; lunar, seasonal, annual time: recurrent time, musical time, dancing time, rhythmic time. An event does not happen twice, yet regular recurrence is the essence of cyclic time. This year’s spring is not last year’s spring, yet spring returns always the same. A rite is performed anew, every year, at the same time, in the same way. A story is told again and again, and yet each new telling is a new event.

Each oral performance is as unique as a snowflake, but, like a snowflake, it will very likely be repeated; and its principle internal organisational device is repetition. Rhythm is basic to oral performance, and it is chiefly obtained by recurrence, by repetition.

From now on I am going to be repeating myself about repetition. One reason there is a lot of repetition in oral performance, as in ordinary speech, is the need for redundancy. The reading eye can turn back and reread and make certain; therefore, in writing you need only say a thing once, if you say it well. So we writers are taught to be afraid of repeating ourselves, to shun even the appearance of repetition. But in speaking, words go by very quickly and are gone; they fly away, they are wingéd words. Speakers know that they may need to bring the whole flock back round again more than once. Orators, reciters, storytellers shamelessly say the same thing several times, maybe varying the words maybe not. Redundancy is not a sin in oral performance, as it has become in writing, but a virtue.

Speakers also use repetition because it is the best device they have to organise, to shape and structure, what they are saying. Experienced listeners in an oral culture—such as a three-year-old who gets read to or told stories a lot—expect repetition. They wait for it. Repetition both raises expectations and fulfills them. Minor variation is expected, but extreme variation, though it adds surprise, which may be welcomed, more likely will be rejected as frivolous or corrupt. Tell it the right way, Mama!

Repetition may be of a single word; of a phrase or sentence; of an image; of an event or action in the story; of a character’s behavior; of a structural element of the piece.

Words and phrases are the most likely to be repeated verbatim. The simplest example of this is starter words, words used to begin a sentence. In the King James Bible, it’s And. And the Lord smote the idolaters. And the idols were destroyed. And the people lamented in the streets.—In a Paiute story, a lot of sentences begin with Then—yaisi in Paiute. Then Coyote did this. Then Grey Wolf said this. Then they went in.—And and Yaisi are key sounds, cues to the listener that a new sentence, a new event, is under way; also they may provide a tiny mental resting place for the teller or reader of the story. These repeated starter words provide a beat, not a regular, metric beat, because this isn’t poetry, it’s narrative prose, but just the same a beat at intervals: a pulse that follows a pause, a sound that follows a silence.

In spoken narrative, silence plays a huge active part. Without silence, pauses, rests, there is no rhythm. Only noise. Noise is by definition meaningless, sound without significance. Significance is born of the rhythmic alternation of void and event—pause and act—silence and word. Repeated words are markers of this rhythm, drumbeats to which the story dances.

For centuries, those huge poems the Iliad and the Odyssey did not exist in writing but only in oral performance. The version we have is the one that happened to get written down. We know now that a tremendous proportion of the language of the epics consists of stock phrases, repeatable terms, used where they were needed to fill out the meter or to take up slack while the performer thought of what Achilles or Odysseus did next. No performer could possibly remember the whole thing verbatim. Every performance was half recital and half improvisation, using that vast stock of ready-made phrases. So the wine-dark sea and rosy-fingered dawn are little metrical bricks, fitted in wherever the hexameter fell short. They are also, of course, beautiful images. Does it lessen them that they are repeated where the meter needs them? Do we not in fact greet their repetition with pleasure, as we do the repetition of a musical phrase or motif in a sonata or symphony?

Repeated actions in oral narrative are essential structural elements. They are usually varied, partial repetitions, building up expectation towards fulfillment. The first son of the king goes out and behaves badly to a wolf and the dragon eats him. The second son of the king goes out and behaves badly to a deer and the dragon eats him. The third son of the king goes out and rescues the wolf from a trap, frees the deer from a snare, and the wolf and the deer tell him how to kill the dragon and find the princess, and he does, and they get married and live happily ever after.

As for repeated behavior of characters, contemporary novelists are likely to consider predictability to be a fault, a flaw, in their invention. Repeated or predictable behavior, however, is what constitutes a character—in life or novels. If it’s highly, obviously predictable, the character is a stereotype or caricature; but the gradations are endless. Some people find all Dickens’s characters mere stereotypes. I don’t. When Mr. Micawber says “Something is certain to turn up,” the first time, it’s insignificant; the second time, it’s revealing; by the third or fourth time he’s said it in the teeth of total financial disaster, it’s significant and funny; and by the end of the book, when all his hopes have been savagely defeated, “Something is certain to turn up” is both funny and profoundly sad.

I use an example from literature, not from oral texts, because Dickens’s relationship to orality and oral performance is very close, maybe closer than any other novelist since 1800 except, possibly, Tolkien. The repetitive behavior of Dickens’s characters is more characteristic of oral narrative than of the novel in general. Delicate probings into the convolutions of the private psyche in a unique situation aren’t well suited to tales told aloud. Characters of oral narratives may be vivid, powerful, worthy of a great deal of thinking about: Achilles, Hector, Odysseus, Roland and Oliver, Cinderella, the Queen and Snow White, Raven, Br’er Rabbit, Coyote. They are not one-dimensional; their motivations may be profoundly complex; the moral situations they are in are of wide and deep human relevance. But as a rule, they can be summed up in a few words, as characters in novels cannot. Their name may even be exemplary of a certain kind of behavior. And they can be summoned into the hearer’s imagination by the mere mention of characteristic behavior: Then said wily Odysseus, thinking how to save himself… Coyote was going along and he saw some girls by the river…. We’ve heard about Odysseus being wily. We’ve heard about Coyote seeing some girls. We know, in general, what to expect. Odysseus will get away with it, but at a cost; he will be damaged. Coyote won’t get away with it, will be made a complete fool of, and will trot away perfectly unashamed. The storyteller says the name Odysseus, or the name Coyote, and we the listeners await the fulfillment of our expectations, and that waiting is one of the great pleasures life offers us.

Genre literature offers us that pleasure. That is perhaps the central reason for the obstinate popularity of the romance, the mystery, science fiction, and the western, despite decades of critical and academic ignorance and contempt. A genre novel fulfills certain generic obligations. A mystery provides some kind of puzzle and its resolution; a fantasy breaks the rules of reality in a significant way; a romance offers the frustration and fulfillment of a love story. On the lowest plane, genre offers the kind of reliability hamburger chains offer: If you pick up a Louis L’Amour western or the eighteenth mystery in a series, you know what you’re going to get. But if you pick up Molly Gloss’s The Jump-Off Creek, a western, or Tolkien’s The Lord of the Rings, a fantasy, or Philip K. Dick’s The Man in the High Castle, a science fiction novel, although each reliably fulfills the obligations of its genre, it is also utterly unpredictable, a novel, a work of art.

Above the level of the merely commercial, in the realm of art, whether it’s called mainstream or genre fiction, we can fulfill our expectations only by learning which authors disappoint and which authors offer the true nourishment for the soul. We find out who the good writers are, and then we look or wait for their next book. Such writers—living or dead, whatever genre they write in, critically fashionable or not, academically approved or not—are those who not only meet our expectations but surpass them. That is the gift the great storytellers have. They tell the same stories over and over (how many stories are there?), but when they tell them they are new, they are news, they renew us, they show us the world made new.

It does not matter, on this level, whether the story is told and heard, or written and read.

But if it is written and read in silence by the reader, there is some awareness in many of us that a dimension of the experience of story has been lost: the aural dimension, the whole aspect of the telling of the story and the hearing of it in a certain time and space, by a certain person, now—and maybe over again in times to come. Sound recordings, popular as they have become, supply the sound of the words and sentences, the telling voice, but it is not a living voice, it is a reproduction—a photograph not a living body. So people seek the irreproducible moment, the brief, fragile community of story told among people gathered together in one place. So children gather at the library to be read to: look at the little circle of faces, blazing with intensity. So the writer on a book tour, reading in the bookstore, and her group of listeners reenact the ancient ritual of the teller at the center of the circle. The living response has enabled that voice to speak. Teller and listener, each fulfills the other’s expectations. The living tongue that tells the word, the living ear that hears it, bind and bond us in the communion we long for in the silence of our inner solitude.

THE OPERATING INSTRUCTIONS

I wrote this piece in 2000 as a talk to a group of people interested in local literacy and literature.

A poet has been appointed ambassador. A playwright is elected president. Construction workers stand in line with officer managers to buy a new novel. Adults seek moral guidance and intellectual challenge in stories about warrior monkeys, one-eyed giants, and crazy knights who fight windmills. Literacy is considered a beginning, not an end.

…Well, maybe in some other country, but not this one. In America the imagination is generally looked on as something that might be useful when the TV is out of order. Poetry and plays have no relation to practical politics. Novels are for students, housewives, and other people who don’t work. Fantasy is for children and primitive peoples. Literacy is so you can read the operating instructions.

I think the imagination is the single most useful tool humankind possesses. It beats the opposable thumb. I can imagine living without my thumbs, but not without my imagination.

I hear voices agreeing with me. “Yes, yes!” they cry—“the creative imagination is a tremendous plus in business! We value creativity, we reward it!” In the marketplace, the word creativity has come to mean the generation of ideas applicable to practical strategies to make larger profits. This reduction has gone on so long that the word creative can hardly be degraded further. I don’t use it any more, yielding it to capitalists and academics to abuse as they like. But they can’t have imagination.

Imagination is not a means of making money. It has no place in the vocabulary of profit making. It is not a weapon, though all weapons originate from it, and the use, or nonuse, of all weapons depends on it: as do all tools and their uses. The imagination is a fundamental way of thinking, an essential means of becoming and remaining human. It is a tool of the mind.

Therefore we have to learn to use it. Children have imagination to start with, as they have body, intellect, the capacity for language: all things essential to their humanity, things they need to learn how to use, how to use well. Such teaching, training, and practice should begin in infancy and go on throughout life. Young human beings need exercises in imagination as they need exercise in all the basic skills of life, bodily and mental: for growth, for health, for competence, for joy. This need continues as long as the mind is alive.

When children are taught to hear and learn the central literature of their people, or, in literate cultures, to read and understand it, their imagination is getting a very large part of the exercise it needs.

Nothing else does as well, not even the other arts. We are a wordy species. Words are the wings both intellect and imagination fly on. Music, dance, visual arts, crafts of all kinds, all are central to human development and well-being, and no art or skill is ever useless learning; but to train the mind to take off from immediate reality and return to it with new understanding and new strength, there is nothing like poem and story.

Through story, every culture defines itself and teaches its children how to be people and members of their people—Hmong, !Kung, Hopi, Quechua, French, Californian…. We are those who arrived at the Fourth World…. We are Joan’s nation…. We are the sons of the Sun…. We came from the sea…. We are the people who live at the center of the world.

A people that doesn’t live at the center of the world, as defined and described by its poets and storytellers, is in a bad way. The center of the world is where you live. You can breathe the air there. You know how things are done there, how things are done rightly, done well.

A child who doesn’t know where the center is—where home is, what home is—that child is in a very bad way.

Home isn’t Mom and Dad and Sis and Bud. Home isn’t where they have to let you in. It’s not a place at all. Home is imaginary.

Home, imagined, comes to be. It is real, realer than any other place, but you can’t get to it unless your people show you how to imagine it—whoever your people are. They may not be your relatives. They may never have spoken your language. They may have been dead for a thousand years. They may be nothing but words printed on paper, ghosts of voices, shadows of minds. But they can guide you home. They are your human community.

All of us have to learn how to invent our lives, make them up, imagine them. We need to be taught these skills; we need guides to show us how. If we don’t, our lives get made up for us by other people.

Human beings have always joined in groups to imagine how best to live and help one another carry out the plan. The essential function of human community is to arrive at some agreement on what we need, what life ought to be, what we want our children to learn, and then collaborate in learning and teaching so that we and they can go on the way we think is the right way.

Small communities with strong traditions are usually clear about the way they want to go, and good at teaching it. But tradition may crystallise imagination to the point of fossilizing it as dogma and forbidding new ideas. Larger communities, such as cities, open up room for people to imagine alternatives, learn from people of different traditions, and invent their own ways to live.

As alternatives proliferate, however, those who take the responsibility of teaching find little social and moral consensus on what they should be teaching—what we need, what life ought to be. In our time of huge populations exposed continuously to reproduced voices, images, and words used for commercial and political profit, there are too many people who want to and can invent us, own us, shape and control us through seductive and powerful media. It’s a lot to ask of a child to find a way through all that, alone.

Nobody can do anything very much, really, alone.

What a child needs, what we all need, is to find some other people who have imagined life along lines that make sense and allow some freedom, and listen to them. Not hear passively, but listen.

Listening is an act of community, which takes space, time, and silence.

Reading is a means of listening.

Reading is not as passive as hearing or viewing. It’s an act: you do it. You read at your pace, your own speed, not the ceaseless, incoherent, gabbling, shouting rush of the media. You take in what you can and want to take in, not what they shove at you so fast and hard and loud that you’re overwhelmed. Reading a story, you may be told something, but you’re not being sold anything. And though you’re usually alone when you read, you are in communion with another mind. You aren’t being brainwashed or co-opted or used; you’ve joined in an act of the imagination.

I know no reason why the media could not create a similar community of the imagination, as theater has often done in societies of the past, but they’re not doing it. They are so controlled by advertising and profiteering that the best people who work in them, the real artists, if they resist the pressure to sell out, get drowned out by the endless rush for novelty, by the greed of the entrepreneurs.

Much of literature remains free of such co-optation simply because a lot of books were written by dead people, who by definition are not greedy.

And many living poets and novelists, though their publishers may be crawling abjectly after bestsellers, continue to be motivated less by the desire for gain than by the wish to do what they’d probably do for nothing if they could afford it, that is, practice their art—make something well, get something right. Books remain comparatively, and amazingly, honest and reliable.

They may not be “books,” of course, they may not be ink on wood pulp but a flicker of electronics in the palm of a hand. Incoherent and commercialised and worm-eaten with porn and hype and blather as it is, electronic publication offers those who read a strong new means of active community. The technology is not what matters. Words are what matter. The sharing of words. The activation of imagination through the reading of words.

The reason literacy is important is that literature is the operating instructions. The best manual we have. The most useful guide to the country we’re visiting, life.

“A WAR WITHOUT END”

Some thoughts, written down at intervals, about oppression, revolution, and imagination.

SLAVERY

My country came together in one revolution and was nearly broken by another.

The first revolution was a protest against galling, stupid, but relatively mild social and economic exploitation. It was almost uniquely successful.

Many of those who made the first revolution practiced the most extreme form of economic exploitation and social oppression: they were slave owners.

The second American revolution, the Civil War, was an attempt to preserve slavery. It was partially successful. The institution was abolished, but the mind of the master and the mind of the slave still think a good many of the thoughts of America.

RESISTANCE TO OPPRESSION

Phillis Wheatley, poet and manumitted slave, wrote in 1774: “In every human Breast, God has implanted a principle, which we call Love of Freedom; it is impatient of Oppression, and pants for Deliverance.”

I would no more deny the truth of that than I would deny that the sun shines. All that is good in the institutions and politics of my country rests on it.

And yet I see that though we love freedom we are mostly patient of oppression, and even refuse deliverance.

I see a danger in insisting that our love of freedom always outweighs whatever force or inertia keeps us from resisting oppression and seeking deliverance.

If I deny that strong, intelligent, capable people will and do accept oppression, I’m identifying the oppressed as weak, stupid, and inept.

If it were true that superior people refuse to be treated as inferiors, it would follow that those low in the social order are truly inferior, since, if they were superior, they’d protest; since they accept an inferior position, they are inferior. This is the comfortably tautological argument of the slave owner, the social reactionary, the racist, and the misogynist.

It is an argument that still bedevils consideration of the Hitlerian holocaust: Why did the Jews “just get into the trains,” why didn’t they “fight back”? A question which—as asked—is unanswerable, and so can be used by the anti-Semite to imply the inferiority of the Jews.

But the argument appeals also to the idealist. Many liberal and humanely conservative Americans cherish the conviction that all oppressed people suffer intolerably from their oppression, must be ready and eager to rebel, and are morally weak, morally wrong, if they do not rebel.

I categorically judge as wrong any person who considers himself or herself racially or socially superior to another or enforces inferior status on another. But it is a different matter to pass categorical judgment against people who accept inferior status. If I say that they are wrong, that morality demands that they rebel, it behooves me to consider what real choice they have, whether they act in ignorance or through conviction, whether they have any opportunity to lessen their ignorance or change their conviction. Having so considered, how can I say they are at fault? Is it they, and not the oppressors, who do wrong?

The ruling class is always small, the lower orders large, even in a caste society. The poor always vastly outnumber the rich. The powerful are fewer than those they hold power over. Adult men hold superior status in almost all societies, though they are always outnumbered by women and children. Governments and religions sanction and uphold inequality, social rank, gender rank, and privilege, wholly or selectively.

Most people, in most places, in most times, are of inferior status.

And most people, even now, even in “the free world,” even in “the home of the free,” consider this state of affairs, or certain elements of it, as natural, necessary, and unchangeable. They hold it to be the way it has always been and therefore the way it must be. This may be conviction or it may be ignorance; often it is both. Over the centuries, most people of inferior status have had no way of knowing that any other way of ordering society has existed or could exist—that change is possible. Only those of superior status have ever known enough to know that; and it is their power and privilege that would be at stake if the order of things were changed.

We cannot trust history as a moral guide in these matters, because history is written by the superior class, the educated, the empowered. But we have only history to go on, and observation of current events. On that evidence, revolt and rebellion are rare things, revolution extremely rare. In most times, in most places, most women, slaves, serfs, low-castes, outcastes, peasants, working-class people, most people defined as inferior—that is, most people—have not rebelled against being despised and exploited. They resist, yes; but their resistance is likely to be passive, or so devious, so much a part of daily behavior, as to be all but invisible.

When voices from the oppressed and the underclasses are recorded, some are cries for justice, but most are expressions of patriotism, cheers for the king, vows to defend the fatherland, all loyally supporting the system that disenfranchises them and the people who profit from it.

Slavery would not have existed all over the world if slaves often rose against their masters. Most slavemasters are not murdered. They are obeyed.

Working men watch their company’s CEOs get paid three hundred times what they are paid, and grumble, but do nothing.

Women in most societies uphold the claims and institutions of male supremacy, deferring to men, obeying them (overtly), and defending the innate superiority of men as natural fact or religious dogma.

Low-status males—young men, poor men—fight and die for the system that keeps them under. Most of the countless soldiers killed in the countless wars waged to uphold the power of a society’s rulers or religion have been men considered inferior by that society.

“You have nothing to lose but your chains,” but we prefer to kiss them.

Why?

Are human societies inevitably constructed as a pyramid, with the power concentrating at the top? Is a hierarchy of power a biological imperative that human society is bound to enact? The question is almost certainly wrongly phrased and so impossible to answer, but it keeps getting asked and answered, and those who ask it usually answer it in the affirmative.

If such an inborn, biological imperative exists, is it equally imperative in both sexes? We have no incontrovertible evidence of innate gender difference in social behavior. Essentialists on both sides of the argument maintain that men are innately disposed to establish a power hierarchy while women, though they do not initiate such structures, accept or imitate them. According to the essentialists, the male program is thus certain to prevail, and we should expect to find the chain of command, the “higher” commanding the “lower,” with power concentrated in a few, a nearly universal pattern of human society.

Anthropology provides some exceptions to this supposed universality. Ethnologists have described societies that have no fixed chain of command; in them power, instead of being locked into a rigid system of inequality, is fluid, shared differently in different situations, operating by checks and balances tending always towards consensus. They have described societies that do not rank one gender as superior, though there is always some gendered division of labor, and male pursuits are those most likely to be celebrated.

But these are all societies that we describe as “primitive”—tautologically, since we have already established a value hierarchy: primitive = low = weak, civilised = high = powerful.

Many “primitive” and all “civilised” societies are rigidly stratified, with much power assigned to a few and little or no power to most. Is the perpetuation of institutions of social inequality in fact the engine that drives civilisation, as Lévi-Strauss suggests?

People in power are better fed, better armed, and better educated, and therefore better able to stay that way, but is that sufficient to explain the ubiquity and permanence of extreme social inequality? Certainly the fact that men are slightly larger and more muscular (though somewhat less durable) than women is not sufficient to explain the ubiquity of gender inequality and its perpetuation in societies where size and muscularity do not make much difference.

If human beings hated injustice and inequality as we say we do and think we do, would any of the Great Empires and High Civilisations have lasted fifteen minutes?

If we Americans hate injustice and inequality as passionately as we say we do, would any person in this country lack enough to eat?

We demand a rebellious spirit of those who have no chance to learn that rebellion is possible, but we the privileged hold still and see no evil.

We have good reason to be cautious, to be quiet, not to rock the boat. A lot of peace and comfort is at stake. The mental and moral shift from denial of injustice to consciousness of injustice is often made at very high cost. My contentment, stability, safety, personal affections, may become a sacrifice to the dream of the common good, to the idea of a freedom that I may not live to share, an ideal of justice that nobody may ever attain.

The last words of the Mahabharata are, “By no means can I attain a goal beyond my reach.” It is likely that justice, a human idea, is a goal beyond human reach. We’re good at inventing things that can’t exist.

Maybe freedom cannot be attained through human institutions but must remain a quality of the mind or spirit not dependent on circumstances, a gift of grace. This (if I understand it) is the religious definition of freedom. My problem with it is that its devaluation of work and circumstance encourages institutional injustices which make the gift of grace inaccessible. A two-year-old child who dies of starvation or a beating or a firebombing has not been granted access to freedom, nor any gift of grace, in any sense in which I can understand the words.

We can attain by our own efforts only an imperfect justice, a limited freedom. Better than none. Let us hold fast to that principle, the love of Freedom, of which the freed slave, the poet, spoke.

THE GROUND OF HOPE

The shift from denial of injustice to recognition of injustice can’t be unmade.

What your eyes have seen they have seen. Once you see the injustice, you can never again in good faith deny the oppression and defend the oppressor. What was loyalty is now betrayal. From now on, if you don’t resist, you collude.

But there is a middle ground between defense and attack, a ground of flexible resistance, a space opened for change. It is not an easy place to find or live in. Peacemakers trying to get there have ended up scuttling in panic to Munich.

Even if they reach the middle ground, they may get no thanks for it. Harriet Beecher Stowe’s Uncle Tom is a slave who, for his courageous effort to persuade his owner to change his heart and his steadfast refusal to beat other slaves, is beaten to death. We insist on using him as a symbol of cringing capitulation and servility.

Admiring heroically useless defiance, we sneer at patient resistance.

But the negotiating ground, where patience makes change, is where Gandhi stood. Lincoln got there, painfully. Bishop Tutu, having lived there for years in singular honor, saw his country move, however awkwardly and uncertainly, towards that ground of hope.

THE MASTER’S TOOLS

Audre Lord said you can’t dismantle the master’s house with the master’s tools. I think about this powerful metaphor, trying to understand it.

By radicals, liberals, conservatives, and reactionaries, education in the masters’ knowledge is seen as leading inevitably to consciousness of oppression and exploitation, and so to the subversive desire for equality and justice. Liberals support and reactionaries oppose universal free education, public schools, uncensored discussion at the universities for exactly the same reason.

Lord’s metaphor seems to say that education is irrelevant to social change. If nothing the master used can be useful to the slave, then education in the masters’ knowledge must be abandoned. Thus an underclass must entirely reinvent society, achieve a new knowledge, in order to achieve justice. If they don’t, the revolution will fail.

This is plausible. Revolutions generally fail. But I see their failure beginning when the attempt to rebuild the house so everybody can live in it becomes an attempt to grab all the saws and hammers, barricade Ole Massa’s toolroom, and keep the others out. Power not only corrupts, it addicts. Work becomes destruction. Nothing is built.

Societies change with and without violence. Reinvention is possible. Building is possible. What tools have we to build with except hammers, nails, saws—education, learning to think, learning skills?

Are there indeed tools that have not been invented, which we must invent in order to build the house we want our children to live in? Can we go on from what we know now, or does what we know now keep us from learning what we need to know? To learn what people of color, the women, the poor, have to teach, to learn the knowledge we need, must we unlearn all the knowledge of the whites, the men, the powerful? Along with the priesthood and phallocracy, must we throw away science and democracy? Will we be left trying to build without any tools but our bare hands? The metaphor is rich and dangerous. I can’t answer the questions it raises.

ONLY IN UTOPIAS

In the sense that it offers a glimpse of some imagined alternative to “the way we live now,” much of my fiction can be called utopian, but I continue to resist the word. Many of my invented societies strike me as an improvement in one way or another on our own, but I find Utopia far too grand and too rigid a name for them. Utopia, and Dystopia, are intellectual places. I write from passion and playfulness. My stories are neither dire warnings nor blueprints for what we ought to do. Most of them, I think, are comedies of human manners, reminders of the infinite variety of ways in which we always come back to pretty much the same place, and celebrations of that infinite variety by the invention of still more alternatives and possibilities. Even the novels The Dispossessed and Always Coming Home, in which I worked out more methodically than usual certain variations on the uses of power, which I preferred to those that obtain in our world—even these are as much efforts to subvert as to display the ideal of an attainable social plan which would end injustice and inequality once and for all.

To me the important thing is not to offer any specific hope of betterment but, by offering an imagined but persuasive alternative reality, to dislodge my mind, and so the reader’s mind, from the lazy, timorous habit of thinking that the way we live now is the only way people can live. It is that inertia that allows the institutions of injustice to continue unquestioned.

Fantasy and science fiction in their very conception offer alternatives to the reader’s present, actual world. Young people in general welcome this kind of story because in their vigor and eagerness for experience they welcome alternatives, possibilities, change. Having come to fear even the imagination of true change, many adults refuse all imaginative literature, priding themselves on seeing nothing beyond what they already know, or think they know.

Yet, as if it feared its own troubling powers, much science fiction and fantasy is timid and reactionary in its social invention, fantasy clinging to feudalism, science fiction to military and imperial hierarchy. Both usually reward their hero, whether a man or woman, only for doing outstandingly manly deeds. (I wrote this way for years myself. In The Left Hand of Darkness, my hero is genderless but his heroics are almost exclusively manly.) In science fiction particularly, one also often meets the idea I discussed above, that anyone of inferior status, if not a rebel constantly ready to seize freedom through daring and violent action, is either despicable or simply of no consequence.

In a world so morally simplified, if a slave is not Spartacus, he is nobody. This is merciless and unrealistic. Most slaves, most oppressed people, are part of a social order which, by the very terms of their oppression, they have no opportunity even to perceive as capable of being changed.

The exercise of imagination is dangerous to those who profit from the way things are because it has the power to show that the way things are is not permanent, not universal, not necessary.

Having that real though limited power to put established institutions into question, imaginative literature has also the responsibility of power. The storyteller is the truthteller.

It is sad that so many stories that might offer a true vision settle for patriotic or religious platitude, technological miracle working, or wishful thinking, the writers not trying to imagine truth. The fashionably noir dystopia merely reverses the platitudes and uses acid instead of saccharine, while still evading engagement with human suffering and with genuine possibility. The imaginative fiction I admire presents alternatives to the status quo which not only question the ubiquity and necessity of extant institutions, but enlarge the field of social possibility and moral understanding. This may be done in as naively hopeful a tone as the first three Star Trek television series, or through such complex, sophisticated, and ambiguous constructions of thought and technique as the novels of Philip K. Dick or Carol Emshwiller; but the movement is recognisably the same—the impulse to make change imaginable.

We will not know our own injustice if we cannot imagine justice. We will not be free if we do not imagine freedom. We cannot demand that anyone try to attain justice and freedom who has not had a chance to imagine them as attainable.

I want to close and crown these inconclusive meditations with the words of a writer who never spoke anything but truth, and always spoke it quietly, Primo Levi, who lived a year in Auschwitz, and knew what injustice is.

“The ascent of the privileged, not only in the Lager but in all human coexistence, is an anguishing but unfailing phenomenon: only in utopias is it absent. It is the duty of righteous men to make war on all undeserved privilege, but one must not forget that this is a war without end.”

Загрузка...