LIVING

VARIATIONS ON DESIRE: A Mouse, a Dog, Buber, and Bovary

DESIRE APPEARS AS A FEELING, a flicker or a bomb in the body, but it’s always a hunger for something, and it always propels us somewhere else, toward the thing that is missing. Even when this motion takes place on the inner terrain of fantasy, it has a quickening effect on the daydreamer. The object of desire — whether it’s a good meal, a beautiful dress or car, another person, or something abstract, such as fame, learning, or happiness — exists outside of us and at a distance. Whatever it is, we don’t have it now. Although they often overlap, desires and needs are semantically distinct. I need to eat, but I may not have much desire for what is placed in front of me. While a need is urgent for bodily comfort or even survival, a desire exists at another level of experience. It may be sensible or irrational, healthy or dangerous, fleeting or obsessive, weak or strong, but it isn’t essential to life and limb. The difference between need and desire may be behind the fact that I’ve never heard anyone talk of a rat’s “desire”—instincts, drives, behaviors, yes, but never desires. The word seems to imply an imaginative subject, someone who thinks and speaks. In Webster’s, the second definition for the noun desire is: “an expressed wish, a request.” One could argue about whether animals have “desires.” They certainly have preferences. Dogs bark to signal they wish to go outside, ravenously consume one food but leave another untouched, and make it known that the vet’s door is anathema. Monkeys express their wishes in forms sophisticated enough to rival those of their cousins, the Homo sapiens. Nevertheless, human desire is shaped and articulated in symbolic terms not available to animals.

When my sister Asti was three years old, her heart’s desire, repeatedly expressed, was a Mickey Mouse telephone, a Christmas wish that sent my parents on a multi-city search for a toy that had sold out everywhere. As the holiday approached, the tension in the family grew. My sister Liv, then seven, and I, nine, had been brought into the emotional drama of the elusive toy and began to fear that the object our younger sister craved would not be found. As I remember it, my father tracked the thing down in the neighboring city of Fairbault, late in the afternoon that Christmas Eve, only hours before the presents were to be opened. I recall his triumphant arrival through the garage door, stamping snow from his boots, large garish box in hand — and our joy. My youngest sister, Ingrid, is missing from the memory, probably because she was too young to have participated in what had become a vicarious wish for the rest of us. Asti knows the story, because it took on mythical proportions in the family, and she remembers the telephone, which remained part of the toy collection for some time, but the great unwrapping on the living room floor that I watched with breathless anticipation isn’t part of her memory.

This little narrative of the Mickey Mouse telephone opens an avenue into the peculiarities of human desire. Surely the telephone’s luminous and no doubt aggrandized image on the television screen whetted Asti’s desire and triggered fantasies of possession. The Disney rodent himself must have played a role. She may have imagined having conversations with the real mouse. I don’t know, but the object took on the shine of glamour, first for her, and then for the rest of us, because it wasn’t gained easily. It had to be fought for, always an augmenting factor in desire. Think of the troubadours. Think of Gatsby. Think of literature’s great, addled Knight Errant on Rocinante. A three-year-old’s desire infected four other family members who loved her because her wish became ours through intense identification, not unlike the sports fan’s hope that his team will win. Desire can be contagious. Indeed, the churning wheels of capitalism depend upon it.

Asti’s “Mickey Mouse” desire presupposes an ability to hold an object in the mind and then imagine its acquisition at some other time, a trick the great Russian neurologist A. R. Luria (1902–1977) explicitly connected to language with its roaming I and the labile quality of linguistic tenses: was, is, will be. Narrative is a mental movement in time, and longing for an object very often takes on at least a crude narrative: P is lonely and longs for company. He dreams of meeting Q. He imagines that he is talking to Q in a bar, her head nestled on his shoulder. She smiles. He smiles. They stand up. He imagines her lying in his bed naked, and so on. I have always felt intuitively that conscious remembering and imagining are powerfully connected, that they are, in fact, so similar as to be at times difficult to disentangle from each other, and that they both are bound to places. It’s important to anchor the people or objects you remember or imagine in a mental space — or they begin to float away, or worse, disappear. The idea that memory is rooted in location goes back to the Greeks and exerted a powerful influence on medieval thought. The scholastic philosopher Albertus Magnus wrote, “Place is something the soul itself makes for laying up images.”1

Scientists have recently given new force to this ancient knowledge in a study of amnesia patients with bilateral hippocampal damage. The hippocampus, in connection with other medial temporal lobe areas of the brain, is known to be vital to the processing and storage of memory, but it also appears to be essential to imagining. When asked to visualize a specific scene, the brain-damaged patients found it difficult to provide a coherent spatial context for their fantasies. Their reports were far more fragmented than those of their healthy counterparts (or “controls,” as scientists like to call them). This insight does not, of course, affect desire itself. People with hippocampal damage don’t lack desire — but fully imagining what they long for is impaired. Other forms of amnesia, however, would make it impossible to keep the image of a Mickey Mouse telephone or the phantom Ms. Q in the mind for more than seconds. This form of desire lives only in the moment, outside narrative, an untraceable eruption of feeling that could be acted upon only if a desirable object popped up in the same instant and the amnesiac reached out and grabbed it.

But desire can be aimless, too. It happens to me from time to time that I wonder what it is I am wanting. A vague desire makes itself felt before I can name the object — a restlessness in my body, possibly hunger, possibly the faintest stirring of erotic appetite, possibly a need to write again or read again or read something else, but there it is — a push in me toward a satisfaction I can’t identify. What is that? Jaak Panksepp, a neuroscientist, writes in his book, Affective Neuroscience: The Foundations of Human and Animal Emotions, about what he calls “the SEEKING system.” Other scientists have given drabber names to the same circuit: “behavioral activation system” or “behavioral facilitation system.” Panksepp writes:

Although the details of human hopes are surely beyond the imagination of other creatures, the evidence now clearly indicates that certain intrinsic aspirations of all mammalian minds, those of mice as well as men, are driven by the same ancient neurochemistries. These chemistries lead our companion creatures to set out energetically to investigate and explore their worlds, to seek available resources and make sense of the contingencies in their environments. These same systems give us the impulse to become actively engaged with the world and to extract meaning from our various circumstances.2

Curiosity, that need to go out into the world, appears to be hardwired in all mammals. As Panksepp articulates it: it’s “a goad without a goal.”3 The “extraction of meaning” from those investigations, however, requires higher cortical areas of the brain unique to human beings. My dear departed dog Jack, when unleashed in the Minnesota countryside, would move eagerly from stump to thistle to cow pie, nostrils quivering, inhaling each natural marvel, and then, once he had mastered the lay of the land, he would burst into a run and race back and forth across the territory like a demented conquering hero. Through his superlative nose, he remembered and recognized the place, but I don’t think that when he was back home in Brooklyn he carried about with him a mental image of the wide flat land where he could romp freely or that he actively longed to return to it. Nor do I think he lay on his bed and imagined an ideal playground of myriad odors. And yet, he missed his human beings when we were gone. He grieved, in fact. Attachment and separation anxiety are primitive evolutionary mechanisms shared by all mammals. Once, when my sister Ingrid cared for Jack in our absence, she was sitting in a room of the house and, feeling a chill, went to the closet and put on a sweater of mine. When she returned, the poor dog was seized with a fit of joy, jumping up on her, turning circles in the air, and licking whatever part of her he could reach. Jack’s nose was spot-on; what he lacked was a human sense of time and context, which might have prevented him from believing in my sudden materialization out of nowhere.

There is a beautiful passage in Martin Buber’s book Between Man and Man, in which he describes stroking a beloved horse on his grandparents’ estate when he was eleven years old. He tells of the immense pleasure it gave him, his tactile experience of the animal’s vitality beneath its skin, and his happiness when the horse greeted him by lifting its head.

But once — I do not know what came over the child, at any rate it was childlike enough — it struck me about the stroking, what fun it gave me, and suddenly I became conscious of my hand. The game went on as before, but something had changed, it was no longer the same thing. And the next day, after giving him a rich feed, when I stroked my friend’s head he did not raise his head. A few years later, when I thought back to the incident, I no longer supposed that the animal had noticed my defection. But at the time I considered myself judged.4

Buber’s story is meant to illustrate the withdrawal from a life of dialogue with the Other into a life of monologue or “reflexion.” For Buber, this self-reflective or mirroring quality disrupts true knowledge of the Other because he then exists as “only part of myself.” It’s notable that Buber shifts to the third person in the early part of the passage and then resumes in the first, because his experience is of a sudden, intrusive self-consciousness that alters the character of his desire. He has become another to himself, a third person he sees in his mind’s eye petting the horse and enjoying it, rather than an active “I” with a “you.” This self-theater of the third person is, I think, uniquely human and is forever invading our desires and fantasies. Celebrity culture demonstrates the extreme possibilities of this position because it runs on the idea of a person seen from the outside as spectacle, and the possibility that lesser mortals, with some luck, can rise to the ranks of the continually photographed and filmed. With the Internet and sites like Facebook, the intense longing to live life in the third person seems to have found its perfect realization. But all of us, whether we are Internet voyeurs of our own dramas or not, are infected by Buber’s “reflexion,” his description of narcissism, in which the self is trapped in an airless hall of mirrors.

Buber’s condemnation of the monologue position is profound, and yet self-consciousness itself is born in “mirroring” and the acquisition of symbols through which we are able to represent ourselves as an “I,” a “he,” or a “she.” It is this distance from the self that makes narrative movement and autobiographical memory possible. Without it, we couldn’t tell ourselves the story of ourselves. Living solely in reflection, however, creates a terrible machinery of insatiable desire, the endless pursuit of the thing that will fill the emptiness and feed a starved self-image. Emma Bovary dreams of Paris: “She knew all the latest fashions, where to find the best tailors, the days for going to the Bois or the Opera. She studied descriptions of furniture in Eugene Sue, and sought in Balzac and George Sand a vicarious gratification of her own desires.”5

It is no secret that, once gained, the objects of desire often lose their sweetness. The real Paris cannot live up to the dream city. The high-heeled pumps displayed in a shop window that glow with the promise of beauty, urbanity, and wealth are just shoes once they find their way into the closet. After a big wedding, which in all its pomp and circumstance announces marriage as a state of ultimate arrival, there is life with a real human being, who is inevitably myopic, weak, and idiosyncratic. The revolutionary eats and sleeps the revolution, the grand cleansing moment when a new order will triumph, and then, once it has happened, he finds himself wandering among corpses and ruins. Only human beings destroy themselves by ideas. Emma Bovary comes to despair: “And once again the deep hopelessness of her plight came back to her. Her lungs heaved as though they would burst. Then in a transport of heroism which made her almost gay, she ran down the hill and across the cow-plank, hurried along the path, up the lane, through the market-place and arrived in front of the chemist’s shop.”6 It is the phrase “a transport of heroism” that is most poignant to me, the absurd but all too human desire to inflate the story of oneself, to see it reflected back as heroic, beautiful, or martyred.

Desire is the engine of life, the yearning that goads us forward with stops along the way, but it has no destination, no final stop, except death. The wondrous fullness after a meal or sex or a great book or conversation is inevitably short-lived. By nature, we want and we wish, and we assign content to that emptiness as we narrate our inner lives. For better and for worse, we bring meaning to it, one inevitably shaped by the language and culture in which we live. Meaning itself may be the ultimate human seduction. Dogs don’t need it, but for us to go on, it is essential, and this is true despite the fact that most of what happens to us is beneath our awareness. The signifying, speech-making, willful, consciously perceiving circuits of our brains are minute compared to the vast unconscious processes that lie beneath.

Almost twenty years ago, I gave birth to my daughter. Actually, “I” did nothing. My water broke. Labor happened. After thirteen hours of it, I pushed. I liked this time of pushing. It was active, not passive, and I finally expelled from between my legs a bloody, wet, awe-inspiring stranger. My husband held her, and I must have, too, but I don’t remember her in my arms until later. What I do recall is that as soon as I knew the baby was healthy, I lapsed into a state of unprecedented satisfaction. A paradisaical torpor seemed to flood my body, and I went limp and still. I was wheeled away to a dim room, and after some minutes, my obstetrician appeared, looked down at me, and said, “I’m just checking on you. How are you?” It was an effort to speak, not because I had any pain or even a feeling of exhaustion, but because speech seemed unnecessary. I did manage to breathe out the words that described my condition: “I’m fine, fine. I’ve never felt like this. I have no desire, no desire of any kind.” I remember that she grinned and patted my arm, but after she left, I lay there for some time, luxuriating in the sated quiet of my body, accompanied only by the awed repetition of the same words: I have no desire, none, no desire of any kind. I am sure that I was under the sway of the hormone oxytocin, released in quantities I had never experienced before, and which had turned me into a happy lump of flesh. Birth was a wholly animal experience; its brutal corporeal paroxysms left reflection behind. The executive, thinking, narrative “I” lost itself entirely in the ultimate creative act: one body being born of another. After the birth, it returned as a stunned commentator, similar to a voice-over in a movie that noted the novelty of my situation to an audience of one: me. Of course, the stupefaction didn’t last. It couldn’t last. I had to take care of my child, had to hold her, feed her, look at her, want her with my whole being. There is nothing more ordinary than this desire, and yet to be gripped by it feels miraculous.

Martin Buber doesn’t treat mothers and infants in his I/Thou dialectic, but the ideal dialogue he describes of openness to the other, of communication that is not dependent on speech, but which can happen in silence “sacramentally,” is perhaps most perfectly realized in the mother/child couple. Especially in the first year, a mother opens herself up to her baby. As D. W. Winnicott writes in The Family and Individual Development, she is able to “drain interest from her self onto the baby.” A mother, he adds, in his characteristically lucid way, has “a special ability to do the right thing. She knows what the baby could be feeling like. No one else knows. Doctors and nurses know a lot about psychology, and of course they know a lot about body health and disease. But they do not know what a baby feels like from minute to minute because they are outside this area of experience.”7 Imagining what your baby feels like by reading her carefully and responding to her is a mother’s work; it is a first/second-person business, and it brings with it ongoing gratification for both sides of the dyad. It is also, as Allan Schore makes clear in his book Affect Regulation and the Origin of the Self, essential to the neurobiological development of the infant.

Maternal desire is a subject fraught with ideology. From the screaming advocates of “family values” to those whose agenda makes it necessary to replace the word “mother” with “caregiver” at every opportunity, popular culture trumpets its competing narratives. In a country where human relationships are seen as entities to be “worked on,” as if they were thousand-piece puzzles that only take time to complete, the pleasure to be found in one’s children, the desire we have for them falls outside the discussion. It is not my intention to be a Romantic. Parenthood can be grueling, boring, and painful, but most people want their children and love them. As parents, they are, as Winnicott said about mothers: “good enough.” This “good enough” is not perfection but a form of dialogue, a receptiveness that doesn’t impose on the child the monologic desires of the parents, but recognizes his autonomy, his real separateness.

Every week, I teach a writing class to inpatients at the Payne Whitney psychiatric clinic. My students are all people who find themselves in the hospital because life outside it had become unbearable, either to themselves or to other people. It is there that I’ve witnessed what it looks like to have no desire or very little desire for anything. Psychotic patients can be electrifying and filled with manic, creative energy, but severely depressed patients are strangely immobile. The people who come to my class have already put one foot in front of the other and found their way into a chair, which is far more than some of the others can do — the ones who remain in their rooms, inert on their beds like the living dead. Some people come to class but do not speak. Some come but do not write. They look at the paper and pencil and are able to say they cannot do it, but will stay and listen. One woman who sat rigidly in her chair, hardly moving except for the hand that composed her piece, wrote of a morgue where the bodies were laid out on slabs, their mouths opened to reveal black, cankerous tongues. “That’s why we’re here,” she said after she had finished reading it aloud, “because we’re dead. We’re all dead.” As I listened to her words, I felt cut and hurt. This was more than sadness, more than grief. Grief, after all, is desire for the dead or for what’s been lost and can never come again. Grief is longing. This was stasis without fulfillment. This was the world stopped, meaning extinguished. And yet, she had written it, had bothered to record this bleak image, which I told her frightened me. I said I had pictured it in my mind the way I might remember some awful image in a movie, and I tried to hold her with my eyes, keep her looking at me, which I did for several seconds. When I think of it now, bringing up film might have been defensive on my part, a way of keeping some distance between me and that morgue (where I’ll end up sooner or later). Nevertheless, I’ve come to understand that what I say is often less important to the students than my embodied attention, my rapt interest in what is happening among us, that they know I am listening, concentrated, and open. I have to imagine what it feels like to be in such a state without coming unglued myself.

I don’t know what that woman’s particular story was or why she landed in the hospital. Some people come wearing the bandages of their suicide attempts, but she didn’t. Everybody has a story, and each one is unique, and yet now that I’ve been going to the hospital for a year, I’ve seen many variations of a single narrative. One man encompassed it beautifully in a short poem. I can’t remember his exact wording but have retained the images it brought to mind. He is a child again, wandering alone in an apartment, longing for “someone” to be there. He finds a door. It swings open, and the room is empty. I can’t think of a better metaphor for unrequited longing than that vacant room. My student understood the essence of what he was missing: the responsive presence of another, and he knew that this absence had both formed and damaged him.

I seem to have come far from the Mickey Mouse telephone, but like so many objects of desire, the telephone was more than a telephone, and the story of searching for it and finding it at last to fulfill a child’s wish is a small parable of genuine dialogue: I have heard you and I’m coming with my answer.

2007

MY MOTHER, PHINEAS, MORALITY, AND FEELING

“DON’T DO ANYTHING YOU DON’T really want to do,” my mother said as she drove me home from some class, meeting, or friend’s house I have long forgotten. I don’t remember anything else my mother said during our conversation, and I can’t say why she offered me this piece of advice just then. I do remember the stretch of Highway 19 just outside my hometown, Northfield, Minnesota, that is now forever associated with those words. It must have been summer, because the grass was green and the trees were in full leaf. I also distinctly recall that as soon as she had spoken, I felt guilty. Was I doing things I didn’t really want to do? I was fifteen years old, in the middle of my adolescence, a young person filled with private longings, confusions, and torments. My mother’s words gave me pause, and I have never stopped thinking about them.

Hers is a curious sentence when you look at it closely, with its two don’ts framing the highly positive phrase “anything you really want to do.” I knew my mother wasn’t offering me a prescription for hedonism or selfishness, and I received this bit of wisdom as a moral imperative about desire. The don’ts in the sentence were a warning against coercion, probably sexual. Notably, my mother did not say, “Don’t have sex, take drugs, or go wild.” She cautioned me to listen to my moral feelings — but what exactly are they? Feeling, empathy in particular, inevitably plays a crucial role in our moral behavior.

That day she spoke to me as if I were an adult, a person beyond looking to her parents for direction. This both flattered and scared me a little. Hiding behind the sentence was the clear implication that she would not tell me what to do anymore. Because my own daughter is now twenty, I understand my mother’s position more vividly. As a toddler, Sophie wanted to stick her fingers into outlets, grab toys from other children, and take off her clothes at every opportunity. When her father and I interfered with these desires, she howled, but our six-year-old girl was another person altogether. Even a mild reprimand from either her father or me would make her eyes well up with tears. Guilt, an essential social emotion, had emerged in her and become part of a codified moral world of rights and wrongs, dos and don’ts.

The journey from naked savage to modest, empathetic little person to independent adult is also a story of brain development. From birth to around the age of six, a child’s prefrontal cortex develops enormously, and how it develops depends on her environment — which includes everything from poisons in the atmosphere to how her parents care for her. It is now clear from research that an adolescent’s brain also undergoes crucial changes and that emotional trauma and deprivation, especially when repeated, can leave lasting, detrimental imprints on the developing brain. The prefrontal cortex is far more developed in human beings than in other animals and is often referred to as the “executive” area of the brain, a region involved in evaluating and controlling our feelings and behavior.

Twenty years ago, I stumbled across the story of Phineas Gage in a neurology textbook. In 1849, the railroad foreman suffered a bizarre accident. A four-foot iron rod rammed into his left cheek, blasted through his brain, and flew out through the top of his head. Miraculously, Gage recovered. He could walk, talk, and think, but along with a few cubic centimeters of the ventromedial region of his frontal lobe, he lost his old self. The once considerate, highly competent foreman became impulsive, aggressive, and callous with other people. He made plans, but could never carry them out. Fired from one job after another, his life deteriorated, and he wandered aimlessly until he died in San Francisco in 1861. This story haunted me because it suggested an awful thing: moral life could be reduced to a chunk of brain meat.

I remember asking a psychoanalyst about this story not long after I had read it. She shook her head: It wasn’t possible. From her point of view, the psyche had nothing to do with the brain — ethics simply don’t vanish with gray matter. But I now think of the Phineas story differently. Gage lost what he had gained earlier in his life — the capacity to feel the higher emotions of empathy and guilt, both of which inhibit our actions in the world. After his injury he turned into a kind of moral infant. He could no longer imagine how his actions would affect others or himself, no longer feel compassion, and without this feeling, he was fundamentally handicapped, even though his cognitive capacities remained untouched. He behaved like the classic psychopath who acts on impulse and feels no remorse.

In Decartes’ Error, the neurologist Antonio Damasio retells the story of Phineas Gage and compares his case to that of one of his patients, Elliot, who, after surgery for a malignant brain tumor, suffered damage to his frontal lobes. Like Gage before him, Elliot could no longer plan ahead and his life fell apart. He also became strangely cold. Although his intellectual faculties appeared to work well, he lacked feeling, both for himself and for others. Damasio writes: “I found myself suffering more when listening to Elliot’s stories than Elliot himself seemed to be suffering.”1 After doing a series of experiments on his patient, Damasio theorizes about what my mother took for granted: emotion not only enhances decision-making in life, it is essential to it.

Sometimes, however, I don’t know what I really want. I have to search myself, and that search involves both a visceral sense of what I feel and a projection of myself into the future. Will I regret having accepted that invitation? Am I succumbing to pressure from another person that will fill me with resentment later? I feel furious after reading this e-mail now, but haven’t I learned that waiting a couple of days before I respond is far wiser than sending off a splenetic answer right now? The future is, of course, imaginary — an unreal place that I create from my expectations, which are made from my remembered experiences, especially repeated experiences. Patients with prefrontal lesions exhibit the same curious deficits. They can pass all kinds of mental cognition tests, but something crucial is still missing. As A. R. Luria notes in Higher Cortical Functions in Man (1962), “… clinicians have invariably observed that, although the ‘formal intellect’ is intact, these patients show marked changes in behavior.”2 They lose the critical faculty to judge their own behaviors, and lapse into a bizarre indifference about themselves and others. I would argue that something has gone awry with their emotional imaginations.

A couple of years after that conversation with my mother in the car, I was skiing with my cousin at a resort in Aspen, Colorado. Early one evening, I found myself alone at the top of a steep slope made more frightening by the mini-mountains on its surface known as moguls. I wasn’t a good enough skier to take that hill, but I had boarded the wrong chairlift. There was only one way out for me and that was down. As I stood there at the summit looking longingly at the ski chalet far below, I had a revelation: I understood then and there that I didn’t like skiing. It was too fast, too cold. It scared me. It had always scared me. One may wonder how it is possible for a young woman of seventeen not to have understood this simple fact about her existence until faced with a crisis. I come from a Norwegian family. My mother was born and raised in that northern country and my father’s grandparents emigrated from Norway. In Norway, people say that children ski before they walk, an overstatement that nevertheless brings the point home. The idea that skiing might not be fun, might not be for everyone, had never occurred to me. Where I come from, the sport signified pleasure, nature, family happiness. As these thoughts passed through my mind, I noticed that the chairlifts were closing and the sky was darkening. I took a breath, gave myself a push with my poles, and went over the edge. About half an hour later, a patrol on a snowmobile discovered me lying in a heap under a mogul, minus a ski, but otherwise intact.

Ridiculous as the story is, its implications are far reaching. We sometimes imagine we want what we don’t really want. A way of thinking about something can become so ingrained, we fail to question it, and that failure may involve more than a tumble on a ski slope. The friend who returns repeatedly to a man who mistreats her is in the grip of a familiar, self-defeating desire in which the imagined future has been forgotten. When I was an impoverished graduate student, I would sometimes spend twenty or thirty dollars on a T-shirt or accessory I didn’t need or even particularly want. What I craved was the purchase, not the thing itself. Of course, a sense of not being deprived may fill an emotional void without ruinous consequences. On the other hand, if you can’t pay your electric bill, you’re stuck. I found myself in a bad spot on the ski slope because I was doing something I didn’t really want to do. My poor judgment was the result of both an alienation from my feelings and a lack of sympathy for myself. The latter observation is vital. Because, like all human beings, I can objectify myself — see myself as one person among others in the social world — I am able not only to plan ahead by imagining how what I do now will affect what happens to me later, I gain the distance needed to recognize myself as a being who deserves compassion.

During the first year of my marriage, I was nervous. I worried in an abstract way about losing my freedom, about domestic life in general, about how to be “a wife.” When I confronted my new husband with these worries, he looked at me and said, “Why, Siri, do whatever you want to do.” I hadn’t told my husband what my mother said to me on Highway 19 twelve years earlier, but his words created an undeniable echo. I understood that he wasn’t giving me license to hurl myself into the arms of another man. He released me to my desires because, like my mother, he trusted my moral feeling. The effect was one of immediate liberation. A burden fell off my shoulders, and I went about doing what I wanted to do, which included being married to the particular man I loved.

My marriage thinking wasn’t all that different from my skiing thinking. I adopted an externalized, rigid, heartless view of both: skiing is supposed to be fun and marriage is an institution of constriction. I didn’t ask myself what I really wanted, because I was in the grip of a received idea, one I had to interrogate and feel for myself before I could discard or embrace it. Unlike Phineas and Elliot, my frontal lobes are intact. I know, however, that the mysteries of my personal neurology are, like everybody else’s, a synthetic combination of my innate genetic temperament and my life experience over time, a thought that takes me back to my mother, a person central to that story. When I told her that I was writing about the advice she gave me years ago, she said, “Well, you know, I couldn’t have said that to just anyone.” Unlike some hackneyed phrase lifted from the pages of a parenting guide, my mother’s sentence was addressed directly to me, and it was given with knowledge, empathy, and love. No doubt, that’s why her words have stayed with me. I felt them.

2007

SEARCH FOR A DEFINITION

AMBIGUITY: NOT QUITE ONE THING, not quite the other. Ambiguity resists category. It won’t fit into the pigeonhole, the neat box, the window frame, the encyclopedia. It is a formless object or a feeling that can’t be placed. And there is no diagram for ambiguity, no stable alphabet, no arithmetic. Ambiguity asks: Where is the border between this and that?

There is comfort in saying the word chair and pointing into the room where the chair sits on the floor. There is comfort in seeing the chair and saying the word chair softly to one’s self, as if that were the end of the matter, as if the world and the word have met. Naïve realism. In English, I can add a single letter to word and get world. I put a small l between the r and the d and close the chasm between the two, and the game gives me some satisfaction.

Ambiguity does not obey logic. The logician says, “To tolerate contradiction is to be indifferent to truth.” Those particular philosophers like playing games of true and false. It is either one thing or the other, never both. But ambiguity is inherently contradictory and insoluble, a bewildering truth of fogs and mists and the unrecognizable figure or phantom or memory or dream that can’t be contained or held in my hands or kept because it is always flying away, and I cannot tell what it is or if it is anything at all. I chase it with words even though it won’t be captured, and every once in a while I come close to it.

That feeling of nearness to the shapeless ghost, Ambiguity, is what I want most, what I want to put inside a book, what I want the reader to sense. And because it is at once a thing and a no-thing, the reader will have to find it, not only in what I have written, but also in what I have not written.

2009

MY STRANGE HEAD: Notes on Migraine

1. ARMS AT REST

I AM A MIGRAINEUR. I use the noun with care, because after a lifetime of headaches, I have come to think of migraines as a part of me, not as some force or plague that infects my body. Chronic headaches are my fate, and I have adopted a position of philosophical resignation. I am aware that such a view is resoundingly un-American. Our culture does not encourage anyone to accept adversity. On the contrary, we habitually declare war on the things that afflict us, whether it’s drugs, terrorism, or cancer. Our media fetishizes the heartwarming stories of those who, against all odds, never lose hope and fight their way to triumph over poverty, addiction, disease. The person who lies back and says, “This is my lot. So be it,” is a quitter, a passive, pessimistic, spineless loser who deserves only our contempt. And yet, the very moment I stopped thinking of my condition as “the enemy,” I made a turn and began to get better. I wasn’t cured, wasn’t forever well, but I was better. Metaphors matter.

Although I wasn’t diagnosed with migraine until I was twenty, I can’t remember a time when I didn’t suffer from headaches. A German neurologist, Klaus Podoll, who has studied migraine auras and artists, contacted me a few years ago after he read an interview I had given, in which I mentioned a hallucination that preceded one of my headaches. In an e-mail conversation, he questioned me carefully about my history and concluded that the annual bouts of what my mother and I thought were stomach flu were probably migraine attacks. I have come to agree with him. My “flu” was always accompanied by a severe headache and violent vomiting. It didn’t occur during the flu season, and the sickness always followed exactly the same course. Two days of pain and nausea that lightened on the third day. Throughout my childhood, the attacks came with ritual regularity. In high school, I didn’t have as many “flus,” but after I returned from an intensely exciting semester abroad, spent mostly in Thailand, during my third year of college, I fell ill with what I thought was yet another flu, a siege of excruciating head pain and retching that lasted six days. On the seventh day, the pain lifted somewhat, but it didn’t go away. It didn’t go away for a year. It was better, it was worse, but my head always hurt and I was always nauseated. I refused to give in to it. Like a dutiful automaton, I studied, wrote, received the desired As, and suffered alone until I went to my family doctor, sobbed in his arms, and was diagnosed with migraine.

My young adulthood was punctuated by the headaches with their auras and abdominal symptoms, nervous storms that came and went. And then, after I married the man I was deeply in love with when I was twenty-seven, I went to Paris on my honeymoon and fell sick again. It began with a seizure: my left arm suddenly shot up into the air, and I was thrown back against the wall in an art gallery I was visiting. The seizure was momentary. The headache that followed went on and on for month after month. This time I searched for a cure. I was determined to battle my symptoms. I visited neurologist after neurologist, took innumerable drugs: Cafergot, Inderal, Mellaril, Tofranil, Elavil, and others I’ve forgotten. Nothing helped. My last neurologist, known as the Headache Czar of New York City, hospitalized me and prescribed Thorazine, a powerful antipsychotic. After eight days of stuporous sedation and an ongoing headache, I checked myself out. Panicked and desperate, I began to think that I would never be well. As a last resort, the Czar sent incurables like me to a biofeedback man. Dr. E. hooked me up to a machine via electrodes and taught me how to relax. The technique was simple. The more tense I was, the louder and faster the machine beeped. As I relaxed, the sounds grew slower until they finally stopped. For eight months, I went for a weekly visit and practiced letting go. Every day I practiced at home without the machine. I learned how to warm my cold hands and feet, to increase my circulation, to dampen the pain. I learned to stop fighting.

Migraine remains a poorly understood illness. Although new techniques, such as neuroimaging, have helped isolate some of the neural circuits involved, brain pictures won’t provide a solution. The syndrome is too various, too complex, too mixed up with external stimuli and the personality of the sufferer — aspects of migraine that can’t be seen on fMRI or PET scans with their colored highlights. I have come to understand that my headaches are cyclical and that they play a part in my emotional economy. As a child, life with my peers in school was always hard for me, and my yearly purges no doubt served a purpose. For two days a year, I suffered a cathartic dissolution, during which I was able to stay home and be close to my mother. But times of great happiness can also send me over the edge — the adventure in Thailand and falling in love and getting married. Both were followed by a collapse into pain, as if joy had strained my body to its breaking point. The migraine then became self-perpetuating. I am convinced that a state of fear, anxiety, and a continual readiness to do combat with the monster headache pushed my central nervous system into a state of continual alarm, which could only be stopped by a deep rest. I continue to cycle. Periods of obsessive and highly productive writing and reading that give me immense pleasure are often followed by a neurological crash — a headache. My swings from high to low resemble the rhythms of manic depression, or bipolar disorder, except that I fall into migraine, not depression, and my manias are less extreme than those of people who suffer from the psychiatric illness. The truth is that separating neurological from psychiatric problems is often artificial, as is the old and stubborn distinction between psyche and soma. All human states, including anger, fear, sadness, and joy, are of the body. They have neurobiological correlates, as researchers in the field would say. What we often think of as purely psychological, how we regard an illness, for example, is important. Our thoughts, attitudes, even our metaphors create physiological changes in us, which in the case of headaches can mean the difference between misery and managing. Research has shown that psychotherapy can create therapeutic brain changes, an increase of activity in the prefrontal cortex. Yes, just talking and listening can make you better.

No one ever died of a migraine. It isn’t cancer, heart disease, or a stroke. With a life-threatening disease, your attitude — whether bellicose or Buddhist — cannot keep you alive. It may simply change how you die. But with my migraines that continue to arrive and no doubt always will, I have found that capitulation is preferable to struggle. When I feel one coming on, I go to bed, and now machineless, I do my relaxation exercises. My meditations aren’t magical, but they keep the worst pain and nausea at bay. I do not welcome my headaches, but neither do I see them as alien. They may even serve a necessary regulatory function, by forcing me to lie low, a kind of penance, if you will, for those other days of flying high.


2. “CURIOUSER AND CURIOUSER”

“‘Who in the world am I?’ Ah, that’s the great puzzle!” says Lewis Carroll’s Alice after experiencing a sudden, disorienting growth spurt. While she meditates on this philosophical conundrum, her body changes again. The girl shrinks. I have asked myself the same question many times, often in relation to the perceptual alterations, peculiar feelings, and exquisite sensitivities of the migraine state. Who in the world am I? Am “I” merely malfunctioning white and gray matter? In The Astonishing Hypothesis Francis Crick (famous for his discovery of the DNA double helix with James Watson) wrote, “You, your joys and your sorrows, your memories and ambitions, your sense of personal identity and free will, are, in fact, no more than the behavior of a vast assembly of nerve cells and their associated molecules.”1 Mind is matter, Crick argued. All of human life can be reduced to neurons.

There is a migraine aura phenomenon named after Charles Lutwidge Dodgson’s (Lewis Carroll’s) story of myriad transformations: Alice in Wonderland syndrome. The afflicted person feels that she or parts of her are ballooning or diminishing in size. The neurological terms for the peculiar sensations of growing and shrinking are macroscopy and microscopy. Dodgson was a migraineur. He was also known to take laudanum. It seems more than possible that he had experienced at least some of the somatic oddities that he visited upon his young heroine. These experiences are not unique to migraine. They are also seen in people who have suffered neurological damage. In The Man with a Shattered World, A. R. Luria recorded the case of a patient, Zazetsky, who suffered a terrible head injury during the Second World War. “Sometimes,” Zazetsky wrote, “when I’m sitting down I suddenly feel as though my head is the size of a table — every bit as big — while my hands, feet, and torso become very small.”2 Body image is a complex, fragile phenomenon. The changes in the nervous system wrought by an oncoming headache, the lesions caused by a stroke or a bullet, can affect the brain’s internal corporeal map, and we metamorphose.

Is Alice in Wonderland a pathological product, the result of a single man’s “nerve cells and their associated molecules” run amok? The tendency to reduce artistic, religious, or philosophical achievements to bodily ailment was aptly named by William James in The Varieties of Religious Experience. “Medical materialism,” he wrote, “finishes up Saint Paul by calling his vision on the road to Damascus a discharging lesion of the occipital cortex, he being an epileptic. It snuffs out Saint Teresa as an hysteric, Saint Francis of Assisi as a hereditary degenerate.”3 And, I might add, Lewis Carroll as an addict or migraineur. We continue to live in a world of medical materialism. People pay thousands of dollars to get a peek at their genetic map, hoping to ward off disease early. They rush to embrace the latest, often contradictory, news on longevity. One study reports it’s good to be chubby. Another insists that when underfed, our close relatives the chimpanzees live longer, and we would do well to follow suit. Republicans and Democrats are subject to brain scans to see what neural networks are affected when they think about politics. The media announces that researchers have found the “God spot” in the brain. Before the genome was decoded and scientists discovered that human beings have only a few more genes than fruit flies, there were innumerable articles in the popular press speculating that a gene would be found for alcoholism, OCD, an affection for purple ties; in short, for everything.

It is human to clutch at simple answers and shunt aside ambiguous, shifting realities. The fact that genes are expressed through environment, that however vital they may be in determining vulnerability to an illness, they cannot predict it, except in rare cases, such as Huntington’s disease; that the brain is not a static but a plastic organ, which forms itself long after birth through our interactions with others; that any passionate feeling, whether it’s about politics or tuna fish, will appear on scans as activated emotional circuits in the brain; that scientific studies on weight and longevity tell us mostly about correlations, not causes; that the feelings evoked by the so-called “God spot” may be interpreted by the person having them as religious or as something entirely different — all this is forgotten or misunderstood.

The man who gave us Alice in Wonderland suffered from migraine. He was also a mathematician, a clergyman, a photographer, and a wit. He was self-conscious about a stammer and may have had sexual proclivities for young girls. It is impossible to know exactly what role migraine played in his creative work. My own experience of the illness — scotomas, euphorias, odd feelings of being pulled upward, a Lilliputian hallucination — figure in the story of myself, a story that in the end can’t be divided into nature or nurture. Migraine runs in families, so I probably have a hereditary predisposition to headaches, but the way the illness developed, and its subsequent meaning for me, are dependent on countless factors, both internal and external, many of which I will never penetrate. Who in the world am I? is an unsolved question, but we do have some pieces to the puzzle.

As Freud argued over a century ago, most of what our brains do is unconscious, beneath or beyond our understanding. No one disputes this anymore. The human infant is born immature, and in the first six years of its life, the front part of its brain (the prefrontal cortex) develops enormously. It develops through experience and continues to do so, although less rapidly than before. Our early life, much of which never becomes part of our conscious memory because it’s lost to infantile amnesia (our brains cannot consolidate conscious memories until later), is nevertheless vital to who we become. A child who has good parental care — is stimulated, talked to, held, whose needs are answered — is materially affected by that contact, as is, conversely, the child who suffers shocks and deprivations. What happens to you is decisive in determining which neural networks are activated and kept. The synaptic circuits that aren’t used are “pruned”; they wither away. This explains why so-called wild children are unable to acquire anything but the most primitive form of language. It’s too late. It also demonstrates how nurture becomes nature and why making simple distinctions between them is absurd. A baby with a hypersensitive genetic makeup that predisposes him to anxiety can end up as a reasonably calm adult if he grows up in a soothing environment.

So Crick was technically right. What seem to be the ineffable riches of human mental life do depend on “an assembly of nerve cells.” And yet, Crick’s reductionism does not provide an adequate answer to Alice’s question. It’s rather like saying that Vermeer’s Girl Pouring Milk is a canvas with paint on it or that Alice herself is words on a page. These are facts, but they don’t explain my subjective experience of either of them or what the two girls mean to me. Science proceeds by testing and retesting its findings. It relies on many people’s work, not just a few. Its “objectivity” rests upon consensus, the shared presuppositions, principles, and methods from which it arrives at its “truths,” truths which are then modified or even revolutionized over time. It should be noted that even the late Francis Crick wasn’t able to leap out of his subjective mental apparatus and become a superhuman observer of BRAIN.

We are all prisoners of our mortal minds and bodies, vulnerable to various kinds of perceptual transfigurations. At the same time, as embodied beings we live in a world that we explore, absorb, and remember — partially, of course. We can only find the out there through the in here. And yet, what the philosopher Karl Popper called World 3, the knowledge we have inherited — the science, the philosophy, and the art — stored in our libraries and museums, the words, images, and music produced by people now dead, becomes part of us and may take on profound significance in our everyday lives. Our thinking, feeling minds are made not only by our genes but also through our language and culture. I have been fond of Lewis Carroll’s Alice since childhood. She may have started out as words on a page, but now she inhabits my inner life. (One could also say her story has been consolidated in my memory through important work done by my hippocampus.) It is possible that my headache episodes have made me particularly sympathetic to the girl’s adventures and her metaphysical riddle, but I am hardly alone in my affection. I dare say countless people have lifted her from World 3, a kind of Wonderland in itself, and taken her into their own internal landscapes, where she continues to grow and shrink and muse over who in the world she is.


3. LIFTING, LIGHTS, AND LITTLE PEOPLE

Not every migraine has a prologue or “aura,” and not every aura is followed by a headache. Nevertheless, these overtures to pain or isolated events are the most peculiar aspect of the illness and may offer insights into the nature of perception itself. As a child I had what I called “lifting feelings.” Every once in a while, I had a powerful internal sensation of being pulled upward, as if my head were rising, even though I knew my feet hadn’t left the ground. This lift was accompanied by what can only be called awe — a feeling of transcendence. I variously interpreted these elevations as divine (God was calling) or as an amazed connection to things in the world. Everything appeared strange and wondrous. The lights came later in my life — showers of stars that begin on one side, usually the right, sharp black points surrounded by shining light that cascade downward and then move toward the center of my vision, or brilliant lights surrounded by black rings or just tiny black spots swimming in air. I’ve had fogs and gray spots that make it hard to see what’s in front of me, weird holes in my vision, and a sensation that there’s a heavy cloud in my head. I’ve had feelings of euphoria that are indescribably wonderful and supernatural exhaustion — a weariness unlike any other I’ve experienced, a pull toward sleep that is irresistible. Sometimes I have fits of yawning that I can’t stop. Also, often just before I wake up with a migraine, I have an aphasia dream. I am trying to speak, but my lips won’t form the words and every utterance is terribly distorted. But my most remarkable premigraine event was hallucinatory. I was lying in bed reading a book by Italo Svevo, and for some reason looked down, and there they were: a small pink man and his pink ox, perhaps six or seven inches high. They were perfectly made creatures and, except for their color, they looked very real. They didn’t speak to me, but they walked around, and I watched them with fascination and a kind of amiable tenderness. They stayed for some minutes and then disappeared. I have often wished they would return, but they never have.

Lilliputian hallucinations before migraine are rare. There are other documented cases, however. Klaus Podoll has written about a woman who during her migraine attacks sees amusing little beetles with faces run across her floor and ceiling. Another reported case involved tiny Indians, and yet another, a dwarf. It wasn’t until after my duo had vanished that I understood I had seen a miniature version of two legendary, oversized characters from my childhood in Minnesota: Paul Bunyan and his blue ox, Babe. The giant man and his huge animal that I had read about in stories had shrunk dramatically and turned pink. It was then that I asked myself about the content of the hallucination. What did it mean that my aura took that form, rather than something else? Are these visions purely nonsensical? What memory traces are activated during these experiences? A man I met in the hospital, where I teach a writing class to psychiatric inpatients, told me that during a psychotic episode he had hallucinated little green men getting into a spaceship. This stereotypical vision of Martians appeared during his crisis, but unlike most of the migraineurs I’ve read about, he found his little aliens disturbing. Psychosis, alcoholism, dementia, epilepsy, and hallucinogens like LSD can all produce neurological disturbances that conjure tiny, life-size, or gigantic persons and animals, as can a disorder called Charles Bonnet syndrome, often but not always associated with deteriorating vision. In his book Phantoms in the Brain, V. S. Ramachandran reports that during a conversation he had with one of his patients, she told him that she saw little cartoon characters scooting up his arms. Why Paul Bunyan? Why Martians? Why cartoon characters? Oddly, all of these visions have a folkloric quality, more contemporary versions of the mythological little people around the world: leprechauns, brownies, fairies, gnomes, goblins, Nordic nisse and tomten, the Hawaiian Menehune, the Greek kalikauzari, the Cherokee yumwi. Where did all these wee folk come from? The content of hallucinations must surely be at once personal and cultural.

My dear little creatures were migrainous figments, aura products similar to other experiences of complex visual hallucinations, which although they may have various medical causes, bear a resemblance to one another and no doubt have some neurobiological connection. As Oliver Sacks points out in his book on migraine, we all hallucinate in our sleep. We generate dream images and stories that are often peculiar, violate the laws of physics, and are highly emotional. But why we dream remains a scientific mystery. Sigmund Freud proposed that dreams protect sleep. Mark Solms, a neurologist and sleep researcher, agrees: “Patients who lose the ability to dream due to brain damage suffer from sleep-maintenance insomnia — they have difficulty staying asleep.”4 We human beings may have a need to create stimulating imagery that keeps us busy while we’re in that parallel state and the waking world has vanished.

Another ordinary form of spontaneous mental images are hypnagogic hallucinations, which appear on the threshold between sleeping and waking. I had always believed that the brilliant mutating images I see as I drift off every night are universal, but I have since discovered that while common, not everyone falls asleep to visions. I am deeply attached to my presleep cinema of ghouls and monsters, shifting faces and bodies that grow and shrink, to my own nameless cartoon characters who flee over mountaintops or jump into lakes, to the brilliant colors that explode or bleed into gorgeous geometries, to the gyrating dancers and erotic performers who entertain me while I am still conscious but falling toward Morpheus. Except as a spectator, I play no role in this lunatic borderland. It is a world distinct from that of my dreams, in which I am always an actor, and therefore it is more closely allied to my Lilliputian experience. I watched them, but I felt no need to interact with them. They were simply there for my viewing pleasure.

It is comforting to think that visual perception is a matter of taking in what’s out there, that a clear line exists between “seeing things” and the everyday experience of looking. In fact, this is not how normal vision works. Our minds are not passive containers of external reality or experience. Evidence suggests that what we see is a combination of sensory information coming in from the outside, which has been dynamically translated or decoded in our brains through both our expectations of what it is we are looking at and our human ability to create coherent images. We don’t just digest the world; we make it. For example, we all have a blind spot in each eye at the place where the optic nerve enters the retina, but we don’t sense that hole, because our minds automatically fill it in. As V. S. Ramachandran and the philosopher Patricia Churchland have argued, “filling in” isn’t always the covering over of a blank with more of the same; there are instances when the brain provides pictures — a normal form of hallucination. Very simply, for the mind, absence can be a catalyst for presence. In his beautiful memoir, And There Was Light, Jacques Lusseyran describes his experience of the world after he went blind at age eight: “Light threw its color on things and on people. My father and mother, the people I met or ran into in the street, all had their characteristic color which I had never seen before I went blind. Yet now this special attribute impressed itself on me as part of them as definitely as any impression created by a face.”5 For Lusseyran, losing his vision became an avenue to almost mystical insight. He found himself lifted up into a world of color and light drenched with meaning.

A lot of research has been done on visual perception. Scientists have isolated cells in particular areas of the seeing parts of the brain that serve special functions — the recognition of verticality, color, and motion, for example — but mysteries remain. Philosophers, neuroscientists, and cognitive scientists argue madly over “the binding problem”—how an object can appear whole and unified to us when each of its features is channeled through disparate networks in the brain. Qualia — the subjective experiences of things — are just as controversial. I don’t see a consensus coming any time soon. Migraine auras of light, color, black holes and fogs, of high feeling and dread, and of peculiar little creatures that run or dance or just amble about, occupy a special place in the medical literature. They are anomalies, no doubt, tics of the nervous system that affect some, not all, but they could well help explain more general human qualities — who we are, what we feel, and how we see. I suspect that everyone has a few Lilliputians in hiding. It may be just a question of whether they pop out or not.

2008

PLAYING, WILD THOUGHTS, AND A NOVEL’S UNDERGROUND

PSYCHOANALYSIS PROPOSES THAT WE ARE strangers to ourselves. There were precursors to Freud’s idea of a psychic unconscious in both philosophy and science. Schopenhauer and Nietzsche each had a version of it, as did the scientists William Benjamin Carpenter in nineteenth-century England and Gustav Fechner and Hermann von Helmholtz in Germany. All of them believed that much of what we are is hidden from us, not only our automatic biological processes but also memories, thoughts, and ideas. Pierre Janet, Jean-Martin Charcot’s younger colleague at the Salpêtrière Hospital in Paris, pursued a psychobiological notion of the self. Ideas, he argued, can split off from consciousness, travel elsewhere, and appear as hysterical symptoms. Theories never bloom in nothingness. What is certain is that Sigmund Freud and his followers, both the faithful and the revisionist, have altered the way we think of ourselves. But the question here is about the novel. Has psychoanalysis changed the novel? Does putting a psychoanalyst in a novel affect its form, its sense of time, its essence?

The novel is a chameleon. That is its glory as a genre. It can be an enormous waddling monster or a fast, lean sprite. It can take everything in or leave most things out. It is Tolstoy and Beckett. There are no rules for writing novels. Those who believe there are rules are pedants and poseurs and do not deserve a minute of our time. Modes of writing and various schools come and go: Grub Street, Naturalism, the nouveau roman, magical realism. The novel remains. The modern novel was born a hybrid, to borrow the Russian theorist M. M. Bakhtin’s word for the genre’s mingling, contradictory voices that shout and murmur from every level and corner of society. When psychoanalysis appeared on the horizon, the novel welcomed it into itself as it welcomes all discourses.

When the “I” of the book is an analyst, does it fundamentally alter the way the novel works? Laurence Sterne’s Tristram Shandy (1759) has a structure far more radical and, I would say, more akin to the associative workings of the human mind and memory, than Simone de Beauvoir’s far more conventional book The Mandarins (1954), which has a narrating analyst, Anne. But to address this question, I cannot remain outside it, looking down at it from a third-person view. In life there is no omniscient narrator. Making a work of fiction is playing, playing in deadly earnest, perhaps, but playing nevertheless. D. W. Winnicott, the English psychoanalyst and pediatrician, argued that play is universal, part of every human being’s creativity and the source of a meaningful life. Making art is a form of play.

In fact, I have discovered that a novel can be written only in play: an open, relaxed, responsive, permissive state of being that allows a work to grow freely. The Sorrows of an American was generated by an unbidden mental image that came to me while I was daydreaming. In a room that looked very much like the tiny living room in my grandparents’ farmhouse, I saw a table. On the table was an open coffin, and in the coffin lay a girl. Then, as I watched, she sat up. My father was dying then, and despite the familiar setting — my father grew up in that house — and the undisguised wish to wake the dead that must have been at the heart of the fantasy, I did not interpret it. Not long afterwards, my father died. There are no miracles in the book, but the farmhouse is there, and a girl child who wakes up, and all through it the dead return to the living. Sections of the book came directly from a memoir my father had written at the end of his life for his family and friends. I now know I used those passages as a way to revive him, if only as a ghost.

And where did my storyteller come from, my forty-seven-year-old, divorced, lonely, grieving psychiatrist/psychoanalyst, Erik Davidsen? Some time in the early eighties, I saw a drawing by Willem de Kooning called “Self-Portrait with Imaginary Brother” at the Whitney Museum in New York. I love de Kooning’s work, but in this case it was the artist’s title that hit me. As one of four sisters, I knew this was the only kind of brother I could ever have. After I finished my Ph.D. in 1986, I considered training to earn my living as a psychoanalyst, but I was too poor for more schooling. Nevertheless, when I began writing the story, my imaginary brother-analyst-self was waiting for me. And I began to play.

The truth about unconscious processes is that the book can know more than the writer knows, a knowing that comes in part from the body, rising up from a preverbal, rhythmic, motor place in the self, what Maurice Merleau-Ponty called schéma corporel. When I cannot find words, a walk helps. My feet jog the sentence loose from that secret underground. Images lurk in that cellar, too, along with half-formed phrases, and whole sentences that belong to no one. Wilfred Bion, an English psychoanalyst, said, “If a thought without a thinker comes along, it may be what is a stray thought, or it could be a thought with the owner’s name and address upon it, or it could be a ‘wild thought.’”1 Sometimes when I’m writing, wild thoughts appear. They fly ahead of me. I have to run after them to understand what is happening.

I discovered the novel’s music as I went along, as well as its gaps and silences. There are always things that are unsaid — significant holes. I was aware that I was writing about memory. Freud’s notion of Nachträglichkeit haunted the book. We remember, and we tell ourselves a story, but the meanings of what we remember are reconfigured over time. Memory and imagination cannot be separated. Remembering is always also a form of imagining. And yet some memories remain outside sequence, story, and felt human time: the involuntary flashbacks of trauma. These timeless bits and pieces of images and sensory shocks subvert and interrupt narration. They resist plot. The real secrets of this particular novel are not revealed through the plot. Many of them never come to light at all.

Surely, what I have learned about psychoanalysis over the years has shaped my work, because it has altered my thoughts, both wild and tame. But so have philosophy, linguistics, neurobiology, paintings, poems, and other novels, not to speak of my lived experiences, both remembered and forgotten. As Winnicott knew, long before there was psychoanalysis, there was play.

2009

SLEEPING/NOT SLEEPING

1. FAILING TO FALL

THE NARRATOR OF CHAUCER’S POEM The Book of the Duchess cannot sleep. As his fitful thoughts come and go, he lies awake. He hasn’t slept for so long, he fears he may die of insomnia. But what is the reason for his sleeplessness? “Myselven can not telle why,”1 he says. The English expression “to fall asleep” is apt because the transition between waking and sleeping is a gradual drop from one state of being into another, a giving up of full self-consciousness for unconsciousness or for the altered consciousness of dreams. Except in cases of exhaustion or with the aid of drugs, the movement from one world to another is not instantaneous; it takes a little time. Full waking self-consciousness begins to loosen and unravel.

During this interval, I have often had the illusion that I am walking. I feel my foot slip off a curb and fall, but before I hit the pavement, I feel a jerk and am fully awake again. I also watch brilliant mutating spectacles on my closed eyelids, hypnagogic hallucinations, that usher me into sleep. Sometimes I hear voices speak a single word or a short emphatic sentence. In Speak, Memory, Vladimir Nabokov tells about his own visual and auditory semi-oneiric phenomena. “Just before falling asleep I often become aware of … a neutral, detached anonymous voice which I catch saying words of no importance to me whatever — an English or Russian sentence.” He, too, had visions, often “grotesque,” that preceded sleep. Although hypnagogic hallucinations are poorly studied except in relation to narcolepsy, many people without that affliction report seeing pictures or just colors and shadows when they linger at the threshold of sleep. What distinguishes these experiences from dreams proper is awareness, a kind of double reality. As Nabokov writes of his images, “They come and go without the drowsy observer’s participation, but are essentially different from dream pictures for he is still master of his senses.”2

When I have insomnia, I cannot drop into this peculiar zone between waking and sleeping — this half-dreaming, half-aware state of words and pictures does not arrive. As Jorge Luis Borges observes in his poem “Insomnia”: “In vain do I await/ the disintegration, the symbols that come before sleep.”3 My internal narrator, the one who is speaking in my head all day long, refuses to shut up. The day-voice of the self-conscious thinker races along heedless of my desire to stop it and relax. Chaucer’s narrator seems to have a similar problem: “Suche fantasies ben in myn hede, / So I not what is best to doo.”4 And so, like many insomniacs before and after him, he picks up a book and begins to read.

I was thirteen when I had my first bout of sleeplessness. My family was in Reykjavík, Iceland, for the summer, and day never really became night. I couldn’t sleep, and so I read, but the novels I was reading only stimulated me more, and I would find myself wandering around the house with rushing fragments of Dickens, Austen, or the Brontës whirring in my head. It is tempting to think of this form of insomnia, the inability to fall asleep, as a disease of agency and control, the inability to relinquish high self-reflexive consciousness for the vulnerable, ignorant regions of slumber when we know not what we do. In On the Generation of Animals, Aristotle regards sleep as a between-world: “… the transition from being to not-being to being is effected through the intermediate state, and sleep would appear to be by its nature a state of this sort, being as it were a borderland between living and not-living: a person who is asleep would appear to be neither completely non-existent nor completely existent…”5 Sleep as nearer to death than waking or, as Banquo calls it in Macbeth, “death’s counterfeit.”

In sleep we leave behind the sensory stimulation of the outside world. A part of the brain called the thalamus, involved in the regulation of sleeping and waking, plays a crucial role in shutting out somatosensory stimuli and allowing the cortex to enter sleep. One theory offered to explain hypnagogic hallucinations is that the thalamus deactivates before the cortex in human beings, so the still active cortex manufactures images, but this is just a hypothesis. What is clear is that going to sleep involves making a psychobiological transition. Anxiety, guilt, excitement, a racing bedtime imagination, fear of dying, and pain or illness can keep us from toppling into the oneiric underworld. Depression often involves sleep disturbances, especially waking up early in the morning and not being able to get back to sleep. Weirdly enough, keeping a depressed patient awake for a couple of nights in the hospital can alleviate his symptoms temporarily. They return as soon as he begins to sleep normally again. On the rare occasion that I have had both a migraine headache and suffered from a whole night of insomnia, I have found that the insomnia appears to cure the migraine. No one understands how either depression or migraine are related to, or overlap with, the sleep cycle.

Chaucer’s insomniac reads Ovid’s The Metamorphosis. It does not put him to sleep. He gets very interested in it and spends many lines reporting on his reading. I read in the afternoons now, never at night, because books enliven the internal narrator to one vivid thought after another. No doubt my obsessive reading kept me up that summer long ago, but the permanent daylight of Reykjavík in June must have played havoc with my circadian rhythms, my normal twenty-four-hour wake/sleep cycle and, without darkness, my body never fell into the borderland that would carry me into slumber. When I look back on it, I think I was more anxious about not sleeping than about anything else I can name, and this is still often the case when I am seized with a fit of wakefulness. I am lucky it doesn’t happen often. It is bitter to hear the birds.


2. WHY SLEEP?

Waking and sleeping are the two sides of being. Aristotle put it this way: “It is necessary that every creature which wakes must also be capable of sleeping, since it is impossible that it should be always actualizing its powers.”6 This makes sense. We know that we have to sleep. We know sleeplessness makes us cranky, stupid, and sad. And yet, why we sleep, why we dream, and even why we are wakeful — conscious — remain mysteries. In The Meditations René Descartes asked if he could be certain he was even awake. “How often, asleep at night, am I convinced that I am here in my dressing gown, sitting by the fire when in fact I am lying undressed in bed!.. I see plainly that there are never any sure signs by means of which being awake can be distinguished from being asleep.”7 Dreamless sleep gave Descartes a further problem. It followed from his cogito ergo sum that the apparent thoughtlessness of deep imageless sleep would mean an end to human existence. He was forced to postulate that both waking and sleeping states are conscious, even those periods when we don’t dream at all. John Locke found this ridiculous and came to the opposite conclusion: dreamless sleep is not part of the self, because there is nothing to remember, and personal identity is made of memories. Most of us accept the fact that although we may believe our dreams to be real events when asleep, upon waking in the morning in the same place, we can tell the difference between nocturnal hallucination and reality. But what is sleep and why do we need it? Who are we when we sleep? What exactly does the insomniac crave?

Until the middle of the twentieth century, most researchers agreed that fatigue led to reduced brain activity in sleep, that sleep was, by and large, a dormant state of mind. But this was proved wrong. In REM (rapid eye movement) sleep, brain activity compares to that of full wakefulness. Indeed, sometimes neural firing is more intense than when we’re awake. The old answer was: that’s because we’re dreaming. But the hard and fast equation between REM sleep and dreaming has been overturned, although the debates go on about exactly what this means. There are dreams during the non-REM phase of sleep as well. What waking consciousness is and what it’s for is also a mystery, although there are many competing theories. Does sleeping help consolidate our memories? Some scientists say yes. Some say no. Do dreams mean anything? There are people involved in dream research who say yes, Freud was essentially right, or right about some aspects of dreaming. Others who say no, dreams are mental refuse, and still others who say they have meanings but not as Freud thought they did. What is the evolutionary purpose of consciousness, of sleep, of dreams? There is no agreement.

If you keep rats awake, they die within two to four weeks. Of course, in order to prevent the poor creatures from sleeping, the scientists make it impossible for them to drop off, and whether they actually die of sleep deprivation or stress isn’t clear. Fruit flies and cockroaches perish without sleep. The putative record for a human being intentionally staying awake belongs to Randy Gardner, a seventeen-year-old, who remained awake for eleven days in 1965 for a science fair. He survived just fine but was a cognitive wreck by the end of his ordeal. As a volunteer writing teacher at the Payne Whitney psychiatric clinic in New York, I had many bipolar patients in my classes who had been admitted to the hospital during bouts of mania. A number of them told me that they had stayed awake for days, flying high as they had sex, shopped, danced, and even wrote. One woman reported she had written thousands of pages during her most recent manic phase. A strange illness called Morvan’s syndrome can cause people to remain essentially sleepless for long periods of time. In 1974, Michel Jouvet, a French researcher, studied a young man with the disorder who remained awake for several months. He was entirely cogent and suffered no memory impairment or anxiety. He did, however, have visual, auditory, tactile, and olfactory hallucinations every night for a couple of hours. He dreamed awake. Depending on their location, brain lesions can make people sleepy or prevent them from sleeping. They can also cause exceedingly vivid dreams or the cessation of dreaming altogether. Then again, people with no brain injury can experience all of these symptoms as well.

These admittedly random examples of sleeping and sleeplessness are all suggestive, and each one could feasibly become part of a larger argument about why we sleep and dream. Various understandings of both sleeping and waking consciousness depend on how the lines are drawn between and among the various states. Ernest Hartmann of Tufts University School of Medicine proposes a model he calls a “focused-waking-to-dreaming-continuum” which moves from the highly self-conscious, logical, category-bound, sequential wakefulness to daydreaming and reverie with their more fragmented, less logical, and more metaphorical thoughts to dreaming that is highly visual and much less self-conscious. This makes a lot of sense to me. The insomniac remains on the focused waking or daydreaming less-focused side of the continuum, unable for any number of reasons to let go. Hartmann shares with other researchers the conviction that dreaming is more emotional than waking life and that we make connections in dreams, often metaphorical ones, that are more creative than when we’re wide awake and working at some task. He does not believe dreams are random nonsense. Dreaming is another form of mental activity.

There is no place for dreamless sleep on Hartmann’s continuum, but that blankness might reside around the dreaming state. Gottfried Leibniz answered Descartes and Locke by arguing that not all thoughts are conscious. Some perceptions are too unfocused and confused to enter our self-reflective awareness. He argued for a continuum of perception from unconsciousness to full self-consciousness; therefore even deep, dreamless sleep is part of what and who we are. Leibniz died in 1716, but his insight remains startling. We still may not know why we sleep or wake up, but we know that both states are part of a dynamic, changing organism. Long after Descartes, Locke, and Leibniz, the French philosopher Maurice Merleau-Ponty wrote in The Phenomenology of Perception (1945): “The body’s role is to ensure metamorphosis.”8 Surely, that is exactly what we do when we move through the various stages of being wide awake and concentrated to the piecemeal musings of reverie, to sinking drowsiness, to sleep and dreaming, or to sleep with no dreams at all.


3. GOING UNDER

I remember a lamp that stood on the floor in the opened doorway to the bedroom where my sister and I slept. My mother put it there every night so the darkness would never be total. This is an old memory and around it are the usual fogs that dim recollection, but the light offered the hope that blackness would not snuff out the visible world entirely during my anxious transition to sleep. Bedtime rituals for children ease the way to the elsewhere of slumber — teeth brushing and pajamas, the voice of a parent reading, the feel and smell of the old blanket or toy, the nightlight glowing in a corner. For the child, bedtime means double separation, not only from wakefulness but also from Mother and Father. I wonder how many glasses of water have been fetched, how many extra stories have been read and lullabies sung, how many small backs and arms and heads have been rubbed in the past week alone in New York City.

In the “Metapsychological Supplement to the Theory of Dreams,” Sigmund Freud wrote,

We are not in the habit of devoting much thought to the fact that every night human beings lay aside the wrappings in which they have enveloped their skin as well as anything which they may use as a supplement to their bodily organs … for instance their spectacles, their false hair and teeth and so on. We may add that they carry on an entirely analogous undressing of their minds and lay aside most of their psychical acquisitions. Thus on both counts they approach remarkably close to the situation in which they began life.9

Children are even closer to that beginning than we adults are. Night looms larger because time moves more slowly — a child’s day represents a much larger percentage of life lived than it does for the middle-aged parent. The mental capacities of little children do not include the rationalizations grown-ups use to explain to themselves that a fear is unjustified. The three-year-old does not yet live in a world of Newtonian physics. Not long ago, I saw a film of a psychology experiment in which young children worked hard to get their oversized bodies into toy cars.

Sleep resistance, bouts of insomnia, nightmares and night terrors, crawling into bed with parents in the middle of the night are so common among children it seems fair to call them “normal.” Infants, of course, are notorious for refusing to sleep and wake on command. The exasperated parent can now call a counselor who, for a fee, will come to your house and address your baby’s “sleep issues.” As far as I can tell, these interventions are directed more at exhausted parents than at the welfare of children. They consist of behaviorist techniques that “teach” the offspring to give up hope for comfort at times inconvenient for her progenitors. The message here is an early-life version of self-help. The truth is that a baby develops through the reflective exchanges he has with his mother (or what is now called the “primary caregiver”). Essential brain development that regulates emotion takes place after birth, and it happens through the back and forth of maternal-infant relations — looking, touching, comforting. But there is also an intrinsic alarm system in the brain that the neuroscientist Jaak Panksepp calls the PANIC system. All mammals exhibit “distress vocalizations” when they are separated from their caretakers. They cry when they’re left alone. As Panksepp writes, “When these circuits are aroused [PANIC system] animals seek reunion with individuals who help create the feeling of a ‘secure neurochemical base’ in the brain.”10 Harry Harlow’s famous experiments with rhesus monkeys demonstrated that isolated baby monkeys preferred inanimate “terry cloth mothers” to hard wire ones that provided them with food. The animals raised in isolation became anxious, timid, maladjusted adults.

I am not saying that “sleep training” creates psychiatric problems. No doubt many sleep-trained children grow up just fine, but I am saying that sleep training is counterintuitive. When your baby cries, you want to go to her, pick her up, and rock her back to sleep. If anything has become clear to me, it is how quickly advice about raising children changes. In the early twentieth century when the dictates of behaviorism reigned supreme, experts on child care advocated strict feeding and sleeping regimens and discouraged parents from playing with their children.

I couldn’t bear to let my baby cry in the night, so I didn’t. For years I read to my daughter while she drifted off to sleep, her fingers in my hair. As she grew older, I continued to read to her and, after I had said good night, she would lean over and switch on a tape of Stockard Channing reading one of the Ramona books by Beverly Cleary. The tape had become a transitional object—a bridge between me and sleep. D. W. Winnicott coined this term for the things children cling to — bits of blanket or stuffed animals or their own fingers or thumb — that occupy a space between the subjective inner world and the outside world. These objects are especially necessary at bedtime when, as Winnicott writes, “From waking to sleeping, the child jumps from a perceived world to a self-created world. In between there is need for all kinds of transitional phenomena — neutral territory.”11 I vividly remember my sister Asti’s ragged blanket she called her “nemene.” One of my nieces used three pacifiers— one to suck and two to twirl. How she loved her “fires.”

There is no reason we should expect young children to enter the nocturnal darkness of sleep and dreams without help. Parental rituals and transitional objects serve as vehicles for making the passage and, indeed, to a child’s ability eventually to comfort himself. Freud was surely right about the strangeness of preparing for bed and about the fact that the human mind is undressed in sleep. The so-called executive part of the brain — the bilateral prefrontal cortex — is largely quiet, which probably accounts for the disinhibition and high emotion of many dreams. It is not always easy to go to the region that lies beneath wakefulness, to relinquish the day and its vivid sensory reality. And for a small child the most vital part of that reality is Mother and Father, the beloveds she must leave behind as she drops into the very private land of sleep.

2010

OUTSIDE THE MIRROR

IT IS A PECULIAR TRUTH that I see far less of myself than other people do. I can see my fingers typing when I look down at them. I can examine my shoes, the details of a shirt cuff, or admire a pair of new tights on my legs while I am sitting down, but the mirror is the only place where I am whole to myself. Only then do I see my body as others see it. But does my mirror-self really represent my persona in the world? Is that woman who gives herself the once-over, who checks for parsley in incisors to avoid a green smile, who leans close to study new wrinkles or the red blotches that sometimes appear on her rapidly aging countenance a reasonable approximation of what others see? I do not witness myself as I talk and gesture emphatically to make absolutely sure my point has been made. I do not see myself as I stride down the street, dance, or stumble, nor do I know what I look like when I laugh, grimace, cry, or sneer. This is no doubt a blessing. Were I to see myself in medias res, my critical faculties might never shut down, and I would barely be able to lift a finger without crippling self-consciousness.

Instead of actually seeing ourselves, we walk around with an idea about ourselves. We have a body image or a body identity. This is the conscious notion of what we look like. I’m pretty or ugly, fat or thin, feminine or masculine, old or young. Everyone knows that we can be wrong about our body image. We have all met thin people who believe they are fat and old people who think they have the bodies of thirty-year-olds and dress accordingly. I confess I am sometimes surprised when I regard my own face in photographs. “Good heavens!” I say to myself. “Is that what you look like now? Are you really so old?” At other times, I find myself aging admirably. “You’re not so bad for fifty-six. You’re hanging in there.” But then photographs, those documents of an instant, don’t capture a person in motion. They are static, and we are not. Nevertheless, I think my body image sometimes lags behind my real body.

If body image is what we think we look like, style is meant to express who we think we are, and since we spend most of our lives dressed, not naked, clothes can efficiently announce something about a person’s character. Whether sober and sleek, humorous, sweet, modest, loud, or dangerous — they serve as an indication of personality. When I put on my clothes, I hope that the dresses and trousers and blouses and coats and shoes and boots and scarves and purses and all the rest of the sartorial paraphernalia I select will speak for me, will suggest to the world an idea I have about myself. It is interesting to ask how these ideas come about. I have learned after almost thirty years of marriage that my husband regards any shirt with a shiny fabric (even the barest sheen) as anathema to his true character. My sister Liv wears a lot of jewelry and she looks wonderful in it. I have sometimes tried to imitate her, but inevitably take it all off and leave on what I always wear — earrings. A lot of jewelry just isn’t “me.” But what is this me-ness about? Where does it come from?

If every person has an idea about garments that are him or her, most of us also have an ideally dressed self. When it comes to wearing clothes, idea and ideal intersect; the real and the imaginary inevitably come together. In my life, I have mostly found my ideal garments in the movies. I have a great weakness for those gleaming images of manufactured glamour and sophistication filmed on glorious Hollywood sets with monumental white staircases, billowing draperies, and sparkling chandeliers. How I have loved sitting in the dark and watching a world in which every suitcase is weightless and even poor shopgirls are as astutely attired as a chic Frenchwoman on the Champs-Élysées.

I believe it all started with a Walt Disney film, Pollyanna, starring Hayley Mills. Based on a revoltingly saccharine best-selling novel published in 1913, this movie captivated me entirely. I was only five when it was first released in 1960, so I think I must have seen it some time later, but not much later. In all events, I identified myself completely with Hayley Mills in her white sailor suit with navy blue trim. (I had no idea then that my mother had spent a good part of her childhood in southern Norway dressed in the very same fashion, de rigueur for middle-class children in the 1920s.) In my young mind, the sailor dress must have been emblematic of the story: a relentlessly cheerful girl sweetens up one sourpuss after another until she has won over an entire town. I pined for a dress like that. I must have believed in it as a vehicle of transformation: in that dress, I too might become like the heroine, adored, simply adored, by every single resident of my own small town.

My fixation on marine garb ended, but my cinematic identifications did not. When I first saw Marlene Dietrich slouching down a stairway in a tuxedo in the 1930 film Morocco, I thought I would never wear a dress again, only men’s suits and smoking jackets. The sight of Lana Turner in a white turban in The Postman Always Rings Twice made me consider that form of headwear. Although turbans never worked, I do own a tuxedo. I am well aware that I look nothing like Dietrich in it, but I credit her for the inspiration. Perhaps my favorite films are Hollywood comedies of the thirties and forties. In those movies, the heroes and heroines are not only capable of finding their way to the end of an English sentence, they know how to banter. They know how to deliver a barb, fire off a witticism, and send a wry offhanded compliment. Their crackling dialogues are inseparable from their characters, characters that are also expressed, at least in part, by their clothes.

In the movies, I like to watch clothes in action — the flow of a dress as an actress moves across the floor, dances, or, better yet, runs. Near the end of It Happened One Night, Claudette Colbert, no longer a spoiled heiress, but leavened by her adventures on the road with that man of the people, Clark Gable, stands before the minister who is going to marry her to a frivolous playboy, the perfect movie sap, and, when asked if she will take this man to be her lawfully wedded husband, she vigorously shakes her head no-she-won’t, hitches up her immense train, and runs, a mile-long veil streaming behind her on the grass. It’s a great shot, one that has the punch of a vividly remembered image from a dream.

Like countless others before and after me, I fell for Katharine Hepburn. I fell for her style. It is hardly news that she was a woman who did not bow to conventional standards, and the way she dressed was a sign of rebellion. There was a masculine quality to whatever she wore, even when she was draped in an evening gown. I remember her in a pair of wide trousers, striding down a golf course with Spencer Tracy in Pat and Mike. And I remember her in Holiday, wearing a gloriously simple black dress with a high neck. No froufrou or silliness for her, no peekaboo blouses or big bows or ridiculous shoes.

I was nineteen when I first saw Holiday, the 1938 romantic comedy directed by George Cukor and based on the play by Philip Barry. As a freshman in college, I had my second moment of profound identification with a celluloid being: Hepburn’s character Linda. What did it matter that she was the offspring of a fantastically wealthy man of business and I the daughter of a not very well paid professor? What difference did it make that she inhabited a mansion on Park Avenue with an elevator, and I had grown up in a modest house with scenic views of corn and alfalfa fields? Wasn’t she misunderstood just as I was? Didn’t she wish desperately to escape all that luxury and superficial nonsense? And even if I had no luxury to flee from, didn’t I, too, fantasize about another life? This kind of thinking, of course, is an essential aspect of what is referred to as “the magic of the movies.”

As I watched entranced, I did not see myself sitting in my seat in my old jeans and sweater, both of which had undoubtedly been purchased on sale. I was not in Minnesota anymore. Some ideal self had been embodied onscreen, a character with whom I shared nothing except an emotional reality: a feeling of being trapped and unhappy. I participated in the fable unfolding before me, and as I participated, I imagined myself in those clothes, Linda’s clothes, not the ones worn by her snooty, shallow sister, whose expensive wardrobe seemed so fussy in comparison. No, I was in that black evening dress, and I was wearing it just as Linda did, wearing it as if it made no difference to me that it was supremely elegant, because I had other more pressing, more important things on my mind. I was falling in love with another free spirit, played by Cary Grant.

Few people are immune to such enchantments, and they long predate the movies. We enter characters in novels, too, and imagine ourselves into their stories and into whatever habiliments they may have on during their adventures, and it is possible for us because we do not have to look at ourselves while we are doing it. When we are invisible to ourselves, every transformation is possible. Movies give visual form to our myriad waking dreams. The marvelous people on the screen take the place of the mirror for a while, and we see ourselves in them. Mirroring is a physiological and a social phenomenon. We are born with the ability to imitate the expressions of others, but we also become creatures of our culture with its countless images of what is chic and beautiful. When we choose what to wear we don’t just choose particular pieces of clothing, we select them because they carry meanings about us, meanings we hope will be understood by other people.

These days, I often find myself buying clothes that look suspiciously like ones I already own. This may sound a little dull and perhaps it is. My body image has changed; I am not the girl of nineteen who sat in the movie theater and watched Holiday anymore. I did leave Minnesota only a few years after seeing that film, and moved to New York City. I broke away from my small town and the constraints of provincial life. It is fair to say that certain movie stars continue to haunt my wardrobe. Katharine Hepburn has been an ideal of tailored beauty whispering in my ear ever since I saw her all those years ago on the screen in that wonderful dress. I eschew frippery and excessive adornment of any kind. I like clothes with a masculine feeling that don’t make me look like a man. I like shoes that I can move and dance and even run in if necessary. Towering heels, platforms, complex straps that resemble fetters are not “me.” I like clothes that preserve and enhance my dignity, but are not so sober and serious that they make me look humorless. This is what I wish to convey when I get dressed. Whether I succeed or not in this endeavor, I honestly don’t know. I don’t see myself often enough. Before I leave the house for an evening out, I check myself in the mirror for just a moment and then I go off, happily ignorant of what I look like when I am living my life.

2011

SOME MUSINGS ON THE WORD SCANDINAVIA

AS A CHILD, I’M NOT sure I even knew what Scandinavia meant. It was shrouded in a cultural mist that somehow wafted over me and mine, but why or how wasn’t at all clear to me. I was already a person divided. A girl living in America with a Norwegian mother and a Norwegian-American father, I spoke Norwegian before I spoke English, but rural Minnesota was my everyday world; Norway was another world. As a four-year-old in 1959, I had spent five months with my mother and sister in Bergen. Until I returned for a year with my family in 1967, Norway lived inside me as a jumble of inchoate fragments — isolated memories (my hands in a gooseberry bush, an orange lying in the snow, the tears of my older cousin at the dinner table), household objects (chests and china, photographs and paintings on the wall), food (especially rice pudding, bløtkake, cream cake, and little chocolates called Twist), my parents’ stories, and a few significant words. During the seven years between childhood visits, I mostly forgot Norwegian.

The advantage of not living in a place is that it becomes pure idea. Under the sway of a homesick mother and a father whose identification with the immigrant community in which he was raised led him to become a professor of Norwegian language and literature, I succumbed to an illusion of an ideal elsewhere, a magic kingdom of trolls and nisse and fiskeboller, of Ibsen, Hamsun, and Munch, a fantastic over there, where the children were happier and healthier, floors were cleaner, and the people kinder and more just. My parents weren’t uncritical of Norway, but they were both prone to a form of nationalism that flourishes in tiny cultures that have been shaped by the humiliations of external control. In the case of Norway, that meant Denmark, and after that, it meant Sweden. At some point, I discovered that together these three comprised Scandinavia. Despite the maniacal flag-waving that took place every May 17 to trumpet our independence from Sweden, it turned out that we were somehow in the same family with them, and not only them, but the Danes, too. It took a while to grapple with this concept, but eventually it penetrated my young mind, and I came to accept it.

The binding principle of Scandinavia is not geography, but language. If you have one of the three languages, the other two can be easily managed with a little work — at least on the page. Danish, so comprehensible to me in print, can quickly become a series of indistinguishable noises in the mouth and throat of an actual speaker, and Swedish, though easier to understand, can also trail off into pleasant music when I’m not paying close attention. And yet so many words belong to all three languages, and more than anything else, it is language that shapes perception of the world, that draws the lines and creates the boundaries that make what is out there legible. It is a legacy of my childhood that I am a Norwegian-American who doesn’t feel quite American but who doesn’t feel quite Norwegian either. If I didn’t speak Norwegian, I would most certainly feel alienated from that country’s culture in a far more fundamental way. It is the language that lures me into feeling the connection to a past that extends backward to a time long before I was born. The vocabulary and cadences of Norwegian continue to live inside me, and moreover, they haunt my English. My prose is decidedly Protestant, and despite the fact that Scandinavia is no longer exclusively Protestant, its mores and culture were profoundly influenced by that iconoclastic, stark, and lonely version of Christianity.

Unlike English, the Scandinavian languages are word poor. With William the Conqueror in 1066 and the infusion of Latinate French into Anglo-Saxon, what we now know as English evolved. And yet, it’s exactly their poverty of vocabulary that gives writers possibilities in the Scandinavian languages that English writers don’t have. A word like lys in Norwegian — which means both light and candle — allows repetitions, ambiguities, and depths that aren’t possible in English. Lys is a word heavy with the knowledge of darkness, of summer and winter, of precious long days of light opposed to long days of murk and clouds. In Bergen, where I went to gymnasium for a year, it rained so much that when the sun was shining, the authorities canceled school. Even after she had been living in Minnesota for years, my mother would turn her face to the sun and close her eyes as if the warm rays might disappear any moment. Perhaps the darkness lies behind the omnipresent candles in Scandinavian households, too, lit even during the day and shining in rooms at night. The northern experience of darkness and light is untranslatable. The contrast between them has to be lived in the body. I have often wondered what immigrants to Scandinavia must feel when they arrive from places where summer and winter aren’t so radically defined by light and dark, how strange it must be to shop in afternoon gloom or see the sun late at night in summer. I have wondered how it changes the rhythms of their lives and the meaning of the words light and dark in their native languages. My paternal grandparents and my father, none of them born in Norway, all spoke English with Norwegian accents. Their Norwegian, however, was unlike the language spoken on the other side of the Atlantic. Their speech was dense with nineteenth-century locutions and sprinkled with hybrids — nouns borrowed from English and assigned a gender — words for things that had no Norwegian equivalent.

This is a certainty: like the rest of Europe, Scandinavia is no longer homogenous. The stereotype of the giant, pale-skinned, Lutheran blonde (the only stock character I embody perfectly) has become an anachronism and is being replaced by a variety of body types, complexions, and religions. I, for one, celebrate a changing image of Scandinavia, because migrations of people from over there always enliven the culture of here. Movements of people create new words, new ideas, and inspire new art. I am the product of an immigrant culture in the Midwest, and I now live in New York City, where forty percent of my fellow inhabitants were born in another country. On one of my last trips to Oslo, I climbed into a taxi, gave an address, and began a conversation with the driver. His father had been born in Pakistan, but he was born in Norway. He needn’t have told me; his Oslo dialect was unmistakable.

While immigrants always revivify the country they enter, their presence also creates conflict. In the United States, where all of us, with the exception of Native Americans, came from somewhere else, superiority was measured in generations. The longer your family had lived in America, the better. In 1972–73, when I lived in Bergen, there were no immigrants in town, and yet with a regularity that never failed to surprise me, and despite the fact that I was a passionate supporter of civil rights, I was attacked for my country’s racism. After the wave of Pakistani immigration to Norway, I returned to find people casually using racial slurs and prey to denigrating stereotypes. In short, they said things that would have been anathema in the United States. Like their cohorts all over the world, right-wing politicians in Scandinavia are guilty of thinking in terms of us and them, of exploiting ignorance and fear to maintain a fiction of “the nation,” not as a shared geography or language, but nation as blood or background. This is always dangerous, and it inevitably stinks of the ugliest of ideas: racial purity. In the United States, one of the oddest legacies of our racist culture is that people who have very pale skin but some African ancestry, like Lena Horne or Colin Powell, are inevitably called “black” rather than “white.” The Nazi racial laws created the most ludicrous hairsplitting over percentages of Jewishness in the population, a frankly absurd notion in a country where Jews had lived for hundreds of years and had been marrying Gentiles for just as long. In the tiny world of immigrants and their children and grandchildren in which my father spent his boyhood, the Swedes and Danes in neighboring communities were not regarded as linguistic cousins with important historical and cultural links to Norwegians. They were foreigners. And yet, there is no story without change. That world my father knew as a child is gone forever. In college, I knew a person with a background he described as “part Swedish, part German, and part Sioux Indian.” My daughter refers to herself as “half Norwegian” and “half Jewish.” She likes to call herself a “Jewegian.” Scandinavia is a word whose meanings are in flux. Its myriad references and significations are being determined, and we can only hope that it will stand as a sign of inclusion, not exclusion.

2005

MY INGER CHRISTENSEN

INGER CHRISTENSEN IS DEAD. A great writer has died. I know that great is a word we often use to decorate a venerable cultural figure and then put him or her on a high shelf with the other moldering greats, but this is not my intention. Great books are the ones that are urgent, life changing, the ones that crack open the reader’s skull and heart. I was in my early twenties when I first read Det, and I felt I had been sent a revelation. This work was like no other I had ever read — its rhythms and repetitions were of my own body, my heartbeat, my breath, the motion of my legs and the swing of my arms as I walk. As I read it, I moved with its music. But inseparable from that corporeal music, embedded in the cadences themselves, was a mind as rigorous, as tough, as steely as any philosopher’s. Christensen did not compromise. Paradox upon paradox accumulated in a game of embodied thought. Logic, systems, numbers came alive and danced for me, but they did so hand in hand with ordinary things, which her voice enchanted and made strange. She made me see differently. She made me feel anew the power of incantation. I read more of her work then. I love especially her poems.

I met her twice, first at a festival in New York City. I rushed up to her, shook her hand, and babbled some words in an effort to articulate my intense admiration. She was kind. The second occasion was in Copenhagen at a dinner where I sat beside my idol, who was charming, funny, and told me she wouldn’t return to New York because nobody let you smoke there. The merry, unpretentious woman at the table and the great poet were one, and yet there is always some split at such moments between the person in the room and the person on the page. I didn’t know the woman, but the poet altered my inner world. She whispers to me in my own writing, a brilliant, fierce literary mother whom I will read and reread again and again. The last words belong to Christensen: the music of life and death. They are the last three lines of Det.

En eller annen er død og bæres ud av sit hus ved mørkets frembrud.

En eller annen er død og betragtes af nogen der omsider er blinde.

En eller annen står stille og er omsider alene med den anden døde.

Someone or other is dead and is carried out of the house as night falls.

Someone or other is dead and is seen by someone who is blind at last.

Someone or other stands still and is alone at last with the other dead person.

(my translation)

2009

MY FATHER/MYSELF

THERE IS A DISTANCE TO fatherhood that isn’t part of motherhood. In our earliest days, fathers are necessarily a step away. We don’t have an interuterine life with our fathers, aren’t expelled from their bodies in birth, don’t nurse at their breasts. Even though our infancies are forgotten, the stamp of those days remains in us, the first exchanges between mother and baby, the back-and-forth, the rocking, the soothing, the holding and looking. Fathers, on the other hand, enter the stage from elsewhere. As the psychoanalyst Jessica Benjamin points out, fathers are exciting, and their play is usually different from that of mothers and more arousing to the infant. Fathers are often the ones who introduce “jiggling, bouncing, whooping.”1 I vividly recall my own baby’s joyous face as she straddled her father’s jumping knee. He regularly turned her into “Sophie Cowgirl,” and the two took wild rides together as my husband provided the shoot-’em-up sound effects. I cannot remember bouncing on my father’s knee, but I can recall the noise of the door opening, his footsteps in the hall, and the intense happiness that accompanied his homecoming. Every day for years, my three sisters and I greeted our father as if he were a returning hero, running to the door, shrieking, “Daddy’s home!” We were only daughters in my family. The boy never arrived, and I have often thought that in the end, his absence served us all, including my father, whose relationship with a son would have been colored by an intense identification he didn’t have with his daughters. I think that was oddly liberating. My sisters and I were born into a culture that didn’t expect great ambition from girls. The irony is that because we didn’t have to share our father with a brother, our interests were able to bloom. A boy would inevitably have felt more pressure from both his parents to become someone, but I feel sure we would have envied that pressure, nevertheless.

God the father, land of our fathers, forefathers, Founding Fathers all refer to an origin or source, to what generated us, to an authority. We fall into the paternal line. Patronymic as identity. I have my father’s name, not my mother’s. I didn’t take my husband’s name when I married, but the symbolic mark of paternity is inscribed into the signs for me: Siri Hustvedt. We were called “the Hustvedt girls” or “the Hustvedt sisters” when we were growing up, four apples from the same tree. The father’s name is the stamp of genealogy, legitimacy, and coherence. Although we know when a woman gives birth that she is the child’s mother, the father’s identity can’t be seen. It’s hidden from us in the mysteries of the bedroom where potentially clandestine unions with other men might have taken place. In Judaism, this difficulty is circumvented by establishing Jewish identity through the mother, the known origin, not through the far less certain one: the father. Doubt or confusion about paternal identity and the scourge of illegitimacy have been the stuff of literature in the West since the Greeks. In the Oedipus story, the hero commits patricide and incest accidentally, but once the crimes are known, the world’s foundations shake. The virgin birth in Christianity is the ultimate evocation of paternal mystery, for here the progenitor is God himself, the Holy Spirit, who by means beyond human understanding has impregnated a mortal woman. Edmund in King Lear bemoans his fate as illegitimate son, “Why brand they us with base? With baseness? bastardy?” But Edmund’s treachery is part and parcel of his position as an outsider, a child born from the right father but the wrong mother — a crooked line. Charles Dickens populated his books with illegitimate children and articulated and rearticulated the drama of fatherlessness as a nullity to the self and to others. The illegitimate Arthur Clennam in Little Dorrit is repeatedly referred to as “Nobody.” In another pointed passage in the same novel, when asked about the mysterious Miss Wade, Mr. Pancks answers, “I know as much about her as she knows about herself. She is somebody’s child — anybody’s — nobody’s.”2 Without an identifiable past, the route to self-knowledge has been closed. The father’s power to name fixes and defines us in a relation that allows us to become somebody, not nobody. This is the source of Jacques Lacan’s famous pun on the dual symbolic role of the father. He names and he sanctions: le nom de père and le non de père.

When I was a child, Father Knows Best was on television. This benign series evoked an orderly family, which is to say everyone in it knew his or her place in the hierarchy. Every week, the structure was rattled by a minor storm, which then passed over. I am sure that the mythical dads of that postwar era mingled with my internal fantasies about my own father. They weren’t despots, but they were in charge, and they had the last word, the ideal fathers of a period that was invested in reestablishing a familial order that had been dismantled during the war when the fathers of many American children were overseas. My father wasn’t a disciplinarian, but he had an unchallenged, unspoken authority. Even a hint of anger or irritation from him was enough to mortify me. Those occasions were rare, but the power of paternal sanction ran deep. I wanted so much to please him. I wanted to be good.

It has been said, and it is true—

And this is real pain,

Moreover. It is terrible to see the children,

The righteous little girls;

They expect to be so good …3

That is how George Oppen ends his poem “Street,” and for me the last two lines have always had the force of a blow. I was a righteous little girl.

They are so delicate, these attachments of ours, these first great passions for our parents, and I have often wondered what would have become of me had my father used his power differently. In the hospital where I teach a weekly writing class to psychiatric patients, I have listened to many stories about fathers — violent fathers, runaway fathers, seductive fathers, negligent fathers, cruel fathers, fathers who are in prison or dead of drink or drugs or suicide. Shameful fathers. These are the paternal characters that fuel the stark narratives of “abuse” that people in our culture gulp down so eagerly. It is simple then to create cause and effect, to eliminate all ambiguity, to ignore the particulars of each case, to march down the road of moral outrage. There are brutal stories. We have all heard them, but there are also subtler forms of paternal power that create misshapen lives. I think of a man like the father in Henry James’s Washington Square, Dr. Sloper. He intervenes when he understands that the young man courting his daughter, Catherine, is a fortune hunter. His assessment is by no means wrong, and his desire to protect his child is eminently reasonable, but beneath his acumen lurks not only coldness to his offspring but an ironic distance that borders on sadism. In a remarkable exchange between Sloper and his sister, they discuss Catherine’s decision not to marry her beau immediately but to wait in the hope that her intractable father will change his mind.

“I don’t see that it should be such a joke that your daughter adores you.”

“It is the point where the adoration stops that I find interesting to fix.”

“It stops where the other sentiment begins.”

“Not at all — that would be simple enough. The two things are extremely mixed up, and the mixture is extremely odd. It will produce some third element, and that’s what I’m waiting to see. I wait with suspense — with positive excitement; and that is a sort of emotion that I didn’t suppose Catherine would ever provide for me. I am really very much obliged to her.”4

Sloper’s comment that Catherine’s emotions for him and her lover are “extremely mixed up” is irrefutable as an insight, and it carries far beyond the boundaries of James’s novel. Our deepest adult attachments are all colored by our first loves. They are extremely mixed up. But it is Catherine’s love for and fear of her father that give him power. Her desire to please him holds her captive to his will. In James’s story, however, there is a further irony, the third element, which is that the struggle over the bounder, Morris Townsend, uncovers what might have remained hidden: the father’s contempt for his daughter. The revelation gives Catherine an iron will, and when Sloper insists she promise that after his death she will not marry Townsend, she refuses, not because she has any intention of marrying him, but because it is her only avenue of resistance.

My father was gentle, kind, often interested in what we did and proud of our accomplishments. The man basked in his young daughters’ love. I have understood this only in hindsight. During my childhood, I wasn’t able to put myself in his position, to imagine what that adulation must have felt like. He was a magical being then, enchanted, I think, by excitement, by the glamour of his otherness. He seemed to know the answer to every question. He was tall and strong, a carpenter, woodchopper, and builder of fires, friend to all mammals and insects, a storyteller, a smoke-ring-blower, and of course, a man who went to work, where he taught college students and engaged in various other cerebral activities, the nature of which were a little dim to me. It is ordinary for children to idealize their fathers. It is also ordinary for children to grow up and recognize that same father’s humanity, including his weaknesses and blind spots. As Winnicott said, it is good for children to have “the experience of being let down gently by a real father.”5 The transition from ideal to real isn’t always so easy, however, not for the children or for the father.

Identities, identifications, and desires cannot be untangled from one another. We become ourselves through others, and the self is a porous thing, not a sealed container. If it begins as a genetic map, it is one that is expressed over time and only in relation to the world. Americans cling desperately to their myths of self-creation, to rugged individualism, now more free-market than pioneer, and to self-help, that strange twist on do it yourself, which turns a human being into an object that can be repaired with a toolbox and some instructions. We do not author ourselves, which is not to say that we have no agency or responsibility, but rather that becoming doesn’t escape relation. “You do not stop hungering for your father’s love,”6 my husband, Paul Auster, wrote in the first part of The Invention of Solitude, “Portrait of an Invisible Man,” “even after you are grown up.” The second part, “The Book of Memory,” is told in the third person. The I becomes he:

When the father dies, he writes, the son becomes his own father and his own son. He looks at his own son and sees himself in the face of the boy. He imagines what the boy sees when he looks at him and finds himself becoming his own father. Inexplicably, he is moved by this. It is not just the sight of the boy that moves him, nor even the thought of standing inside his own father, but what he sees in the boy of his own vanished past. It is a nostalgia for his own life that he feels, perhaps, a memory of his own boyhood as a son to his father. Inexplicably, he finds himself shaking at that moment between both happiness and sorrow, if this is possible, as if he were going both forward and backward, into the future and into the past. And there are times, often there are times, when these feelings are so strong that his life no longer seems to dwell in the present.7

Here the identifications are seamless. Three generations mingle and time collapses in likeness. I am you. I have become you. But we cannot write, When the father dies, the daughter becomes her own father and then her own son. The daughter never becomes a father. The sex threshold is a thick seam, not easily crossed. It complicates identification and desire. Paul didn’t have a daughter, our daughter, when he wrote those words. She came later. Once, when she was very small, she asked us if she would grow a penis when she got older. No, we told her, that would never come to be. It wasn’t the moment to introduce the subject of sex-change operations, but one may wonder in all seriousness about Freud’s much maligned comment that “anatomy is destiny.” To what degree are we prisoners of our sex?

I, too, have felt the continuities among generations of women in my family, the maternal as an unbroken chain of feeling. I loved my maternal grandmother, whom I knew well, my mormor, mother’s mother in Norwegian. She adored my own mother, her youngest child, and she adored me. I remember her hand on my face when I said good-bye after a visit, the affection in her eyes, her mildness. My own mother’s face, her hands, her touch and voice, have resonated in me all my life and have became part of a legacy I carried with me to my own daughter, an inheritance, which is like music in my body, a wordless knowledge given and received over time. In this, I was lucky. There is little dissonance in that tune that was passed from one woman to the next. Mother love is everyone’s beginning, and its potency is overwhelming. I remember once finding myself with a group of women — it may have been at a baby shower — when one of them proposed a ghoulish choice: If your husband and child were drowning, which would you save? Every woman, one after another, said the child, and as the confessions accumulated, there were also several jokes (told by more than one woman) that fell into the no contest category, which were greeted by peals of laughter. I remember this because it spoke to the ferocity of the love most women have for their children, but also to an undisguised hostility, at least among those particular women, toward the men whom they had left to die in an imaginary deep, a feeling I honestly didn’t share.

It is impossible then to talk about fathers without talking about mothers. For both boys and girls, the mother begins as a towering figure, source of life, food, and feeling. The sentimentality that has lain thickly over motherhood in Western culture, at least since the nineteenth century, strikes me as a way to tame a two-way passion that has a threatening quality, if only by dint of its strength. Children must escape their mothers, and mothers must let them go, and separation can be a long tug-of-war. Every culture seeks to organize the mysteries of maternity — menstruation, pregnancy, birth, and separation, the initiation into adulthood. Taboos, rituals, and stories create the frames for understanding human experience by distinguishing one thing from another and creating a comprehensible order. In her discussion of various kinds of social organization, Mary Douglas makes an interesting comment in her book Purity and Danger, “I would like to suggest that those holding office in the explicit part of the [social] structure tend to be credited with consciously controlled powers, in contrast with those whose role is less explicit and who tend to be credited with unconscious, uncontrollable powers, menacing those in better defined positions.”8 Her point is that ambiguity is dangerous, and that “articulate, conscious powers” seek to protect the system as a whole “from inarticulate and unstructured areas.” She cites the witch as an example of “nonstructure.” “Witches are social equivalents of beetles and spiders who live in cracks of the walls and wainscoting.” But then she mentions another kind of ambiguous character, the legitimate intruder. “Of these Joan of Arc can be taken as a splendid prototype: a peasant at court, a woman in armour, an outsider in the councils of war; the accusation that she was a witch puts her fully in this category.”9

In classical psychoanalysis, the conscious articulate power is the father, who comes between mother and son as a kind of savior from unarticulated nonstructure, maternal engulfment, but he also thwarts the son’s desire for his mother and inspires rivalry: the Oedipal drama. Once it is resolved, the father’s law is internalized, and the boy can go on to occupy the father’s place. Using Douglas’s model, the mother in psychoanalysis comes very close to being a witch. Moreover, exactly where all this left little girls in relation to their fathers has been something of a muddle in the field. Turning the story around doesn’t work, because little girls also want to leave their mothers. In The Dissolution of the Oedipus Complex, Freud continues his observation on anatomy and destiny: “The little girl’s clitoris behaves just like a penis to begin with, but when she makes a comparison with a playfellow of the opposite sex, she perceives that she has come off badly and she feels this as a wrong done to her and a ground for inferiority. For a while she consoles herself with the expectation that later on, when she grows older, she will acquire just as big an appendage as the boy’s…”10 It isn’t strange that feminists have found the idea of penis envy uncompelling. In its place, Jessica Benjamin proposes another reading: “What Freud called penis envy, the little girl’s masculine orientation really reflects the wish of the toddler — of either sex — to identify with the father, who is perceived as representing the outside world.”11

Girls can certainly identify with their fathers. Many do. In fact, it is far more usual for a girl to admit to being like her father than for a boy to say, “I’m just like my mother,” which would impinge on his masculinity by summoning his dependency on her. What the girl cannot do is take her father’s place, which remains the articulated position of power and authority in the culture. Benjamin is not alone in her critique. The Oedipal conflict has been criticized inside psychoanalysis for some time, and it is widely recognized that Freud’s focus on the father underestimated the mother and her vital role in a child’s early life. The importance of mothers, however, doesn’t change the fact that it is still harder for girls to find a place in the sexual divide, to embrace an articulated position of power. The witch is always hiding in the background.

Of course, life never corresponds exactly to any myth. The sharp divisions erected to explain sexual difference elude the ambiguities of what it means to be a person growing up in the world with real parents. I was not a tomboy. As a small child, I liked girls’ games, and I liked dolls, and I can’t remember a time when I didn’t have love feelings for boys. I cried easily. I was extremely alert to my parents’ expectations, rather passive, and empathetic to a fault. My animistic tendencies lasted longer than most people’s. I remember personifying just about everything. My greatest happinesses were drawing, reading, and daydreaming. Most of the action in my life took place internally. This is still true. In my neck of the woods, the expression for such a person was “femmy” or “wimpy.” Virginia Woolf’s “Angel in the House,” the person she had to defeat to write, exemplifies the wimpy feminine ideal of the Victorian era:

She was intensely sympathetic. She was immensely charming. She was utterly unselfish. She excelled in the difficult arts of family life. She sacrificed herself daily. If there was a chicken, she took the leg. If there was a draught, she sat on it — in short she was so constituted that she never had a mind or a wish of her own, but preferred to sympathize always with the minds and wishes of others.12

The Angel is a person without subjectivity, a mirror held up to the desires of others. Arguably, she no longer exists as a paragon of womanhood, but there is something about her that isn’t easily dismissed, because her “sympathy with the minds and wishes of others” is part of maternal reality. What Winnicott called “good enough mothering” includes the ability to be in harmony with an infant’s needs, to answer, mirror, and calm. Allan Schore calls this the “psychobiological attunement”13 of the mother/child duo, and it is crucial to the growth of every human being. Neurobiological studies have made it clear that the interaction between mother and baby are essential to the child’s brain development. Attunement is not a selfless process on the part of the mother but rather an immersion in betweenness, into a dialectical movement that connects two organisms but which is, in fact, a single process. Woolf’s Angel is like a mother during her baby’s first year of life, when her child’s vulnerability and needs are intense and draining. The Victorian trap for women was multiple. It idealized maternal qualities, isolated them as the distinctive, rigid features of womanliness entirely separate from the qualities of the paternal, and linked feminine traits to childish ones, thereby infantilizing women. The good-enough mother is not the perfect mother. The good-enough mother is a subject with interests, thoughts, needs, and desires beyond her child. Nevertheless, the lure of the desire of others is strong, not just for women, but for men, too, and yet it may be that for many reasons — psychic, biological, social — most women have found the pressure of “the minds and wishes of others” more difficult to resist than men. The continual suppression of the self for another will inevitably produce resentment, if not rage.

Accommodation, squeezing oneself into the expectations of others, however, is part of every childhood. Children of both sexes are dwarfed by their parents in every way. Small and powerless, they are easily crushed by parental authority. Obedience to mother’s and father’s wishes is hardly enslavement, but all children are in thrall to the people they’ve been born to, and the desire to please can easily become a form of internal tyranny. In the letter he wrote to his father, which never reached its destined reader, Franz Kafka presented a stark picture of childhood puniness:

I was, after all, weighed down by your mere physical presence. I remember, for instance, how we often undressed in the same bathing hut. There was I, skinny, weakly, slight; you, strong, tall, broad. Even inside the hut I felt a miserable specimen, and what’s more, not only in your eyes but in the eyes of the whole world, for you were for me the measure of all things. But then when we stepped out of the bathing hut before the people, you holding my hand, a little skeleton, unsteady, barefoot on the boards, frightened of the water, incapable of copying your swimming strokes, which you, with the best of intentions, but actually to my profound humiliation, always kept on showing me, then I was frantic with desperation and at such moments all my bad experiences in all spheres fitted magnificently together.14

Here the father’s body is huge, and before it the child becomes a shrinking “little skeleton.” It is a boy’s experience because a woman’s body would not be “the measure of all things” for a male child. I distinctly remember the feeling of awe and alienation I felt when I saw naked adult bodies as a child, but here Kafka’s experience of his naked father is terrifying, an impossible standard, which humiliates him and very quickly becomes bound to all his “bad experiences.” Desire, fear, and shame are extremely mixed up. The letter as a whole is one of only intermittently suppressed rage, an overt bid for dialogue that is in reality a statement of grievance. Why is it so hard to talk to fathers? Montaigne argues in his essay “Of Friendship” that there cannot be friendship between children and fathers.

From children toward fathers, it is rather respect. Friendship feeds on communication, which cannot exist between them because of their too great inequality, and might therefore interfere with the duties of nature. For neither can all the secret thoughts of fathers be communicated to children lest this beget an unbecoming intimacy, nor could the admonitions and corrections, which are one of the chief duties of friendship, be administered by children to fathers.15

Montaigne is right. Inequality engenders necessary silences. Young children don’t really want friendship from a father, but a heroic figure to look up to. Is there something in fatherhood as we know it that by its very nature blocks communication?

My father liked instructing us, liked working in the garden with us, liked to explain just about anything to us, and he listened to us, but there were distances in him that were difficult to breach, and unlike my mother, he found it hard to speak directly to his daughters about anything personal, especially as we got older and matured sexually. Sometimes he would communicate his worries about his children through his wife, which generally meant that his comments had been screened or edited by her judgments about the situation, so exactly what had alarmed him had become rather foggy once it reached us. The older I became, the more hidden I felt he was, and there were moments when he seemed unavailable to a degree that startled me. It could be difficult for him to say, so sometimes he would do. My father drove me home after I had been fitted with braces for my teeth, painful and grueling hours made worse by the fact that the orthodontist was a truly unpleasant man who gruffly told me to stop moving my feet when I squirmed in discomfort, to open my mouth wider, and to stop flinching when he hit a tender spot. I left the ordeal with tears in my eyes. My father didn’t say much, but then he stopped at a gas station, left the car, and returned with a box, which he handed to me. I looked down: chocolate-covered cherries. My father’s favorite. I was eleven years old, and even then, I felt poignancy mingle with comedy. I didn’t like chocolate-covered cherries and was in no shape to eat them even if I had liked them, but the mute gesture has stayed with me as one of infinite if somewhat wrongheaded kindness, as a token of his love.

By all accounts, my father was a good boy. He was the oldest of four as I am, upright, sensitive, intelligent, with a perfectionist streak that showed up strongly in me. My father’s sister once told me that some of the boys who attended their one-room schoolhouse in rural Minnesota teased my father for reading too much. Apparently, it was a pursuit that lacked manliness. In my father’s childhood, masculine and feminine roles were strictly defined by the kinds of labor done on the farm. My grandfather and grandmother both worked hard, but at different jobs. In his memoir, my father wrote, “Adolescence, as it is now understood, did not exist. A boy became a man when he could do a man’s work.” My father confessed to hot competition with his fellows when it came to rites of passage: “At what age had so-and-so been entrusted with a team of horses, a tractor, the family automobile, and how many cows did one milk.” But, then again, by his own admission, it was his sister who was the athlete: “She could run faster than her brothers, do cartwheels, walk on her hands, and wielded a mean bat at the softball plate.” He was proud of his sister’s physical prowess, and when two of his daughters, his oldest not among them, turned into champion horsewomen, no one was more pleased with their trophies than my father. Despite his beginnings on the farm, my father became an intellectual and worked as a professor. Reading too much took him elsewhere.

But, like all of us, he was shaped by his early experiences. He watched his parents’ farm fail during the Depression and suffered the indignities and humiliations of extreme poverty. His boyhood helplessness in the face of these terrible events became the catalyst for a life lived to repair what had been broken. The winds of chance and devastation were not going to blow down his family if he could help it. He would work himself to death if he had to. This is the old story of the good boy who becomes the duty-bound father. What he could never say was that his parents’ marriage was one of conflict and alienation. He, too, idealized and identified with his father, a tenderhearted and rather meek man, who by the time I met him seemed resigned to his fate. My grandmother, on the other hand, was indomitable and outspoken, admirable traits that sometimes veered toward the screeching and irrational. For a temperamentally sensitive boy like my father, his mother’s invective must have cut to the quick. But these were wounds he hid.

About three years after my father’s death, I had a conversation with my mother that made such an impression on me I can reproduce it almost word for word.

“He wanted his girls to marry farmers.”

“He wanted us to marry farmers?” I said. “You can’t mean that seriously.”

“Well, farmboys like him, who went on to other things.”

“Farmboys who became professors?” I said incredulously.

My mother nodded.

“But, Mamma,” I said, “how many farmboys turned college professors are there? It’s tantamount to saying that we shouldn’t marry or that we should have married him!”

My mother and I laughed, but this strange notion of my father’s reinforced the fact that it was difficult for him to let go of his daughters, to tolerate our growing up. He wanted to continue to find himself reflected in our childish eyes and see the ideal father shining back at him. It took me a long time to understand this, in part because I never stopped hungering for his love and approval, and he remained a measure for me, if not of all things, of many. But I suspect now that there was a part of him that thought he had lost me to my husband, to my work, and because real dialogue was often difficult for us and unequal to some degree — I remained a respectful daughter — there were unspoken misunderstandings between us.

I don’t remember when I began to realize that I wanted to be like my father, but it wasn’t in my earliest days. I think I became ambitious around eleven, which was just about the time I was suddenly able to read “small print” books, when I first read William Blake and Emily Dickinson. Poems and stories became an avenue for my psychic cross-dressing, or rather, discovering my masculinity. I was twelve when I first heard the story of Joan of Arc, that legitimate intruder branded as a witch. The man who told it to me was my seventh-grade history teacher at a Rudolf Steiner School in Bergen, Norway, Arne Krohn Nilsen, a tall rangy man with long whiskery eyebrows that made him look as if he were permanently surprised. He was an intense teacher, and he told Jeanne d’Arc’s tale of glory and woe with a fervor I have never forgotten. He told it to the whole class, but listening to it, I felt like the recipient of a secret gift. I could not have said that the girl warrior appealed to me because, for a while anyway, she was allowed to play a role normally prohibited to women, but I am certain that I felt it. As my teacher spoke, as his voice rose and fell, and his sweeping gestures emphasized the drama, I was Joan of Arc. In a blank book, he drew me a picture of the historical heroine in armor with a sword on a white steed. I still have it. I relate this because not only did Joan collapse the hard lines of sexual difference, but she came to me through a man who genuinely believed in my abilities, a father figure.

Portrait of the Artist as a Young Woman. “Identity and memory are crucial for anyone writing poetry,” says Susan Howe in her book My Emily Dickinson. “For women the field is still dauntingly empty. How do I, choosing messages from the code of others in order to participate in the universal theme of Language, pull SHE from all the myriad symbols and sightings of HE.”16 Emily Dickinson constantly asked this question in her poems:

In lands I never saw — they say

Immortal Alps look down—

Whose Bonnets touch the firmament—

Whose Sandals touch the town—

Meek at whose everlasting feet

A Myriad daisy play—

Which, Sir, are you and which am I

Upon an August Day?17

Dickinson stayed at home to read and write. There she inhabited the immensity of her own inner life. Her mentors lived on the page. Hundreds of fathers. But Howe takes her title from a letter Dickinson wrote to her cousin after reading in the newspaper that George Eliot had died. “The look of the words as they lay in the print I shall never forget. Not their face in the casket could have had the eternity to me. Now, my George Eliot.”18 And mine. Translator, scholar, intellectual, brilliant novelist, Mary Ann hid behind the mask of George. How well I understand that pseudonym — the need to evade the fixity that comes with the brand “woman writer.” If reading was for me the route to legitimate power under the sign of my professor father, it was nevertheless my mother who fed me books, one after another, to stave off a mounting hunger, which at times veered toward the compulsive. She had read widely, and so the idea of literature belonged to both my father and my mother, and my literature, the English-language books I read at eleven, twelve, and thirteen, were my mother’s choices for me. I read under the auspices of two polestars, one paternal and more remote, the other maternal and closer.

What did I want? More. Reading is internal action. It is the intimate ground where, as my husband says, “two consciousnesses touch.” I would add two unconsciousnesses as well. Reading in our culture has become so attenuated that all reading is now considered “good.” Children are admonished to read in general, as if all books are equal, but a brain bloated with truisms and clichés, with formulaic stories and simple answers to badly asked questions is hardly what we should aspire to. For the strange thing is that even books we can no longer actively recall are part of us, and like a lost melody, they may return suddenly. I have discovered my own borrowings from texts through rereading books. These liftings, never exact, were always done unconsciously. As a young person, I read the canon, as I perceived it. Great books signified achievement and mastery, but also apprenticeship. I wanted to know everything, to enlarge myself, to get a fat mind, and that mind, as it has turned out, is mostly made of men. “The process of literary influence,” Harold Bloom wrote in The Anxiety of Influence, “is a battle between strong equals, father and son as mighty opposites…”19 Freud’s Civilization and Its Discontents, with its rapacious sons murdering the tyrannical father, isn’t far away from Bloom’s blanket declaration that literature is the domain of men duking it out—strong equals. There are no daughters in this narrative. And there’s the rub. It is too easy to say that the canon is patriarchal, that casting out John Milton for George Sand is a solution to the problem, but that is to ignore quality in the name of parity.

What of women who write? We, too, have literary fathers and mothers. For most of my life, I have felt that reading and writing are precisely the two places in life where I am liberated from the constraints of my sex, where the dance of being the other takes place unhindered, and the free play of identifications allows entrance into a multitude of human experiences. When I am working I feel this extraordinary freedom, my plurality. But I have discovered that out there in the world, “woman writer” is still a brand on a writer’s forehead, not easily erased, that being George remains preferable to being Mary Ann.

I am not arguing that Bloom is entirely wrong. I think he is right that many male writers struggle to overcome influence. I have seen it up close in some men I know who have had to tangle with a beloved author before they can write themselves. A classic example is Beckett’s overwhelming admiration for Joyce, an influence he had to purge before becoming the writer he became. The question is not whether women writers are influenced; every writer takes from the past. It is how it happens. I was a sponge for books, but I have never had a bellicose relation to writers I love, men or women, even those who have influenced me the most strongly. My love for Henry James doesn’t make me want to fight it out and get over him. Is this because, as a woman, I have a different relation to the paternal and the maternal? Are writers like Emily Dickinson, Jane Austen, Emily and Charlotte Brontë, George Eliot, Gertrude Stein, and Virginia Woolf not part of the pantheon of English letters? I don’t believe that their sex is what one thinks of first when one thinks of their books, is it? But didn’t they make themselves in a different way from men who write? Didn’t they have to? Notably, not one of the writers in the list above was a mother. Is the question of equality with men so fraught for women that their battle is a different one? Are we part of a crooked line outside the patrimony? “Which, Sir, are you, and which am I?” Immortal Alps or daisy? Note that Dickinson’s alps wear bonnets. Howe quotes Dickinson’s second letter to the mysterious “Master,” in which she writes,

If you saw a bullet hit a bird — and he told you he wasn’t shot — You might weep at his courtesy, but you would certainly doubt his word.

One more drop from the gash that stains your Daisy’s bosom — then would you believe?20

Is she not both alps and daisy? Wounded here. Whole elsewhere. Is she not myriad? Howe directs her reader to David Copperfield, to David, Master Davy, but also to Daisy, Steerforth’s affectionate and feminizing name for his younger friend.21 Literary mingling. Sexual mingling. Language isn’t owned by anyone. It is inside and outside; it belongs to men and to women. Does it matter that women are mostly latecomers to the table of literature? Perhaps. Perhaps not.

There have been moments in my life when I felt like the legitimate intruder — at the defense of my doctoral dissertation in English literature, for example. I sat at a table with six men, my judges. They were not unsympathetic. Most of them were admiring. The single exception was an aging pedant who had painstakingly checked my footnotes for accuracy and, finding no errors, resorted to the comment “I did find some of the editions you used egregious, however.” As I waited in the hall for their verdict, I didn’t expect them to suggest any changes. I knew what I had written was good, but for me that seven-year adventure of getting a degree was an ongoing encounter with paternity, not because nearly all my professors were men, which they were, but because the institution itself offered a fatherly stamp of approval, as Dickens would say, “three dry letters”: Ph.D. The fact that my father had also undergone those rigors no doubt haunted the entire enterprise. I have since discovered that it is much harder for young women to give up the lure of a higher degree than it is for young men. A poet who had been languishing in a Ph.D. program for years confessed to me that although she had no intention of becoming a professor, giving up the hope for a degree felt like a painful loss of stature. I understand. For women, letters after their names can be a form of armor. This is probably even more true in the sciences, where there are fewer women than in the humanities. It is in these worlds that one feels the problem of femininity most deeply, because it is here that it shouldn’t show. The awareness of sex acts as a disturbance to the collegial pursuits of the mind — those unnameable openings in the structure begin to emanate dangerous powers, and the maternal witch is back.

A physicist friend of mine told me that women in his field generally disguise their bodies in manly attire to fit in with the powers that be, but he had also noticed a trend: when a woman has reached a position of respect and acclaim, when she has secured her reputation as brilliant, her sartorial discipline begins to unravel. Colors formerly unseen, high heels, makeup, and jewelry appear in rapid succession on her body, as if these accoutrements of womanliness were the tokens of a long-restrained sexual energy, as if the poor thing has suddenly been allowed to burst into bloom. For all its strides in the right direction, the Enlightenment elevated Reason to an impossible stature, and because women were lumped with its opposite, with the irrational forces in human life, no longer inexplicable or mystical, just situated on the wrong side of the fence, women languished there until they could claim Reason and the Rights of Man as their own. But climbing into the patriarchy entails some distortion. I learned to lower my voice when I spoke at seminars in graduate school, to try to sound dispassionate, even when I was quaking with feeling. I called on masculine forms to ensure I was taken seriously, to hide the girl. Over time, those forms became me, too. We are not static beings. We age and change.

In my novels, I have written as a woman and as a man. I have written as a father. I have written as a son. A young woman dresses as a man. She puts on her armor and wanders the streets. A man paints his self-portrait as a woman. A man dresses as a woman and comes into his own. We are myriad, all of us. Daisies. Witches. Alps. Masters. And skeletal little children looking up at the enormity of Dad. Contrary to Montaigne’s statement about fathers and children, late in his life, not so long before he died, my father and I became friends. Although he was proud of me and carefully pasted my good reviews into a scrapbook, he had never said much to me about my work. I had become accustomed to brief, cryptic comments that could be construed in many different ways. My father was very ill when I finished my third novel and sent my parents the manuscript, but he was still living at home. That book was the first I wrote in the voice of a man. One afternoon, the phone rang, and to my surprise it was my father. He rarely called. I usually spoke to my mother, and she would then put my father on for a chat. Without warning, he launched into a disquisition on the book, heaping praise on my literary efforts. And I began to sob. He talked, and I sobbed. He talked more, and I sobbed more. Years of tears. I would never have predicted so violent a reaction. But then, you see, he knew. He knew how much I wanted his sanction, his approval, his admiration, and his knowing what I had mistakenly assumed he had always taken for granted became the road to each other. We were changed then, my father and I. At least some of the distance between us fell away, and when we sat together in the months before his death, we talked as friends, as strong equals, as two real people, not ideal people, who had found each other again.

2008

FLOWERS

WHEN THERE ARE FLOWERS IN a room my eyes are drawn to them. I feel their presence in a way that I do not feel chairs, sofas, coffee tables, curtains. Their fascination for me must be connected to the fact that they are alive, not dead. The attraction is prereflective — it rises up in my body before any articulated thought. Before I can name the flowers (if I can), before I can tell myself that I am attracted to the blooms, the pleasurable sensation has arrived. The color red is especially exciting. It is hard to turn away from red flowers — to not look at amaryllis at full stretch, their broad pale-green stalks erect or leaning slightly behind the glass of a vase. When snow is falling outside, my happiness is augmented — red against the white seen through a window. And in summer, I cannot resist gazing for two, three, four minutes at peonies that have spread open into fat, heavy clusters of petals with their stamens of yellow dust.

Dying flowers don’t have this power over me. In my garden, I pick off wasted blossoms, snip rosehips, pluck withered, browning leaves. I neaten up the dead, but I hover over the living blooms. I watch a bee sit on the beaded orange heart of an open daisy. Sometimes I adjust a flower’s head toward the light, careful not to bruise its petals. And I find my encounters with these quickening but senseless plants so absorbing that I do not narrate them. This is odd because I am continually putting words to living, always forming sentences that accompany me as I greet a person, sit at a dinner party, stroll on the street, but there is no inner voice that follows me in the garden. My head goes silent.

When I was a child, I lived in a house outside a small town in Minnesota. Behind the house was a steep bank that led down to a creek. In spring, after the snows, the water rose and flooded the flat bottom ground. It was on the slope above the creek that I found the first bloodroot growing in the soil. I remember the cold moisture seeping through my pants as I sat down to examine the flowers. When there were plenty of them, I gave myself permission to pick a bouquet for my mother. Their tiny white heads drooped as if in sorrow, but their true enchantment was located at their roots, in the rhizomes that contained a reddish sap that bled onto my hands. As I plucked them, I thought of wounds and of grief, and was overtaken by a satisfying melancholy. My infantile animism had a long life, but in my conscious memory, these personifications were never complete. I lived in a state of only partial belief, not true belief. Because it could bleed, the early bloodroot was more human than the bluebell that came later. I remember how much I liked to open my palms once I was home and examine the red stains the flowers had left on my skin and that I felt a kind of awe, an awe, I suppose, about the living and the dead and the injured.

2011

Загрузка...