• CHAPTER XV •


THE BEDROOM


I

The bedroom is a strange place. There is no space within the house where we spend more time doing less, and doing it mostly quietly and unconsciously, than here, and yet it is in the bedroom that many of life’s most profound and persistent unhappinesses are played out. If you are dying or unwell, exhausted, sexually dysfunctional, tearful, racked with anxiety, too depressed to face the world, or otherwise lacking in equanimity and joy, the bedroom is the place where you are most likely to be found. It has been thus for centuries, but at just about the time that the Reverend Mr. Marsham was building his house an entirely new dimension was added to life behind the bedroom door: dread. Never before had people found more ways to be worried in a small, confined space than Victorians in their bedrooms.

The beds themselves became a particular source of disquiet. Even the cleanest people became a steamy mass of toxins once the lights went out, it seemed. “The water given out in respiration,” explained Shirley Forster Murphy in Our Homes, and How to Make Them Healthy (1883), “is loaded with animal impurities; it condenses on the inner walls of buildings, and trickles down in foetid streams, and … sinks into the walls,” causing damage of a grave but unspecified nature. Why it didn’t cause this damage when it was in one’s body in the first place was never explained or evidently considered. It was enough to know that breathing at night was a degenerate practice.

Twin beds were advocated for married couples, not only to avoid the shameful thrill of accidental contact but also to reduce the mingling of personal impurities. As one medical authority grimly explained: “The air which surrounds the body under the bed clothing is exceedingly impure, being impregnated with the poisonous substances which have escaped through the pores of the skin.” Up to 40 percent of deaths in America, one doctor estimated, arose from chronic exposure to unwholesome air while sleeping.

Beds were hard work, too. Turning and plumping mattresses was a regular chore—and a heavy one, too. A typical feather bed contained forty pounds of feathers. Pillows and bolsters added about as much again, and all of these had to be emptied out from time to time to let the feathers air, for otherwise they began to stink. Many people kept flocks of geese, which they plucked for fresh bedding perhaps three times a year (a job that must have been as tiresome for the servants as it was for the geese). A plumped feather bed may have looked divine, but occupants quickly found themselves sinking into a hard, airless fissure between billowy hills. Support was on a lattice of ropes, which could be tightened with a key when they began to sag (hence the expression “sleep tight”), but in no degree of tension did they offer much comfort. Spring mattresses, invented in 1865, didn’t work reliably at first because the coils would sometimes turn, confronting the occupant with the very real danger of being punctured by his own bed.

A popular American book of the nineteenth century, Goodholme’s Cyclopedia, divided mattress types into ten levels of comfort. In descending order they were:


Down


Feathers


Wool


Wool-flock


Hair


Cotton


Wood-shavings


Sea-moss


Sawdust


Straw

When wood-shavings and sawdust make it into a top-ten list of bedding materials, you know you are looking at a rugged age. Mattresses were havens not only for bedbugs, fleas, and moths (which loved old feathers when they could get at them) but for mice and rats as well. The sounds of furtive rustlings beneath the coverlet was an unhappy accompaniment to many a night’s sleep.

Children who were required to sleep in trundle beds low to the floor were likely to be especially familiar with the whiskery closeness of rats. Wherever people were, were rats. An American named Eliza Ann Summers reported in 1867 how she and her sister took armloads of shoes to bed each night to throw at the rats that ran across the floor. Susanna Augusta Fenimore Cooper, daughter of James Fenimore Cooper, said that she never forgot, or indeed ever quite got over, the experience of rats scuttling across her childhood bed.

Thomas Tryon, author of a book on health and well-being in 1683, complained of the “Unclean, fulsom Excrement” of feathers as being attractive to bugs. He suggested fresh straw, and lots of it, instead. He also believed (with some justification) that feathers tended to be polluted with fecal matter from the stressed and unhappy birds from which they were plucked.

Historically, the most basic common filling was straw, whose pricks through the ticking were a celebrated torment, but people often used whatever they could. In Abraham Lincoln’s boyhood home, dried cornhusks were used, an option that must have been as crunchily noisy as it was uncomfortable. If one couldn’t afford feathers, wool and horsehair were cheaper alternatives, but they tended to smell. Wool often became infested with moths, too. The only certain remedy was to take the wool out and boil it, a tedious process. In poorer homes, cow dung was sometimes hung from the bedpost in the belief that it deterred moths. In hot climates, summertime insects coming through the windows were a nuisance and hazard. Netting was sometimes draped around beds, but always with a certain uneasiness, as all netting was extremely flammable. A visitor to upstate New York in the 1790s reported how his hosts, in a well-meaning stab at fumigation, filled his room with smoke just before bedtime, leaving him to grope his way through a choking fog to his bed. Wire screens to keep out insects were invented early—Jefferson had them at Monticello—but not widely used because of the expense.

For much of history a bed was, for most homeowners, the most valuable thing they owned. In William Shakespeare’s day a decent canopied bed cost £5, half the annual salary of a typical schoolmaster. Because they were such treasured items, the best bed was often kept downstairs, sometimes in the living room, where it could be better shown off to visitors or seen through an open window by passersby. Generally, such beds were notionally reserved for really important visitors but in practice were hardly used, a fact that adds some perspective to the famous clause in Shakespeare’s will in which he left his second-best bed to his wife, Anne. This has often been construed as an insult, when in fact the second-best bed was almost certainly the marital one and therefore the one with the most tender associations. Why Shakespeare singled out that particular bed for mention is a separate mystery, since Anne would in the normal course of things have inherited all the household beds, but it was by no means the certain snub that some interpretations have made it.



Privacy was a much different concept in former times. In inns, sharing beds remained common into the nineteenth century, and diaries frequently contain entries lamenting how the author was disappointed to find a late-arriving stranger clambering into bed with him. Benjamin Franklin and John Adams were required to share a bed at an inn in New Brunswick, New Jersey, in 1776, and passed a grumpy and largely sleepless night squabbling over whether to have the window open or not.

Even at home, it was entirely usual for a servant to sleep at the foot of his master’s bed, regardless of what his master might be doing within the bed. The records make clear that King Henry V’s steward and chamberlain both were present when he bedded Catherine of Valois. Samuel Pepys’s diaries show that a servant slept on the floor of his and his wife’s bedroom, and that he regarded her as a kind of living burglar alarm. In such circumstances, bed curtains provided a little privacy and cut down on drafts, too, but increasingly came to be seen as unhealthy refuges of dust and insects. Bed curtains could also be a fire hazard—no small consideration when everything in the bedroom, from the rush matting on the floor to the thatch overhead, was energetically combustible. Nearly every household book cautioned against reading by candlelight in bed, but many people did anyway.

In one of his works, John Aubrey, the seventeenth-century historian, relates an anecdote concerning the marriage of Thomas More’s daughter Margaret to a man named William Roper. In the story Roper calls one morning and tells More that he wishes to marry one of More’s daughters—either one will do—upon which More takes Roper to his bedroom, where the daughters are asleep in a truckle bed wheeled out from beneath the parental bed.* Leaning over, More deftly takes “the sheet by the corner and suddenly whippes it off,” Aubrey relates with words that all but glisten lustily, revealing the girls to be fundamentally naked. Groggily protesting at the disturbance, they roll onto their stomachs, and after a moment’s admiring reflection Sir William announces that he has seen both sides now and with his stick lightly taps the bottom of sixteen-year-old Margaret. “Here was all the trouble of the wooeing,” writes Aubrey with clear admiration.

However true or not the episode—and it is worth noting that Aubrey was writing more than a century after the fact—what is clear is that no one in his day thought it odd that More’s grown daughters would sleep beside the parental bed.



The real problem with beds, certainly by the Victorian period, was that they were inseparable from that most troublesome of activities, sex. Within marriage, sex was of course sometimes necessary. Mary Wood-Allen, in the popular and influential What a Young Woman Ought to Know, assured her young readers that it was permissible to take part in physical intimacies within marriage, so long as it was done “without a particle of sexual desire.” The mother’s moods and musings at the time of conception and throughout pregnancy were thought to affect the fetus profoundly and irremediably. Partners were advised not to have intercourse unless they were “in full sympathy” with each other at the time, for fear of producing a failed child.

To avoid arousal more generally, women were instructed to get plenty of fresh air, avoid stimulating pastimes like reading and card games, and above all never to use their brains more than was strictly necessary. Educating them was not simply a waste of time and resources but dangerously bad for their delicate constitutions. In 1865, John Ruskin opined in an essay that women should be educated just enough to make themselves practically useful to their spouses, but no further. Even the American educator Catharine Beecher, who was by the standards of the age a radical feminist, argued passionately that women should be accorded full and equal educational rights, so long as it was recognized that they would need extra time to do their hair.

For men, the principal and preoccupying challenge was not to spill a drop of seminal fluid outside the sacred bounds of marriage—and not much there either, if they could decently manage it. As one authority explained, seminal fluid, when nobly retained within the body, enriched the blood and invigorated the brain. The consequence of discharging this natural elixir illicitly was to leave a man literally enfeebled in mind and body. So even within marriage one should be spermatozoically frugal, as more frequent sex produced “languid” sperm, which resulted in listless offspring. Monthly intercourse was recommended as a safe maximum.

Self-abuse was of course out of the question at all times. The well-known consequences of masturbation covered virtually every undesirable condition known to medical science, not excluding insanity and premature death. Self-polluters—“poor creeping tremulous, pale, spindle-shanked wretched creatures who crawl upon the earth,” as one chronicler described them—were to be pitied. “Every act of self-pollution is an earthquake—a blast—a deadly paralytic stroke,” declared one expert. Case studies vividly drove home the risks. A medical man named Samuel Tissot described how one of his patients drooled continuously, dripped watery blood from his nose, and “defecated in his bed without noticing it.” It was those last three words that were particularly crushing.

Worst of all, an addiction to self-abuse would automatically be passed on to offspring, so that every incident of wicked pleasure not only softened one’s own brain but sapped the vitality of generations yet unborn. The most thorough analysis of sexual hazards, not to mention most comprehensive title, was provided by Sir William Acton in The Functions and Disorders of the Reproductive Organs, in Childhood, Youth, Adult Age, and Advanced Life, Considered in Their Physiological, Social and Moral Relations, first published in 1857. He it was who decided that masturbation would lead to blindness. He was also responsible for the oft-quoted assertion: “I should say that the majority of women are not very much troubled with sexual feeling of any kind.”

Such beliefs held sway for an amazingly long time. “Many of my patients told me that their first masturbatory act took place while witnessing some musical show,” Dr. William Robinson reported grimly, and perhaps just a bit improbably, in a 1916 work on sexual disorders.

Fortunately, science was standing by to help. One remedy, described by Mary Roach in Bonk: The Curious Coupling of Sex and Science (2008), was the Penile Pricking Ring, developed in the 1850s, which was slipped over the penis at bedtime (or indeed anytime) and was lined with metal prongs that bit into any penis that impiously swelled beyond a very small range of permissible deviation. Other devices used electrical currents to jerk the subject into a startled but penitent wakefulness.

Not everyone agreed with these conservative views, it must be noted. As early as 1836, a French medical authority named Claude François Lallemand published a three-volume study equating frequent sex with robust health. This so impressed a Scottish medical expert named George Drysdale that he formulated a philosophy of free love and uninhibited sex called Physical, Sexual and Natural Religion. Published in 1855, it sold ninety thousand copies and was translated into eleven languages, “including Hungarian,” as the Dictionary of National Biography notes with its usual charming emphasis on pointless detail. Clearly, there was some kind of longing for greater sexual freedom in society. Unfortunately, society at large was still a century or so away from granting it.

Penile Pricking Ring (photo credit 15.1)

In such a perpetually charged and confused atmosphere, it is perhaps little wonder that for many people successful sex was an unrealizable aspiration—and in no case more resoundingly than that of John Ruskin himself. In 1848, when the great art critic married nineteen-year-old Euphemia “Effie” Chalmers Gray, things got off to a bad start and never recovered. The marriage was never consummated. As she later related, Ruskin confessed to her that “he had imagined women were quite different to what he saw I was and that the reason he did not make me his Wife was because he was disgusted with my person the first evening.”

Eventually able to take no more (or actually wanting to take a lot more, but with someone else), Effie filed a nullity suit against Ruskin, the details of which became a happy titillation for devotees of the popular press in many lands, and then ran off with the artist John Everett Millais, with whom she had a happy life and eight children. The timing of her virtual elopement with Millais was unfortunate, as Millais was at that time engaged in painting a portrait of Ruskin. Ruskin, a man of honor, continued to sit for Millais, but the two men never again spoke. Ruskin sympathizers, of whom there were many, responded to the scandal by pretending there wasn’t one. By 1900, the whole episode had been so effectively expunged from the record that W. G. Collingwood could, without a blush of embarrassment, write The Life of John Ruskin without hinting that Ruskin had ever been married, much less sent crashing from a room at the sight of female pubic hair.

Ruskin never escaped his prudish ways or gave any indication of desiring to. After the death of J. M. W. Turner, in 1851, Ruskin was given the job of going through the works left to the nation by the great artist and found several watercolors of a cheerfully erotic nature. Horrified, Ruskin decided that they could only have been drawn “under a certain condition of insanity,” and for the good of the nation destroyed almost all of them, robbing posterity of several priceless works.

Effie Ruskin’s escape from her unhappy marriage was both lucky and unusual, for nineteenth-century divorce acts, like everything else to do with marriage, were overwhelmingly biased in favor of men. To obtain a divorce in Victorian England, a man had merely to show that his wife had slept with another man. A woman, however, had to prove that her spouse had compounded his infidelity by committing incest, bestiality, or some other dark and inexcusable transgression drawn from a very small list. Until 1857, a divorcée forfeited all her property and generally lost the children, too. Indeed, in law a wife had no rights at all—no right to property, no right of expression, no freedoms of any kind beyond those her husband chose to grant her. According to the great legal theorist William Blackstone, upon marriage a woman relinquished her “very being or legal existence.” A wife had no legal personhood at all.

Some countries were slightly more liberal than others. In France, exceptionally, a woman could divorce a man on grounds of adultery alone, though only as long as the infidelity had occurred in the marital home. In England, however, standards were brutally unfair. In one well-known case, a woman named Martha Robinson was for years beaten and physically misused by a cruel and unstable husband. Eventually, he infected her with gonorrhea and then poisoned her almost to the point of death by slipping antivenereal powders into her food without her knowledge. Her health and spirit broken, she sued for divorce. The judge listened carefully to the arguments, then dismissed the case and sent Mrs. Robinson home with instructions to try to be more patient.

Even when things went well, it was difficult being a woman, for womanhood was automatically deemed to be a pathological condition. There was a belief, more or less universal, that women after puberty were either ill or on the verge of being ill almost permanently. The development of breasts, womb, and other reproductive apparatus “drained energy from the finite supply each individual possessed,” in the words of one authority. Menstruation was described in medical texts as if it were a monthly act of willful negligence. “Whenever there is actual pain at any stage of the monthly period, it is because something is wrong either in the dress, or the diet, or the personal and social habits of the individual,” wrote one (male, of course) observer.

The painful irony is that women frequently were unwell because considerations of decorum denied them proper medical care. In 1856, when a young Boston housewife from a respectable background tearfully confessed to her doctor that she sometimes found herself involuntarily thinking of men other than her husband, the doctor ordered a series of stringent emergency measures, which included cold baths and enemas, the removal of all stimulus, including spicy foods and the reading of light fiction, and the thorough scouring of her vagina with borax. Light fiction was commonly held to account for promoting morbid thoughts and a tendency to nervous hysteria. As one author gravely summarized: “Romance-reading by young girls will, by this excitement of the bodily organs, tend to create their premature development, and the child becomes physically a woman months or even years before she should.”

As late as 1892, Judith Flanders reports, a man who took his wife to have her eyes tested was told that the problem was a prolapsed womb and that until she had a hysterectomy her vision would remain impaired.

Sweeping generalizations were about as close as any medical man would permit himself to get to women’s reproductive affairs. This could have serious medical consequences, since no doctor could make a proper gynecological examination. In extremis, he might probe gently beneath a blanket in an underlit room, but this was highly exceptional. For the most part, women who had any medical complaint between neck and knees were required to point blushingly to the affected area on a dummy.

One American physician in 1852 cited it as a source of pride that “women prefer to suffer the extremity of danger and pain rather than waive those scruples of delicacy which prevent their maladies from being fully explored.” Some doctors opposed forceps delivery on the grounds that it allowed women with small pelvises to bear children, thus passing on their inferiorities to their daughters.

The inevitable consequence of all this was that ignorance of female anatomy and physiology among medical men was almost medieval. The annals of medicine hold no better example of professional gullibility than the celebrated case of Mary Toft, an illiterate rabbit breeder from Godalming, in Surrey, who for a number of weeks in the autumn of 1726 managed to convince medical authorities, including two physicians to the royal household, that she was giving birth to a series of rabbits. The matter became a national sensation. Several of the medical men attended the births and professed total amazement. It was only when yet another of the king’s physicians, a German named Cyriacus Ahlers, investigated more closely and pronounced the whole matter a hoax that Toft at last admitted the deception. She was briefly imprisoned for fraud but then sent home to Godalming, and that was the last that anyone ever heard of her.

An understanding of female anatomy and physiology was still a long way off, however. As late as 1878 the British Medical Journal was able to run a spirited and protracted correspondence on whether a menstruating woman’s touch could spoil a ham. Judith Flanders notes that one British doctor was struck off the medical register for noting in print that a change in coloration around the vagina soon after conception was a useful indicator of pregnancy. The conclusion was entirely valid; the problem was that it could be discerned only by looking. The doctor was never allowed to practice again. In America, meanwhile, James Platt White, a respected gynecologist, was expelled from the American Medical Association for allowing his students to observe a woman—with her permission—give birth.

Against this, the actions of a surgeon named Isaac Baker Brown become all the more extraordinary. In an age in which doctors normally didn’t go within an arm’s length of a woman’s reproductive zone and would have little idea of what they had found if they went there, Baker Brown became a pioneering gynecological surgeon. Unfortunately, he was motivated almost entirely by seriously disturbed notions. In particular, he grew convinced that nearly every female malady was the result of “peripheral excitement of the pudic nerve centring on the clitoris.” Put more bluntly, he thought women were masturbating and that this was the cause of insanity, epilepsy, catalepsy, hysteria, insomnia, and countless other nervous disorders. The solution was to remove the clitoris surgically and thus take away any possibility of wayward excitation. He also developed the conviction that the ovaries were mostly bad and were better off removed. Since no one had ever tried to remove ovaries before, it was an exceptionally delicate and risky operation. Baker Brown’s first three patients died on the operating table. Undaunted, he performed his fourth experimental operation on, of all people, his sister. She lived.

When it was discovered that he had for years been removing women’s clitorises without their permission or knowledge, the reaction of the medical community was swift and furious. Baker Brown was expelled from the Obstetrical Society of London, which effectively ended his ability to practice. On the plus side, doctors did at last accept that it was time to become scientifically attentive to the private parts of female patients. So ironically, by being such a poor doctor and dreadful human being, Baker Brown did more than any other person to bring the study and practice of female medicine up to modern standards.


II

There was, it must be said, one very sound reason for being fearful of sex in the premodern era: syphilis. There has never been a more appalling disease, at least for the unlucky portion who get what is known as third-stage syphilis. This is a milestone you just don’t want to experience. Syphilis gave sex a real dread. To many, it seemed a clear message from God that sex outside the bounds of marriage was an invitation to divine retribution.

Syphilis, as we have seen, had been around for a long time. As early as 1495, just three years after the voyage of Christopher Columbus that introduced it to Europe, some soldiers in Italy developed pustules “like grains of millet” all over their faces and bodies, which is thought to be the first medical reference to syphilis in Europe. It spread rapidly—so rapidly that people couldn’t agree where it came from. The first recorded mention of it in English is as “the French pox” in 1503. Elsewhere it was known as the Spanish disease, the Celtic humors, the Neapolitan pox, or, perhaps most tellingly, “the Christian disease.” Syphilis was coined in a poem by the Italian Hieronymus Fracastorius in 1530 (in his poem Syphilis is the name of a shepherd who gets the disease) but does not appear in English until 1718.

Syphilis was for a long time a particularly unnerving disease because of the way it came and went in three stages, each successively worse than the last. The first stage usually showed itself as a genital chancre, ugly but painless. This was followed some time later by a second stage that involved anything from aches and pains to hair loss. Like first-stage syphilis, this would also resolve itself after a month or so whether it was treated or not. For two-thirds of syphilis sufferers, that was it. The disease was over. For the unfortunate one-third, however, the real dread was yet to come. The infection would lie dormant for as long as twenty years before erupting in third-stage syphilis. This is the stage nobody wants to go through. It eats away the body, destroying bones and tissue without pause or mercy. Noses frequently collapsed and vanished. (London for a time had a “No-Nose’d Club.”) The mouth may lose its roof. The death of nerve cells can turn the victim into a stumbling wreck. Symptoms vary, but every one of them is horrible. Despite the dangers, people put up with the risks to an amazing degree. James Boswell contracted venereal diseases nineteen times in thirty years.

Treatments for syphilis were severe. In the early days a lead solution was injected into the bladder via the urethra. Then mercury became the drug of choice and remained so right up to the twentieth century and the invention of the first antibiotics. Mercury produced all kinds of toxic symptoms—bones grew spongy, teeth fell out—but there was no alternative. “A night with Venus and a lifetime with Mercury” was the axiom of the day. Yet the mercury didn’t actually cure the disease; it merely moderated the worst of the symptoms while inflicting others.



Perhaps nothing separates us more completely from the past than how staggeringly ineffectual—and often petrifyingly disagreeable—medical treatments once were. Doctors were lost in the face of all but a narrow range of maladies. Often their treatment merely made matters worse. The luckiest people in many ways were those who suffered in private and recovered without medical intervention.

The worst outcome of all, for obvious reasons, was to have to undergo surgery. In the centuries before anesthetics, many ways of ameliorating pain were tried out. One method was to bleed the patient to the point of faintness. Another was to inject an infusion of tobacco into the rectum (which, at the very least, must have given the patient something else to think about). The most common treatment was to administer opiates, principally in the form of laudanum, but even the most liberal doses couldn’t mask real pain.

During amputations, limbs were normally removed in less than a minute, so the most traumatizing agony was over quickly, but vessels still had to be tied off and the wound stitched, so there remained much scope for lingering pain. Working quickly was the trick of it. When Samuel Pepys underwent a lithotomy—the removal of a kidney stone—in 1658, the surgeon took just fifty seconds to get in and find and extract a stone about the size of a tennis ball. (That is, a seventeenth-century tennis ball, which was rather smaller than a modern one, but still a sphere of considerable dimension.) Pepys was extremely lucky, as the historian Liza Picard points out in Restoration London, because his operation was the surgeon’s first of the day and therefore his instruments were reasonably clean. Despite the quickness of the operation, Pepys needed more than a month to recover.

More complicated procedures were almost unbelievably taxing. They are painful enough to read about now, but what they must have been like to live through simply cannot be conceived. In 1806, the novelist Fanny Burney, while living in Paris, suffered a pain in her right breast, which gradually grew so severe that she could not lift her arm. The problem was diagnosed as breast cancer and a mastectomy was ordered. The job was given to a celebrated surgeon named Baron Larrey, whose fame was based not so much on his skill at saving lives as on his lightning speed. He would later become famous for conducting two hundred amputations in twenty-four hours after the Battle of Borodino in 1812.

Burney’s account of the experience is almost unbearably excruciating because of the very calmness with which she relays its horrors. Almost as bad as the event itself was the torment of awaiting it. As the days passed, the anxiety of apprehension became almost crushing, and was made worse when she learned on the morning of the appointed day that the surgeons would be delayed by several hours. In her diary she wrote: “I walked backwards and forwards till I quieted all emotions, and became, by degrees, nearly stupid—torpid, without sentiment or consciousness—and thus I remained till the clock struck three.”

At that point she heard four carriages arrive in quick succession. Moments later, seven grave men in black came into the room. Burney was given a drink to calm her nerves—she didn’t record what, but wine mixed with laudanum was the usual offering. A bed was moved into the middle of the room; old bedding was placed on it so as not to spoil a good mattress or linens.

“I now began to tremble violently,” Burney wrote, “more with distaste and horror of the preparations even than of the pain.… I mounted, therefore, unbidden, the bedstead, and M. Dubois placed me upon the mattress, and spread a cambric handkerchief upon my face. It was transparent, however, and I saw through it that the bedstead was instantly surrounded by the seven men and my nurse. I refused to be held; but when, bright through the cambric, I saw the glitter of polished steel—I closed my eyes.” Learning that they intended to remove the whole breast, she surrendered herself to “a terror that surpasses all description.” As the knife cut into her, she emitted “a scream that lasted intermittingly during the whole time of the incision—and I almost marvel that it rings not in my ears still, so excruciating was the agony. When the wound was made, and the instrument was withdrawn, the pain seemed undiminished … but when again I felt the instrument—describing a curve—cutting against the grain, if I may say so, while the flesh resisted in a manner so forcible as to oppose and tire the hand of the operator, who was forced to change from the right to the left—then, indeed, I thought I must have expired. I attempted no more to open my eyes.”

But still the operation went on. As the surgeons dug away diseased tissue, she could feel and hear the scrape of the blade on her breastbone. The entire procedure lasted seventeen and a half minutes, and it took her months to recover. But the operation saved her life. She lived another twenty-nine years and the cancer never came back.

Not surprisingly, people were sometimes driven by pain and a natural caution regarding doctors to attempt extreme remedies at home. Gouvernor Morris, one of the signers of the Declaration of Independence, killed himself by forcing a whalebone up his penis to try to clear a urinary blockage.

The advent of surgical anesthetics in the 1840s didn’t eliminate the agony of medical treatments very often so much as postpone it. Surgeons still didn’t wash their hands or clean their instruments, so many of their patients survived the operations only to die of a more prolonged and exquisite agony through infection. This was generally attributed to “blood poisoning.” When President James A. Garfield was shot in 1881, it wasn’t the bullet that killed him, but doctors sticking their unwashed fingers in the wound. Because anesthetics encouraged the growth of surgical procedures, there was in fact probably a very considerable net increase in the amount of pain and suffering after the advent of anesthetics.

Even without the unnerving interventions of surgeons, there were plenty of ways to die in the premodern world. For the City of London, the death rolls—or Bills of Mortality as they were known in England—for 1758 list 17,576 deaths from more than eighty causes. Most deaths, as might be expected, were from smallpox, fever, consumption, or old age, but among the more miscellaneous causes listed (with original spellings) were:

choaked with fat 1 Itch 2 froze to death 2 St Anthony’s fire 4 lethargy 4 sore throat 5 worms 6 killed themselves 30 French pox 46 lunatick 72 drowned 109 mortification 154 teeth 644

How exactly “teeth” killed so many seems bound to remain forever a mystery. Whatever the actual causes of death, it is clear that expiring was a commonplace act and that people were prepared for it to come from almost any direction. Death rolls from Boston in the same period show people dying from such unexpected causes as “drinking cold water,” “stagnation of the fluids,” “nervous fevers,” and “fright.” It is interesting, too, that many of the more expected forms of death feature only marginally. Of the nearly 17,600 people whose deaths were recorded in London in 1758, just 14 were executed, 5 murdered, and 4 starved.

With so many lives foreshortened, marriages in the preindustrial world tended to be brief. In the fifteenth and sixteenth centuries, the average marriage lasted just ten years before one or the other of the partners expired. It is often assumed that because people died young they also married young in order to make the most of the short life that lay in front of them. In fact, that seems not to be so. For one thing, people still saw the normal span of life—one’s theoretical entitlement—as the biblical three score years and ten. It was just that not so many people made it to that point. Nearly always cited in support of the contention that people married early are the tender ages of the principal characters in Shakespeare’s Romeo and Juliet—Juliet just thirteen, Romeo a little older. Putting aside the consideration that the characters were fictitious and hardly proof of anything, what is always overlooked in this is that in the poem by Arthur Brooke on which Shakespeare based the story, the characters were actually sixteen. Why Shakespeare reduced their ages is, like most of what Shakespeare did, unknowable. In any case, Shakespeare’s youthful ages are not supported by documentary evidence in the real world.

In the 1960s, the Stanford historian Peter Laslett did a careful study of British marriage records and found that at no time in the recorded past did people regularly marry at very early ages. Between 1619 and 1660, for instance, 85 percent of women were nineteen or older when married; just one in a thousand was thirteen or under. The median age at marriage for brides was twenty-three years and seven months, and for men it was nearly twenty-eight years—not very different from the ages of today. William Shakespeare himself was unusual in being married at eighteen, while his wife, Anne, was unusually old at twenty-six. Most really youthful marriages were formalities known as espousals de futuro, which were more declarations of future intentions than licenses to hop into bed.

What is true is that there were a lot more widowed people out there and that they remarried more frequently and more quickly after bereavement. For women, it was often an economic necessity. For men, it was the desire to be looked after. In short, it was often as much a practical consideration as an emotional one. One village surveyed by Laslett had, in 1688, seventy-two married men, of whom thirteen had been married twice, three had been married three times, three married four times, and one married five times—all as the result of widowhood. Altogether about a quarter of all marriages were remarriages following bereavement, and those proportions remained unchanged right up to the first years of the twentieth century.

With so many people dying, mourning became a central part of most people’s lives. The masters of mourning were of course the Victorians. Never have a people become more morbidly attached to death or found more complicated ways to mark it. The master practitioner was Victoria herself. After her beloved Prince Albert died in December 1861, the clocks in his bedroom were stopped at the minute of his death, 10:50 p.m., but at the Queen’s behest his room continued to be serviced as if he were merely temporarily absent rather than permanently interred in a mausoleum across the grounds. A valet laid out clothes for him each day, and soap, towels, and hot water were brought to the room at the appropriate times, then taken away again.

At all levels of society mourning rules were strict and exhaustingly comprehensive. Every possible permutation of relationship was considered and ruled on. If, for example, the dearly departed was an uncle by marriage, he was to be mourned for two months if his wife survived him, but for just one month if he was unmarried or widowed himself. So it went through the entire canon of relationships. One needn’t even have met the people being mourned. If one’s husband had been married before and widowed—a fairly common condition—and a close relative of his first wife’s died, the second wife was expected to engage in “complementary mourning”—a kind of proxy mourning on behalf of the deceased earlier partner.

Exactly how long and in what manner mourning clothes were worn was determined with equally meticulous precision by the degree of one’s bereavement. Widows, already swaddled in pounds of suffocating broadcloth, had additionally to drape themselves in black crape, a type of rustly crimped silk. Crape was scratchy, noisy, and maddeningly difficult to maintain. Raindrops on crape left whitish blotches wherever they touched it, and the crape in turn ran onto fabric or skin underneath. A crape stain ruined any fabric it touched and was nearly impossible to wash off skin. The amounts of crape worn were strictly dictated by the passage of time. One could tell at a glance how long a woman had been widowed by how much crape she had at each sleeve. After two years, a widow moved into a phase known as “half mourning” when she could begin to wear gray or pale lavender, so long as they weren’t introduced too abruptly.

Servants were required to mourn when their employers died, and a period of national mourning was decreed when a monarch died. Much consternation ensued when Queen Victoria expired in 1901, because it had been over sixty years since the last regal departure and no one could agree what level of mourning was appropriate to such a long-lasting monarch in such a new age.



As if Victorians didn’t have enough to worry about already, they developed some peculiar anxieties about death. Edgar Allan Poe exploited one particular fear to vivid effect in his story “The Premature Burial” in 1844. Catalepsy, a condition of paralysis in which the victim merely seemed dead while actually being fully conscious, became the dread disease of the day. Newspapers and popular magazines abounded with stories of people who suffered from its immobilizing effects.

One well-known case was that of Eleanor Markham of upstate New York, who was about to be buried in July 1894 when anxious noises were heard coming from her coffin. The lid was lifted and Miss Markham cried out: “My god, you are burying me alive!” She told her saviors: “I was conscious all the time you were making preparations to bury me. The horror of my situation is altogether beyond description. I could hear everything that was going on, even a whisper outside the door.” But no matter how much she willed herself to cry out, she said, she was powerless to utter a noise.

According to one report, of twelve hundred bodies exhumed in New York City for one reason or another between 1860 and 1880, six showed signs of thrashing or other postinterment distress. In London, when the naturalist Frank Buckland went looking for the coffin of the anatomist John Hunter at St. Martin-in-the-Fields Church, he reported coming upon three coffins that showed clear evidence of internal agitation (or so he was convinced). Anecdotes of premature burials featured in even serious publications. A correspondent to the British journal Notes and Queries offered this contribution in 1858:


A rich manufacturer named Oppelt died about fifteen years since at Reichenberg, in Austria, and a vault was built in the cemetery for the reception of the body by his widow and children. The widow died about a month ago and was taken to the same tomb; but, when it was opened for that purpose, the coffin of her husband was found open and empty, and the skeleton discovered in a corner of the vault in a sitting posture.

For at least a generation such stories became routine in even serious periodicals. So many people became morbidly obsessed with the fear of being interred before their time that a word was coined for it: taphephobia. The novelist Wilkie Collins placed on his bedside table each night a letter bearing standing instructions of the tests he wished carried out to ensure that he really had died in his sleep if he was found in a seemingly corpselike state. Others directed that their heads be cut off or their hearts removed before burial, to put the matter comfortably (if that is the right word) beyond doubt. One author proposed the construction of “Waiting Mortuaries,” where the departed could be held for a few days to ensure they really were quite dead and not just unusually still. Another more entrepreneurial type designed a device that allowed someone awaking within a coffin to pull a cord, which opened a breathing tube for air and simultaneously set off a bell and started a flag waving at ground level. An Association for Prevention of Premature Burial was established in Britain in 1899 and an American society was formed the following year. Both societies suggested a number of exacting tests to be satisfied by attending physicians before they could safely declare a person dead—holding a hot iron against the deceased’s skin to see if it blistered was one—and several of these tests were actually incorporated into medical schools’ curricula for a time.

Grave robbing was another great concern—and not without reason, for the demand for fresh bodies in the nineteenth century was considerable. London alone was home to twenty-three schools of medicine or anatomy, each requiring a steady supply of cadavers. Until the passing of the Anatomy Act in 1832, only executed criminals could be used for experiment and dissection. Yet executions in England were much rarer than is commonly supposed: in 1831, a typical year, sixteen hundred people were condemned to death in England, but only fifty-two executed. So the demand for bodies was way beyond what could be legally supplied. Grave robbery in consequence became an irresistibly tempting business, particularly as stealing a body was, thanks to a curious legal quirk, a misdemeanor rather than a felony. At a time when a well-paid working man might earn £1 in a week, a fresh corpse could fetch £8 or £10 and sometimes as much as £20, and, at least initially, without much risk as long as the culprits were careful to remove only the bodies and not shrouds, coffins, or keepsakes, for which they could be charged with a felony.

It wasn’t just a morbid interest in dissection that drove the market. In the days before anesthetics, surgeons really needed to be closely acquainted with bodies. You can’t poke thoughtfully among arteries and organs when the patient is screaming in agony and spurting blood. Speed was of the essence, and the essential part of speed was familiarity, which could only come with much devoted practice on the dead. And of course the lack of refrigeration meant that flesh began to spoil quickly, so the need for fresh supplies was constant.

To thwart robbers, the poor in particular often held on to departed loved ones until the bodies had begun to putrefy and so had lost their value. Edwin Chadwick’s Report on the Sanitary Condition of the Labouring Classes of Great Britain was full of gruesome and shocking details about the practice. In some districts, he noted, it was common for families to keep a body in the front room for a week or more while waiting for putrefaction to get a good hold. It was not unusual, he said, to find maggots dropping onto the carpet and infants playing among them. The stench, not surprisingly, was powerful.

Graveyards also improved their security, employing armed night-watchmen. That severely elevated the risk of being apprehended and beaten, so some resurrection men, as they were popularly known, turned to murder as safer. The most notorious and devoted were William Burke and William Hare, Irish immigrants in Edinburgh, who killed at least fifteen people in a period of less than a year, beginning in November 1827. Their method was crudely effective. They befriended sad wastrels, got them drunk, and suffocated them, the stout Burke sitting on the victim’s chest and Hare covering the mouth. The bodies were taken at once to Professor Robert Knox, who paid from £7 to £14 for each fresh, pink corpse. Knox must have known that something exceedingly dubious was going on—two Irish alcoholics turning up with a succession of extremely fresh bodies, each having expired in seemingly tranquil fashion—but maintained that it was not his business to ask questions. He was widely condemned for his part in the affair, but never charged or penalized. Hare escaped hanging by turning king’s evidence and offering to testify against his friend and partner. This proved unnecessary, as Burke made a full confession and was swiftly hanged. His body was delivered to another anatomy school for dissection, and pieces of his skin were pickled and for years handed out as keepsakes to favored visitors.

Hare spent only a couple of months in prison before being released, though his fate was not a happy one. He took a job at a lime kiln, where his co-workers recognized him and thrust his face into a heap of quicklime, permanently blinding him. He is thought to have spent his last years as a wandering beggar. Some reports had him returning to Ireland, others place him in America, but how long he lived and where he was buried are unknown.

All this gave a great spur to an alternative way of disposing of bodies that was surprisingly controversial in the nineteenth century: cremation. The cremation movement had nothing to do with religion or spirituality. It was all about creating a practical way to get rid of a lot of bodies in a clean, efficient, and nonpolluting manner. Sir Henry Thompson, founder of the Cremation Society of England, demonstrated the efficacy of his ovens by cremating a horse at Woking in 1874. The demonstration worked perfectly but caused an outcry among those emotionally opposed to the idea of burning a horse or any other animal. In Dorset a certain Captain Hanham built his own crematorium and used it very efficiently to dispose of his wife and mother in defiance of the laws. Others, fearful of arrest, sent their loved ones to countries where cremation was legal. Charles Wentworth Dilke, the writer and politician who was one of the cofounders of Gardener’s Chronicle with Joseph Paxton, shipped his late wife to Dresden to be cremated in 1874 after she died in childbirth. Another early exponent was Augustus Henry Lane Fox Pitt Rivers, one of the nineteenth century’s leading archaeologists, who not only desired cremation for himself but insisted upon it for his wife, despite her continued objections. “Damn it, woman, you shall burn,” he declared to her whenever she raised the matter. Pitt Rivers died in 1900 and was cremated, even though it wasn’t yet legal. His wife outlived him, however, and was given the peaceful burial she had always longed for.

In Britain, on the whole, opposition remained entrenched for a long time. Many people thought the willful destruction of a corpse immoral. Others cited practical considerations. A point made often by opponents was that it would destroy evidence in cases of murder. The movement also wasn’t helped by the fact that one of its principal proponents was essentially mad. His name was William Price. He was a doctor in rural Wales noted for his eccentricities, which were exhaustive. He was a druid, a vegetarian, and a militant Chartist; he refused to wear socks or to touch coins. In his eighties he fathered a son by his housekeeper and named him Jesus Christ. When the baby died in early 1884, Price decided to cremate him on a pyre on his land. When villagers saw the flames and went to investigate, they found Price, dressed as a druid, dancing around the bonfire and reciting strange chants. Outraged and flustered, they stepped in to stop him and in the confusion Price snatched the half-burned baby from the fire and retired with the body to his house, where he kept it in a box under his bed until arrested a few days later. Price was brought to trial, but released when the judge decided that nothing he had done was conclusively criminal, since the baby was not actually cremated. He did, however, set back the cause of cremation very severely.

While cremation became routine elsewhere, it wasn’t formally legalized in Britain until 1902, just in time for our Mr. Marsham to exercise that option if he chose to. He didn’t.


* Truckle bed and trundle bed are two terms for the same thing. Truckle comes from the Greek trochlea, signifying something that slides, and trundle is related to the Old English words trindle and trendle, all meaning something that moves along by rolling. Truckle bed dates from 1459; trundle bed followed about a hundred years later.

Загрузка...