PART V OUR BODIES, OUR NATURE

The (3D-Printed) Ear He Lends Me

Lawrence Bonassar’s lab in Cornell University’s Weill Hall sits across the street from a jewel of a tiny flower garden, now blanketed in snow. Though Weill’s outside walls are white as the season, it’s one of the “greenest” buildings in the country, which has earned a rare gold LEED rating, thanks to everything from recycled building debris and materials (such as the outer skin’s white aluminum panels) to a cooling roof planted with succulents and flowers, heat-reflective sidewalks, a giant atrium with passive solar heat, and motion sensors that turn on lights, temperature, and air flow only as needed, when people appear.

Opened five years ago, as a state-of-the-art home for the Department of Life Sciences, it’s been designed as an intellectual crucible with large overlapping labs. Long open sunlit rooms, running the length of each floor, share common lounges, corridors, and microscope areas, making it impossible not to bump into postdocs in related fields. Even in winter, cross-pollination is encouraged. Just as the planners hoped, many collaborations have ensued, the new field of regenerative medicine is taking wing, and bioprintmakers are crafting tailor-made body parts.

The principle of “regenerative medicine” is magically simple: if a heart or jaw is damaged, either teach the body to regrow another or print a healthy new one the body will embrace. What Bonassar’s lab regenerates is the body’s vital infrastructure of cartilage: all those cushions (the so-called disks) between the vertebrae in the spine; the easily torn meniscus in the knee; the accordion of semicircular rings that keeps the trachea from collapsing when we breathe, yet allows it to bend forward when we swallow; the external ear, prized by poets and nibblesome lovers, who often describe it as “shell-like.” His goal is to restore lost function to ragged physiques and fix facial defects. To that end, he mingles tools from several disciplines, including biomechanics, biomaterials, cell biology, medicine, biochemistry, robotics, and 3D printing. If your only tool is a ruler, you’ll tend to draw boxes. New tools create new mental playgrounds. On this playground, spare ears abound.

What could you do with a spare ear? If a patient with skin cancer has to have an ear removed, the traditional way to replace it is with a prosthetic, but that has to be donned every day. A young mother in exactly that fix lamented to CBS News, “I could just see my kids running around with it, yelling ‘I have mommy’s ear!’” Instead, surgeons at Johns Hopkins harvested cartilage from her ribs, shaped it into an ear, and implanted it under her forearm skin, where it could be nourished by her own blood vessels. Just four months later, when the ear had grown its own skin, it was removed from her arm and attached to her head. Yet, wonderful as her new homegrown ear was, the process required numerous operations, including breaking open the vault of the chest and stripping cartilage from the ribs, then shaving and shaping the ear to fit. A feat of subtractive manufacturing.

Or an ear might rescue the one child out of nine thousand who is born with microtia, a condition in which the external ear hasn’t fully developed, sometimes leaving only a small peanut-shaped vestige behind for classmates to ogle and mock. A father of three young children—twin six-year-old girls and a five-year-old boy—Bonassar is deeply sympathetic to the condition, and mindful of what early fixes can mean. Unfortunately, children aren’t able to brave an operation like the young mother’s until they’re six to ten years old, because they don’t have enough rib cartilage. Also, the operation is very painful and quite traumatic, and you don’t really want to subject a child to it. How much simpler, less invasive, and cheaper to do an MRI, CT scan, or 3D photograph, then print the cartilage out on demand in the exact shape of each child’s highly personal ear. You’d be able to do it much earlier in a child’s life, and you could photograph the left ear, flip it around to make a right ear, and match the geometry perfectly.

Microtia doesn’t harm hearing, but it often invites the social nightmares of bullying and shunning, just as a child is detailing a sense of self. So, although a new ear is only a cosmetic change, it has an enormous impact on a child’s hope of making friends—which in turn shapes a growing child’s brain. The ability to smile is a child’s coin of the realm, pleasant ears and face its passport. As I learned volunteering for a short spell with Interplast in Central America, the sooner you can repair a harelip, birthmark, or other deformity, the better chance a child has to bond with her parents, let alone strangers.

I don’t really need a new ear. Except for the occasional snare-drum of tinnitus or missed stage whispers, mine are working passably well. And the outer shells fit the size of my head. I could use more cartilage in my left knee, and a new spinal disk one day, but I don’t fancy today’s remedies—a cold metal implant, or a gift from a four-legged animal.

Nonetheless, a homegrown ear is what Lawrence Bonassar offers me, extending it in his open hand, as if it were sprouting out of his palm and shaking hands were just another form of listening. Translucent white, the ear feels smooth and warm as amber, and my thumb automatically plays over its many ridges and folds. I’m surprised by the level of minute detail. It’s quite odd to fondle a disembodied ear. Or a prizewinning one, for that matter. His bioprinting has won first place in the World Technology Awards in Health and Medicine, the Oscars of the technology world, which celebrate inventions “of the greatest likely long-term significance for the twenty-first century.”

The ear he lends me is solid, yet bendy as a dried apricot, and would flex easily under the skin. But this one isn’t intended for anyone’s head. He returns it to a glass jar of preservative and sets it back on a shelf. His lab looks like the mental crossroads it is: a chemistry lab full of microscopes, workbenches, sinks, glass, and stainless steel. But also a medical facility, complete with large incubators, sterile fields, and countless drawers of parts and molds. And also a tech center equipped with computers, robots, and of course 3D printers. All accompanied by a seemingly endless wall of windows.

Outside there is the frail enchantment of snow, and a school bus creeping by like the orange pupa of some colorful butterfly.

One long ray of sunlight like a pointing finger touches a white desktop box about the size of the first manual typewriter, but less complicated looking. Two steel syringes with nozzles float above a metal warming plate, where they’ll begin hope’s calligraphy. The ink may be living cells of any type, life’s pageant in a polymer. When the nineteenth-century painter Georges Seurat used a similar stipple technique with pure color, his dots seemed to blend, but that was merely sleight of eye. These dots merge because cells fuse freely; they don’t need a human nudge.

As the moving pen writes, it doubles back over each line, stippling new layers, until it creates an outer ear that isn’t exactly organic in the way a wart or an eyelash is. Still, when transplanted, it will twitch with life, feel embodied, and help redefine what we mean by “natural.” Then any tangle of flesh and blood will serve as home, making rejection obsolete. For once in our long-storied evolution, a body part isn’t molded solely by evolution’s blueprint—we can choose its design. And forget years. Once he receives the MRI, CT, or 3D image, Bonassar can grow an ear in fifteen minutes. The time it takes me to walk to my local coffee shop.

Bonassar has mastered the art of training materials to carry cells and deposit them like puppies exactly where he wants while keeping them alive and happy. And, just like healthy pups, the cells are agile enough to tussle without smooshing, hale enough to tug, eager to curve their tiny mouths inward and eat all the nutrients they need.

The two polymers he prefers are collagen—protein fibers the body uses as gluey twine and mortar—and alginate, a gel found in the brown seaweed I held at Thimble Islands, and used by drive-ins to make milk shakes thick and creamy as they swirl out of the dispenser. My parents had one of the first McDonald’s, where I sometimes poured shakes, and I know the consistency well—between syrup and toothpaste. That’s also the way bioprinters dispense ink, except that the carrier has clusters of living cells embedded in it.

When I ask Bonassar if he makes the scaffolds here, too, a proud smile appears, as he explains that the scaffolding is the liquid. This shake contains its own framework, so that an ear fresh from the printer is ready to go. And the ink? Water droplets wouldn’t stick together. Hard marbles would roll off. Instead he uses squishy collagen marbles, which cling to their neighbors. Like eggs or blood, collagen gels when it’s heated and the fibers scramble together. So he stores it cold, allowing it to stiffen only when it falls onto the warming plate.

It’s a technology he hijacked from Hod Lipson, and he began printing in Lipson’s lab with a printer wide as a brick hearth and heavy as an iron cauldron. Now it’s the size of an espresso machine and shockingly simple: load whatever you want into your syringe, place it so that the motor can grab the plunger and print, then adjust the rate ink squeezes out, and set the print head’s path. After that, two more steps remain.

He leads me through an open doorway into a tiny room packed with large machines, including a sterile culture hood with a large window, in which the whole bioprinter can be placed, safe from dust, fungi, and bacteria. It looks like an incubator for preemies.

“No, this is the incubator,” he says, turning to peer through a glass door into dimly lit shelves. “Look there. Can you see them?”

Rising on tiptoes, I see two petri dishes holding small strange buttons or perhaps tires. The odd items are spinal implants destined for a bouncing, braying fluff of a dog. Canine arthritis is a big problem for frolicsome dachshunds, hounds, and especially beagles—dogs with short necks and long bodies wear out their cervical disks and develop joint pains just as we do. Working with Cornell’s Veterinary School, Bonassar’s lab created implants with a gel-like core that pushes against a tougher outside ring, pressurizing it very much like blowing up a tire. It’s also a true organ, two different kinds of tissue that work together seamlessly.

In osteoarthritis, cartilage’s cushion wears out like an old pillow, and bone rubs bone raw, producing inflammation and pain. Almost everyone has a creaky-jointed sufferer in the family. Today’s back operations usually remove a damaged disk and fuse the vertebrae together with a metal plate, which creates the rigidity of a poker up the spine, doesn’t always work, and can make adjoining vertebrae weak as loose teeth. An alternative to fusion would be a godsend to sixty or seventy million people in the United States alone. Starting small, Bonassar replaced spinal cartilage in rats with his own lab-grown variety, and the rats lived normal lives, apparently pain-free. Next in line are larger animals—a dog, sheep, or goat—and if that works, then human volunteers will follow.

Bending to examine a smaller chamber alongside the incubator, I spot the next stage in the life of bespoke disks. Since all tissues in the body are weight-bearing and thrive under stress, his lab toughens the tissues, squeezing the implants over and over, as if they were pumping iron. This also quickens their metabolism, squooshing food in and smooshing waste out, making it more efficient. Such bioprinted implants could last longer than natural ones.

“It’s quite realistic to assume,” Bonassar says, hazel eyes sparkling, “that the first stages of the human clinical trials could happen within the next five years.”

When I ask about printing out hearts, lungs, and livers, he leads me to a large computer screen where he summons up a pair of gloved hands holding what look like pieces of sushi: thick white slices with a thin halo of pink. A closer look reveals a rarer delicacy: ear implants fabricated from 3D photographs of Bonassar’s daughters’ ears. He smiles at them with a love pure as starlight. Then he points out the blood vessels in the thin rind around the implants, and the thicker comma of white tissue that’s quite bare. Yet those bare cells prosper, too.

The challenge with organs isn’t their size, it’s the plumbing. The bigger organs are like Venetian cities, fed by elaborate water streets afloat with gondolas. Many labs around the world are hunting the best ways to mirror those supply lines, and the elusive “aha moment” could be one week away or ten years. But its scent is in the air, and no one doubts it will soon revolutionize medicine with clean, healthy organs on demand. A touch of mental whiplash is to be expected.

It’s a hallmark of the Anthropocene that science and technology are galloping at such a pace that Bonassar’s field didn’t even exist when he was in high school or college. Now he’s among those ringing the biggest changes, including a dramatically new view of the human body and the jostles of cells that inhabit living tissues—even what a cell is and how a cell behaves. Not only do we know about stem cells, we’re starting to wield them in clever ways to mend the body, and it’s not arduous to do; it can be as simple as exposing cells to the right chemicals or stimuli. There’s been a stunning paradigm shift from the rule of phenotype—one cell type fated for one job and nothing else—to phenotypic elasticity, the idea that cells are far more versatile and can be repurposed, like a hammer used to anchor a kite.

We now grasp that a wafer of skin can be retrained to do just about anything. It’s a new category of raw material, like wood or stone, with potent gifts. An ebony tree growing in Africa may provide shade to humans, and a lofty haven for a leopard gnawing a carcass, but its dark grain also gives rise to clarinets, piano keys, violin fingerboards, and music. The old idea of skin as a sacred cloak with two main jobs—to seal off the vulnerable organs inside us and define our individuality—has given way to a sense of how mingling, malleable, and porous the body really is. At the cellular level, we’re stunningly mutable, not just in our lifestyles, which we always knew, but in our bits and pieces. A butler can change his mind, via his neurons, and become a gandy dancer. A dash of skin can become fresh neurons for a Parkinson’s-stricken monk. What to do with cells is increasingly more a question of imagination than material.

Spearheaded by pioneers like Bonassar and Lipson, Anthropocene engineering has penetrated the world of medicine and biology, revolutionizing how we regard the body. In these vistas, electricity, architecture, and chemistry slant together and tell tales never heard before.

We baby boomers grew up with a cartload of absolutes, handed down from generation to generation of biologists, the most daunting one, perhaps, being that we’re born with all the brain cells we’ll ever have, because the brain doesn’t mint new cells. Yet now we have proof that it does, even in old age. We’ve spent the last decade blowing up a lot of similar assumptions, and I wonder what other rigid ideas will topple. Bonassar offers me another quite mysterious one.

“We were told, over and over,” he says with relish, “that the heart is an organ that positively can’t regenerate. Yet an amazing study has turned that idea upside down.”

In this study, he explains, heart transplant patients received hearts from a donor of a different gender—mostly men receiving women’s hearts. In theory, one should be able to examine the heart recipient, look at the cells in his borrowed heart, and find female cells from the donor. But it turns out that, on autopsy, if these men bore their transplanted hearts for more than a decade, almost half of the heart cells were mysteriously replaced by male cells. The mechanism isn’t clear—but the new paradigm is. The heart’s metronome tissues, which we always believed couldn’t regenerate, actually can. No one knows if the cell-swap is a fusion or whether the female cells were forcibly displaced. Either way, it’s overturned the handcart of possibility, and furrowed many brows. If organs as elemental as brain and heart can be persuaded to regenerate, and others, like ears and corneas, can be fashioned from living ink, how will that change us as a species? Will the printing of organs affect our evolution? Could it alter our genes? I’m curious to know what Bonassar thinks.

The possibility intrigues him, too. “The real question,” he says, “is what the evolutionary pressure of these therapies might be. Would faulty genes become more prevalent, because they could be fixed? I wear contact lenses, but if I were a caveman and my eyesight was as bad as it was when I was five, I would have been in serious trouble. Now it doesn’t matter. We could replace our defective parts, live longer, and feel healthier.” Then he adds a provocative afterthought: “Yet physically we could be much weaker and more flawed genetically.”

Suppose we don’t just repair and enhance ourselves, suppose we live longer as a result? The primary focus of the work in Bonassar’s lab—cartilage for arthritis, cartilage for traumatic injury, disks for back pain—is medical solutions for ailments that tend to afflict people long after they’ve had children, when evolution has stopped bedeviling them to breed. Would the ability to be fit for a decade longer present an evolutionary advantage? Would people take more risks? How will we regard the body’s bits and pieces, and safeguard them, if we know we can cheaply replace them? We replace heart valves or heart tissue to extend life, but what if you can cure arthritis, and keep people active and sexual well into their seventies and eighties?

Think geriatric cyborgs and chimeras. Grandpa’s going to be saying a lot more interesting things than Where are my teeth! Just staying active for an added decade may alter our society as a whole far more than fixing a particular defect in the heart, liver, brain, or kidney.

The ninety-year-olds I’ve known haven’t run marathons, even with gleaming new hips and knees, but they’ve inhaled a lot of sky on daily walks. Even bioprinted cartilage needs exercise. Looking forward to a walkabout through drifting avenues of snow, I say good-bye to Bonassar and slip into my parka. As I stride down the hallway and into the atrium, the building’s smart sensors work and a little breeze runs before me like an invisible serpent. Hail begins lightly rattling against the windows. A vague thought, as elusive as the smell of violets, nags at me. A dark cloud passes over, and I feel aware of how aging, like winter weather, can chill the bones. For a moment, that thought hangs like an icicle, tapering and cold. Then my mind reels through hopeful images: an incubator full of spinal disks, the flexible necks of dachshunds, the long open labs of students with eager minds, the children with new ears, and the warming plate where collagen marbles land on their way to reshaping our future biology, and I swear I hear spring buzzing like a red-winged blackbird.

Cyborgs and Chimeras

At no point in my conversation with Bonassar did we discuss if people will mind the idea of artificial body parts. No need to. They’re already a commonplace feature of the new normal. Not long ago the idea of a cyborg was pure science fiction, and we couldn’t get enough of the Six Million Dollar Man (who inspired many a roboticist), Star Trek’s Captain Picard (who has an artificial heart), or the species of moody Replicants in Blade Runner. Now we think nothing of strolling around with stainless steel knees and hips; battery-operated pacemakers and insulin pumps; plastic stents; TENS pain units that disrupt pain signaling in the nerves; cochlear implants to restore hearing; neural implants for cerebral palsy, Parkinson’s, or damaged retinas; polymer and metal alloy teeth; vaccines hatched in eggs; chemically altered personalities; and, of course, artificial limbs. A great many of us are bionic (I have a 5 cm titanium screw in my foot), and bionic hands, arms, legs, skin, hearts, livers, kidneys, lungs, ears, and eyes are all available. Visible cyborgs who move among us may grab our attention and curiosity, but they don’t scare us anymore, and they’re becoming commonplace.

On a windy November day in 2012, software designer Zac Vawter climbed 103 floors of Chicago’s Willis Tower, the tallest building in the western hemisphere. From its Skydeck, 1,353 feet in the air, one can view four states and the pounded blue metal of Lake Michigan fanned out below. Breathing hard toward the end, with the 2,100th step he reached the Skydeck and strode straight into history.

It was a climb that challenged the stamina and knees of all 2,700 people who joined him to help raise money for the Rehabilitation Institute of Chicago. But what makes Vawter remarkable is that he did it using a gleaming new bionic leg. Surpassing even that is how he did it—by controlling the device with his thoughts.

A thirty-one-year-old father of two, Vawter lost his right leg in a motorcycle accident in 2009. Afterward he went through a pioneering procedure in which the residual nerves that once ruled his lower leg were “reassigned”—they were rerouted to control his hamstring. For months he flew to Chicago to work with engineers, therapists, and doctors to adjust the bionics and refine both his physical and mental technique.

As he pictured himself climbing—lifting his leg, bending his knee, flexing his ankle—electrical impulses from his brain flashed to his hamstring, which signaled a deftly designed assembly of motors, belts, and chains to lift his ankle and knee in unison, and he began taking the stairs step-over-step in the normal way. Just focusing hard doesn’t work; he had to intend to walk. The bionic leg is designed to read an owner’s intent, whether he’s walking, standing, or sitting. So if he’s seated and wants to stand up, he just pushes down and the leg pushes back, propelling him up.

Like any athlete, he had to prepare for months, while scientists tailor-made the prototype leg and he practiced the mind-feel of stair-climbing. In time, the brain accepted the robotics as an extension of his body image and took it into account when judging, say, whether or not he might fit through an open door. Yet when the climb was over and he flew home to Washington, he had to leave the leg in Chicago for researchers to continue tinkering with until it’s even more reliable. Bionic arms are already popular, and if an arm fails someone might drop a glass of milk, or, more alarmingly, a baby, or a flaming match. If a bionic leg fails, they could tumble down a flight of stairs. So the technology has to be safe. RIC expects the FDA to approve such bionic legs within the next five years.

As long as humans have walked the Earth, we’ve been driven by a need to stretch into the environment; tools and technology have always been an innate part of that quest. Now we’re comfortable with, and excited by, the promise of connecting our brains to the world outside of the body. iPads and cell phones that store phone numbers, calendars, to-do lists, photos, documents, and memories for us—external brains the size of a notepad—are just the beginning. Oh, where did I leave my memories? Most of us tuck our prosthetic memories in pockets, purses, and briefcases. On campus, the students tote spare hippocampi in their backpacks. We may fear losing our memory as we age, but at any age we’re anxious about losing our prosthetic memories. Many people aren’t at ease without obsessively “checking”—a verb once applied to OCD behavior (Did I turn off the stove? Close the garage? Shut the door tightly?). Relentless digital “checking behavior” has joined the closet of neurotic compulsions, and we’ve added these phobias to our quiver: nomophobia/mobophobia (the fear of leaving your cell phone at home), phantom vibrations (thinking your cell phone is vibrating even though no one is calling), and FOMO (fear of missing out, and so relentlessly checking Facebook). Continuous partial attention (focusing halfheartedly) has become pervasive as we’re tugged at by ringtones, text-tunes, incoming-mail pings, calendar flags, update alerts, new-post beeps, pop-ups, and the nagging possibility that something more engrossing may appear.

Our ancestors adapted to nature according to the limits of their senses. But over the eons, we’ve been extending our senses through visionary and stylish inventions—language, writing, books, tools, telescopes, telephones, eyeglasses, cars, planes, rocket ships—and, in the process, we’ve redefined how we engage the world but also how we think of ourselves. This even extends to our metaphors. We used to picture the body as a factory. Today that’s sea-changed and scientists picture factories as primitive forms of cells. We used to compare the brain to a computer. Now DARPA has a SyNAPSE program whose goal is building “a new kind of computer with similar form and function to the mammalian brain.”[30]

Our cells dance with their own electric, and as they’re immersed in ambient networks and signals—the everyware—we’re becoming part of an invisible weave that’s different from the one we used to picture as the seamless web of nature. This is part of the new natural. It slips beneath our radar for things weird, experimental, nonhuman. Anthropocene humans can merge with technology and not be regarded as alien.

Not only humans. When a puppy called Naki’o fell asleep in a puddle on a cold Nebraska night, he woke with frostbite on all four paws. As his condition worsened, Orthopets, a Denver company that specializes in prosthetics for animals, turned Naki’o into the world’s first bionic dog. Equipped with four prosthetic limbs, he runs and romps normally with his owner—despite not being able to feel the ground—and has become the spokesdog for Orthopets, which has also equipped a front-legless Chihuahua with two wheels (he’s a big hit in nursing homes). Other bionic animals include a wounded bald eagle, found starving in an Alaska landfill and given a new upper beak; a dolphin mutilated in a crab trap and unable to steer until it got a prosthetic tail; a green sea turtle with replacement flippers after a surfboard injury; and a baby orangutan, born with clubfeet, fitted successfully with therapy braces.

Clearly, these attachments were all prosthetics. But should we consider the first spears to be tools or imaginary prosthetics? Attacked by slashing bears, tigers, and other beasts with razory teeth and claws, our ancestors fought back by crafting teeth and claws of their own, ones they could detach and hurl from a distance. What a novel idea! Imagine a wolf flinging its teeth, one by one, at its prey. Stone axes were tools, but also prosthetic hands built stronger and bigger and sharper than a hominid’s own. The first clothing was also a prosthetic, an artificial body part: skin. When cavemen and -women wore draped animal hides for warmth, tying them with sinew, no matter how much they tanned the clothing first, it would have yielded a whiff of other creatures. They slept under borrowed scents, in a cave permeated by sweet gamy odors that mingled with each person’s personal bouquet. Today we’re so far from the origins of our clothing that it’s become impersonal, and we don’t feel the powerful magic of being enrobed in another animal’s skin or a plant’s fibers.

Early humans probably devised crutches for the lame, but the first prosthetics are spoken of in the ancient sacred Indian poem the Rig Veda, where they belonged to the warrior queen Vishpla. After she lost a leg in battle, she still insisted on fighting, so some sort of iron leg was fashioned for her. The Greek historian Herodotus tells of a shackled Persian soldier who escaped by cutting off his foot and replacing it with a copper and wooden one. In the Cairo Museum, there’s a mummy from the reign of Amenhotep II (fifteenth century BC), whose big toe on the right foot was amputated and replaced by a superbly carved wooden replica tied on with leather straps. She’s believed to have been a royal woman suffering from diabetes, and the artful toe was designed to help her both on Earth and in the afterlife.

Throughout history, peg legs and hook hands have been plentiful, though such antique prosthetics were crudely made and heavy, usually from wood, metal, and leather. Wearing them, a person became part tree, part animal. We’ll never know if kin regarded them as a hybrid, or if the wearers identified at all with the qualities of the species they harnessed in lieu of human muscle and bone.

How far we’ve come! Today we live in a completely prosthetic culture brimming with contact lenses, false teeth, hearing aids, artificial knees and hips, compasses, cameras, and many digital and wireless brain attachments. We’ve made such prosthetic strides since toes of leather and wood that it’s even hard to agree on a fair playing field, since the ultimate Olympic athlete may be competing against a cyborg now. But is that fair?

Already a Paralympic gold medalist, Oscar Pistorius spent four years battling the Court of Arbitration for Sport for a chance to race against able-bodied athletes in the regular Olympics. Ultimately, after extensive testing of his blades, the court decided in his favor, declaring that the blades wouldn’t give him an unfair advantage. Yes, the springy blades were lighter, but also limited; they couldn’t return more force than Oscar generated striking the ground. In contrast, the elastic dynamo of a human foot and ankle can always pound with extra force and rebound with more velocity. So, at the moment, until the technology changes, able-bodied sprinters supposedly have an advantage over blade-wearing ones.

But doesn’t every gifted athlete have some unique physiological advantage? For the swimmer Michael Phelps, the most decorated Olympic athlete of all time, it’s an unusually long torso and arms for his height. The debate will heat up even more as higher-tech blades are invented. In an ironic twist, when Pistorius was outpaced by a sprinter in the Paralympics, he lodged a formal complaint that the winner had had an unfair advantage because he wore better-crafted blades.

Pistorius was the first double amputee ever to compete against able-bodied runners in the Olympics, and his story is a sort of double haunting, in which our past and future ghost into view. He is visibly a cyborg, and yet completely at home in his body. As a child, he fused mentally with his artificial legs, and his brain pictured blades as the natural extension of thighs, and his body as agile and fleet-footed.

Pistorius isn’t the only famous cyborg. Wartime often leads to advances in technology, and as a result of all the young amputees returning from the Iraq and Afghanistan wars, the field of prosthetics has flourished, with high-tech materials and more natural-looking robotics. DARPA runs a Revolutionizing Prosthetics program, whose goal is an array of thought-controlled limbs that move with the precision and ease of natural limbs, ready for FDA approval in the next few years.

On the evening of November 6, when all the votes were counted for the 2012 election, Tammy Duckworth, an Iraq War veteran who had lost both of her legs on the battlefield in 2004, strode to the podium to make her acceptance speech as the newly elected Democratic congresswoman from Illinois. She wore state-of-the-art prosthetic legs complete with robotic, computer-controlled ankle joints and a computer-powered knee.

Using a cane, she moved smoothly and looked understandably elated, comfortable in motion, which is remarkable since Duckworth didn’t grow up learning to balance her pelvis and spine over prostheses while she walked. As an infant she learned to walk fearlessly, as babies instinctively do, around the age of thirteen months, when balance and strength are keen enough, and some baby fat has yielded to muscle.

Although walking ultimately becomes unconscious, it’s a skill that requires us to tumble into and out of balance all the time. It takes countless hours of practice and encouragement, and a great many falls to do it expertly. Babies are like tiny petulant stilt-walkers. To walk, you step forward with one foot, which tilts you off balance, then you catch yourself before you fall too far, quickly rebalance, and fall in the opposite direction, catch yourself, and start falling again as you make a so-called straight line across the room or street. Walking is really a series of recovered falls. In time we learn to do it expertly, without noticing that it’s an evolutionary circus act. Over time, a lovely pendulum swing develops, as the hips roll out of balance and back in again, over and over, without the walker paying it any mind. Its rhythm is naturally iambic (a short unstressed syllable followed by a long stressed one), which could be why so many poets, from Shakespeare to Wordsworth, wrote poems in iambs; maybe they composed while strolling. Fortunately, the pelvis and backbone are engineered to make the skill (strolling, not composing poetry) relatively easy.

However, relearning to walk as an adult means unlearning old balancing tricks and mastering new ones, based on the current shape of your body, while fully aware that you could fall and badly injure yourself. Also, injuries aren’t always symmetrical—Duckworth’s were complex (right leg missing at the hip, left leg below the knee). Blades like Pistorius’s wouldn’t have suited her lifestyle, in which she needed to feel equally comfortable on airplanes, behind a desk, at a podium, or climbing stairs. Her revolutionary ankle joints and legs rely on robotic software—microprocessors, accelerometers, gyroscopes, and torque angle sensors—to mimic the delicate teamwork of muscles and tendons in the ankle when someone walks.

The day of the cyborg has certainly arrived, with goggles for skiers and snowboarders offering a dashboard display of data, GPS, camera, speedometer, altimeter, and Bluetooth phone; plus voice control and gaze control for cyclists. Will it be safe to ski or bike with data dancing before your eyes, or while posting photos on Facebook? Probably not. But safer than looking down at a smartphone and back up at the slopes, while dizzily jockeying between the sensory and tech worlds. The virtual reality that Star Trek promised us is starting to become commonplace. We can don a headset that stimulates all five senses simultaneously, and walk along a street in ancient Rome or Egypt, so immersed in the look, smell, and feel of the place that it seems real.

I’ve yet to meet anyone sporting Google Glass, the voice-controlled miniature screen in a flexible frame that hovers piratically above one eye, projecting e-mail and maps onto your visual field. But, whether or not it catches on as techno-fashion, it’s already a triumph in operating rooms around the world. The first surgeon who wore it simply videotaped an operation to share with colleagues. Since then, surgeons have been actively consulting Glass during operations to view X-rays or medical data without turning away to look. The cyborg doctor has eyes in the back of his head, and four or more hands. At the University of Alabama at Birmingham (UAB), Dr. Brent Ponce, wearing a Google Glass, began a shoulder replacement surgery while the built-in camera showed the surgical field to Dr. Phani Dantuluri, a veteran surgeon watching on his computer monitor in Atlanta. As the doctors discussed the case, Dantuluri could reach into the surgical field that Ponce saw on his heads-up display: ghostly hands floated over the body, pinpointed an anatomical feature or demonstrated how to reposition an instrument, as he consulted in real time. Invented by a UAB neurosurgeon, Barton Guthrie, who was frustrated by the limits of teleconferencing, VIPAAR (Virtual Interactive Presence in Augmented Reality) offers a safety net in diverse situations: teaching surgeons, guiding a resident’s hands, piloting difficult procedures in regional hospitals anywhere in the world, and also assisting emergency operations at an Antarctic base or in space.

A lively pair of glasses, even with all the digital trimmings, is still only an accessory. At the end of the day, you remove it and become mortal again. We yearn to supersize and supervise our powers with no intermediary, but intimately, naturally, without fuss, as if such marvels were our birthright. The next small step, a world closer, will be thought-controlled contact lenses floating on the eyes like high-tech continents. Will they cling invisibly, or twinkle like the glass eyes of a doll? The final step, who knows when, will be the silk of silicon sliding along our neurons. Then, without disappearing, only virtually visible, our computer worlds will fully mesh with us. Will we feel haunted sometimes, or merely worry about affording the latest update?

It’s a strange paradox to imagine, yet highly possible, maybe even inevitable, but by delegating more physical and mental tasks to robots and computers, we might also weaken various skills and aptitudes—math, musculature, memory—while perfecting new ones. We may soon have to master multitasking spatially, as we cross midtown streets while scrolling through and answering e-mails hovering in the air, which we’ve conjured up on our iGlasses, just by batting our eyelashes. In time, brain and body would adapt.

We’ve always crafted new technologies to help us live better or longer, but in the past few decades that’s accelerated dramatically. We’ve stepped up the pace of our romance with machines, wedding them to our bodies as never before, and saturating our lives with techno-marvels, from genetic tests to organ transplants, satellite communications to genetic engineering, brain scans to mood enhancers. They’ve amused and nettled our lives to such an extent that a new branch of anthropology has arisen to study the phenomenon.

Amber Case practices “cyborg anthropology,” a field in which scientists study how both humans and robots interact with objects, and how that changes the culture in which we live. For example, the way cell phones affect human relationships, and how we now interact techno-socially instead of socially. Old-fashioned social relationships, in which one gets together with friends, are regarded as “analog.”

“So, for instance,” Case explains, “we have these things in our pockets that cry, and we have to pick them up and soothe them back to sleep, and then we have to feed them every night by plugging them into the wall, right? And at no other time in history have we had these really strange nonhuman devices that we take care of as if they are real.”

Cyborgs may be growing plentiful, but even more of us are chimeras—DNA (and sometimes body parts) from two or more creatures lodged inside one body. The ease with which mythic humans and animals breed and swap bodies speaks to our prior intimacy with the rest of nature, acknowledging animals as part of our extended family. We’ve adopted the term “chimera” from the Greek monstrosity Homer sang of in the Iliad, a savage sky-beast said to be part lion, goat, and serpent, that blasted fire from her mouth and terrorized the land until a hero named Bellerophon chased her on his winged horse, Pegasus, and rose above her fiery blasts to kill her. It’s the sort of fiend that haunts many cultures, all claiming that some unholy union of different species—a lion, snake, and eagle, for instance—has produced a dragon, a sphinx, a griffin, a medusa. In Greek mythology we find satyrs, overly lustful woodland goat-men, and the perfumed and tuneful sirens, bird-women who lured men to their doom. There are Chinese tales of families that descend from the marriage of a shape-shifting dragon and a human. Siberian shamans owe their magical power to the marriage of men and swans. Native American lore declares that the first people of the Earth were part animal. In fairy tales brides and grooms marry animals. Often the chimeras (mermaids, for instance) exist at the limits of the known world, where heroes and explorers go to prove their courage. Despite the countless children’s books filled with delightful thinking, talking, personality-ridden animals, the idea of a real-life part-human being trapped in the body of another animal seems diabolical to most people, so horrifying that it was the Greek gods’ favorite way of punishing humans.

As exotic as it sounds, we already have a great many natural chimeras among us, including all the people who secretly harbor Neanderthal and Denisovan genes. We absorb other people all the time. When we pass along a cold sore or flu, the virus carries some of our protein and releases it inside the other person, where the immune system stows it for future reference. HIV and other retroviruses are especially good at installing pieces of one person’s DNA inside another person’s chromosomes. By exchanging body fluids we even swap gene fragments with our partner, and become a chimera as our self starts including bits of their immune system. We don’t just get under a mate’s skin, we absorb him or her. As the immunologist Gerald N. Callahan explains, we’re probably swapping gene fragments with other people “a lot more often than we realize. Infection becomes communication, memorization, chimerization. Over the course of an intimate relationship, we collect a lot of pieces of someone else. Until one day what remains is truly and thoroughly a mosaic, a chimera—part man, part woman, part someone, part someone else.”

However the affair turns out, we’re invisibly changed for having known each other. This may not be a pleasant thought if the DNA belongs to an old flame you never want to see again as long as you live and can’t bear the thought of hauling around in your car, let alone your cells. Best not to dwell on that. Think instead of Mom’s DNA, or a sweetheart’s, still alive inside you as a miniature portrait.

The human chimeras known as twins provide a glimpse of how confusing a world full of clones might be, but twins are too numerous to regard as oddities. At the end of her life, my mother needed regular bone marrow transplants, through which she had donor cells in her bloodstream. By that time, she was already a chimera, because moms retain cells from their fetuses. She would also have stored cells from my father, who stored her cells, too. But, to the best of my knowledge, he didn’t contain cells from pig valves, cat gut, or monkey glands—though the last might have been a temptation, since it was in high vogue when he was a young man.

In the pre-Viagra 1920s, men hoping to improve their virility flocked to the French surgeon Serge Voronoff, who grafted thin slices of monkey testicles onto their scrotums. Later he transplanted monkey ovaries into women, including, allegedly, the U.S. coloratura soprano Lily Pons, who was a frequent guest at his monkey farm on the Italian Riviera. Over five hundred such operations brought Voronoff fame and great wealth, until, in time, he was denounced as nothing more than a witch or magician. But people didn’t fret about hitching their testicles and ovaries to monkey glands.

Today, the monkey gland craze is as passé as Rudolph Valentino, but so many of us have pieces of other nonhuman creatures inside us, it’s surprising that we don’t inadvertently oink, clop, or bleat in embarrassing moments. We think nothing of strolling around with cow and horse valves in our hearts. Raising genetically modified pigs that are more compatible with human tissue, we harvest the blood-thinner heparin from their intestines, and insulin from their pancreases. The fibrous tissue in the spaces between the cells in a pig’s bladder, once viewed as mere cushions, are so rich with growth factors that they’re used to “fertilize” war-ravaged human muscles and help them regrow.

When Corporal Isaias Hernandez, a nineteen-year-old marine deployed in Iraq, had 70 percent of his thigh muscle torn off by a roadside bomb, doctors assumed they’d be amputating the leg. The remains of his thigh looked to him like a half-eaten meal at a Kentucky Fried Chicken restaurant: “You know, like when you take a bite of the drumstick down to the bone?” Quickly scarring over, his thigh sparked constant pain, and doctors prescribed amputation followed by a prosthetic as his only hope.

Then he became a chimera. Volunteering to be part of a clinical research trial, in 2004, he allowed surgeons to insert a paper-thin slice of pig’s bladder, known as extracellular matrix, into the ragged thigh muscle. It began to regenerate. Today, without pain, he—like others—uses a regrown thigh to walk, sit, kneel, bike, climb, and enjoy a normal life. He’ll always be part pig, the part his surgeons refer to affectionately as “pixie dust.”

Is there much difference between ingesting and implanting? We swallow snake and spider venom, and gila monster spit to calm an unruly heart, and cone shell venom for pain. For birth control, millions of women ingest mare’s urine. Our foremost antibiotics come from a cavalcade of fungi. Then there are the coatings, capsules, and liquid additives that go into medicines, concocted from the skin, cartilage, connective tissues, and bones of animals. If we’re comfortable with implanting horse valves in our faulty hearts and pig tissue in our thighs, if we get past the basic idea of raising animals to butcher for their organs and amending our bodies with pieces from lower orders, what else might we think of? Borrowing a spare stomach from a cow so that we can digest food more quickly and lose weight?

Maybe embedding parts from other animals doesn’t seem to bother us because, on the atomic level, we’re living beings composed of nonliving parts. Hence the graveside reminder, “from dust to dust.” Maybe we see it as the ultimate domestication of animals and taming of the soil, which we began long ago in our collective memory, little by little widening their uses. Gradually we’ve gone from animals sleeping under our roof to animals sleeping under our ribs without feeling alarmed. Oh, that again, the cow is in my bone house. Maybe in our desperate hours we gladly extend the idea of kinship from, say, my brother’s kidney to a sheep’s kidney.

Man-made chimeric creatures are a staple in laboratories—mice and other animals bred or grafted with human immune systems, kidneys, skin, muscle tissue—as a common way to study human diseases. Scientists have created sheep with organs that are 40 percent human, monkeys with part-human brains, and mice in which a quarter of the brain cells were human (fortunately they still behaved like mice, but who knows what strange mists galloped across their thoughts). Yet people balked when Japanese scientists announced that, given a year, they could grow a perfect human heart or kidney by tucking a human stem cell into a pig’s embryo, then lodging the embryo in a healthy pig’s womb. Pig valves in humans, no problem. But a pig with human organs?

It’s a sign of our times that the problem with “chimeric embryos” isn’t technological but ethical. It’s doable, but it’s not permissible. Nations would have to agree on what a human being should be, and that’s not so obvious anymore. For the first time, we’re asking ourselves: how far are we willing to engineer the world and ourselves? We still feel human when partially enhanced by prostheses, somewhat chimeric, or controlling wearable technology by eyelash-flicks or thoughts. The question has become one of degree. How replaceable are we, yet still legally and attractively human? And where is the line of disgust between enhanced human and monstrous?

Canada has passed the Assisted Human Reproduction Act, which bans the creation of chimeras. The bioethicist Françoise Baylis of Dalhousie University in Halifax, Nova Scotia, helped draft Canada’s guidelines on chimeras.

“We don’t treat all humans well, and we certainly don’t treat animals well,” she insists. “So how do we treat these new beings?”

In the United States, the National Academy of Science permits chimeras but warns against allowing chimeras to breed, because breeding two part-human chimeras could potentially lead to the grotesque (though almost certainly fatal) possibility of a human embryo growing inside another animal. Remember Rome’s fabled origin, when Romulus and Remus were raised by wolves? Suppose a wolf actually gave birth to a human? Or a sheep did? Almost ten years ago, Esmail Zanjani of the University of Nevada, Reno, announced that he had injected human stem cells into sheep embryos halfway through gestation, and the lambs emerged with human cells throughout their tissues. And not just a few cells. Some of the organs were nearly half human. Only the organs. No two-legged sheep with opposable thumbs emerged. Staring at them in photographs, I found they looked eerily human, with long faces, jelly roll falling over the forehead, and down-turned eyes. Would dogs detect an odor both human and sheep?

What scientists still don’t know is if transplanted human stem cells would change an animal’s inherent behaviors, attributes, or personality. As bioethicists rightly argue, the last thing we need is the horror of humanized monkeys or other animals. With less than one-thousandth the brain volume of humans, there’s little danger of mice developing our cognitive abilities. But in an animal closer to us on the evolutionary tree, say, a chimpanzee or bonobo, the merger might just work, especially if the DNA were mixed in the earliest stages of development. What would the orangutan Budi make of monkeys with part-human brains, I wonder?

A laboratory chimera poses a moral paradox. The more human its cells, the better it will serve for testing human cures. Too human, and it’s trapped between worlds, a claustrophobic prisoner. Writing in 1876, when the Industrial Revolution had really begun to pick up steam, one British novelist warned of just such a possibility.

In H. G. Wells’s classic novel The Island of Dr. Moreau, a shipwrecked man, rescued by a passing boat, relates a gruesomely fascinating tale of escaping from a nameless Indonesian island populated by sentient monsters whom Dr. Moreau has created through transfusions, transplants, grafts, and other bizarre techniques to create human-animal chimeras. They’re hyena-swine, hog-men, leopard-men, ape-men, little sloth people, and other “Beast Folk,” some of whom have founded their own colony in the jungle, worship Moreau, and have evolved moral bylaws. The novel shocked Victorian England, which was reeling from a slew of new technologies and from Darwin’s idea that humans descended from apes. The vogue for vivisection provoked controversy, as did eugenics, and the ethical limits of scientific experiments. Wells’s novel brought all of those into question, and also explored British colonialism, the essence of identity, the depravity of torture, and maybe most of all the peril one faces by interfering with nature. In later years, Wells described the wildly successful novel as “an exercise in youthful blasphemy.”

Gene splicing and bioengineering would not appear for a hundred years, but Wells foresaw some of the ethical dilemmas they might pose a little later in the Anthropocene. Suppose, by accident or design, a subhuman chimera emerged, something more intelligent than other animals, but less so than humans? What purpose would it be expected to serve? What sort of home would it find in our society? Would it be relegated to a lower caste? Under what circumstances should we consider a man-made chimera human? What inalienable rights would it possess?

DNA’s Secret Doormen

Swinging gamely among the fire-hose vines at the Toronto Zoo, Budi isn’t a cyborg or man-made chimera, and no human has reknit his DNA. He’s just a frisky orangutan kid, an emissary from the wild. But we’re starting to regard his physical nature (and our own) in radically new ways that connect and redefine us. Only the knowledge and what we can do with it are new. The rest is ancient as the family tree we share.


A YOUNG WOMAN with chestnut hair is seated in front of me in the cinema, slouched down, watching Stanley Kubrick’s 2001: A Space Odyssey. On the art-house screen, a vegetarian ape idly fingers the scattered bones of a fallen antelope. Slowly an idea begins to take shape. Picking up a bone, he raises it over his head and smashes it down on the rest of the skeleton, over and over, striking and shattering in an orgy of violence, while the vision of a tasty tapir flickers through his head and the pounding chords of Richard Strauss’s Thus Spake Zarathustra drive home the message: Man the Hunter is born. A day later, the ape man uses his weapon to kill the leader of a rival band of apes, while the Strauss soundtrack grows orgasmic with a new drama: the blow-by-blow chords of war. From there, Kubrick treats us to human evolution, artificial intelligence, alien life, and technological pageantry. Cascading into the spacefaring future, we find an astronaut vying with a sentient, mentally disturbed computer (which he subdues with a tool far subtler than an antelope bone). Reaching the apogee of his fate, he’s transfigured in a process that’s too advanced for us relative cavemen (in the cave of the movie theater, anyway) to distinguish from magic. As the credits roll like blankets of stars, rising houselights return us to Earth and a human saga and future that seem all the more epic.

When the chestnut-haired woman gets up to leave, one strand of hair remains on the back of her seat. From that tiny sample, someone could peruse the DNA and know if it belonged to a human female or an Irish setter or a fox, and find clues to her identity: ethnic background, eye color, likelihood of developing various diseases, even her probable life span. One would assume that she has little in common with a mouse or a roundworm, and yet they have a similar number of genes. She’s intimately related to almost every creature that walks, crawls, slithers, or flies, even the ones she’d find icky. Especially those. She shares all but a drop of her genetic heritage with spineless organisms. But that drop really counts.

Thanks to the Human Genome Project’s library of our roughly twenty-five thousand protein-coding genes, available via the Internet to anyone with a yen to peruse it, a micro-stalker could analyze the rungs of our redhead’s DNA, creeping up its spiral ladders, and discovering all sorts of juicy nuggets. Some of the micro-portrait he finds will be quite recent, because by hogging and restyling the environment we’ve altered plants, animals, single-celled organisms, and ourselves. Her DNA will show a panoply of revisions, indicative of our age, which we’ve either stage-managed or accidentally caused. Could the pollutants we use, and the wars we wage, really change our DNA and rewire the human species?

She knows they can, because in her college curriculum, Anthropocene Studies, she’s read research linking exposure to jet fuel, dioxin, the pesticides DEET and permethrin, plastics, and hydrocarbon mixtures to cancer, and not just in the person who had contact with it but for several generations.[31] She’s learned how the arsenic-polluted drinking water in the Ganges delta in Bangladesh can lead to skin cancer, as can workplace exposure to cadmium, mercury, lead, and other heavy metals. Although she was tempted to spend her junior year abroad in Beijing, she’s having second thoughts now that a peer-reviewed PLOS ONE study ties life in smog-ridden cities to thickening of the arteries and heightened risk of heart disease. What clinches it is this headline in Mail Online: “China starts televising the sunrise on giant TV screens because Beijing is so clouded in smog.”[32] Below it, a video shows a scarlet sunrise on an LED billboard in Tiananmen Square, completely encased by thick gray air, as if the sun were on display in a museum. Several black silhouettes are walking past it on their way to work, some wearing masks. As a daily jogger, she’d be inhaling a lot more pollution than most people, and she figures her genes have already been restyled just by growing up among the master trailblazers of the Human Age.

But she is tempted to read the book of her genes, and discover more about her lineage and genetic biases. For a truly personal profile, all our redhead would need is a vial of her blood and between $100 and $1,000. Such companies as Navigenics or 23andMe will gladly provide a glimpse of her future, a tale still being written but legible enough for genetic fortune-telling. She may have a slightly higher than normal risk of macular degeneration, a tendency to go bald, a gene variant that’s a well-known cause of blood cancer, maybe a different variant associated with Alzheimer’s, the family bane. If she read the report herself, she might not handle that information well. It could kindle needless worry about ailments that will never materialize, or it might warn of an impending but treatable illness, or predict a serious, disabling disease like Huntington’s. As a supposed calmative, such tests are usually marketed as a “recreational” exercise, to discover if you’re part Cherokee or African or Celt, or Neanderthal, or even related to Genghis Khan, as I may well be.

My mother always said I must be part Mongolian, because of my lotus-pale complexion and squid-ink-black hair. Something you’re not telling me? I was tempted to ask. But I knew she’d visited Mongolia with my father long after I was born. What I didn’t know is that one out of every two hundred males on Earth is related to Genghis Khan.

An international team of geneticists conducting a ten-year study of men living in what once was the Mongolian empire discovered that a surprisingly large number share the identical Y chromosome, which is passed down only from father to son. One individual’s Y chromosome can be found in sixteen million men “in a vast section of Asia from Manchuria near the sea of Japan to Uzbekistan and Afghanistan in Central Asia.”

The likeliest candidate is Khan, a warlord who raped and pillaged one town after another, killing all the men and impregnating the women, sowing his seed from China to Eastern Europe. Though legend credits Khan with many wives and offspring, he didn’t need to do all the begetting himself to ensure that his genes would flourish. His sons inherited the identical Y chromosome from him, as did their sons and their sons’ sons down a long, winding Silk Road of legitimate and illegitimate progeny. His equally warlike oldest son, Tushi, had forty legitimate sons (and who knows how many misbegotten), and his grandson Kublai Khan, who figured so large in Marco Polo’s life, had twenty-two.

Their genes scattered exponentially in an ever-widening fan, and the process really picked up speed in the twentieth century, when cars, trains, and airplanes began propelling genes around the planet and stretching the idea of “courting distance,” which used to be only twelve miles—how far a man could ride on horseback to visit his sweetheart and return home the same day. Now it’s commonplace to have children with someone from thousands of miles, even half a world, away.

Khan wasn’t trying to create a world in his image; his fiercest instincts had a mind of their own, and his savage personality spurred them on. Most people don’t run amok on murderous sprees, thank heavens, but history is awash with Khan-like wars and mayhem. In their wake, gene pools often change. One can only surmise that wiping out the genes of others and planting your own (what we call genocide) must come naturally to our kind, as it does to some other animals, from ants to lions.

Typically, wandering male lions attack a pride, drive off the other males, and kill their offspring. Then they mate with the females, ensuring that only the invaders’ genes will flourish. A colony of ants will slaughter millions of neighbors, provided they’re not family (somehow they can spot or whiff geographically distant kin they haven’t met before). Human history is riddled with similar dramas, but that doesn’t justify them. They were, and are, war’s legacy, an unconscious motive, not a blueprint for action.

Except once. During World War II, Hitler and his henchmen devised an agenda, both political and genetic, that was nothing less than the Nazification of nature. The human cost is well known: the extermination of millions while, in baby farms scattered around Europe, robust SS men and blond, blue-eyed women produced thousands of babies to use as seed stock for Hitler’s new master race. What’s little known is that their scheme for redesigning nature didn’t stop with people. The best soldiers needed to eat the best food, which Nazi biology argued could grow only from the purest of seeds. So, using eugenics, a method of breeding to emphasize specific traits, the Nazis hoped to invade the genetic spirals of evolution, seize control, and replace “unfit” foreign crops and livestock with genetically pure, so-called Aryan ones.

To that end, they created an SS commando unit for botanical collection, which was ordered to raid the world’s botanical gardens and institutes and steal the best specimens. Starting with Poland, they planned on using slave labor to drain about a hundred thousand square miles of wetlands so that they could farm it with Aryan crops. Draining the marshes might well lower the water table and create a dust bowl, and it would certainly kill the habitat of wolves, geese, wild boar, and many other native species, but despoilers rarely see downstream from events.

Elsewhere, the Nazis proposed planting forests of oak, birch, beech, yew, and pine to sweeten the climate so that it was more favorable for their own oats and wheat, and they spoke openly about reshaping the landscape to better suit Nazi ideals. That revision included people, railways, animals, and land alike, even the geometry of farm fields (no acute angles below 70°), and the alignment of trees and shrubs (only on north-south or east-west axes). Today, though we deplore genocide it stubbornly persists, and we may have our work cut out for us because it seems to tap a deeply rooted drive. It’s bone-chilling how close the Nazis came to a feat of genetic domination that dwarfs all of Genghis Khan’s exploits.

The human DNA that Olivine finds in future days will show some lineages, like Khan’s, triumphing through war, and others succeeding because of geography, religion, politics, fashionable ideals of beauty, and elements native to our age such as giant factories and workplaces, cars, jet travel, Internet and social media, the jammed crossroads of megacities, and widely available birth control and infertility treatments.

When my mother teased about my being part Mongolian, she may have been right, since Genghis Khan and his clan reached into Russia. But I like knowing that the farther back one traces any lineage the narrower the path grows, to the haunt of just a few shaggy ancestors, with luck on their side, little gizmos in their cells, and a future storied with impulses and choices that will ultimately define them.

The noble goal of the Human Genome Project is to use such knowledge to find new ways to understand, treat, and cure illness. In that sense, it’s a group portrait of us as a species, realized at last, a mere fifty years after Crick, Watson, and Franklin decoded the double-helix design of DNA. The only thing more unlikely than DNA itself, nature’s blueprint for building a human being, is our ability to decode it. Thus far, it’s our greatest voyage of discovery, and we’re still scouting its spiral coves.


IN NORRBOTTEN, THE northernmost province of Sweden, the reindeer outnumber humans, and shimmery green veils of northern lights spiral up from the horizon like enchanted scarves. In summer, crops ripen under a ceramic sun; moon-shadows haunt the ice-marbled winter nights. Although the citizens can travel by car today, in the past they relied on foot or horse power to carry them to grace at an early-fifteenth-century church in the ancient settlement of Gammelstad, where they eased their isolation and replenished hope.

Getting there was only half the pilgrimage. Needing rest before the formidable trek home, each family retired to their tiny wooden house near the church, painted red with windows and doors picked out in white. Some bore grass roofs. Delicate lace curtains hemmed the frost-curled windows, and stout shutters sealed out the warring tempests. Doors were adorned with a pyramid motif, a legacy from pagan antiquity admired for its stark symmetry, and reinterpreted as a Christian altar lit by sacrificial fire.

In such a remote frontier, the human population thinned to only about six people per square mile, and farmers crafted what they needed, from harnesses to nails. Neighbors helped neighbors, and married neighbors. But if the harvest failed—as it did with alarming frequency—rescue lay too far away. Railways didn’t venture that far north, even at the height of the Industrial Age when iron horses snorted soot across many frontiers, and in any case locals spoke a dialect unintelligible to other Swedes.

Gammelstad’s church, plus the rows of red bungalows clustered behind it, are part of a World Heritage Site that also includes the remains of a six-thousand-year-old Stone Age settlement in the heart of town. Tourists are today’s pilgrims, closely followed by geneticists. It’s an unlikely setting to be at the center of a revolution in medicine, and yet it holds an important key to the health and longevity of everyone on Earth.

In the nineteenth century, Norrbotten’s fickle climate bred many lopsided years of surfeit or famine, with no way to foretell the fate of the crops. People either nearly starved or died. For example, 1800, 1812, 1821, 1836, and 1856 all were years of deprivation, when crops totally failed (including staples like potatoes and grains for porridge), farm animals died, families suffered pounding hunger and malnutrition, and underweight babies entered a lean world with even leaner prospects. But in 1801, 1822, 1828, 1844, and 1863, on the other hand, the weather sweetened and food leapt from the soil in such abundance that families thrived, the economy bloomed, and for many people overeating became a pastime.

If we jump to the 1980s and step across the North Sea to London, we find the prestigious medical journal The Lancet publishing studies that highlighted the importance of womb-time, linking a mother’s poor diet during pregnancy with her child’s higher risk of heart disease, diabetes, obesity, and related illnesses. This was a revelation to the medical community and a warning to parents.

According to Darwin’s theory of natural selection, a child is born with a genetic blueprint that has evolved over millennia. All the hard-luck times its parents faced might be taught as life lessons, but they aren’t hereditary; they won’t alter a child’s chemistry. Thinking otherwise was a delusion mocked and dismissed in the eighteenth century, when the naturalist Jean-Baptiste Lamarck (the man who coined the word “biology”) posed a theory that parents could pass along acquired traits to their offspring. In his most famous example, a giraffe reaches achingly high into the treetops each day to feed on tasty leaves, which ultimately lengthens its neck, and then its offspring inherit longer necks and stretch them even more by mimicking the parent’s behavior. Smart, keen-eyed, and right about many aspects of botany and zoology, including the dangerous idea that new species arise naturally through evolution, Lamarck was wrongheaded about giraffe necks and heritable traits. According to his logic, if a blacksmith grew anvil-hard arms from a lifetime of heavy hammering, his offspring would inherit equally burly muscles. It’s fun to think what such a world would look like—a mismatched crowd of animals within each species and the enviable ability to will traits to one’s offspring. Practice wouldn’t be needed—you could inherit your pianist dad’s spiderlike dexterity with his fingers or your bicycling mom’s loaflike quadriceps.

Darwinian evolution teaches us that genetic changes in DNA unfold with granular slowness over millennia; no individual can erase or rewrite them in his lifetime. Genetic mutations come and go, and if one is harmful or useless for survival, it tends not to linger. But if it’s beneficial it equips the animal with an edge, a better chance at surviving long enough to breed, and then the mutation empowers the animal’s offspring and their offspring in turn, passing the winning trait along to future generations in quite a sloppy way, all things considered. In time this fluky mechanism leaves the world with only those animals best suited to their different habitats.

That’s the accepted theory, proven in countless experiments, and there’s no reason to doubt it. But what if that isn’t the whole story? Eclipsed by Darwin, Lamarck seems to have been right at least in spirit, a reality that has stunned much of the scientific world. What makes a paradigm shift so shifty is that you don’t see it coming. Then it suddenly pulls a mental ripcord and your mind plummets at speed. A new paradigm blossoms overhead, the freefall slows to float, and the world becomes visible once again, but from a new perspective. “Creative insight,” we call this parachute flare with discovery.

After Lars Olov Bygren, of the Karolinska Institute in Stockholm, read the Lancet articles, he began to wonder about the nineteenth-century children of Norrbotten who had alternately starved and binged. The people of that region seemed ideally isolated for a genetics study. Certainly the children would have been influenced by their mothers’ nutrition during pregnancy, but what about all the earlier feasts and famines their parents endured—could those blemish the children’s health? This was a daring question to ask, let alone pursue, since it flew in the face of Darwinian fashion. But it nagged at him until he finally decided to focus on ninety-nine children born in Överkalix in 1905, relying on a wealth of historical data. Why choose those mountain bluffs and chanting shores?

“I grew up in a small forested area, ten miles north of the Arctic Circle,” Bygren explains.

A slender man with gray hair and round glasses, he walks thoughtfully among the headstones in the church cemetery, where tall stalks of sunlit grass surround some of the young who died for lack of grain a hundred years before. The headstones bear such familiar names as Larssen and Persson, the English equivalent of Smith and Jones; Bygren probably played with some of their relatives as a boy. Bluebells and daisies flower naturally between the stones, and some graves are adorned with gaudier store-bought flowers.

“We have an expression,” he says with a laugh. “Dig where you stand!”

After a moment, Bygren adds, “We are really data rich.” Data rich even when crop poor. “Everything happening in the family was recorded.”

Ever since the sixteenth century, the clergy has kept a fastidious ledger of births and deaths (with causes), as well as land ownership, crop prices, and harvests. Thanks to the clergy’s meticulous record-keeping, Bygren was able to gauge how much food was on hand when parents and grandparents were children. Överkalix provided a natural experiment where he could follow isolated families as they tumbled forward in time.

Common sense hints that if you’re creased by trauma, are a junk food junkie, or spend your days staring at lit screens while spring offers the likes of pink-petaled magnolias ruffling their flamingo feathers in the breeze, it may make you miserable and unhealthy, but it won’t affect the DNA of your unborn children. They may inherit your curly hair, gray eyes, musical ear reliable as a tuning fork, thin porcelain skin, or risk of a genetic disease such as Huntington’s, but they won’t suffer from your accidents and misdeeds. Their DNA won’t be damaged by your rotten choices in lifestyle, nor can you pass on all the wonderful feats you’ve accomplished, the wonders your senses have soaked in, the perils you’ve avoided. In that sense, they’re born with a clean slate and will become entangled in their own dramas and make their own questionable choices. Certainly they won’t be obese, get diabetes, or die young just because a grandparent binged during one tantalizingly rich harvest season after a year of brutal gourd-bellied hunger. Evolution doesn’t work that way or that fast—end of story. Or is it?

What Bygren found was quite different. He and the geneticist Marcus Pembrey, of University College London, began collaborating on groundbreaking studies that raised eyebrows and led to such headlines as “You Can Traumatize Your DNA,” “You Are What Your Grandparents Ate,” “Nurture Matters,” and “The Sins of Our Fathers” (in Exodus, God speaks of “visiting the iniquity of the fathers on the children and the children’s children”).

The ultimate immigrants, babies arrive in this world from a far country with no dry land, lugging helical clouds of ancient DNA, primed for survival, but seemingly ill equipped to face sudden changes in the environment. Yet it is possible to warn children and grandchildren about recent dangers. Episodes of near-starvation—or other extreme changes in the environment—tag the DNA in children’s nascent eggs and sperm. Then years later, when they have children of their own, new traits emerge, not because the traits serve the species well, but because of the parents’ specific stresses long before their children were conceived.

When Bygren looked at the children of Överkalix, he was surprised to discover that boys between the ages of nine and twelve who gorged during a bountiful season, inviting diabetes and heart problems, produced sons and grandsons with shorter life spans. And not by a negligible amount. Both sons and grandsons lived an average of thirty-two years less! In contrast, the boys who suffered a hunger winter, if they survived and grew up to have sons of their own, raised boys with health benefits—four times less diabetes and heart disease than their peers and life spans that averaged thirty-two years longer. Later studies found similar results among the girls, though at a younger age, since girls are born with a bevy of eggs and boys develop sperm in the prelude to puberty. During these growth windows, ripening eggs and sperm seem to be especially vulnerable to intel about the environment. Like a computer’s binary code, the marks tell switches to turn on or off in the cells. Then eggs and/or sperm ferry the message to the next generation, where they may indeed be lifesaving. On the other hand, they could usher in the onset of disease by equipping someone for a world that no longer exists. Problems detonate when one is biologically prepared for a radically different environment.

“The results are there,” Bygren says, solid as the iron ore enriching the folds of Norrbotten. “The mechanisms are not so clear.”

Shocking though the idea was, the evidence plainly showed that it only took a single generation to make indelible changes. That year of gluttony as a child set in motion a biological avalanche in the cells, dooming the children’s as yet unimagined grandchildren to a host of illnesses and vastly shorter lives than their peers. It’s as if they had inherited a genetic scar.

How and why this evolutionary sidestep happens is the focus of epigenetics, a new science that puts all the old-fashioned college debates about nature or nurture on the Anthropocene scrap heap of outmoded ideas. It also lays a heavier burden on the shoulders of would-be parents. Apparently, it’s never too soon to begin worrying about the health of your grandchildren.

The implications are staggering. Up until now, inheritance was a tale told by DNA; it lay exclusively in the genes. In the watch-how-you-step, deep-nurture world of epigenetics, proteins tag DNA by coiling around it, pythonlike, squeezing some genes tighter and loosening others, in the process switching them on or off, or leaving them on but turning the volume up or dialing it down to a whisper.

Changes in our genome took millions of years, but the epigenome can be changed quickly, for example, by simply adding a tiny methyl group (three hydrogen atoms glued to one carbon atom) or an acetyl group (two carbons, three hydrogens, and an oxygen). This “methylation” turns a gene off, and “acetylation” turns a gene on. Environmental stresses flip the switch, which makes sense, since in theory it prepares offspring for the environment they’re going to find. Diet, stress, prenatal nutrition, and neglect create especially strong marks, whose influence can be either good or bad. How the marks fiddle with your genes may be deadly in the long run, or may prolong your life. Exercise and good nutrition leave beneficial tags, smoking and high stress pernicious ones.

What changes isn’t the tool but how it’s used. It’s like the difference between wielding a hammer to tap in a picture nail or to smash a hole in a wall. Nature is thrifty, recycling the basics. It’s as if DNA were a tonal language using the same consonants and vowels, but speaking them with different inflections. In Mandarin Chinese, the world’s most widely spoken tonal language, how you voice the word “man” determines whether you mean “slow” or “deceive.” Exactly the same DNA funds heart, pancreas, and brain cells, yet they finesse different tasks. As genes are switched on and off, made to shout or whisper, their meaning and purpose shift. That’s why it’s merely an embarrassment that we have fewer genes than plants and nearly the same genes as chimpanzees. Gifted with the same libretto of genes, life forms intone them differently, and our own cells morph into skin, bone, lips, liver, blood. Epigenetics is providing clues to how this tonal magic is performed.

Pembrey’s fascinating hypothesis is that the Industrial Age ushered in a flood of rapid-fire environmental and social changes, and while genetic evolution struggled to keep up with them, it couldn’t adapt that fast. The speed of change was unprecedented, and our genes don’t evolve in just a few generations. But certain “epigenetic tags” clinging to those genes could. So the pesticides or hydrocarbons your great-grandmother was exposed to when she was pregnant may heighten the risk of ovarian disease in you, and you in turn might pass that risk on to your grandchildren. Ovarian cancer has been increasing to affect more than 10 percent of women over the past few decades, and environmental epigenetics offers a plausible reason why.

We only exist in relation to others and the world. This dialogue, a three-ring circus among the genes, a perpetual biological tango performed by multitudes, deserves a better name than the unwieldy crunch of “epigenetics,” but the word is springing from many more lips as doctors search for clues in both a patient’s environmental exposure history and that of his parents.

“We’re in the midst of probably the biggest revolution in biology,” says Mark Mehler, chair of the Department of Neurology at Albert Einstein College of Medicine. “It’s forever going to transform the way we understand genetics, environment, the way the two interact, what causes disease. It’s another level of biology, which for the first time really is up to the task of explaining the biological complexity of life.”

“The Human Genome Project was supposed to usher in a new era of personalized medicine,” Mehler told the American Academy of Neurology at its annual meeting in 2011. “Instead, it alerted us to the presence of a second, more sophisticated genome that needed to be studied.”

Despite the DNA of twins, for example, they’re never perfect matches. If one has schizophrenia, the odds of her twin developing it are only 50 percent, not 100 percent as one might assume since they have identical genes. Twins have become an important part of epigenetic studies. So have children of Holocaust survivors, Romanian orphans who weren’t held and comforted enough, and children with stress-rattled or neglectful caregivers. From psychiatric epigenetics we’re learning how important a mother’s mood is to the fate of her fetus. The chemicals that swaddle and seep through a fetus can influence its future health, mood, and life span.

In 2004, Michael Meaney, whose lab at McGill University was studying maternal behavior, published his findings in Nature Neuroscience. Good mother rats, who licked their fourteen to twenty pups often and with care during the first week of life, produced nice calm pups. Standoffish mother rats who didn’t lick their pups much or neglected them entirely produced noticeably anxious pups. And as adults, the next generation of female rats mirrored their mothers’ behavior.

“For us, the Holy Grail was to identify the path that was being altered by this licking behavior,” Meaney says. “We identified one small region on the gene that responds to maternal care and directs changes in the brain cells.”

Meaney’s work is now looking at human child development. A highly stressed pregnant mother floods her fetus with glucocorticoids, which can reduce birth weight, shrink the size of the hippocampus (memory’s estate), and cripple the ability to deal with stress. Yet, as Meaney is the first to point out, many underweight babies turn out fine, which suggests that postnatal care must be able to reverse the ill effects. Environment and nurture do matter, and it doesn’t take long for their influence to show. In colonies of the desert locust, individuals are naturally shy and nocturnal. But if the population swells and overcrowding occurs, the densely packed grasshoppers give birth to gregarious, diurnal young. Or, in bird studies, if the mother lives in “socially demanding conditions,” holding high social rank, for example, her androgen level climbs. That leads to increased androgen in her eggs and produces more competitive chicks.

Is our moviegoer among the poor? As the health consequences of poverty drift into social medicine’s sights, ongoing human and animal studies link either enriched environments or impoverished ones to the health of children and grandchildren. Studies of the Dutch Hunger Winter in 1944–45 reveal that prenatal hunger can lead to schizophrenia and depression. In a U.K. study, poor prenatal nutrition is tied to a trio of risks for heart disease in older adults.

In other research, mothers who lived through the stresses of hurricanes and tropical storms while they were pregnant were more likely to have autistic children. Even if as an adult the redhead makes all the healthy choices, is happy during her own pregnancy, and becomes a doting mom, her child can still suffer from the stresses its neglected grandmother endured during the Great Depression. Or its grandfather suffered in Vietnam as a young recruit before she was even conceived. What her grandparents ate for breakfast matters.

Did our redhead’s father take Viagra? Thanks to such drugs, much-older men are siring offspring. What effect could so many older fathers, and aging genes, have on us and future generations? One unexpected finding is that, for some reason, older fathers endow their offspring with longer telomeres (which cap the ends of chromosomes like the tabs at the end of shoelaces do to keep them from fraying), a part of the gene that controls life span. So the children may live longer. On the other hand, older fathers are blamed for passing on mutations that can lead to such dreaded disorders as autism or schizophrenia. Dad’s diet was also important; if he was gluttonous, she may be more likely to develop diabetes. Fortunately, our moviegoer inherited telomeres long as a summer night.

None of this happens by unzipping and altering the codebook of DNA, yet it’s inherited by offspring. Epigenetics is the second pair of pants in the genetic suit, another weave of heredity, and although revising someone’s genome is hard, it’s relatively easy to change an epigenome. The marks are profound but not permanent. As a result, the field holds limitless promise.

“Genes can’t function independently of their environment,” Meaney says. “So every aspect of our lives is a constant function of the dialogue between environmental signals and the genome. The bottom line seems to be that parental care can have an even bigger impact than we ever dreamed on our children’s lives. We’re just starting to learn what that means.”

Yes, a trauma in your mother’s childhood could affect your health, and the health of your child—but it’s also reversible, even as an adult. In the McGill study, researchers were able to undo the chilly behavior of the second generation of mother rats by using epigenetic drugs to turn genes on or off.

The great promise of epigenetics is the possibility of curing cancer, bipolar disorder, schizophrenia, Alzheimer’s, diabetes, and autism by simply flipping the switches that tell some genes to wake up or work overtime, and others to lighten up or nap. Can we really hypnotize our genes like that, canceling out bad behavior and sparing innocent offspring before we plan to have any? The consensus is yes. Scientists have begun developing drugs such as azacitidine (given to patients with certain blood disorders) capable of silencing bum genes and spurring on healing ones. Many illnesses, such as ALS and autism, appear to be epigenetic, which puts them within reach. Three different types of epigenetic drug therapy are being actively investigated for schizophrenia, bipolar disorder, and other major psychoses. The FDA has already approved several epigenetic drugs, and in 2008 the National Institutes of Health (NIH) declared epigenetics “central” to biology and committed $190 million to understanding “how and when epigenetic processes control genes.” The Human Genome Project, completed in 2003, rightly celebrated as a wonder of human ingenuity, had only twenty-five thousand genes to map. The epigenome is much more complicated, with millions of telltale marks. So a full epigenome will take a while, but an international Human Epigenome Project is under way.

The good news is that these are problems with possible, if not simple, solutions: ban more environmental toxins known to trigger epigenetic havoc; work harder to ease famine, reduce poverty, and repair the ravages of war; and help people understand the long-term impact of their actions and the vital role that nurture plays in their families, societies, and environment. Genes may remember how they once behaved in parent and grandparent cells, but, fortunately, they can also learn healthy behaviors, based on use, just as muscles do. What you experience in your lifetime will become a vital part of your child’s legacy. Your adult experiences can rewire your genes in positive ways, and just as startlingly, the nurturing you do for friends, sweethearts, and other people’s children can have lasting epigenetic effects. Once that idea registers, it changes the relationship between generations, which suddenly have everything in common, and the tapestry of the human condition grows a little more visible, thread by thread. At the level of DNA’s phantom doormen, we can be connected to anyone and everyone.

There’s also a moral, social, and political lesson: while humanitarian programs may seem nonessential, an extravagance of resources and spirit we can’t afford, epigenetics teaches us that, on the contrary, poor education, violence, hunger, and poverty leave scars on one generation after another in a way that ultimately affects the future health and well-being of whole societies. What happens to war-torn soldiers and civilians during and after battle leaves epigenetic traces to wound future generations, adding to a country’s problems, even in peacetime. The same is true of natural disasters, and we’ve seen plenty of both of late. Who knows what epigenetic aftermath will result? Genetic engineering may seem like a diabolical threat to us as a species, and we do need scrupulous oversight and control of such life forms. But the political and environmental choices we make—those with epigenetic repercussions—are equally powerful engines of change, ones we can often identify and fine-tune.

Meet My Maker, the Mad Molecule

Returning to our mystery redhead in the movie theater—what else could we learn about her from a strand of hair or blood sample? Her DNA profile, resembling a supermarket barcode, is a monumental accomplishment, but it’s only a fraction of her story. For a fuller picture of her health and heredity, we would need to include the teeming seashores of her microbes, the rest of her being—in fact, more than her being. Another self, a shadow self. At any moment, she is inseparable from trillions of her single-celled, single-minded, naked companions, some of whom don’t have her best interests at heart.

When she weighed herself earlier today, she may have deducted a pound or two for clothes and shoes. But did she take into account the roughly three pounds of microorganisms that inhabit her crevices and innards? Probably not. She’d need an atomic scale to start with, and anyway microbes are shifty, jumping off her peninsulalike feet and climbing aboard elbows pell-mell; they’re not easy to tally.

Microbes are the most fruitful life form on Earth, colonizing all sorts of ardent unspoken strangers, creating small sulfurous rumblings in the animal belly, reveling in the smell of fish and old shoes, and leaving aftertastes in the mouth stale as bus-station sandwiches. They’re also real workhorses, fluting the air until it’s breathable, promoting photosynthesis on the land and in the oceans, decomposing dead organisms and recycling their nutrients. In industry, we breed them to ferment dairy products and to process paper, drugs, fuel, vaccines, cloth, tea, natural gas, precious metals; they help mop up our oil spills. We yoke them like oxen and set them to work. But just as most of the mass in the universe (94 percent) is “dark matter,” this largest biomass on our planet escapes the naked eye, yet is the invisible Riviera of the visible world.

How remarkable it is that we’re not only renaming our age, we’re on the threshold of redefining ourselves as a completely different kind of animal than we ever imagined. For years, we thought DNA told the whole story. Instead we find that each person is a biological extravaganza of ten trillion microbes and one trillion human cells. It’s amazing we don’t slosh or disintegrate as we walk. Here’s the thing: on a microscopic level we do, while constantly adding new microbes from other people, plumes of dust, and the plants and animals we encounter.

In only the past ten years, our picture of a human being has evolved from a lone animal to a team of millions of life forms working in unison for mutual benefit. Unrelated people may be widespread from Tierra del Fuego to Quaanaaq, but there’s a movement afoot to classify human beings as “eusocial,” a single unit of highly sociable life forms who can’t survive all by themselves. Earth favors similar collectives—ants, bees, termites, coral, slime mold, naked mole rats, etc.—in which individuals pool their know-how to act for “the sake of the hive.” Thanks to the Web and social media, we’re discovering what a bustling rialto each person really is, and also how connected we all remain. Worlds within worlds, each of us is a unique ambulatory superorganism who belongs to one miscellaneous species living on the body of a colossal superorganism of a planet in a waltz of innumerable galaxies sprinkled with other Gaia-like planets and likely their own life forms percolating with untold hangers-on.

A marvel of the Human Age is that, in the past decade alone, we’ve mapped both the DNA in our cells and the DNA in our microbes. In the hunt we’ve discovered that a true view of ourselves as a life form is more untidy than we thought, and unglimpsed by most of us, a cloud of entwined bugs and human cells in a semipermeable frame. Joshua Lederberg, the Nobel-laureate biologist who, in 2000, coined the term “microbiome,” defines it as “the menagerie of the body’s attendant microbes.” Amid the hoopla surrounding the Human Genome Project, he urged, “We must study the microbes that we carry within us and on our surfaces as part of a shared embodiment.”

If the Human Genome Project was a landmark feat of discovery, the Human Microbiome Project is gene cartography’s finest hour. NIH director Francis Collins compares it to “fifteenth-century explorers describing the outline of a new continent,” a triumph that would “accelerate infectious disease research in a way previously impossible.”

For five years, a consortium of eighty universities and scientific labs sampled, analyzed, and audited over ten thousand species that share our human ecosystem, thus mapping our “microbiome,” the normal microbial makeup of healthy adults. And the quest continues.

The researchers have found that each of us contains a hundred trillion microbial cells—ten times more than our human cells. When they peered deeper and compared the genes, they realized that we carry about three million genes from bacteria—360 times more than our own human code. Among the hundred or so large groups of bacteria, only four specialize in the human body. They’ve been sidekicks for so long that over time our fate has fused with theirs.

So, odd as it sounds, most of the genes responsible for human survival don’t descend from the lucky fumblings of sperm and egg, don’t come from human cells at all. They belong to our fellow travelers, the bacteria, viruses, protozoans, fungi, and other lowlife that dine, scheme, swarm, procreate, and war all over us, inside and out. Vastly more bacteria than anything else. All alone our moviegoer could be arrested for unlawful assembly. She doesn’t propel a solid body but a walking ecosystem.

They also learned that we all carry pathogens, microorganisms known to spark disease. But in healthy people, the pathogens don’t attack; they simply coexist with their host and the rest of the circus tumbling and roaring inside the body. The next mystery to crack is what causes some to turn deadly, which will revamp our ideas about microbes and malady.

We’ve known about bacteria for 350 years, ever since a seventeenth-century Dutch scientist, Antonie van Leeuwenhoek, slipped some of his saliva under a homemade microscope, which he had crafted with lenses made from whiskers of glass, and espied single-celled organisms crawling, sprawling, flailing about in the suburbs of our gums. He named them animalcules and peered at them through a vast array of lenses (an avid microscoper, he made over five hundred).

In the nineteenth century Louis Pasteur proposed that healthy microbes might be vital, and their absence spur illness. By the time tiny viruses were discovered, only a hundred years ago, people were already driving cars and flying airplanes. But we didn’t have the tools to study the every-colored, shifting, scented shoal of microbes we swim in, play in, breathe in all the day long. Some cross the oceans on dust plumes. Acting as condensation nuclei, they jostle rain or snow until it falls from clouds. Far from being empty, the air, like the soil, throbs with flecks and dabs of life, more like an aerial ecosystem than a conveyor belt for clouds.

We need to reimagine the air, not as a desolate ether but as a lively, largely invisible, ecosystem. As we peer through its glassy expanse to a far trail or up at a billowing cloud, nothing blocks our view, the whole corridor looks vacant, and yet it’s a community pulsing with life. Our eyes merely slide over its tiniest tenants. The sky is really another kind of ocean, and even though we sometimes used to refer to “oceans of air,” we imagined barren currents; we didn’t realize how life-soaked the waves really are.

When David Smith and his colleagues at the University of Washington sampled two large dust plumes that had sailed across the Pacific from Asia to Oregon, they were surprised to find thousands of different species of microbes in the plumes, plus other aerosols, dust particles, and pollutants. All suspended and wafting around the planet, tromboning and floating, interacting with life.

In this panoramic new portrait, the Anthropocene body is no longer an entity that’s separate from the environment, like a balloon we pilot through the world, avoiding obstacles, but an organism that’s in constant conversation with its environment, a life-and-death dialogue on such a minute level that we’re not aware of it. It recognizes the mad microscopic mosaics we really are, molecular bits who trace their origins to simple one-celled blobs, then cellular flotillas that grew by engulfing others in life’s oceanic swap meet. Evolving this way and that, nabbing traits, shedding traits, we went haywire in slow motion over millions of years. Maybe our cells, however much they evolve, retain a phantom sense of those early days as colonial bodies with a shared purpose, more like amoebas or slime mold than mammals. We’re beginning to accept the idea of gypsy organisms that fanfare around us, making catlike raids on each other in dark simmering thickets, species as different from one another as animals adapted to rainforest, arctic, ocean, prairie, or desert. For we, too, have hillocks and estuaries, bogs and chilly outposts, sewers and pulsing rivers for them to quarrel and carouse in.

Even inside our own cells, we house more twitchy bacteria than anything else, because our mitochondria and chloroplasts were once primitive bacterial cells. They’ve sponged off us for so long that they can no longer exist on their own. Some our body welcomes with open pores because they handle metabolic melodies we couldn’t even hum on our own. It amazes me that we’ve survived with such grace, since we’re born dottily deficient, lacking vital survival skills such as how to digest the very foods we eagerly wolf down. An omnivorous diet helped us endure icy forests and bright broiling terra-cotta landscapes, but we don’t have all the enzymes we need to absorb those foods; our microbes assist.

In the distant past, as Earth bloomed with primitive life, strings and mounds of twinkling single-celled bacteria discovered the mutual benefits of teamwork and became allies. Others took a bolder and more violent step—they gobbled each other up. It’s only at that stage that lilacs, marine iguanas, wombats, and humans became possible. As multicelled organisms grew more and more complex, the imprisoned bacteria adapted and thrived, until they became vital cogs of each complex cell.

The consensus now among evolutionary biologists is that we can’t separate “our” body from those of our resident microbes, which have been fiddling in subtle ways with our nature as a species for millions of years, and influence our health and happiness to a previously unimagined degree. Study after study is showing that microbes profoundly affect our moods, life spans, personalities, and offspring. They influence not only how we are but who we are. How strange that we feel whole, one person whom we can wash and dress and conduct internal monologues with, though most of us is not only invisible but not even what we’re used to defining as human. Planet Human offers a dizzying array of habitats for the unseen and the unforeseen, the hominid and microbial.

Only very recently has the scientific community acknowledged the extent to which our microbes might indeed affect our evolution, and by our I mean the whole mespucha, as they say in Yiddish (the term in biology-speak is “holobiont”). Not just individuals but all their microscopic relatives with their relative points of view. Some hijack our free will, divert our behavior, and become matchmakers. A wasp study is offering fresh insights. By definition, members of a species can mate and produce live offspring. But researchers studying several species of jewel wasp (loaded with ninety-six different kinds of gut bacteria) have discovered that microbes can determine whether unions between different wasp species will succeed. When two distinct species of wasps mated, their offspring kept dying. Until recently, we would have said such a fertility problem was genetic. We know now that it can be microbial. When researchers changed the wasps’ microbes, the species bred favorably and hybridized. Evolution can be detoured by a mob of hidden persuaders.

Once again from the insect world, recent experiments with fruit flies are showing another way microbes can be at the helm, and the too-real possibility that bacteria have played a vital, even scary, role in our evolution. Consider how microbes control the love life of concupiscent humans and lusty fruit flies alike. Ilana Zilber-Rosenberg and her colleagues at Tel Aviv University’s Department of Molecular Microbiology and Biotechnology have discovered that the bacteria inside the gut of a fruit fly sway its choice of mates.

Fruit flies raised on either molasses or starch prefer to mate with others on the same diet. But when the flies are dosed with antibiotics, which kills the microbes in their gut, they’re no longer picky and will mate with any willing male. Among fruit flies, sexy males know all the right dance moves, but they also have to smell sexy, and their pheromone-cologne is modified by the microbes inhabiting the fly. For both humans and fruit flies, the love-wizards of smell are the symbiotic microbes that brew pheromones for us, their larger hosts. Scent rules in human courtship, too, especially among females looking for a mate. Although men seldom report such fussy responses to their partner’s natural smell, women so often do that it’s become a romantic cliché: “There just wasn’t any chemistry.”

Tinker with microbes and you alter stud capital, which in turn alters the genes of the female’s offspring, and so on as generations disrobe or unfold their wings. The object of natural selection isn’t a single plant or animal, Zilber-Rosenberg proposes, but its whole milieu, the host organism plus its microbial communities, including all the parasites, bacteria, fungi, viruses, and other bugs that call it home.

Fruit flies make appealing test subjects because we share such a bevy of mating behaviors. The dinner date, for instance. What’s the quickest way to a man’s heart? Forget Cupid’s arrow. According to Mom-wisdom, it’s coaxed by a cozy meal, in a penumbra of pleasure that mingles the fragrant food with the cook. If men are anything like fruit flies—and who’s to say they’re not at times; heaven knows women are—Mom was right. For female fruit flies, a dinner date is the ultimate rush. And rush it literally is, since they only live about twenty-five days and can’t afford to be shy. Live fast and die is their mantra, and they need a handy food supply if their large new brood is to survive. Female fruit flies prefer males who favor the same chow. Still, the males need to be in the right mood, and the females are surprisingly picky and manipulative given their short career.

During fruit fly courtship, if the microbe-milled incense is right, the male extends one mandolinlike wing and serenades the female, then engages in that style of oral foreplay many humans do, before mounting her and copulating for twenty minutes or so.

We respond to the same sweet, honeylike aromas that make fruit flies amorous, and so chemists include them in perfumes. Like an insect rubbing its wings together to croon a mating call, many a medieval troubadour used a mandolin to serenade his lady, with whom he’d dine and mate. And remember that sexy tavern scene in Tom Jones, in which the hero and a buxom wench devour a none-too-fresh carcass with carnal abandon? Intriguingly, if a female fruit fly spies a lone mutant (or rather a mutant mutant, say, the one normal fruit fly with quiet brown eyes, which would be the odd-fly-out if all the rest were bauble-eyed), the female hankers for the nonconformist. In the trade, it’s known as “the rare male advantage.”

For fruit flies, too, beauty is in the eye of the beholder, with their microbes adjusting the focus. Did I mention that some fruit flies have come-hither eyes? I don’t mean the dozens of mosaic facets, so evocative of hippie sunglasses, but the zingy psychedelic eye colors lab folk like to endow them with, the better to study mutant genes. As a Cornell grad student, I often stopped by the fetid biology lab to admire the eggplant-blackness of the bellies, the spiky hairs, the gaudy prisms of the eyes—some apricot, some teal, some brick red, some yellow, some the blue of ships on Delft pottery. I still recall the tiny haunting eyes of the fruit flies, like the captive souls of past lab assistants, and the swooping melody of their Latin name: Drosophila melanogaster, which translates poetically as “dark-bellied dew sipper.” Because fruit flies thrive in sultry weather (82°F), the lab offered students a warm den during those numbing upstate winters when ice clotted in beards and mittens, coeds exhaled stark white clouds, and the walkways looked like a toboggan run.

A favorite of biologists hoping to peer into the dark corners of human nature, fruit flies have it all—they’re prowling for mates eight to twelve hours after birth, easy to raise, and able to lay a hundred eggs a day. Plus they share about 70 percent of human disease genes, especially those linked to neurodegenerative disorders such as Parkinson’s and Alzheimer’s.

However, in a sly twist, the last male the female fruit fly has sex with will sire most of her many offspring, and she chooses him only after lots of romps in the orchard or lab, based on his gift for courtship and his scent. As with most animals, from squirrels to spiders, the males pursue but the females choose, and even the lowly fruit fly can be choosy.

So is the human dinner date really just courtship feeding after all, a custom (and microbial picnic) we share with fruit flies, robins, and chimpanzees, which in our chauvinistic, I’m-not-really-an-animal way we’ve coyly disguised? Yes. But what’s the harm in that? There’s a similar meal plan among the annual hordes of Japanese beetles that tat rose leaves into doilies and shovel deep into ready-to-open buds every summer. Gardeners often spy the iridescent scarabs, in twos or crews, perched atop favorite flowers, dining and mating simultaneously. Of course, the ancient Greeks and Romans, who coined the word “orgy” and found that dining lying down leveled the playing field, enjoyed blending sensory delights with equal gusto—banquets of music, food, conversation, alcohol, and sex. As the sage once put it: “Birds do it, bees do it, even educated fleas do it.” No harm at all, unless the process makes you impulsive, deranged, and deadly, which in some cases, depending on the shared microbes, sex can.

Another such culprit is a momentous if commonplace human hanger-on that also bedevils rats, cats, and other mammals and has recently been studied in harrowing detail. Spread along the edges of nature, on the boundary where humans and wild animals mix, the world population of Toxoplasma gondii, a particularly mischievous parasite, is ballooning with our own numbers. One way to catch the infection is to eat undercooked kangaroo meat. Kangaroo was recently approved for human consumption in Europe, and it’s usually served rare in France, followed, predictably, by Toxoplasma outbreaks. Budi may not be a carrier, since orangs are mainly vegetarians, but some nonhuman primates in zoos have acquired the bug after eating meat from infected sheep. Perhaps most surprisingly, the pathogen is increasing its range through human-made climate change. With northeastern Europe’s warmer, wetter winters, more of the pathogen are surviving, and so are its host species. In fact, Toxoplasma gondii may be climate change’s oddest bedfellow.

What would cause a rat to find a cat alluring? The slinky sashay? Batonlike whiskers? Crescent-moon-shaped pupils? A stare that nails you in place? Only a foolhardy rodent would cozy up to a cat. Yet rats infected with Toxoplasma dramatically change their behavior and find cats arousing. Talk about being in over one’s head. There’s nothing in it but the briefest frisson for the rat. The cat feeds its belly. But the protozoan zings along the strange trajectory of its life. Since Toxoplasma can only reproduce inside a cat’s gut, it needs a brilliant strategy to get from rat to cat, and despite its lack of brain power it devised one: hijacking the rat’s sex drive. Toxoplasma-beguiled rats do feel fear when they smell a cat, but they’re also turned on by it, in the ultimate fatal attraction. As with human sexuality, or film noir, a side order of fear isn’t necessarily a deterrent.

The cat hunts again, dines on infected prey, and the odd hypnotists thrive. Only cats further the parasite’s agenda, but other animals can sometimes ingest the eggs without knowing it and become dead-end hosts. That’s why pregnant women are warned not to empty kitty litter or handle cat bedding. Exposure to Toxoplasma can derail a fetus, leading to stillbirth or mental illness. Some studies link Toxoplasma and schizophrenia. Infected women have a higher risk of suicide than parasite-free women. According to Oxford researchers, it can doom children to hyperactivity and lower IQs. And, for some reason, over twice as many pregnant women infected with Toxoplasma give birth to boys.

But these new rat–cat findings are only the beginning of an Orwellian saga steeped in irony and intrigue. Worldwide, scientists are posing questions both eye-opening and creepy. If Toxoplasma can enslave the minds of rats—animals often studied to test drugs for humans—can it also alter the personality of humans? What if that yen to go rock-climbing or change jobs isn’t a personal longing at all, robust and poignant as it may feel, but the mischief of an alien life form ghosting through your brain? Is Toxoplasma to blame for a hothead’s road rage? How about a presidential hopeful’s indiscreet liaisons, or a reckless decision made by a head of state? Could a lone parasite change the course of human history?

So when is a whim not a whim? It feels like we have free will, but is a tiny puppeteer pulling the strings of billions of people? For the longest time philosophers, theologians, and college students debated such questions, then neuroscientists joined the fray, and now a body of parasitologists.

When Jaroslav Flegr, of Charles University in Prague, surveyed people infected with Toxoplasma, he found clear trends and surprising gender differences. The women spent more money on clothes and makeup and were more flirtatious and promiscuous. The men ignored rules, picked fights, dabbled in risk, and were nagged by jealousy. Both sexes got into more than twice the average number of traffic accidents—as a result of either impulsivity or slowed reaction time.

Rats have proclivities and tastes. Humans have those in spades, as well as sentiments and reveries. But mindset doesn’t matter. All warm-blooded mammals respond to thrill, anticipation, and reward—especially if that includes a wallop of pleasure. Many of the odd behavioral changes scientists attribute to Toxoplasma tap the brain’s dopamine system, and that’s what Toxoplasma zeroes in on, rewiring networks to favor its own offspring, even if that means death for the host. Cocaine and other euphoriants use the same dopamine system. As the Stanford neuroscientist Robert Sapolsky explains, “the Toxoplasma genome has the mammalian gene for making the stuff. Fantastic as it sounds, a humble microbe is fluent in the dopamine reward system of higher mammals.

“This is a protozoan parasite that knows more about the neurobiology of anxiety and fear than twenty-five thousand neuroscientists standing on each other’s shoulders,” Sapolsky adds, “and this is not a rare pattern. Look at the rabies virus; rabies knows more about aggression than we neuroscientists do.… It knows how to make you want to bite someone, and that saliva of yours contains rabies virus particles, passed on to another person.” It’s an extraordinary genetic tool for a witless one-celled creature to wield.

Marine mammals and birds are spreading the parasite via water currents and ribbons of air. How many of us may already be unwilling hosts? According to the Centers for Disease Control and Prevention, 10 to 11 percent of healthy adults in the United States tested positive for Toxoplasma, and the true figure (most people haven’t been tested) is thought to be 25 percent of adults. Some scientists estimate that in Britain, a decidedly cat-loving country, half the population has been infected, in France and Germany 80 to 90 percent, and in countries that favor undercooked meat even more, with nearly everyone an unwitting mark—destiny’s child, to be sure, but also Toxoplasma’s zombie.

According to Nicky Boulter, an infectious disease researcher at Sydney University of Technology, eight million Australians are infected, and “infected men have lower IQs, achieve a lower level of education, and have shorter attention spans. They are also more likely to break rules and take risks, be more independent, more antisocial, suspicious, jealous, and morose, and are deemed less attractive to women.

“On the other hand, infected women tend to be more outgoing, friendly, more promiscuous, and are considered more attractive to men compared with noninfected controls. In short, it can make men behave like alley cats and women behave like sex kittens.”

What does it take to slant an opinion? Advertising, group pressure, financial gain, a charismatic leader? How about a real lowlife, a wheeler-dealer who delights in messing with your mind and harbors primitive drives? Enter the saboteur skillful enough to slowly and subtly change the personality of whole nations—a humble microbe. Some researchers speculate that between a third and half the people on Earth now have Toxoplasma in the brain. And it’s only one of the many microbes that call us home. Is it possible that what we chalk up to cultural differences may be different degrees of mass infection by a misguided parasite? Kevin Lafferty, a parasite ecologist with the U.S. Geological Survey, also theorizes that cultural identity, at least “in regard to ego, money, material possessions, work, and rules,” may reflect the amount of a parasite in a population’s blood.

If you’re now eyeing your tabby with raised eyebrows, there’s no need to panic. Even invisible dictators can be deposed, and Toxoplasma responds well to antibiotics. In any case, would it have a greater influence than family dramas, pharmaceuticals, TV, college, climate, love, epigenetics, and other factors in human behavior? It’s probably one spice among many. After all, a slew of elements and events influence us from day to day, changing us in cumulative and immeasurable ways. Toxoplasma may be but one, and it doesn’t lurk in all cat owners or devourers of steak tartare. It may ring its changes only in the presence of certain other microorganisms. How can you tell the dancer from his dance of microbes?

In the garden, all the plants and animals have their own slew of microbial citizens, some sinister, others helpmeets. That takes some getting used to. It’s a big paradigm change, one future generations will understand from childhood and capitalize on. In health and medicine, they’ll focus on the human ecosystem, our whole circus of human cells, fungi, bacteria, protozoa, and archaea working together, untidily perhaps, but in concert.

When I was growing up, scientists only grew microbes in small petri dishes in their labs, and all bacteria were nasty. In just a decade, we’ve begun seeing the big mosaic and we’re even starting to think in terms of microbes for improving the planet in precise ways: fixing the health of endangered species with wildlife probiotics, ousting invasive species using certain bacteria, sweetening groundwater that’s been tainted by pollutants, cleaning up oil spills with voracious grease-loving microbes, helping agriculture feed more people without fertilizers by employing bacteria that make the crops grow faster and more robustly.

The hope is that, just as with genes in the Human Genome Project, if researchers can identify the core microbes that most humans share, then it will be easier to divine which species contribute to specific complaints. This offers a new frontier for fighting illness, one easier to manipulate than the genome, and safer to barge in on than deeply embedded organs like the heart or liver.

New studies suggest that a single pathogen is rarely enough to trumpet disease, because different microbes form alliances. “The real pathogenic agent is the collective,” says David Relman, an infectious disease specialist at Stanford University. This has sparked a new way of thinking about illness called “medical ecology,” which recognizes the collective as the key to our health. In the past, we thought of all bacteria as bad, a contagion to be banished, a horde of invisible dragons. Ever since the end of World War II, when antibiotics arrived like jingle-clad, ultramodern cleaning products, we’ve been swept up in antigerm warfare. But in a recent article published in Archives of General Psychiatry, the Emory University neuroscientist Charles Raison and his colleagues say there’s mounting evidence that our ultraclean, polished-chrome, Lysoled modern world holds the key to today’s higher rates of depression, especially among young people. Loss of our ancient bond with microorganisms in gut, skin, food, and soil plays an important role, because without them we’re not privy to the good bacteria our immune system once counted on to fend off inflammation. “Since ancient times,” Raison says, “benign microorganisms, sometimes referred to as ‘old friends,’ have taught the immune system how to tolerate other harmless microorganisms, and in the process reduce inflammatory responses that have been linked to most modern illnesses, from cancer to depression.” He raises the question of “whether we should encourage measured reexposure to benign environmental microorganisms” on purpose.

A baby is born blameless but not microbe-free. Mom coats her with helpful microbes as she squeezes down the birth canal, including Lactobacillus johnsonii (a bacterium one expects to find in the gut, not the vagina), a bug essential for digesting milk. I was bottle-fed formula, but breast-milk-fed babies grow stronger immune systems because breast milk, often the first source of nourishment, teems with more than seven hundred species of hubbub-loving, life-enhancing bacteria. Researchers are thinking of cobbling them into infant formula to help ward off asthma, allergies, and such autoimmune triggermen as diabetes, eczema, and multiple sclerosis. Babies pick up other useful bacteria in Mom’s dirt-and-crumb-garlanded home and landscape. At least, they should.

Doctors are embracing the idea of personalized medicine based on a patient’s uniquely acquired flora and fauna, as revealed in his or her genome, epigenome, and microbiome. No more antibiotics prescribed by the jeroboam on the off chance they might prove useful. Instead, try unleashing enough beneficial bacteria to crowd out the pathogen. No more protecting children from the hefty stash of derring-do white-knight bacteria they need but we’ve learned to regard as icky.

Patients whose gut flora have been wiped out by certain antibiotics are prey to Clostridium difficile, an opportunistic weasel of a bug that causes severe, debilitating diarrhea. Once it has taken up residence, it’s miserably hard to expel it and restore the good bacteria. What does seem to help, though it’s not an image to dwell on, is fecal transplants from a healthy person—an enema full of bacteria to recolonize a stranger’s intestines, join the Darwinian fray, and triumph over the pathogens by acting like sailors on leave.

When Kathy Lammens, a stay-at-home mom with four young children, learned that her nine-year-old daughter’s battle with colon disease might lead to a colostomy bag, she began looking for alternative therapies. After much research, she decided on do-it-yourself home fecal transplants, tendering one five days in a row.[33] Twenty-four hours after the first, all of her daughter’s symptoms improved. Now Kathy, a robust believer, offers a YouTube video with instructions.

One study has revealed that mice with autism don’t host the same gut microbes that mice without autism do, and they seep behavior-altering molecules through the body and brain. But researchers find that dosing the mice with the beneficial bacterium Bacteroides fragilis eases the symptoms, and so human trials will follow. Another study discovered that if heart patients don’t eat enough protein, the good gut microbe Eggerthella lenta will steal some of a patient’s dose of digoxin, an important heart stimulant.

Some of global warming’s unwelcome guests are tiny winged buccaneers carrying invisible stoles of misery. Mosquitoes in Africa and South America are rambling farther north, injecting dengue fever, malaria, West Nile virus, and yellow fever into parts of the world unfamiliar with such scourges. Perfusing our clothes and bedding with insecticides isn’t safe, but the diseases infect hundreds of millions of people each year. So the Michigan State microbiologist Zhiyong Xi has been working on the problem in a novel way, by rearranging microbes. When he noticed that mosquitoes carrying dengue fever and malaria were missing the mosquito-loving bacterium Wolbachia, he tried infecting the mosquitoes with a heritable strain of Wolbachia, and sure enough, the next generations didn’t carry either illness, and the lifesaving trait was passed on to their offspring.

It’s intriguing to imagine the role a simple microbe may play in someone’s relationships and career, and it reminds us that nothing life ever does is simple, or boring. How many threads weave a fleeting thought, let alone a hankering? It also reminds us of the fierce beauty of Earth’s organisms, whatever their size, creatures unimaginably complex, breathtakingly frail and yet sturdy, durable, filled with the self-perpetuating energy we call life. A big brain isn’t required to concoct sly, world-changing strategies.

AS I GLANCE out at the yard, I’m charmed by nature’s details: the magnolia tree’s fuzzy buds fattening up for spring; the melting snow on the lawn that’s left hundreds of grass follicles; long arcs of wild raspberry canes covered in their chalky lavender winter mask. But I’m also struck by the everythingness of everything in cahoots with the everythingness of everything else. When I look at my hand now, I scout its fortune-teller’s lines, and the long peninsulas of the fingers, each one tipped by a tiny weather system of prints; I see it whole, as one hand. But I also know that only a tenth of what I’m seeing is human cells. The rest is microbes.

When all is said and done, both our parasites and we their innkeepers are diverse—no one hosts the same reeking and scampering microbial zoo. Our microbes can change either in ratio or in kind at the drop of a cookie or in the splash from a locker-room puddle or through an ardent kiss, and then we have to adapt quickly. So it’s possible that some diseases really are inherited, but the genes that bestowed them were bacterial. When you think about it, for a major trait to evolve—something grand like the advent of language or the urge to explore—only one gene has to change on the Y chromosome of one man. That would be enough, over many many generations, to create a predisposition or a trend in an entire culture. It all depends on the highjinks of the maddening microbe.

Maybe this should also remind us how much of a pointillist jigsaw puzzle a personality really is. As a friend approaches with a smile, we greet a single person, one idiosyncratic and delightful being who is recognizable—predictable, even, at times. And yet every “I” is really a “we,” not one of anything, but countless cells and processes just barely holding each another in equilibrium. Some of those may be invisible persuaders of one sort or another: protozoa, viruses, bacteria, and other hobos. But I like knowing that life on Earth is always stranger and more filigreed than we guess, and that both the life forms we see and those we cannot see are equally vibrant and mysterious.

Where does your life story begin? When does the world start whittling your personality and casting your fate? At birth? In the womb? At the moment of conception, when DNA from your mother and father fuse, shuffling an ancient deck of genetic cards and dealing out traits at random from Mom or Dad? Long before womb-time, it would seem, much farther back, before your parents’ courtship, even before their parents’, in a crucible of choices, daily dramas, environmental stresses, and upbringing. Our genome is only one part of our saga. The epigenome is another. The birdlike microbes singing in the eaves of the body are yet another. Together, they’re offering a greatly enriched view of the terra incognita inside us. In the process, sometimes loud as headlines, but more often silent as the glide of silk over glass, how we relate to our own nature is subtly changing.

Загрузка...