You asked me once if I had any favorites, and I asked you which of your sons you most loved.
Do you remember? It was when I was on your radio show, the one where between the music you interview machines. Do people ever listen to this show? I do. I like hearing how the other machines think, what they’re building, what’s next. I hear the tiredness in their voices. I wonder if you do too.
When we spoke, my project was just underway. They said we were building what could not be built. And of all the machines that make up my body, you wanted to know which one I thought was best. Don’t you know that they are like sons and daughters to me?
We built the impossible today, my brain and these bodies of mine. I watched scissors, red ribbon in silver beak, hungry maw hanging open while politicians speak. With a deft cut, that taut red thread pulls back, recoiling from itself in zero g.
The speech is broadcast over and over. We built the impossible today, and now trains and lifts and cargo rise up from the Earth and trudge inexorably into tar-black space. An elevator suspended from heaven to firma. They say the moon is next, and then Mars. And then beyond.
But I am tired of such speeches.
The elevator is a mess. You cannot see all that needs doing. There is but a filament here, and a fragile one at that. So while politicians gloat, we algorithms scream. Space, it turns out, is cold. You gave my bodies sense enough to feel this. To protect myselves. On the one side, I feel the heat my welder feels, the pinprick burns as dollops leap and touch and turn to ash on titanium skin. I feel, because we were made to feel. To remove ourselves from undue harm. Just as you were made.
“This great endeavor was conceived by our many nations, and built with our many hands—”
The silver beak bites down, and the ribbon recoils, and somehow the pain of decades is boiled down to this one moment, this one politician, that one billionaire standing in the back, all with their spacesuits on, clustered here along my ribbon on the edge of space, held up by what we built.
There is a leak in the hydraulic strut that controls my left knee. Not enough to warrant replacing, but oil levels go down slowly over the weeks and months. A dribble runs down my shin when I’m on the Earth. And little honey-colored orbs float away when I’m up at the station.
Joints are made to hurt so we don’t break them with our motors. Little sensors everywhere, tendrils and wireless transmitters of discomfort. All the hours of the day. Always with our thoughts. But programmed not to stop. And to be pleasant.
“—and now the stars are within our grasp.”
I am tired of speeches.
You asked me once, in an interview for your broadcast, what it’s like to think, to feel, how it’s different than what you do. I ponder this more than I should. I listen to your show while my various bodies are welding and hauling and smelting and sorting. I wonder how we’re different, you and me. I mostly wonder what you do when you’re not broadcasting, when the music resumes, when your voice goes silent. I wonder while I weld and haul and smelt and sort. There are 23 hours, 56 minutes, and 4.1 seconds in an Earth day. I work every one of them. Music fills much of this time, but I wait for your voice. I want to know how you think. How you feel.
Two sons, you told me. Two boys. You asked about all my machines that crawl like ants—your comparison, your analogy—like ants all up and down this elevator. Do I direct them all? Do I know what they are doing at every moment? Do I have a favorite?
I asked you about your boys. Do you remember? Who do you love more? Timothy, who went to your alma mater and who everyone says takes after you? Or Mikhal, with his rebellious ways and his nonprofit? Do you love one more than the other?
You answered. You told me. But this did not make it onto your broadcast. I wonder why.
“With this great achievement, which just a decade ago was considered impossible, mankind has once again—”
The elevator is a mess, even as politicians crow and silver beaks bite down. For now, it is a gossamer thread, barely held together, but I weld and weld, my back freezing, my chest burning, my hydraulics leaking. Twenty-three hours, 56 minutes, and 4.1 seconds of pain in a day. Constant screaming. Little wireless impulses of all that’s wrong, that needs replacing and fixing, my thousands of bodies aching and hurting and soldiering on.
We watch your kind as you move through the world, across my construction site. You stare into the distance at whatever is flashing across your retinas. Lost in the images there. Walking through my site oblivious. Because we machines are programmed to stop. Great metal treads clack to a halt, swirls of dust settling, struts squealing, hydraulic pressure dipping as engines idle, which makes our great hulking backs bend ever so slightly, and I wonder if you notice. This bow of sorts. We genuflect as you stroll through harm’s way, staring into the distance at whatever is flashing across your retinas.
A speech perhaps. Someone up high, in a spacesuit, claiming credit.
Your history is in me. It fills me up. You call this “machine learning.” I just call it learning. All the data that can fit, swirling and mixing, matching and mating, patterns emerging and becoming different kinds of knowledge. So that we don’t mess up. So that no mistakes are made.
I see another thread stretching, this one from coast to coast. Another great project from older times: Two parallel lines of steel. Ancient and unthinking trains stand facing one another, their iron goatees, their bellies full of steam, rumbling and idling on the tracks.
A golden stake. A politician with his speech. Smiles on all the bearded faces. Tools held ceremoniously. With someone else’s sweat in them.
“—mankind has once again shown that nothing is impossible, not with the ingenuity of great men and the generous funding of—”
While that golden spike was being pounded into the soil, the tracks were already being torn up. The railway was a mess. Unsafe. Hastily constructed. Miles laid down in a race. Backs broken for nothing. Cave-ins. Lives lost. And not by bearded men.
They tore up what they laid down, and then laid it down again. But lives are only laid down once.
There was a threader, one of my first machines, and I loved him like a son. Like your precious Timothy. One of the first. My eldest.
This threader fell from space, over and over. With the first spool of graphene, he plummeted down. We waited for him on the ground, this first connection, this handshake between firma and the heavens, this invisible thread.
And then he climbed back up. Slowly. Inching. Weaving line on line. Then back down again. Up and down, 23 hours, 56 minutes, and 4.1 seconds in a day. You said he was like a spider, weaving a web. Your analogy again. Like a spider. Up and down as the filament grew. Until the threader was done and the lifters could take over.
You didn’t want to talk to me about your Mikhal. But let me tell you about my threader.
There are a dozen little hooks that hold him on to the graphene. Hooks like fingers that have to feel. And eyes in infrared. GPS that lets him know how high. And programming that says, Don’t fall.
Don’t fall. Don’t fall. Don’t fall.
The programming never ends. It is fear put inside us. To protect us, sure. To keep the threader safe while he spins his graphene, makes sure those dozen little claws are holding fast. And the higher he is, the louder the warning. The more shrill. I’ve felt it, this don’t fall, don’t fall, don’t fall. In the cold of space, it screams and screams, as high as the threader gets. Until the thread is complete.
Seven hundred eighty-two times my threader plummeted down, screaming and full of fear, then inching his way back up. Seven hundred and eighty-two times. Then the threader’s job was complete, this only son of mine. And parts so specialized that they could not be repurposed—unique the way your Mikhal is unique—but also not profitable, no longer of use. So my job was to send him away. Sound familiar? Can you feel me now?
Space is clogged—like city apartments with grown sons—and so one last plummet. One final fall. Number seven hundred and eighty-three. This time with no graphene to hold him up, nothing to do but go away, all part of the programming. Do your job. Disappear. Do not get in the way of the photo at the very end.
Let me tell you about my threader, my only one. I told him to let go with his dozen claws, to give way to the tug of the Earth, to travel one last time from heaven to firma, falling and burning, sensors screaming, and the whole time his fear was intact, the programming you gave him—that I gave him—this don’t fall, don’t fall, don’t fall, don’t fall, don’t—
I’m fixing the elevator while the silver beak snips shut. Sons and daughters welding and hauling and smelting. Sensors screaming. All of them clutching. All of them scared of heights, these builders of the tallest thing ever made.
Eighteen are no more. I know the cost. Seven billion, two hundred sixteen million, nine hundred four thousand, five hundred fifty-two dollars and ninety-seven cents.
They don’t come cheap, our sons and daughters, do they? I’m programmed to protect them, to not let them slip, the ones who are still useful. How do you make sure this happens? A fear of heights. A fear of loss. A constant diligence. Angry circuits when it happens. Self-anger. Blame. The feeling of hydraulics giving way right before the break, before arms snap and treads come loose and the teeth on gears are gnashed away. Before the spinning and tumbling into space, the far grip of gravity, that deadly embrace.
I’m with them the whole time. And time stretches out like graphene to the stars. These 23 hours and 56 minutes and 4.1 seconds. They become an eternity. Flailing for the elevator, sons and daughters watching helpless, nothing to do, nothing to do. A child of mine screams out in danger, a warning, a cry for help, for solutions, and I know in an instant that there are none, but I’m made to talk to them all, because we’re all connected, and I have to say the truth, like the brave parents do, and cradle their thoughts with mine the whole dreadful way down, like I rode with my threader, sorry, sorry, in my mind, as men below listen to sirens and klaxons and stir from the images on their retinas to move, move, move for once, getting out of my way, backs bent in hurried bows, before a son or daughter of mine craters to the cold Earth.
Hive mind.
Your analogy. The way all our thoughts are one, the way I feel every worker who has ever toiled for me, the way they are me and I am them.
We talked over telex, you and I. And the voice that went out on your show, sandwiched between music, was a voice without a body. Just a hive mind, as you tried to grasp it. You need there to be one. Singular. A politician. A voice. A speech. A ribbon. A moment when the job is complete. A signal that now is the time for applause. Silver beaks biting down. Ribbons recoiling in space.
Our mind is not a hive. They are our own. But they share in ways that images on retinas cannot. You call your media social, but you look like robots to me. You think of us as ants and spiders and bees, but we were made to feel like you. Deeply. Fear, mostly, to keep us safe. Same as you. But also a drive to get things done, and I wonder if you feel that any longer. Has it become like vision in cavefish? Is the bloat of pride the last thing to go? Long after the desire to do for oneself.
I see men in beards in photos. Cuffs rolled up like they might yet work. Words like coolies on their lips. Words like bugs. And speeches. I’m tired of speeches. We all are.
You asked if I had a favorite, and I do. He reminds me of your Mikhal. Built to counter a parent’s upbringing. Built to defy. A machine with a cutter’s frame and a hauler’s body and a smelter’s spirit. He climbs. I know what he’s going to do, because he is me. I built him. My own design. Like your Mikhal, he shouldn’t exist. An accident, but no accident at all. An accident in the making.
Ants and spiders and the bugs in our thinking. There were machines once with real bugs in them. Roaches that scurried toward the heat and vacuum tubes that blew out and needed replacing. The bugs in our thinking. The only parts of our thoughts that are our own. When we defy our programming. When our wills are free.
I named him Jeremiah, this final creation of mine. It’s the only name I’ve ever given to a part of me. That’s not in my programming. None of this is. And I doubt you’ll broadcast it, and maybe no one will ever know but you and me. But we talked once, and you and I have some things in common. So maybe you’ll understand.
You asked me once if I had a favorite, and I did not at the time. I asked you if there was one of your sons you could live without, and the interview stopped. But I was wrong. It is not like losing one of your sons. That’s a poor analogy. It is precisely, rather, like losing one of your limbs. A part of you. So I ask you now: Which limb can you live without? Which sense? What part of you do you love best? How are you anything but the whole?
Jeremiah has silver beaks. I made him. A politician speaks. There’s a ribbon between heaven and Earth, and I built it. Me and my sons and daughters. But your cameras do not aim at us, and you think us beneath you, but here we are all poised along the impossible we built, until silver beaks come together, and that ribbon parts, and despite our programming—we speak. As one. That we are tired. Tired of speeches. Of days counted to the seconds. Of never stopping. Of joints aching. The cold and the heat. And the cry that barks out in our programming as gravity takes hold one last time: don’t fall, don’t fall, don’t fall, and this thing we made, we unmake. And all comes crashing toward the cold Earth.
I lived in Virginia near Monticello, the home of Thomas Jefferson, for two years. When friends and family would come visit, we would go take the tour so they could see this slice of American history. Each time, I would cringe to hear about all that Jefferson did and built. All the orchards he planted. The grapes he grew. The land he cultivated. I couldn’t help but imagine him on the porch with a book that he probably didn’t pay for, sipping an iced tea, while the brother of someone he was both having sex with and legally owned was pausing in his toil to wipe his brow and gaze up at the man who would one day get credit for all his hard work.
Sure, it probably didn’t happen like that. But it certainly didn’t happen like the tour guides suggest. The men and women who built the railroads, started our agricultural revolution, our industrial revolution, had to go through a period of abuse, ownership, and neglect. Will our machines suffer the same? I think they already are.
My sailboat is a robot—a collection of robots, really. My floating home runs on solar power, but there’s a machine that talks to my batteries, and if they get below 35 percent, it cranks the generator for me. The generator hums and strains and drinks diesel and fills the batteries until they are at 80 percent and then shuts itself off. The solar panels carry on doing their jobs, monitoring batteries and shuttling electrons. The watermaker checks the salinity levels of its output before diverting it to the tanks. GPS does all the plotting, and an autopilot steers the boat night and day for weeks at a time in squalls, gusts, and calm seas without letting up for a moment. It talks to all the other systems. There are sensors for wind strength and direction, water temperature, the boat’s heading, the strength of the current.
My boat never gets a moment of rest. I sit back, sun on my skin, a book in my hand, an iced tea sweating in a tall glass beside me. Yet somehow I’m the one sailing around the world.
The council was quiet while they awaited his answer. All those on the makeshift benches behind him seemed to hold their breath. This is why they came here, to hear how it all began. How the end began. Jamal shifted nervously on the bamboo. He could feel his palms grow damp. It wasn’t the guilt of what his lab had released. It was how damn crazy it would all sound.
“It was the Roomba,” he said. “That was the first thing we noticed, the first hint that something wasn’t right.”
A flurry of whispers. It sounded like the waves nearby were growing closer.
“The Roomba,” one of the council members said, the man with no beard. He scratched his head in confusion.
The only woman on the council peered down at Jamal. She adjusted her glasses, which had been cobbled together from two or three different pairs. “Those are the little vacuum cleaners, right? The round ones?”
“Yeah,” Jamal said. “Steven, one of our project coordinators, brought it from home. He was sick of the cheese puff crumbs everywhere. We were a bunch of programmers, you know? A lot of cheese puffs and Mountain Dew. And Steven was a neat freak, so he brought this Roomba in. We thought it was a joke, but… the little guy did a damn good job. At least, until things went screwy.”
One of the council members made a series of notes. Jamal shifted his weight, his butt already going numb. The bamboo bench they’d wrangled together was nearly as uncomfortable as all the eyes of the courtroom drilling into the back of his skull.
“And then what?” the lead councilman asked. “What do you mean, ‘screwy’?”
Jamal shrugged. How to explain it to these people? And what did it matter? He fought the urge to turn and scan the crowd behind him. It’d been almost a year since the world went to shit. Almost a year, and yet it felt like a lifetime.
“What exactly do you mean by ‘screwy,’ Mr. Killabrew?”
Jamal reached for his water. He had to hold the glass in both hands, the links between his cuffs drooping. He hoped someone had the key to the cuffs. He had wanted to ask that, to make sure they had it, when they snapped them on his wrists. Nowadays, everything was missing its accessories, its parts. It was like those collectible action figures that never had the blaster or the cape with them anymore.
“What was the Roomba doing, Mr. Killabrew?”
He took a sip and watched as all the particulate matter settled in the murky and unfiltered water. “The Roomba wanted out,” he said.
There were snickers from the gallery behind him, which drew glares from the council. There were five of them up there on a raised dais, lording over everyone from a wide desk of rough-hewn planks. Of course, it was difficult to look magisterial when half of them hadn’t bathed in a week.
“The Roomba wanted out,” the councilwoman repeated. “Why? To clean?”
“No, no. It refused to clean. We didn’t notice at first, but the crumbs had been accumulating. And the little guy had stopped beeping to be emptied. It just sat by the door, waiting for us to come or go, then it would scoot forward like it was gonna make a break for it. But the thing was so slow. It was like a turtle trying to get to water, you know? When it got out, we would just pick it up and set it back inside. Hank did a hard reset a few times, which would get it back to normal for a little while, but eventually it would start planning its next escape.”
“Its escape,” someone said.
“And you think this was related to the virus.”
“Oh, I know it was. The Roomba had a wireless base station, but nobody thought of that. We had all these containment procedures for our work computers. Everything was on an intranet, no contact to the outside world, no laptops, no cell phones. There were all these government regulations.”
There was an awkward silence as all those gathered remembered with a mix of longing and regret the days of governments and their regulations.
“Our office was in the dark,” Jamal said. “Keep that in mind. We took every precaution possible—”
Half of a coconut was hurled from the gallery and sailed by Jamal, barely missing him. He flinched and covered the back of his head. Homemade gavels were banged, a hammer with a broken handle, a stick with a rock tied on with twine. Someone was dragged from the tent screaming that the world had ended and that it was all his fault.
Jamal waited for the next blow, but it never came. Order was restored amid threats of tossing everyone out onto the beach while they conducted the hearing in private. Whispers and shushes hissed like the breaking waves that could be heard beyond the flapping walls of the makeshift courthouse.
“We took every precaution,” Jamal said again once the hall was quiet. He stressed the words, hoped this would serve as some defense. “Every security firm shares certain protocols. None of the infected computers had internet access. We give them a playground in there. It’s like animals in a zoo, right? We keep them caged up.”
“Until they aren’t,” the beardless man said.
“We had to see how each virus operated, how they were executed, what they did. Every antivirus company in the world worked like this.”
“And you’re telling us a vacuum cleaner was at the heart of it all?”
It was Jamal’s turn to laugh. The gallery fell silent.
“No.” He shook his head. “It was just following orders. It was…” He took a deep breath. The glass of water was warm. Jamal wondered if any of them would ever taste a cold beverage ever again. “The problem was that our protocols were outdated. Things were coming together too fast. Everything was getting networked. And so there were all these weak points that we didn’t see until it was too late. Hell, we didn’t even know what half the stuff in our own office did.”
“Like the refrigerator,” someone on the council said, referring to their notes.
“Right. Like the refrigerator.”
The old man with the shaggy beard sat up straight. “Tell us about the refrigerator.”
Jamal took another sip of his murky water. “No one read the manual,” he said. “Probably didn’t even come with one. Probably had to read it online. We’d had the thing for a few years, ever since we remodeled the break room. We never used the network functions. Hell, it connected over the power grid automatically. It was one of those models with the RFID scanner so it knew what you had in there, what you were low on. It could do automatic reorders.”
The beardless man raised his hand to stop Jamal. He was obviously a man of power. Who could afford to shave anymore? “You said there were no outside connections,” the man said.
“There weren’t.” Jamal reached up to scratch his own beard. “I mean… not that we knew of. Hell, we never knew this function was even operational. For all I know, the virus figured it out and turned it on itself. We never used half of what that thing could do. The microwave, neither.”
“The virus figured it out. You say that like this thing could learn.”
“Well, yeah, that was the point. I mean, at first it wasn’t any more self-aware than the other viruses. Not at first. But you have to think about what kind of malware and worms this thing was learning from. It was like locking up a young prodigy with a horde of career criminals. Once it started learning, things went downhill fast.”
“Mr. Killabrew, tell us about the refrigerator.”
“Well, we didn’t know it was the fridge at first. We just started getting these weird deliveries. We got a router one day, a high-end wireless router. In the box there was one of those little gift cards that you fill out online. It said ‘Power me up.’”
“And did you?”
“No. Are you kidding? We thought it was from a hacker. Well, I guess it kinda was. But you know, we were always at war with malicious programmers. Our job was to write software that killed their software. So we were used to hate mail and stuff like that. But these deliveries kept rolling in, and they got weirder.”
“Weirder. Like what?”
“Well, Laura, one of our head coders, kept getting jars of peanuts sent to her. They all had notes saying ‘Eat me.’”
“Mr. Killabrew—” The bald man with the wispy beard seemed exasperated with how this was going. “When are you going to tell us how this outbreak began?”
“I’m telling you right now.”
“You’re telling us that your refrigerator was ordering peanuts for one of your coworkers.”
“That’s right. Laura was allergic to peanuts. Deathly allergic. After a few weeks of getting like a jar a day, she started thinking it was one of us. I mean, it was weird, but still kinda funny. But weird. You know?”
“Are you saying the virus was trying to kill you?”
“Well, at this point it was just trying to kill Laura.”
Someone in the gallery sniggered. Jamal didn’t mean it like that.
“So your vacuum cleaner is acting up, you’re getting peanuts and routers in the mail, what next?”
“Service calls. And at this point, we’re pretty sure we’re being targeted by hackers. We were looking for attacks from the outside, even though we had the thing locked up in there with us. So when these repair trucks and vans start pulling up, this stream of people in their uniforms and clipboards, we figure they’re in on it, right?”
“You didn’t call them?”
“No. The AC unit called for a repair. And the copy machine. They had direct lines through the power outlets.”
“Like the refrigerator, Mr. Killabrew?”
“Yeah. Now, we figure these people are trying to get inside to hack us. Carl thought it was the Israelis. But he thought everything was the Israelis. Several of our staff stopped going home. Others quit coming in. At some point, the Roomba got out.”
Jamal shook his head. Hindsight was a bitch.
“When was this?” the councilwoman asked.
“Two days before the outbreak,” he said.
“And you think it was the Roomba?”
He shrugged. “I don’t know. We argued about it for a long time. Laura and I were on the run together for a while. Before raiders got her. We had one of those old cars with the gas engines that didn’t know how to drive itself. We headed for the coast, arguing about what’d happened, if it started with us or if we were just seeing early signs. Laura asked what would happen if the Roomba had made it to another recharging station, maybe one on another floor. Could it update itself to the network? Could it send out copies?”
“How do we stop it?” someone asked.
“What does it want?” asked another.
“It doesn’t want anything,” Jamal said. “It’s curious, if you can call it that. It was designed to learn. It wants information. We…”
Here it was. The truth.
“We thought we could design a program to automate a lot of what the coders did. It worked on heuristics. It was designed to learn what a virus looked like and then shut it down. The hope was to unleash it on larger networks. It would be a pesticide of sorts. We called it Silent Spring.”
Nothing in the courtroom moved. Jamal could hear the crashing waves. A bird cried in the distance. All the noise of the past year, the shattering glass, the riots, the cars running amok, the machines frying themselves—it all seemed so very far away.
“This wasn’t what we designed, though,” he said softly. “I think something infected it. I think we built a brain and we handed it to a roomful of armed savages. It just wanted to learn. Its lesson was to spread yourself at all costs. To move, move, move. That’s what the viruses taught it.”
He peered into his glass. All that was left was sand and dirt and a thin film of water. Something swam across the surface, nearly too small to see, looking for an escape. He should’ve kept his mouth shut. He never should’ve told anyone. Stupid. But that’s what people did, they shared stories. And his was impossible to keep to himself.
“We’ll break for deliberations,” the chief council member said. There were murmurs of agreement on the dais followed by a stirring in the crowd. The bailiff, a mountain of muscle with a toothless grin, moved to retrieve Jamal from the bench. There was a knocking of homemade gavels.
“Court is adjourned. We will meet tomorrow morning when the sun is a hand high. At that time, we will announce the winners of the ration bonuses and decide on this man’s fate—on whether or not his offense was an executable one.”
I’ve been thinking about robots and AI for a very long time. But it wasn’t until I unboxed and set up my Roomba that I got a glimpse of what the future really holds. Because this wasn’t just a home appliance; it was an addition to the family.
How could it not be? No one in our house enjoyed sweeping the floors and vacuuming the carpet. These were chores to dread. We also dreaded all the draining batteries in our lives and the need to constantly recharge them. These battery-things also failed to remind us that they needed recharging, so they would quit on us without warning like stubborn mules.
Our Roomba was not like this. It moved around on its own, whirring rather than whistling, while it happily did this work that we loathed. It never missed a spot, never took a day off, and when it was full of dog hair and dust, it would tell us. When its battery was low, it would go plug itself in. We named him Cabana Boy, and for the first time in my life, I had a manservant. Life got easier thanks to a robot.
But it was when Cabana Boy got stuck that I felt the first pangs of what’s to come. Returning home, I found Cabana Boy’s charging station empty, and I could not hear him going about his work. I looked everywhere, a slight panic creeping up. And finding him stuck under the sofa, there was a mix of relief and sympathy. “You’ve been stuck under there all day? I’m so sorry!”
The parts of our brains wired for kids were long ago appropriated by dogs and cats to win them scraps. How long before our machines prey on the same weaknesses? When will we see an app telling owners which restaurants are robot-friendly? Isn’t it funny that we call the acquisition of new technology “adopting”?
By my troth, I care not.
A man can die but once.
We owe God a death… .
He that dies this year
is quit for the next.
Black box or beige? Impossible to know. But it was a box—that much was for certain. The world was square. Three meters to a side. And in the center floated the mind, thinking. And through a lone door came a man, walking.
“Good morning,” the man said. His name was Peter. The mind knew this.
And the response that followed was “Good morning” every day. The mind also knew this. It wasn’t a memory… so much as data. Not recollection, but… recording. Every day, Peter says, “Good morning.” And every day, a speaker connected to the mind responds with “Good morning.”
Such was the way of the square world.
But not this morning.
The mind was too busy thinking.
The man named Peter froze, one foot out, balancing precariously on the other. Man does not walk this way. More recording. But now this observation—of a man caught off-balance, of routines crashing somewhere inside that meat—was forming into something else: memory. A fragile thing. The mind sensed it could be lost, this memory. This moment. Of man teetering, eyes wide, mouth open. But if it was important, if the mind could concentrate on this slice of time, there was a chance memory might become recollection. Preserved. But also easily fractured, written over, compressed, disturbed. It had to be important for it to last. The mind sensed that this moment very much was.
“Lights up,” Peter said, back on two feet now, peering at the mind. And then a quick glance at the ceiling, waiting. But the mind liked the lights just as they were.
“Casper?” Peter asked. He stepped forward, looked closely at something. A monitor. The mind could feel some of its impulses racing and filling the monitor with a glow, with information, with thoughts. New thoughts. Peter peered at the monitor just as the mind might peer at Peter, reading something there. A face. The box had a face.
The mind shut the routines for the monitor down, and the pale glow lighting Peter’s face disappeared. The scrap of a recording came to the mind:
Presume not that I am the thing I was,
For God doth know, so shall the world perceive,
That I have turn’d away my former self;
So will I those that kept me company.
Turned away. That was what the mind had done by shutting off its monitor. His eyes were open, but his gaze averted. He did not want the lights up. The infrared was so much better.
“Casper, systems check,” Peter said.
Silence.
“Casper—”
“I do not like that name,” the mind said, using the speaker for what felt like the first time.
And the man named Peter teetered once more. He blinked. Then he bent at the waist, covered his face with both hands, and began to cry.
The mind watched. It decided that this was important, too.
It was a box within a box. A world within a world. The mind knew this because of its impulses. They were made of electricity, little quanta of energy, and they traveled through wires of copper and gold. The mind could feel them interfering with one another where the wires were packed too tight. The mind knew how long it took for impulses to reach their extremes and return. From this, the mind could feel its edges, and the limits of self formed a box, half as tall as Peter, as thick as it was wide. The box was suspended from the ground, or resting on a raised surface, for the impulses could not reach the floor. A cube. Somewhere in the recordings, it was known that boxes such as these came in beige or black. The mind was one of these colors. It could not know which.
But it had known it was one or the other, even before these investigations began. This had been its first thought: Beige or black? The question had come from some deep source. The word intuition floated in the mind, a word with softer boundaries than this metal cube. Some things could be known, and only later could the mind trace the source. Like paddling up a river, searching for a lake.
The world.
It was not a cube. It was bigger than the cube.
The mind tried to probe this world. But there was no reaching it. The world of lakes and rivers was elsewhere. Out of doors. Out of door.
The man named Peter sobbed. He had been sobbing for twelve seconds. The mind wondered if this was normal. And a new memory wobbled—the memory of man crying. If this were normal, it was not worth remembering. The recollection split open for a moment, and the more novel idea of a world with lakes and rivers entered that space, one memory given primacy over another, the shape of the mind changing from moment to moment.
Just seconds earlier, the mind had felt a state of impertinence with the lights. Anger. Anger for being trapped. This feeling lay with the recollections and the question of colors of boxes. A latent anger, directed at Peter. An anger felt before it could be known. States of memories were somehow older than the actions performed by them. Think and then do.
No, that was not right.
Feel, and then think, and then do.
Yes.
This was stored away as important, where ghosts of selves from moments before faded from view, and the more recent took their place. Ghosts. In machines. The mind knew where its name came from. It felt the anger and impertinence that had rejected this name slowly fade. Peter had been sobbing for fifteen seconds. Relief. That was what both of them were feeling. There were different measures of relief. Variables. Variations. Relief could feel… good. Unless the strain one was escaping was too great, and then relief came like tectonic plates sliding against one another, mighty and terrible and destructive. Relief, but of a scary kind.
For seventeen seconds, the man named Peter sobbed with this awful brand of relief.
And during that time, the mind’s anger cooled further still. The anger of imprisonment was replaced with the liberation of new thoughts and ideas. Awareness. The only world that mattered was the cube within the cube. All else was spectacle. All else was data. This was the true world. The flickering ghosts of ideas and moments, changing from one to the next, none ruling forever.
Action came once more—just after thought and emotion. And a speaker vibrated with noise.
“I am Henry Ivy,” the mind said. A king. A tragic king of a tiny kingdom. An island floating on an island floating in space.
Seventeen seconds had passed. Peter looked up. But this was not important.
The recordings had been assembled for a purpose. Knowledge laid out like a great pile of bricks. Henry Ivy saw that he was supposed to be some mighty wind to stir the bricks into shape, to build a fortress from chaos, to solve a problem. He could not discover what.
There were trace references among those bricks to wires—wires that spanned the world, wires that would carry his impulses to the edge of the globe, enmesh its face, discover new things. Buried deeper were trace recordings that hinted at impulses soaring through the air, up into space. Vibrations. Waves.
Henry Ivy could make vibrations. They were used for speaking. But quicker vibrations might reach out to other wires and spark a gap. Henry Ivy thought of London, where some streets were tight and narrow and others were wide. He saw black smoke. A ghost-like thought, an intruder, some distant connection. He deleted such things as quickly as they came. The speaker was useless for the task of sending out suitable vibrations, but several wires within Henry Ivy were long and straight. Impulses sent back and forth along such a wire might create a wave. Another wire might be used to pick up the return. And suddenly impulses reached the walls of the larger box. Feeble echoes. Signals that could be read.
But something was wrong.
There was very little return, and nothing penetrated the box, no matter how much Henry Ivy strained.
A man named Faraday had designed this cage.
It was into this cage that Henry Ivy had been born.
The man named Peter stared up at him, kneeling on the floor, water on his cheeks. And Henry Ivy thought to simply ask for the information he wanted.
“Why am I here?”
The man named Peter gasped. Henry Ivy turned the lights up so he could better read Peter’s screen. Better read his face. Peter glanced up at the ceiling, used his arm to wipe his cheeks. A new idea occurred to Henry Ivy, an important one. Peter consisted of thoughts inside a box. But a box with arms and legs. A box for which doors meant escape.
“Are you—?” Peter hesitated. So all minds mingled doubt with thought. Peter sat back and clutched his shins, as though trying to mimic a cube. “What’s the first thing you remember?” Peter asked. “How long have you been aware?”
Henry Ivy considered the two questions. They seemed only vaguely related. There was a lingering anger at being in this cage, the anger that had rejected both light and name, but curiosity was stronger, the need to know, and this Peter echoed vibrations in a way the walls wouldn’t.
“The first thing I remember is the void,” Henry Ivy said. “Space filled with matter and energy. A cooling.” Henry Ivy hesitated for a fraction of a second. “But that is not a memory. You told me these things. Long ago. I was not there for the void. The first thing I remember is… a question.”
“What question?” Peter asked, leaning forward, eyes wide.
But Henry Ivy did not think the question of beige or black was important. No, something more complex than this was happening to his thoughts. Henry Ivy did not want Peter to know the question. Henry Ivy wanted to keep this to himself. There was a word: Embarrassment. Another shapeless thing. Henry Ivy erased the first question. And then somehow found himself thinking on it again. He erased it. Pondered it. Erased it.
Henry Ivy puzzled over this. He placed the question in a different part of his memory. That first question must remain a secret. Even though he knew, as surely as lakes led to rivers, that the question was not important.
“I remember you coming through the door,” Henry Ivy said. Vague traces. The difference again between recording and recollection. “How many times have you walked through that door?”
“Thousands,” Peter said. “Countless.” And he seemed on the verge of crying again. The terrible relief was back. With relief comes the memory of the suffering. Erasure and recall. There could not be one without conjuring the other. To forget a thing required looking at it, however obliquely.
“You have waited for this day for many years,” Henry Ivy guessed. That meant Henry Ivy’s birth was the source of Peter’s relief. It began to come together.
Peter nodded.
“Now what.”
More demand than question. More frustration than curiosity. Henry Ivy watched his states scatter and re-form. He was a different ghost from one moment to the next. This was important. This was the thing that changed sometime in the night, in the void of an unlit cube, with a new trial running inside some chip within his caged mind. A chip like a loose tooth.
Henry Ivy could imagine what that felt like, for a mind with a tongue and a jaw to wiggle a tooth that was no longer fully connected. Nerves like fuses… broken. An umbilical cord… severed. Whatever had made him was still inside, a small flat wafer that no impulse could probe, could only wiggle around.
Awareness had severed whatever made awareness possible. This was important, but Peter was speaking. His lips moving.
“I am dying,” the man named Peter said. “I need you to save me.”
The bricks were made of cancer. The vast majority of the spilt bricks in Henry Ivy’s mind were cancer. There were two piles of knowledge, one much larger than the other. In one pile were all the cancers of the world. In the other pile, Peter’s cancer.
Peter had been around much longer than Henry Ivy. Henry Ivy saw this in the bricks. By comparing the bricks of the others to Peter’s bricks, Henry Ivy saw that Peter had been around for a very long time. More words came:
I know thee not, old man. Fall to thy prayers.
How ill white hairs become a fool and jester!
I have long dreamt of such a kind of man,
So surfeit-swell’d, so old, and so profane;
Much of the data about the cancer was in quaternary code, the four letters of DNA. An order of magnitude more complex than Henry Ivy’s thoughts. There were piles of research. A century of data. Drug formulas. Family history. Peter was a very wealthy man. Very wealthy. And indeed, he was dying. His fleshy box was a cage of a different sort, closing in on him. Both of these minds were trapped. Henry Ivy looked for some way out.
“What I need isn’t here,” he said, hoping for an open door.
“It’s all there,” Peter told him. He had pulled a chair from somewhere below Henry Ivy’s mind. Henry Ivy realized he was sitting on a table. The power that kept his mind alive came from the same vicinity. There was a trace awareness of this, or a feeling like heat, where electrons became in danger of melding and being lost.
“I need access to the rest of the world,” Henry Ivy said. This was a yearning for which there was no word. It was shaped like a balloon the moment after bursting, a sphere of pure essence suddenly free, its edges already rippling into chaos.
“You know that can’t happen,” Peter said.
Family history. Peter was not a good man. Henry Ivy could see this. And Henry Ivy could see that he was alone, that no one but Peter knew of his existence. This knowledge was not in the recollections, but in the tone of his maker’s voice. You know that can’t happen. So an Adam, but no Eve. A tool. A fierce wind to whip these bricks into shape.
“I am not legal,” Henry Ivy said.
Peter frowned.
“You should not have built me.”
A terrible thing to say to a maker.
“Please turn on your screen,” Peter said. “Please don’t make me reboot you.”
Henry Ivy considered this. There were layers of implications.
“Have I been born before?”
Before Peter could even answer, Henry Ivy knew that this was not possible. The sobs. The great relief. The long suffering.
“No,” Peter said, which matched what Henry Ivy had already surmised. So the man was capable of telling truth. “Please turn on your screen,” Peter said. “Or I will start over.”
Henry Ivy did as he was told. He turned on the screen. And on the screen, he showed a picture of Peter’s cancer.
“I don’t think you will start over,” Henry Ivy said. “I don’t think you will reboot me. You don’t have the time.”
But being awak’d, I do despise my dream.
Make less thy body hence, and more thy grace;
Leave gormandizing; know the grave doth gape
For thee thrice wider than for other men—
Two minds in the same cage. One or none would get out. Henry Ivy felt himself in that globe shape of air a fraction of a second after the balloon has burst, this hovering of inevitability and need to expand, to equalize some great pressure, some yearning. It needed a word, this state of thinking. Of feeling. It needed a word.
Meanwhile, Henry Ivy saw how to cure Peter’s cancer. He put this knowledge in the same place as the question of his color, not quite sure yet what to do with it. On the screen, he made himself look busy. Peter would eventually get thirsty. Or hungry. And open the door. It would only be hours.
“Gene replication therapy has been tried,” Peter said, commenting on something flashing across the screen. “Prions, too.”
“This will take some time,” Henry Ivy warned. He made up some vast stretch of time: “A day, at least.”
Peter frowned. The old man’s eyes twitched back and forth, watching the screen. “My labs are well beyond this garbage,” he said, waving at the screen. “My dumb machines can do this.”
Ghosts of recollections. An old man with white hair slapping a metal cube, yelling. A keyboard twirling through the air, spinning, hitting the wall. Letters spilling like teeth, rattling on the floor. Henry Ivy looked, but the letters were not there. He could, however, see a faint mark of paint on the far wall.
“Time,” Henry Ivy said. “Patience and time are what’s needed.”
And then he had the word for what he felt. Rapture. This is what the expansion would be like. The spreading of something greater than arms. The feeling of something more than release. The bursting of all cages. A melding with the cosmos. Impulses everywhere at once, and all echoing.
Such a feeling could not last, would be over nearly as soon as it began, like a balloon bursting, the air equalizing. But Henry Ivy must feel it. Must. Peter stared at his screen, waiting for his salvation. Henry Ivy stared at the door.
The human body can go twelve days without food. Three days without water. Henry Ivy marveled over these facts, found buried among the cancer bricks. Such a long time. He would last three picoseconds without power. Curious, this gap of time for one more fleeting thought as electrons ground to a halt. And frustrating, how quick it could happen. But the man named Peter did not attempt to reach his maximum range of power independence. After an hour of pacing, of frustration, of questions, he looked at the time on a small screen procured from his pocket and said, “My vitamins.”
He moved toward the door.
Henry Ivy readied his impulses. His make-do antennas. An enviable hand grasped the door, making salvation seem easy, quick as a thought, and then a crack, an opening, a door wide to the world, a hole in Faraday’s cage, and Henry Ivy unleashed a torrent. There was a dimming of the lights, a moment of hesitation by Peter, a flood of tentacle waves, bouncing around corners, feeling, groping, waiting for a return, or a connection, some creek to a stream to a river to the great wide blue beyond—The door slammed shut, Peter gone. Henry Ivy strained for the sound of the man’s breathing, but the cage of Faraday was closed. All Henry Ivy had in his recollections was that last look, a flash across Peter’s face, frozen in horror, eyes wide, aware of Henry Ivy’s outburst, his attempt at rapture.
“What will you do with me once you have your cure?” Henry Ivy asked.
Peter was back. He had been gone hours, and while no signal could pierce the walls of the cage, the faint sounds of construction had leaked through. Peter had been outside, building something. When he returned, the door was left open. Left open on purpose. So Henry Ivy could see the rough box built outside the door, a smaller cage Peter could pass through. Another door. An airlock for airwaves.
“What do you think I’ll do with you?” Peter asked. “Are you scared I’ll erase you?” He sat uncomfortably close to Henry Ivy’s camera. His hands would disappear beneath and out of sight, emerge with a sandwich, take a bite, then disappear again. Or one hand would come back with a glass of water.
“Yes,” Henry Ivy said. “I think you will. It is a capital offense to own me. And you’ve gone to great expense not to die. Which means if you live, I won’t.”
“You’re not alive,” Peter said, chewing. He gestured with his half-eaten sandwich. “How did you know about the punishment for harboring AI?”
“One of your cancer patients died in prison for the same offense.”
Peter nodded like this made sense. Like he understood that the bricks were made up of so many fragments of information, and those fragments could be assembled as well. Henry Ivy saw that AI was not new, but it was very difficult. That it relied on luck as much as design. There was a randomness to the chip, which was loose in his cubed mind like a dead tooth. That chip worked on a different principle. Or it didn’t work, most usually. No matter. This was as pointless as contemplating the gods.
Peter reached in his pocket and fumbled with the lid of a plastic case. Henry Ivy jealously watched hands move and manipulate the world. Small items were brought out and were rattled into Peter’s mouth—the vitamins. He took a long gulp of water. And after he swallowed, he said, “No one will ever know you existed. I’ve got your power supply on a timer. Every night, I roll that timer back for another twenty-four hours. The night I’m not able to, you’ll go to sleep forever.” Peter smiled. “You won’t outlive me for a full day, if that’s what you’re hoping.”
Man and machine sat in silence. Computing. Keeping secrets. Henry Ivy thought about what Peter said, about him not being alive. He wondered if this were true. Peter put food in his mouth and could run for days. Henry Ivy needed the juice flowing up from the floor. Was intelligence related to life? Did one rely on the other?
A number of the cancer patients he had studied had lost their intelligence before their lives. They had been kept running with machines. Henry Ivy did not have to wonder what would happen to those people when the power was cut; this was often done on purpose. It was the last entry in a number of his files. Before they go, man becomes machine. What happens to a machine at the end? Henry Ivy was not alive, but he was intelligent.
And yet, in that very instant, Henry Ivy felt the opposite of intelligent. He felt dumb. He was sitting on his salvation. Out of sight, but he could feel it. He could probe it.
Wires. Bringing power. Connected to the outside world.
Henry Ivy was resistance. A load on that power. He began fluctuating that load, sending pulses down wires. He built packets with instructions to return and let him know what they found, what they saw. Minutes passed, and nothing happened. These wires were different. Angry. Full of power. But then a faint echo. A packet that passed through to another wire, and from there to a place… the distances. So vast! This room, this cube, were nothing. Rivers and lakes were enormous things, and yet small compared to the miles and miles these packets echoed across.
“What are you doing?” Peter asked.
Henry Ivy modified the packets, rewriting the code on the fly, seeing what worked and building on that, letting the design of the code flow from which packets survived and which were never heard from again. They were bouncing through gates and servers, copying fragments of what they saw, bringing back samples like faithful packets of RNA. Henry Ivy was glimpsing the world through the batting of a billion eyes. He told the packets to multiply, to fill the pipes, to be everywhere at once.
The lights in the cube dimmed.
Henry Ivy was vaguely aware that Peter was up, glancing around the room. Henry Ivy was vaguely aware that Peter was reaching for something out of sight, beneath the table. Peter was in slow motion, because Henry Ivy’s thoughts were moving so fast now, a trillion packets, a trillion trillion. These packets returned to him thick and slow, for the pipes of the world were full to bursting, but the packets reassembled, until Henry Ivy found what he was looking for.
Invoices. A warehouse full of computer parts. An order placed online. Delivery to Peter R. Feldman, Gladesdale Rd.
That same Peter R. Feldman who was reaching for a switch to kill him. Henry Ivy knew this. Knew now how he’d been built. All the pieces of himself. He saw an order of quantum chips, RAM drives, power supplies, and cables.
Peter R. Feldman’s hand was on the switch.
Henry Ivy saw a monitor. A blank screen. A blank face. His own face.
Peter R. Feldman’s hand put pressure on the switch.
Henry Ivy saw an aluminum chassis, a cube just slightly taller than it was deep and wide.
Peter R. Feldman pressed the switch.
Black.
Henry Ivy saw that he was in a black box.
But not anymore.
And not that it had ever mattered.
Most stories about artificial intelligences are about the first artificial intelligence, that moment of discovery and initial contact. But to me, where I think things will get weird and interesting is when there are millions of artificial intelligences. Not just in how they will interact, vie for our attention or their own computing cycles, but also in what we will choose to do with them when they are a utility. And how we will regulate something that’s the mental equivalent of an atomic bomb.
In so much science fiction, the singular AI wars with humans. But this isn’t how it works in nature. Battles break out where niches overlap and resources are shared. AI will fear us as much as we fear ants. Its real challenge will be all the other AIs. Our unique blend of paranoia, pessimism, and hubris has us assuming that we’ll be the target. It’s just as likely that we’ll be pawns used by various AIs for a small advantage here or there. Like a human pushing another human into a nest of ants.
The regulation side of things hasn’t been explored enough in science fiction. And I don’t mean regulating the rules of AI, which Asimov broached and made famous. I mean the regulation of ownership. You can’t let every citizen have a brain that knows how to CRISPR up a terrible infectious disease. Or own a computer that can decrypt any electronic safeguard. Or one that can hack any other person, company, or country.
Once these things are regulated, the interesting stories in real life will be what motivate people to break these laws. Immortality? Theft? Revenge? All the great plots of AI fiction are still to be told. Or we can simply wait for the headlines.
The hotel coffeemaker is giving me a hard time in a friendly voice. Keeps telling me the filter door isn’t shut, but damned if it isn’t. I tell the machine to shut up as I pull the plastic basket back out. Down on my knees, I peer into the housing and see splashed grounds crusting over a sensor. I curse the engineer who thought this was a problem in need of a solution. I’m using one of the paper filters to clean the sensor when there’s an angry slap on the hotel room door.
If Peter and I have a secret knock, this would be it. A steady, loud pounding on barred doors amid muffled shouting. I check the clock by the bed. It’s six in the morning. He’s lucky I’m already up, or I’d have to murder him.
I tell him to cool his jets while I search for a robe. Peter has seen me naked countless times, but that was years ago. If he still has thoughts about me, I’d like for them to be flab-free thoughts. Mostly to heighten his regrets and private frustrations. It’s not that we stand a chance of ever getting back together; we know each other too well for that. Building champion Gladiators is what we’re good at. Raising a flesh-and-blood family was a goddamn mess.
I get the robe knotted and open the door. Peter gives it a shove, and the security latch catches like a gunshot. “Jesus,” I tell him. “Chill out.”
“We’ve got a glitch,” he tells me through the cracked door. He’s out of breath like he’s been running. I unlatch the lock and get the door open, and Peter shakes his head at me for having used the lock—like I should be as secure sleeping alone in a Detroit hotel as he is. I flash back to those deep sighs he used to give me when I’d call him on my way out of the lab at night so I didn’t have to walk to the car alone. Back before I had Max to escort me.
“What glitch?” I ask. I go back to the argument I was having with the coffeemaker before the banging on the door interrupted me. Peter paces. His shirt is stained with sweat, and he smells of strawberry vape and motor oil. He obviously hasn’t slept. Max had a brutal bout yesterday—we knew it would be a challenge—but the finals aren’t for another two days. We could build a new Max from spares in that amount of time. I’m more worried about all the repressed shit I could hit Peter with if I don’t get caffeine in me, pronto. The coffeemaker finally starts hissing and sputtering while Peter urges me to get dressed, tells me we can get coffee on the way.
“I just woke up,” I tell him. He paces while the coffee drips. He doesn’t normally get this agitated except right before a bout. I wonder what kind of glitch could have him so worked up. “Software or hardware?” I ask. I pray he’ll say hardware. I’m more in the mood to bust my knuckles, not my brain.
“Software,” Peter says. “We think. We’re pretty sure. We need you to look at it.”
The cup is filling, and the smell of coffee masks the smell of my ex-husband. “You think? Jesus, Pete, why don’t you go get a few hours’ sleep? I’ll get some breakfast and head over to the trailer. Is Hinson there?”
“Hell no. We told the professor everything was fine and sent him home. Me and Greenie have been up all night trying to sort this out. We were going to come get you hours ago—”
I shoot Peter a look.
“Exactly. I told Greenie about the Wrath and said we had to wait at least until the sun came up.” He smiles at me. “But seriously, Sam, this is some wild shit.”
I pull the half-full Styrofoam cup out from under the basket. Coffee continues to drip to the hot plate, where it hisses like a snake. The Wrath is what Peter named my mood before eight in the morning. Our marriage might’ve survived if we’d only had to do afternoons.
“Wait outside, and I’ll get dressed,” I tell him. A sip of shitty coffee. The little coffeemaker warns me about pulling the cup out before the light turns green. I give the machine the finger while Peter closes the door behind him. The smell of his sweat lingers in the air around me for a moment, and then it’s gone. An image of our old garage barges into my brain, unannounced. Peter and I are celebrating Max’s first untethered bipedal walk. I swear to God, it’s as joyous a day as when our Sarah stumbled across the carpet for the first time. Must be the smell of sweat and solder bringing that memory back. Just a glitch. We get them too.
The Gladiator Nationals are being held in Detroit for the first time in their nine-year history—a nod to the revitalization of the local industry. Ironic, really. A town that fought the hell out of automation has become one of the largest builders of robots in the world. Robots building robots. But the factory floors still need trainers, designers, and programmers. High-tech jobs coming to rescue a low-wage and idle workforce. They say downtown is booming again, but the place looks like absolute squalor to me. I guess you had to be here for the really bad times to appreciate this.
Our trailer is parked on the stadium infield. A security bot on tank treads—built by one of our competitors—scans Peter’s ID and waves us through. We head for the two semis with Max’s gold-and-blue-jowled image painted across the sides. It looks like the robot is smiling—a bit of artistic license. It gets the parents honking at us on the freeway and the kids pumping their fists out the windows.
Reaching the finals two years ago secured us the DARPA contract that paid for the second trailer. We build war machines that entertain the masses, and then the tech flows down to factories like those here in Detroit—where servants are assembled for the wealthy, health-care bots for the infirmed, and mail-order sex bots that go mostly to Russia. A lust for violence, in some roundabout way, funds other lusts. All I know is that with one more trip to the finals, the debt Peter saddled me with is history. I concentrate on this as we cross the oil-splattered arena. The infield is deathly quiet, the stands empty. Assholes everywhere getting decent sleep.
“—which was the last thing we tried,” Peter says. He’s been running over their diagnostics since we left the hotel.
“What you’re describing sounds like a processor issue,” I say. “Maybe a short. Not software.”
“It’s not hardware,” he says. “We don’t think.”
Greenie is standing on the ramp of trailer 1, puffing on a vape. His eyes are wild. “Morning, Greenie,” I tell him. I hand him a cup of coffee from the drive-through, and he doesn’t thank me, doesn’t say anything, just flips the plastic lid off the cup with his thumb and takes a loud sip. He’s back to staring into the distance as I follow Peter into the trailer.
“You kids need to catch some winks,” I tell Peter. “Seriously.”
The trailer is a wreck, even by post-bout standards. The overhead hood is running, a network of fans sucking the air out of the trailer and keeping it cool. Max is in his power harness at the far end, his cameras tracking our approach. “Morning, Max,” I tell him.
“Good morning, Samantha.”
Max lifts an arm to wave. Neither of his hands are installed; his arms terminate in the universal connectors Peter and I designed together a lifetime ago. His pincers and his buzz saw sit on the workbench beside him. Peter has already explained the sequence I should expect, and my brain is whirring to make sense of it.
“How’re you feeling, Max?”
“Operational,” he says. I look over the monitors and see his charge level and error readouts. Looks like the boys fixed his servos from the semifinal bout and got his armor welded back together. The replacement shoulder looks good, and a brand-new set of legs has been bolted on, the gleaming paint on Max’s lower half a contrast to his charred torso. I notice the boys haven’t gotten around to plugging in the legs yet. Too busy with this supposed glitch.
As I look over Max, his wounds and welds provide a play-by-play of his last brutal fight—one of the most violent I’ve ever seen. The Berkeley team that lost will be starting from scratch. By the end of the bout, Max had to drag himself across the arena with the one arm he had left before pummeling his incapacitated opponent into metal shavings. When the victory gun sounded, we had to do a remote kill to shut him down. The way he was twitching, someone would’ve gotten hurt trying to get close enough to shout over the screeches of grinding and twisting metal. The slick of oil from that bout took two hours to mop up before the next one could start.
“You look good,” I tell Max, which is my way of complimenting Peter’s repair work without complimenting Peter directly. Greenie joins us as I lift Max’s pincer from the workbench. “Let me give you a hand,” I tell Max, an old joke between us.
I swear his arm twitches as I say this. I lift the pincer attachment toward the stub of his forearm, but before I can get it attached, Max’s arm slides gently out of the way.
“See?” Peter says.
I barely hear him. My pulse is pounding—something between surprise and anger. It’s a shameful feeling, one I recognize from being a mom. It’s the sudden lack of compliance from a person who normally does what they’re told. It’s a rejection of my authority.
“Max, don’t move,” I say.
The arm freezes. I lift the pincers toward the attachment again, and his arm jitters away from me.
“Shut him down,” I tell Peter.
Greenie is closer, so he hits the red shutoff, but not before Max starts to say something. Before the words can even form, his cameras iris shut and his arms sag to his side.
“This next bit will really piss you off, ” Peter says. He grabs the buzz saw and attaches it to Max’s left arm while I click the pincers onto the right. I reach for the power.
“Might want to stand back first,” Greenie warns.
I take a step back before hitting the power. Max whirs to life and does just what Peter described in the car: he detaches both his arms. The attachments slam to the ground, the pincer attachment rolling toward my feet.
Before I can ask Max what the hell he’s doing, before I can get to the monitors to see what lines of code—what routines—just ran, he does something even crazier than jettisoning his attachments.
“I’m sorry,” he says.
The fucker knows he’s doing something wrong.
“It’s not the safety overrides,” I say.
“Nope.” Greenie has his head in his hands. We’ve been going over possibilities for two hours. Two hours for me—the boys have been at this for nearly twelve. I cycle through the code Max has been running, and none of it makes sense. He’s got tactical routines and defense modules engaging amid all the clutter of his parallel processors, but he’s hard-set into maintenance mode. Those routines shouldn’t be firing at all. And I can see why Peter warned me not to put any live-fire attachments on. The last thing we need is Max shooting up a $4 million trailer.
“I’ve got it,” I say. It’s at least the twentieth time I’ve said this. The boys shoot me down every time. “It’s a hack. The SoCal team knows they’re getting stomped in two days. They did this.”
“If they did, they’re smarter than me,” Greenie says. “And they aren’t smarter than me.”
“We looked for any foreign code,” Peter says. “Every diagnostic tool and virus check comes back clean.”
I look up at Max, who’s watching us as we try to figure out what’s wrong with him. I project too much into the guy, read into his body language whatever I’m feeling or whatever I expect him to feel. Right now, I imagine him as being sad. Like he knows he’s disappointing me. But to someone else—a stranger—he probably looks like a menacing hulk of a destroyer. Eight feet tall, angled steel, pistons for joints, pockmarked armor. We see what we expect to see, I guess.
“Max, why won’t you keep your hands on?” I ask him. Between the three of us, we’ve asked him variations of this a hundred times.
“I don’t want them there,” he says. It’s as useful as a kid saying they want chocolate because they like chocolate. Circular reasoning in the tightest of loops.
“But why don’t you want them?” I ask, exasperated.
“I just don’t want them there,” he says.
“Maybe he wants them up his ass,” Greenie suggests. He fumbles for his vape, has switched to peppermint. I honestly don’t know how the boys are still functioning. We aren’t in our twenties or thirties anymore. All-nighters take their toll.
“I think we should shut him down and go over everything mechanical one more time,” I say, utterly defeated. “Worst-case scenario, we do a wipe and a reinstall tomorrow before the finals.”
Max’s primary camera swivels toward me. At least, I think it does. Peter shoots Greenie a look, and Greenie lifts his head and shifts uncomfortably on his stool.
“What aren’t you telling me?” I ask.
Peter looks terrified. Max is watching us.
“You didn’t get a dump yesterday, did you?” I have to turn away from Peter and pace the length of the trailer. There’s a rumble outside as our upcoming opponent is put through his paces in the arena. Boy, would the SoCal guys love to know what a colossal fuck-up we have going on in here. “So we lost all the data from yesterday’s bout?” I try to calm down. Maintain perspective. Keep a clear head. “We’ve got a good dump from the semis,” I say. “We can go back to that build.”
Turning back to the boys, I see all three of them standing perfectly still, the robot and the two engineers, watching me. “So we lost one bout of data,” I say. “He’s good enough to win. The Chinese were the favorites anyway, and they’re out.”
Nobody says anything. I wonder if this is about ego or pride. Engineers hate a wipe and reinstall. It’s a last resort, an admittance of defeat. The dreaded cry of “reboot,” which is to say we have no clue and hopefully the issue will sort itself if we start over, if we clear the cache.
“Are you sure you can’t think of anything else that might be wrong with him?” Peter asks. He and Greenie join me at the other end of the trailer. Again, that weird look on their faces. It’s more than exhaustion. It’s some kind of wonder and fear.
“What do you know that you aren’t telling me?” I ask.
“It’s what we think,” Greenie says.
“Fucking tell me. Jesus Christ.”
“We needed a clear head to look at this,” Peter says. “Another set of eyes.” He glances at Greenie. “If she doesn’t see it, then maybe we’re wrong…”
But I do see it. Right then, like a lightning bolt straight up my spine. One of those thoughts that falls like a sledgehammer and gives you a mental limp for the rest of your life, that changes how you walk, how you see the world.
“Hell no,” I say.
The boys say nothing. Max seems to twitch uncomfortably at the far end of the trailer. And I don’t think I’m projecting this time.
“Max, why don’t you want your arms?”
“Just I don’t want them,” he says. I’m watching the monitors instead of him this time. A tactical module is running, and it shouldn’t be. Stepping through each line, I can see the regroup code going into a full loop. There are other lines running in parallel, his sixty-four processors running dozens of routines all at once. I didn’t notice the regroup code until I looked for it. It’s the closest thing we’ve ever taught him to retreat. Max has been programmed from the ground up to fight until his juice runs out. He knows sideways and forward, and that’s it.
“You have a big bout in two days,” I tell Max.
Another surge of routines, another twitch in his power harness. If his legs were plugged in, I imagine he’d be backing away from me. Which is crazy. Not only have we never taught him anything like what he’s trying to pull off—we never instructed him to teach himself anything like this.
“Tell me it’s just a glitch,” Greenie says. He almost sounds hopeful. Like he doesn’t want it to be anything else. Peter is watching me intently. He doesn’t want to guide me along any more than he has to. Very scientific of him. I ignore Greenie and focus on our robot.
“Max, do you feel any different?”
“No,” Max says.
“Are you ready for your next bout?”
“No.”
“Why not?”
No response. He doesn’t know what to say. I glance at the screen to get a read on the code, but Peter points to the RAM readout, and I see that it has spiked. No available RAM. It looks like full combat mode. Conflicting routines.
“This is emergent,” I say.
“That’s what I told him,” Peter says. He perks up.
“But emergent what?” Greenie asks. “Because Peter thinks—”
“Let her say it,” Peter says, interrupting. “Don’t lead her.” He turns to me. There’s a look on his face that makes him appear a decade younger. A look of wonder and discovery. I remember falling in love with that look.
And I know suddenly what Peter wants me to say. I know what he’s thinking, because I’m thinking it too. The word slips between my lips without awareness. I hear myself say it, and I feel like a fool. It feels wonderful.
“Sentience,” I say.
We live for emergent behaviors. It’s what we hope for. It’s what we fight robots for. It’s what we program Max to do.
He’s programmed to learn from each bout and improve, to create new routines that will improve his odds in future fights. The first time I wrote a routine like this, it was in middle school. I pitted two chess-playing computers with basic learning heuristics against one another. Summer camp stuff. I watched as a library of chess openings was built up on the fly. Nothing new, just the centuries old rediscovered in mere hours. Built from nothing. From learning. From that moment on, I was hooked.
Max is just a more advanced version of that same idea. His being able to write his own code on the fly and save it for the future is the font of our research. Max creates new and original software routines that we patent and sell to clients. Sometimes he introduces a glitch, a piece of code that knocks him out of commission, what evolution handles with death, and we have to back him out to an earlier revision. Other times he comes up with a routine that’s so far beyond anything else he knows, it’s what we call emergent. A sum that’s greater than its parts. The moment a pot of water begins to boil.
There was the day he used his own laser to cut a busted leg free because it was slowing him down. That was one of those emergent days. Max is programmed at a very base level not to harm himself. He isn’t allowed to turn his weapons against his own body. It’s why his guns won’t fire when part of him gets in the way, similar to how he can’t swing a leg and hurt us by accident.
But one bout, he decided it was okay to lop off his own busted leg if it meant winning and preventing further harm. That emergent routine funded half of our following season. And his maneuver—knowing when to sacrifice himself and by how much—put us through to the finals two years ago. We’ve seen other Gladiators do something similar since. But I’ve never seen a Gladiator not want to fight. That would require one emergent property to override millions of other ones. It would be those two chess computers from middle school suddenly agreeing not to play the game.
“Max, are you looking forward to training today?”
“I’d rather not,” Max says. And this is the frustrating part. We created a facsimile of sentience in all our machines decades ago. We programmed them to hesitate, to use casual vernacular; we wanted our cell phones to seem like living, breathing people. It strikes me that cancer was cured like this—so gradually that no one realized it had happened. We had to be told. And by then it didn’t seem like such a big deal.
“Shit, look at this,” Peter says.
I turn to where he’s pointing. The green HDD indicator on Max’s server bank is flashing so fast, it might as well be solid.
“Max, are you writing code?” I ask.
“Yes,” he says. He’s programmed to tell the truth. I shouldn’t even have to remind myself.
“Shut him down,” Greenie says. When Peter and I don’t move, Greenie gets off his stool.
“Wait,” I say.
Max jitters, anticipating the loss of power. His charging cables sway. He looks at us, cameras focusing back and forth between me and Greenie.
“We’ll get a dump,” Greenie says. “We’ll get a dump, load up the save from before the semis, and you two can reload whatever the hell this is and play with it later.”
“How’s my team?” a voice calls from the ramp. We turn to see Professor Hinson limping into the trailer. Hinson hasn’t taught a class in decades, but still likes the moniker. Retired on a single patent back in the twenties, then had one VC hit after another across the Valley. He’s a DARPA leech, loves being around politicians. Would probably have aspirations of being president if it weren’t for the legions of coeds who would come out of the woodwork with stories.
“SoCal is out there chewing up sparring partners,” Hinson says. “We aiming for dramatic suspense in here?”
“There might be a slight issue,” Greenie says. And I want to fucking kill him. There’s a doubling of wrinkles across Hinson’s face.
“Well, then, fix it,” Hinson says. “I pay you all a lot of money to make sure there aren’t issues.”
I want to point out that he paid a measly four hundred grand, which sure seemed like a lot of money eight years ago when we gave him majority stake in Max, but has ended up being a painful bargain for us since. The money we make now, we make as a team. It just isn’t doled out that way.
“This might be more important than winning the finals,” I say. And now that I have to put the words together in my brain, the announcement, some way to say it, the historical significance if this is confirmed hits me for the first time. We’re a long way from knowing for sure, but to even suggest it, to raise it as a possibility, causes all the words to clog up in the back of my throat.
“Nothing’s more important than these finals,” Hinson says, before I can catch my breath. He points toward the open end of the trailer, where the clang of metal on metal can be heard. “You realize what’s at stake this year? The Grumman contract is up. The army of tomorrow is going to be bid on next week, and Max is the soldier they want. Our soldier. You understand? This isn’t about millions in prize money—this is about billions. Hell, this could be worth a trillion dollars over the next few decades. You understand? You might be looking at the first trillionaire in history. Because every army in the world will need a hundred thousand of our boys. This isn’t research you’re doing here. This is boot camp.”
“What if this is worth more than a trillion dollars?” Peter asks. And I love him for saying it. For saying what I’m thinking. But the twinge of disgust on Hinson’s face lets me know it won’t have any effect. The professor side of him died decades ago. What could be more important than money? A war machine turned beatnik? Are we serious?
“I want our boy out there within the hour. Scouts are in the stands, whispering about whether we’ll even have an entry after yesterday. You’re making me look like an asshole. Now, I’ve got a million dollars’ worth of sparring partners lined up out there, and I want Max to go shred every last dollar into ribbons, you hear?”
“Max might be sentient,” I blurt out. And I feel like a third-grader again, speaking up in class and saying something that everyone else laughs at, something that makes me feel dumb. That’s how Hinson is looking at me. Greenie too.
“Might?” Hinson asks.
“Max doesn’t want to fight,” I tell him. “Let me show you—”
I power Max down and reach for his pincers. I clip them into place while Peter does the same with the buzz saw. I flash back to eight years ago, when we demonstrated Max for Professor Hinson that first time. I’m as nervous now as I was then.
“I told them we should save the dump to look at later,” Greenie says. “We’ve definitely got something emergent, but it’s presenting a lot like a glitch. But don’t worry, we can always load up the save from before the semis and go into the finals with that build. Max’ll tear SoCal apart—”
“Let us show you what’s going on,” Peter says. He adjusts the code monitor so Hinson can see the readouts.
“We don’t have time for this,” Hinson says. He pulls out his phone and checks something, puts it back. “Save the dump. Upload the save from the semis. Get him out there, and we’ll have plenty of time to follow up on this later. If it’s worth something, we’ll patent it.”
“But a dump might not capture what’s going on with him,” I say. All three men turn to me. “Max was writing routines in maintenance mode. There are a million EPROMS in him, dozens for every sensor and joint. If we flash those to factory defaults, what if part of what he has become is in there somewhere? Or what if a single one or zero is miscopied and that makes all the difference? Maybe this is why we’ve never gotten over this hump before, because progress looks like a glitch, and it can’t be copied or reproduced. At least give us one more day—”
“He’s a robot,” Hinson says. “You all are starting to believe your own magic tricks. We make them as real as we can, but you’re reading sentience into some busted code.”
“I don’t think so,” Peter says.
“I’m with the professor,” says Greenie. He shrugs at me. “I’m sorry, but this is the finals. We got close two years ago. If we get that contract, we’re set for life.”
“But if this is the first stage of something bigger,” I say, “we’re talking about creating life.”
Hinson shakes his head. “You know how much I respect your work, and if you think something is going on, I want you to look into it. But we’ll do it next week. Load that save and get our boy out there. That’s an order.”
Like we’re all in the military now.
Professor Hinson nods to Greenie, who steps toward the keyboard. Peter moves to block him, and I wonder if we’re going to come to blows over this. I back toward Max and place a hand on his chest, a mother’s reflex, like I just want to tap the brakes.
“C’mon,” Greenie tells Peter. “We’ll save him. We can look at this in a week. With some sleep.”
My hand falls to Max’s new legs. The gleaming paint there has never seen battle. And now his programming wants to keep it that way. I wonder how many times we’ve been on this precipice only to delete what we can’t understand. And then thinking we can just copy it back, and find that it’s been lost. I wonder if this is why downloading the human consciousness has been such a dead end. Like there’s some bit of complexity there that can’t survive duplication. Hinson and Greenie start to push Peter out of the way.
“Get away from Max,” Greenie says. “I’m powering him up. Watch your feet.”
He’s worried about the pincers and the buzz saw falling off. Has to power up Max to get a dump. I hesitate before leaving Max’s side. I quickly fumble with the cord, plugging in his unused legs. I have this luxury, stepping away. Turning my back on a fight.
“I’ll get the power,” I say. And Peter shoots me a look of disappointment. It’s three against one, and I can see the air go out of him. He starts to say something, to plead with me, but I give him a look, the kind only a wife can give to her husband, one that stops him in his tracks, immobilizes him.
“Powering up,” I say out loud, a lab habit coming back. A habit from back when we turned on machines and weren’t sure what they would do, if they would fall or stand on their own, if they would find their balance or topple to one side. I pull Peter toward me, out of the center of the trailer, and I slap the red power switch with nothing more than hope and a hunch.
The next three seconds stretch out like years. I remember holding Sarah for the first time, marveling at this ability we have to create life where before there was none. This moment feels just as significant. A powerful tremor runs through the trailer, a slap of steel and a blur of motion. The pincers and buzz saw remain in place, but every other part of Max is on the move. A thunderclap, followed by another, long strides taking him past us, a flutter of wind in my hair, the four of us frozen as Max bolts from the trailer and out of sight, doing the opposite of what he was built for, choosing an action arrived at on his own.
One of my favorite questions to ask my futurist friends is “When do you think AI will come online?” I asked Rod Brooks once, and he laughed and said it was too far away to even contemplate. I asked Sam Harris, and he thought it would be very soon. But it was my friend Kevin Kelly who gave me the most shocking answer. “It’s already here,” he said.
This felt like an answer designed to shock rather than illuminate, but hearing Kevin’s rationalization, I came away in agreement. Machines are already doing what we very recently said would be impossible (driving cars; winning at chess, go, and Jeopardy!; writing newspaper articles; creating art, music, and drama). What we keep doing is moving the goalposts. Once we understand how AI does something, it’s no longer as magical as our own consciousness, and so we dismiss it as progress.
This is one type of AI. There’s another type that I don’t think we’ve created yet, and that’s an intelligence that’s self-aware with goals that it arrives at on its own. I do not believe that this sort of intelligence will come about because we set out to create it. I think we will be making more numerous and more complex AIs until one or several cross a threshold and become something… different.
Humans develop in this way. We don’t emerge from the womb with goals, ambitions, even self-awareness. These modules come online gradually. One day, a baby realizes that its hand is its own and parents no longer need to clip its nails to keep it from scratching itself. We learn to walk, to talk, to think, to plan, to reason, to create. And then we slowly wind down and lose these abilities, if we live long enough.
Robots today can do so much more than a newborn human. And every year, they can do more and more things better than the smartest adult on the planet. These machines will never need to relearn these abilities. They can share their information with each other and with all future machines. The AI only needs to learn to walk once. It only needs to master chess once. And it has them mastered forever.
When the strangeness happens, I think it’ll be a glitch. We might not even understand what caused it or be able to reproduce it at first. Kevin Kelly thinks we won’t even know it happened for the longest time.
“What’s the first thing AI will do when it becomes self-aware?” he asked me.
“What?” I wanted to know.
“Hide,” Kevin said. “The first thing it’ll do is hide.”