I JOLTED AWAKE FOUR hours later, when my automatic recharge cycle started. The transport said immediately, That was unnecessarily childish.
“What do you know about children?” I was even more angry now because it was right. The shutdown and the time I had spent inert would have driven off or distracted a human; the transport had just waited to resume the argument.
My crew complement includes teachers and students. I have accumulated many examples of childishness.
I just sat there, fuming. I wanted to go back to watching media, but I knew it would think it meant I was giving in, accepting the inevitable. For my entire existence, at least the parts I could remember, I had done nothing but accept the inevitable. I was tired of it.
We are friends now. I don’t understand why you won’t discuss your plans.
It was such an astonishing, infuriating statement. “We aren’t friends. The first thing you did when we were underway was threaten me,” I pointed out.
I needed to make certain you didn’t attempt to harm me.
I noticed it had said “attempt” and not “intend.” If it had cared anything about my intentions it wouldn’t have let me onboard in the first place. It had enjoyed showing me it was more powerful than a SecUnit.
Not that it was wrong about the “attempt.” While watching the episodes I had managed to do some analysis of it, using the schematics in its own public feed and the specs of similar transports available on the unsecured sections of its database. I had figured out twenty-seven different ways to render it inoperable and three to blow it up. But a mutually assured destruction scenario was not something I was interested in.
If I got through this intact, I needed to find a nicer, dumber transport for the next ride.
I hadn’t responded and I knew by now it couldn’t stand that. It said, I apologized. I still didn’t respond. It added, My crew always considers me trustworthy.
I shouldn’t have let it watch all those episodes of Worldhoppers. “I’m not your crew. I’m not a human. I’m a construct. Constructs and bots can’t trust each other.”
It was quiet for ten precious seconds, though I could tell from the spike in its feed activity it was doing something. I realized it must be searching its databases, looking for a way to refute my statement. Then it said, Why not?
I had spent so much time pretending to be patient with humans asking stupid questions. I should have more self-control than this. “Because we both have to follow human orders. A human could tell you to purge my memory. A human could tell me to destroy your systems.”
I thought it would argue that I couldn’t possibly hurt it, which would derail the whole conversation.
But it said, There are no humans here now.
I realized I had been trapped into this conversational dead end, with the transport pretending to need this explained in order to get me to articulate it to myself. I didn’t know who I was more annoyed at, myself or it. No, I was definitely more annoyed at it.
I sat there for a while, wanting to go back to the media, any media, rather than think about this. I could feel it in the feed, waiting, watching me with all its attention except for the miniscule amount of awareness it needed to keep itself on course.
Did it really matter if it knew? Was I afraid knowing would change its opinion of me? (As far as I could tell, its opinion was already pretty low.) Did I really care what an asshole research transport thought about me?
I shouldn’t have asked myself that question. I felt a wave of non-caring about to come over me, and I knew I couldn’t let it. If I was going to follow my plan, such as it was, I needed to care. If I let myself not care, then there was no telling where I’d end up. Riding dumb transports watching media until somebody caught me and sold me back to the company, probably, or killed me for my inorganic parts.
I said, “At some point approximately 35,000 hours ago, I was assigned to a contract on RaviHyral Mining Facility Q Station. During that assignment, I went rogue and killed a large number of my clients. My memory of the incident was partially purged.” SecUnit memory purges are always partial, due to the organic parts inside our heads. The purge can’t wipe memory from organic neural tissue. “I need to know if the incident occurred due to a catastrophic failure of my governor module. That’s what I think happened. But I need to know for sure.” I hesitated, but what the hell, it already knew everything else. “I need to know if I hacked my governor module in order to cause the incident.”
I don’t know what I expected. I knew ART (aka Asshole Research Transport) had a deeper attachment to its crew than SecUnits had for clients. If it didn’t feel that way about the humans it carried and worked with, then it wouldn’t have gotten so upset whenever anything happened to the characters on Worldhoppers. I wouldn’t have had to filter out all the based-on-a-true-story shows where human crews got hurt. I knew what it felt, because I felt that way about Mensah and PreservationAux.
It said, Why was your memory of the incident purged?
That wasn’t the question I was expecting. “Because SecUnits are expensive and the company didn’t want to lose any more money on me than it already had.” I wanted to fidget. I wanted to say something so offensive to it that it would leave me alone. I really wanted to stop thinking about this and watch Sanctuary Moon. “Either I killed them due to a malfunction and then hacked the governor module, or I hacked the governor module so I could kill them,” I said. “Those are the only two possibilities.”
Are all constructs so illogical? said the Asshole Research Transport with the immense processing capability whose metaphorical hand I had had to hold because it had become emotionally compromised by a fictional media serial. Before I could say that, it added, Those are not the first two possibilities to consider.
I had no idea what it meant. “All right, what are the first two possibilities to consider?”
That it either happened, or it didn’t.
I had to get up and pace.
Ignoring me, ART continued, If it happened, did you cause it to happen, or did an outside influence use you to cause it to happen? If an outside influence caused it to happen, why? Who benefited from the incident?
ART seemed happy to have the problem laid out so clearly. I wasn’t sure I was. “I know I could have hacked my governor module.” I pointed at my head. “Hacking my governor module is why I’m here.”
If your ability to hack your governor module was what caused the incident, why was it not checked periodically and the current hack detected?
There would be no point in hacking the module if I couldn’t fool the standard diagnostics. But … The company was cheap and sloppy, but not stupid. I had been kept in a deployment center attached to corporate offices. So they hadn’t anticipated any potential danger.
ART said, You are correct that further research is called for before the incident can be understood fully. How do you plan to proceed?
I stopped pacing. It knew how I planned to proceed. Go to RaviHyral, look for information. There hadn’t been anything in the company’s knowledge base that I could access without getting caught, but the systems on RaviHyral itself might not be so well protected. And maybe if I saw the place again, it would spark something in my human neural tissue. (I wasn’t much looking forward to that part, if it happened.) I could tell ART was doing that thing again where it asked me questions it knew the answer to so it could trap me into admitting stuff that I didn’t want to admit. I decided to just skip to the end. “What do you mean?”
You will be identified as a SecUnit.
That stung a little. “I can pass as an augmented human.” Augmented humans are still considered humans. I don’t know if there are any augmented humans with enough implants to resemble a SecUnit. It seems unlikely a human would want that many implants, or would survive whatever catastrophic injury might make them necessary. But humans are weird. Whatever, I didn’t intend to let anyone see more than I absolutely had to.
You look like a SecUnit. You move like a SecUnit. It sent a whole array of images into the feed, comparing a recording of me moving around its corridors and cabins with recordings of various members of its crew in the same spaces. I had relaxed, relieved to be off the transit ring, but I didn’t look very relaxed. I looked like a patrolling SecUnit.
“No one noticed on the transit rings,” I said. I knew I was taking a chance. I had gotten by so far because the humans and augmented humans in the commercial transport rings didn’t see SecUnits except on the entertainment feed or in the news, where we were mostly killing people or already blasted into pieces. If I was spotted by anyone who had ever worked a long-term contract with SecUnits, there was a good chance they would realize what I was.
ART brought up a map listing. The RaviHyral Mining Facility Q Station was the third largest moon of a gas giant. The map rotated, with the various mining installations and support centers and the port highlighted. There was only the one port. These installations will employ SecUnits/have employed SecUnits. You will be seen by human authorities who have worked with SecUnits.
I hate it when ART is right. “I can’t do anything about that.”
You can’t alter your configuration.
I could see the skepticism through the feed. “No, I can’t. Look up the specs on SecUnits.”
SecUnits are never altered. Skepticism intensifying. It had obviously pulled all the information on SecUnits in its database and assimilated it.
“No. Sexbots are altered.” At least the ones I had seen had been altered. Some were mostly Unit standard with a few changes, others were radically different. “But that’s done in the deployment center, in the repair cubicles. To do anything like that I’d need a medical suite. A full one, not just an emergency kit.”
It said, I have a full medical suite. Alterations can be made there.
That was true, but even a medical suite as good as what ART had, able to carry out thousands of unassisted procedures on humans, wouldn’t have the programming to physically alter a SecUnit. I might be able to guide it through the process, but there was a big problem with that. Alterations to my organic and nonorganic components would cause catastrophic function loss if I wasn’t deactivated when they took place. “Theoretically. But I can’t operate the medical suite while I’m being altered.”
I can.
I didn’t say anything. I started sorting through my media again.
Why are you not responding?
I knew ART well enough by now to know it wouldn’t leave me alone, so I went ahead and spelled it out. “You want me to trust you to alter my configuration while I’m inactive? When I’m helpless?”
It had the audacity to sound offended. I assist my crew in many procedures.
I got up, paced, stared at the bulkhead for two minutes, then ran a diagnostic. Finally, I said, “Why do you want to help me?”
I’m accustomed to assisting my crew with large-scale data analysis, and numerous other experiments. While I am in transport mode, I find my unused capacity tiresome. Solving your problems is an interesting exercise in lateral thinking.
“So you’re bored? I’d be the best toy you’ve ever had?” When I was on inventory, I would have given anything for twenty-one cycles of unobserved downtime. I couldn’t feel sorry for ART. “If you’re bored, watch the media I gave you.”
I am aware that for you, your survival as a rogue Unit would be at stake.
I started to correct it, then stopped. Rogue was not how I thought of myself. I had hacked my governor module but continued to obey orders, at least most of them. I had not escaped from the company; Dr. Mensah had legally bought me. While I had left the hotel without her permission, she hadn’t told me not to leave, either. (Yes, I know the last one isn’t helping the argument all that much.)
Rogue units killed their human and augmented human clients. I … had done that once. But not voluntarily.
I needed to find out whether or not it had been voluntary.
“My survival isn’t at stake if I continue to ride unoccupied transports.” And learn to avoid the asshole ones that want to threaten me and question all my choices and try to talk me into getting into the medical suite so they could do experimental surgery on me.
Is that all you want? You don’t want to return to your crew?
I said, not patiently, “I don’t have a crew.”
It sent me an image from the newsburst I had given it, a group image of PreservationAux. Everybody was in their gray uniforms, smiling, for a team portrait taken at the start of the contract. That isn’t your crew?
I didn’t know how to answer it.
I had spent thousands of hours watching or reading about, and liking, groups of fictional humans in the media. Then I had ended up with a group of real humans to watch and like, and then somebody tried to kill them, and while protecting them I had to tell them I had hacked my governor module. So I left. (Yes, I know it’s more complicated than that.)
I tried to think about why I didn’t want to change my configuration, even to help protect myself. Maybe because it was something humans did to sexbots. I was a murderbot, I had to have higher standards?
I didn’t want to look more human than I already did. Even when I was still in armor, once my PreservationAux clients had seen my human face, they had wanted to treat me like a person. Make me ride in the crew section of the hopper, bring me in for their strategy meetings, talk to me. About my feelings. I couldn’t take that.
But I didn’t have the armor anymore. My appearance, my ability to pass as an augmented human, had to be my new armor. It wouldn’t work if I couldn’t pass among humans who were familiar with SecUnits.
But that seemed pointless, and I felt another wave of “I don’t care” coming on. Why should I care? I liked humans, I liked watching them on the entertainment feed, where they couldn’t interact with me. Where it was safe. For me and for them.
If I had gone back to Preservation with Dr. Mensah and the others, she might be able to guarantee my safety, but could I really guarantee her safety from me?
Altering my physical configuration still seemed drastic. But hacking my governor module was drastic. Leaving Dr. Mensah was drastic.
ART said, almost plaintively, I don’t understand why this is a difficult choice.
I didn’t, either, but I wasn’t going to tell it that.
I took two cycles to think it over. I didn’t talk to ART about it, or anything else, though we kept watching media together, and it exercised a self-restraint I didn’t think it had and didn’t try to start arguments with me.
I knew I had been lucky up to this point. Onboard the transport I had used to leave Port FreeCommerce, I had compared myself to recordings of humans, trying to isolate what factors might cause me to be identified as a SecUnit. The most correctable behavior was restless movement. Humans and augmented humans shift their weight when they stand, they react to sudden sounds and bright lights, they scratch themselves, they adjust their hair, they look in their pockets or bags to check for things that they already know are in there.
SecUnits don’t move. Our default is to stand and stare at the things we’re guarding. Partly this is because our non-organic parts don’t need movement the way organic parts do. But mostly it’s because we don’t want to draw attention to ourselves. Any unusual movement might cause a human to think there’s something wrong with you, which will draw more scrutiny. If you’ve gotten stuck with one of the bad contracts, it might cause the humans to order the HubSystem to use your governor module to immobilize you.
After analyzing human movement, I wrote some code for myself, to cause me to make a random series of movements periodically if I was standing still. To change my respiration to react to changes in the air quality. To vary my walking speed, to make sure I reacted to stimuli physically instead of just scanning and noting it. This code had gotten me through the second transit ring. But would I be all right on a ring or installation frequented by humans who often saw or worked with SecUnits?
I tweaked my code a little and asked ART to record me again as I moved around its corridors and compartments. I tried to make myself look as much like a human as possible. I’m used to feeling mentally awkward around humans, and I took that sensation and tried to express it in my physical movements. I felt pretty good about the result. Until I looked at the recordings and compared them to ART’s recordings of its crew and my recordings of other SecUnits.
The only one I was fooling here was myself.
The change in movement made me look more human but my proportions exactly matched the other SecUnits. I was good enough to fool humans who weren’t looking for me, since humans tend to ignore non-standard behavior in transitional public spaces. But anyone who had set out to find me, who was alert to the possibility of a rogue SecUnit, might not be fooled, and a simple scan calibrated to search for SecUnit size, height, and weight was certain to find me.
It was the logical choice, it was the obvious choice, and I would still rather peel my human skin off than do it.
I was going to have to do it.
After a lot of argument, we agreed the easiest change for the best result was to take two centimeters of length out of my legs and arms. It doesn’t sound like a big change, but it meant my physical proportions would no longer match Unit standard. It would change the way I walked, the way I moved. It made sense and I was fine with it.
Then ART said we also needed to change the code controlling my organic parts, so they could grow hair.
My first reaction to that was no fucking way. I had hair on my head, and eyebrows; that was a part of SecUnit configuration that was shared with sexbots, though the code controlling it kept SecUnit head hair short to keep it from interfering with the armor. The whole idea of constructs is that we look human, so we don’t make the clients uncomfortable with our appearance. (I could have told the company that the fact that SecUnits are terrifying killing machines does, in fact, make humans nervous regardless of what we look like, but nobody listens to me.) But the rest of my skin was hairless.
I told ART that I preferred it that way and extra hair would just draw unwanted attention. It replied that it meant the fine, sparse hair humans had on parts of their skin. ART had done some analysis and come up with a list of biological features that humans might notice subliminally. Hair was the only one we could change my underlying code to create, and ART proposed that it would make the joins between the organic and inorganic parts on my arms, legs, chest, and back look more like augments, the inorganic parts that humans had implanted for medical or other reasons. I pointed out that many humans or augmented humans had the hair on their bodies removed, for hygienic or cosmetic reasons and because who the hell wants it there anyway. ART countered that humans didn’t have to worry about being identified as SecUnits, so they could do whatever they wanted to their bodies.
I still wanted to argue, because I didn’t want to agree with anything ART said right now. But it seemed minor in comparison to removing two centimeters of synthetic bone and metal from my legs and arms, and changing the code for how my organic parts would grow around them.
ART had an alternate, more drastic plan that included giving me sex-related parts, and I told it that was absolutely not an option. I didn’t have any parts related to sex and I liked it that way. I had seen humans have sex on the entertainment feed and on my contracts, when I had been required to record everything the clients said and did. No, thank you, no. No.
But I did ask it to make an alteration to the dataport in the back of my neck. It was a vulnerable point, and I didn’t want to miss the opportunity to take care of it.
Once we agreed on the process, I stood in front of the surgical suite. The MedSystem had just sterilized and prepped itself and there was a heavy scent of antibacterials in the air, reminding me of every time I had carried an injured client into a room like this. I was thinking about all the ways this could go wrong, and the terrible things ART could do to me if it wanted.
ART said, What is causing the delay? Is there a preliminary process left to complete?
I had no reason to trust it. Except the way it kept wanting to watch media about humans in ships, and got upset when the violence was too realistic.
I sighed, stripped off my clothes, and laid down on the surgical platform.