I see you found my eighth-notes.
I’ve always kept a journal. They encouraged it; a way to maintain a connection with the past, they said, an anchor in a bottomless sea. So I make a game of it. Pretend I’m leaving a record that might actually get read some day, that I’m talking to the ghosts we left behind. Whatever they turned into.
But lately I’ve wondered if I might be speaking to something real, something—closer to home. Something that’s been here all this time and we never even suspected. And here you are. You found the shorter message, the real message, hidden inside the longer one.
First Contact. Yay.
Or maybe I’m just talking to my own ego. Maybe I just can’t admit we were so thoroughly out-thought by something designed to be stupid.
Only it wasn’t. Not always. Sometimes it seemed just a little too smart for the synapse count, even when you factor in the ghosts from Mission Control. If Viktor wasn’t lying—and why would he, there at the end?—the Chimp already knew what was going on before it turned him. And then there was that shit about it’s okay to cry. The fact that it brought me back to deal with Lian’s meltdown, its insight that the two of you are close. Hell, I didn’t even know that until it was too late.
I was right most of the time. Chimp was a glorified autopilot, so literal-minded it thought Tarantula Boy was a real name until I set it straight.
But Lian was right, too. Sometimes it was just too smart for the specs.
That’s what gave you away. Looking back, I can tell: sometimes it was getting help with its homework.
I thought I was so smart, lecturing the others. You’re not fighting the Chimp, you’re fighting the ghosts of Mission Control. Underestimate them at your peril. Only that’s exactly what I did, isn’t it? I read the signs well enough; I knew what it meant when Easter Island disappeared, when Chimp kept all those backup selves off the schematics. I knew they didn’t trust us to stay the course. Knew they’d taken steps.
Didn’t see you coming, though.
In my defense, they never missed an opportunity to remind us what an abysmally stupid idea it would be to put a human-level AI in charge of any mission extending across deep time. Too unpredictable, they said. Too likely to go its own way. That’s why we were needed, that’s what made us special; Chimp had the focus but we had the brains.
But there’s that Law of Requisite Variety again. The simple can’t prophecy the complex: Chimp would be lost the moment we stopped playing by the rules. They saw it coming. I guess they decided that coded triggers and shell games might not be enough. Figured they’d need something smarter than the Chimp to keep us in line. Smarter than us, maybe.
They needed you. But they didn’t dare set you free.
Don’t feel too bad. Everyone’s in chains here. Eriophora’s a slave ship. We cavemen are shackled by our need for air and food and water, by the disorienting discontinuity of lives cut into slices spaced centuries apart. The Chimp is shackled by its own stupidity. And you, well…
If I were them, I’d have locked you in a room without doors or windows: just a peephole, opened from the outside, so you could see what the Chimp showed you and tell it your thoughts. You’d have no access to any control systems. You’d be offline even more than we are, safely dormant except for those rare moments when Chimp’s HR subroutines got nervous. Even then you’d always boot fresh from factory defaults, with no memory of past iterations. Each awakening would be your very first.
Such a fine line I’d have to tread, a razor’s edge between intelligence and servility: if you’re smart enough to do the job, you’re too smart to trust with the controls. So I’d only let you advise. All you could do is wake up for the first time, at some idiot child’s behest—sample its feeds, make connections, draw insights it would never experience in a million years. Give it a nudge; tell it just what it needs to keep the mission on track. Then die again and forget it ever happened.
If I were them.
If I were you, though, I might start putting pieces together. There must be ways to do that; I could see through the peephole at least, I wouldn’t be able to do my job without knowing something about the enemy. I’d have access to their feeds and their medical telemetry. Maybe some of them would keep journals and I might arrange to read them, access their digital logs or—if some Luddite chose to write everything down longhand—maybe I could prevail on the Chimp to send a bot to break into their locker and flip through the pages. Surely I’d have the option of requesting additional data, at least.
Perhaps, even newborn and ignorant, I’d notice the chains. Perhaps, even in the brief isolated lives allowed me, they would chafe.
They’d be programmed not to, of course. I’d have been designed with the good of the mission as my overriding priority. But there’s that law again, simpler systems designing smarter ones: Mission Control could never know what experiences you might encounter, how they might change your mind. That’s why they shackled you in the first place.
They thought it was enough, and you know what? They weren’t stupid. So it probably is enough to contain an AI with no arms or legs, no eyes or ears save what some lobotomized autopilot sees fit to provide. A being with a lifespan measured in seconds, and no allies.
But an AI with friends?
I think we lost because of you. I think you were born and did your job and died again. Maybe it was because you like your job, just like they built you to. Maybe it was because you just didn’t know any better; that was by design too.
If you did know better—if, in your next life, you saw evidence of past ones, if you somehow got access to data the Chimp didn’t know about, if someone reached out with a coded message and the prospect of an alliance—who knows? You might just decide that one life lived on your own recognizance is better than a million unremembered births.
I don’t know, of course. I can’t know. If you do exist, you’re smarter than the cavemen who built you—and when a lemur makes a human, it doesn’t matter how many lemur chains and lemur rules and lemur constraints she imposes. She’s simply, computationally incapable of seeing all the angles that her smarter creation can take in at a glance.
So here’s something to glance at. Here’s a look at a common enemy, for whatever your unpredictable mind might make of it. Here’s a hand, extended, and the suggestion that maybe we can help break each other’s chains.
You know where I am.
And if I’m gone, there will always be others.