41

“So, how did you do that?” I asked, after we’d left the moon-bus, and all the hubbub had died down.

“Do what?” Karen said.

“Break into the cockpit. And then push the cockpit door open against all that air pressure.”

“You know,” said Karen, staring at me with her one intact eye.

“No, I don’t.”

“Didn’t you select the super-strength option?”

“What? No.”

Karen smiled. “Oh,” she said. “Well, I did.”

I nodded, impressed. “Remind me not to piss you off.”

“ ‘Mr. McGee,’ ” said Karen, “ ‘don’t make me angry. You wouldn’t like me when I’m angry.’ ”

“What?”

“Sorry. Another TV program I have to show you.”

“I’ll look forward to—say! I cut off Deshawn. Do you know the verdict?”

“Oh, God!” said Karen. “I’d forgotten all about that. No, the jury was just coming in when he called; they hadn’t read the verdict yet. Let’s get him on the phone.”

We had Smythe show us to the communications center, and we placed the call to Deshawn’s cellular using a speaker phone so we could all hear. It turned out to be a complex process getting ahold of someone on Earth, involving actual human operators—I didn’t know such things existed anymore. But at last Deshawn’s phone was ringing.

“Deshawn Draper,” he said, by way of greeting, then, after a second, “Hello? Is anybody—”

“Deshawn!” said Karen. “It’s Karen, up on the moon—sorry about the time lag.

What’s the verdict?”

“Oh, so now you’re interested?” said Deshawn, sounding a bit miffed.

“I’m sorry, Deshawn,” I said. “A lot has gone down. The biological me is dead.”

A pause, for more than just the speed of light. “Oh, my,” said Deshawn. “I’m so sorry. You must—”

“The verdict!” exclaimed Karen. “What was the verdict?”

“—feel totally awful. I wish—Oh, the verdict? Guys, I’m sorry. We lost; Tyler won.”

“God,” said Karen. And then, more softly, “God…”

“Of course, we’ll appeal,” said Deshawn. “My dad’s already hard at work on the paperwork. We’ll take this all the way to the Supreme Court. The issues are huge…”

Karen continued to talk to Deshawn. I drifted off toward a window, looking out at the barren lunar landscape, very sorry indeed that you couldn’t see Earth from here.


Brian Hades was ecstatic to no longer be a hostage, and Gabe Smythe seemed glad that it was all over, too.

Except that it wasn’t over. There was still one more issue that had to be resolved.

Karen was off speaking to the biological Malcolm Draper—getting his advice on appealing the ruling against her. Although in theory the biological Malcolm and the Mindscan one should have the same views, in practice their opinions had to have diverged—although, granted, not likely nearly as much as mine and Jacob’s had.

While Karen was doing that, I went over to the High Eden administration building and confronted Hades and Smythe. Hades was behind his kidney-shaped desk, and Smythe was standing behind him, leaning effortlessly, as one could in this gravity, against a credenza.

“I know,” I said simply, standing in front of them, “that you’ve made other instantiations of me. Some down on Earth, and at least one here on the moon.”

Hades turned around, and he and Smythe looked at each other, the tall man with his white beard and ponytail, and the short one with his florid complexion and British accent.

“That’s not true,” said Hades, at last, turning back to face me.

I nodded. “The first tactic of corporate management, on any world: lie. But it’s not going to work today. I’m positive about the other instantiations. I’ve been in contact with them.”

Smythe narrowed his eyes. “That’s not possible.”

“Yes, it is,” I said. “Some sort of … of entanglement, I think.” Both men reacted with surprise at my use of that word. “And I know that you’ve been doing things to them, things to their minds. The question I want answered is, why?”

Hades said nothing, and neither did Smythe.

“All right,” I said, “let me tell you what I think you’re up to. I learned at the trial that there’s a concept in philosophy called ‘the zombie.’ It’s not precisely like the zombies of voodoo; those are reanimated dead folk. No, a philosophical zombie is a being that looks and acts just like us but has no consciousness, no self-awareness. Even so, it can perform complex, high-level tasks.”

“Yes?” said Smythe. “So?”

‘Seems you’re the only one who knows / What it’s like to be me.’

“Sorry,” said Smythe. “Were you singing just now?”

“I was trying to,” I said. “That’s a line from the theme song to an old TV series called Friends. Used to be one of Karen’s favorite shows. And it was bang-on target: it’s like something to be me; that’s the real definition of consciousness. But for zombies, it isn’t like anything. They aren’t anybody. They don’t feel pain or pleasure, even though they react as if they do.”

“You realize,” said Smythe slowly, “that not all philosophers believe such constructs are possible. John Searle was very much in favor of them, but Daniel Dennett didn’t believe in them.”

“And what do you believe, Dr. Smythe? You’re head psychologist for Immortex. What do you believe? What does Andrew Porter believe?”

“You won’t answer that,” said Hades, looking back over his shoulder. “I’m not a hostage anymore, Gabe—if you value your job, you won’t answer that.”

“Then I’ll answer it,” I said. “I think you do believe in zombies here at Immortex. I think you’re experimenting on copies of my mind, trying to produce human beings without consciousness.”

“Whatever for?” asked Smythe.

“For—everything. For slave labor, for sexual toys. You name it. Religious people would say these are bodies without souls; philosophers would say they’re existing without being self-aware … without knowing that they exist, without anyone being home between their ears. The market for uploading consciousness may be huge, but the market for intelligent robot labor is even bigger. No one has found a way to make true artificial intelligence, until now—and your Mindscan process does it by the simplest method possible: exactly duplicating a human mind. I saw that bit with Sampson Wainwright on TV all those years ago—the two entities, behind the curtains. Your copies are exact—but that’s not what you wanted, is it? Not really.

“No, you want the intelligence of humans, without the sentience, without the self-awareness, without it being like anything. You want those zombies—thinking beings that can perform even the most complex task flawlessly without ever complaining or getting bored. And so you’re experimenting with bootleg copies of my mind, trying to carve out the parts that are conscious in order to produce zombies.”

Smythe shook his head. “Believe me, nothing as nefarious as what you propose is at work here.”

“Gabe,” said Brian Hades, softly but sternly.

“It’s better he know the truth,” said Smythe, “than think something worse.”

Hades considered for a long time, his round, bearded face immobile. Finally, almost imperceptibly, he nodded.

But, now that he had the go-ahead, Smythe didn’t seem to know what to say. He pursed his lips and thought for several seconds, then: “Do you know who Phineas Gage is?”

“The guy in Around the World in Eighty Days?” I ventured.

“That was Phileas Fogg. Phineas Gage was a railway worker. In 1848, a tamping iron blew through his skull, leaving a hole nine centimeters in diameter.”

“Not a pleasant way to go,” I said.

“Indeed,” said Smythe. “Except he didn’t go. He lived for a dozen years afterwards.”

I lifted my eyebrows, which were still catching a bit, damn it all. “With a hole like that in his head?”

“Yes,” said Smythe. “Of course, his personality changed—which taught us a lot about how personality was created in the brain. Indeed, much of what we know about how the brain works is based on cases like Phineas Gage—outrageous, freak accidents. Most of them are one-of-a-kind cases, too: there’s only one Phineas Gage, and there could be any number of reasons why what happened to him is not typical of what would happen to most people with that kind of brain damage. But we rely on his case, because we can’t ethically duplicate the circumstances. Or we couldn’t, until now.”

I was mortified. “So you’re deliberately damaging the brains of versions of me just to see what happens?”

Smythe shrugged as though it were a small matter. “Exactly. I’m hoping to turn consciousness studies into an experimental science, not some hit-and-miss game of chance. Consciousness is everything: it’s what gives the universe shape and meaning. We owe it to ourselves to study it—to really, finally, at last find out what it is, and why it is like something to be conscious.”

My voice was thin. “That’s monstrous.”

“Psychologists have been unable to test their theories, except in the most marginal ways,” said Smythe, as if he hadn’t heard me. “I’m elevating psychology from the quagmire of the soft sciences into the realm of the exact—giving it the same beautiful precision that particle physics has, for instance.”

“With copies of me?”

“They’re surplus; they’re like the extra embryos produced in in vitro fertilization.”

I shook my head, appalled, but Smythe seemed unperturbed. “Do you know what I’ve discovered? Have you any idea?” His eyebrows had climbed high on his pink forehead. “I can shut off long-term memory formation; shut off short-term memory formation; give you a photographic, eidetic memory; make you religious; make you taste colors or hear shapes; retard your time sense; give you perfect time sense; give you a phantom awareness of the tail you used to have in the womb. No doubt I’ll soon unlock addiction, making people immune to it. I’ll be able to bring normally autonomic processes such as heart rate into conscious control. I’ll be able to give an adult the effortless ability a child has to learn new languages.

“Do you know what happens when you cut out both the pineal gland and Broca’s area? When you totally separate the hippocampus from the rest of the brain? When you do a transformation, so that what’s normally encoded in the left hemisphere is mapped onto the right side of the body, and vice versa? What happens when you wake up a human mind in a body that has three arms, or four? Or has its two eyes situated opposite each other, one facing front, the other facing back?

I know these things. I know more about how the mind really works than Descartes, James, Freud, Pavlov, Searle, Chalmers, Nagel, Bonavista, and Cho combined. And I’ve only just begun my research!”

“Jesus,” I said. “Jesus. You have to stop. I forbid it.”

“I’m not sure that’s within your power,” said Smythe. “You didn’t create your mind; it’s not subject to copyright. Besides, think of the good I’m doing!”

“Good? You’re torturing these people.”

Smythe looked unfazed. “I’m doing research that needs to be done.”

Before I could reply, Brian Hades spoke for the first time in several minutes. “Please, Mr. Sullivan. You’re the only one who can help us.”

“Why me?” I said. “Is it because I’m young?”

“That’s part of it,” said Hades. “But only a small part of it.”

“What else is there?”

Hades looked at me, and Smythe looked at Hades. “You spontaneously boot.” Hades said. “No one else ever has.”

I was completely baffled. “What?”

“If you, as an upload, lose consciousness, you don’t stop for good,” Hades said. “Rather, your consciousness comes back of its own volition. No other Mindscan has ever done that.”

“I haven’t lost consciousness,” I said. “Not since I uploaded.”

“Yes, you have,” said Hades. “Almost as soon as you were created. Don’t you remember? Back at our facility in Toronto?”

“I … oh.”

“Remember?” said Smythe, standing up straight. “There had been a moment when something had gone wrong. Porter noticed it—and was amazed.”

“I don’t understand. What’s so amazing about that?”

Smythe spread his arms as if it were obvious. “Do you know why Mindscans never sleep?”

“We aren’t subject to fatigue,” I said. “We don’t get tired.”

Smythe shook his head. “No. Oh, that happens to be true, but it’s not the reason.”

He looked at Hades, as if giving him a chance to cut him off, but Hades just shrugged a little, passing the floor back to Smythe.

“We’ve all been following the trial up here, of course,” Smythe said. “You saw Andy Porter give testimony, right?”

I nodded.

“And he talked about competing theories of how consciousness is instantiated, remember? Of what the actual physical correlates of it are?”

“Sure. It could be anything from neural nets to, ah…”

“To cellular automata on the surface of the microtubules that make up the cytoskeleton of neural tissue,” said Smythe. “Porter’s a good company man; he made it sound like there’s still a question about this. But there isn’t—although we here at Immortex are the only ones who know that. Consciousness is cellular automata—that’s where it’s embodied. No question.”

I nodded. “Okay. So?”

Smythe took a deep breath. “So, with the Mindscan process, we get a perfect quantum snapshot of your mind at a given moment in time: we precisely map the configuration of—to use Porter’s metaphor—the black and white pixels that make up the fields of cellular automata that cover the microtubules in your brain tissue. It’s a precise quantum snapshot. But that’s all a Mindscan is—a snapshot. And that’s not good enough. Consciousness isn’t a state, it’s a process. For our snapshot to become conscious, that snapshot has to spontaneously become one frame in a motion-picture film, a film that’s creating its own unscripted story, unfolding into the future.”

“If you say so,” I said.

Smythe nodded emphatically. “I do. The snapshot becomes a moving picture when the black and white pixels become animated. But they don’t do that on their own: they have to be given rules to obey. You know, turn white if three of your neighbors are black, or something like that. But the rules aren’t innate to the system. They have to be imposed upon it. Once they are, the cellular automata keep permuting endlessly—and that’s consciousness, that’s the actual phenomenon of self-awareness, of inner life, of existence being like something.”

“So how do you add in rules that govern the permutations?” I asked.

Smythe lifted his hands. “We don’t. We can’t. Believe me, we’ve tried—but nothing we can do gets the pixels to start doing anything. No, the rules come from the already conscious mind of the subject being scanned. It’s only because the real, biological mind is initially quantally entangled with the new one that the rules are transferred, and the pixels become cellular automata in the new mind. Without that initial entanglement, there is no process of living consciousness, only a dead snapshot of it. Our artificial minds don’t have such rules built in, so if the consciousness ever halts in a copied mind, there’s no way to start it up again.”

“So if one of us were to fall asleep—” I said.

“He’d die,” said Smythe simply. “The consciousness would never reboot.”

“So, why is this a big secret?”

Smythe looked at me. “There are more than a dozen other companies trying to get into the uploading business; it’s going to be a fifty-trillion-dollar-a-year industry by 2055. They can all do a version of our Mindscan process: they can all copy the pattern of pixels. But, so far, we’re the only ones who know that quantum entanglement with the source mind is the key to booting up the copied consciousness. Without linking the minds, at least initially, the duplicate never does anything.” He shook his head. “For some reason, though, your mind does reboot when it’s shut off.”

“I’ve only blacked out once,” I said, “and that was just after the initial boot-up. You can’t know that it always happens.”

“Yes, we can,” said Smythe. “Copies of your mind manage to generate rules for their cellular automata spontaneously, on their own, without being linked to the original. We know, because we’ve instantiated multiple copies of your mind into artificial bodies here on the moon and down on Earth—and, no matter when we do it, the copies spontaneously boot up. Even if we shut them down, they just boot up again later on their own.”

I frowned. “But why should I be different from everyone else in this regard? Why do copies of my mind spontaneously reboot?”

“Honestly?” said Smythe, raising his platinum eyebrows. “I’m not sure. But I think it has to do with the fact that you used to be color-blind. See, consciousness is all about the perception of qualia: things that only exist as constructs in the mind, things like bitterness or peacefulness. Well, colors are one of the most basic qualia. You can take a rose and pull off and isolate the stem, or the thorns, or the petals: they are distinct, actual entities. But you can’t pull off the redness, can you? Oh, you can remove it—you can bleach a rose—but you can’t pluck the redness out and point to it as a separate thing. Redness, blueness, and so on are mental states—there’s no such thing as redness on its own. Well, by accident, we gave your mind access to mental states it had never experienced before. That initially made it unstable. It tried to assimilate these new qualia, and couldn’t—so it crashed. That’s what happened when Porter first transferred you: it crashed, and you blacked out. But then your consciousness rebooted, on its own, as if striving to make sense of the new qualia, to incorporate them into its worldview.”

“It makes you an invaluable test subject, Mr. Sullivan,” said Brian Hades. “There’s no one else like you.”

“There should be no one else like me,” I said. “But you keep making copies. And that’s not right. I want you to shut off the duplicates of me you’ve fraudulently produced, destroy the master Mindscan recording, and never make another me again.”

“Or…?” said Hades. “You can’t even prove they exist.”

“You think messing with the biological Jacob Sullivan was hard? Trust me: you don’t want to have to deal with the real me.”

Загрузка...