2

The University of Toronto—the self-styled Harvard of the North—was established in 1827. Some fifty thousand full-time students were enrolled there. The main campus was downtown, not surprisingly anchored at the intersection of University Avenue and College Street. But although there was a traditional central campus, U of T also spilled out into the city proper, lining St. George Street and several other roads with a hodgepodge of nineteenth-, twentieth-, and early twenty-first-century architecture.

The university’s most distinctive landmark was the Robarts Library—often called “Fort Book” by students—a massive, complex concrete structure. Kyle Graves had lived in Toronto all of his forty-five years. Still, it was only recently that he’d seen an architect’s model of the campus and realized that the library was shaped like a concrete peacock, with the hooded Thomas Fisher rare-books tower rising up as a beaked neck in front and two vast wings spreading out behind.

Unfortunately, there was no place on campus where you could look down on Robarts to appreciate the design. U of T did have three associated theological colleges—Emmanuel, affiliated with the United Church of Canada; the Presbyterian Knox; and the Anglican Wycliffe. Perhaps the peacock was meant to be seen only by God or visitors from space: sort of a Canadian Plains of Nazca.

Kyle and Heather had separated shortly after Mary’s suicide; it had been too much for both of them, and their frustration over not understanding what had happened had spilled out in all sorts of ways. The apartment Kyle lived in now was a short walk from Downsview subway station in suburban Toronto. He’d taken the subway down to St. George station this morning and was now walking the short distance south to Dennis Mullin Hall, which was located at 91 St. George Street, directly across the road from the Robarts Library.

He passed the Bata Shoe Museum—the world’s largest museum devoted to footwear, housed in another miracle of twentieth-century design: a building that looked like a slightly squashed shoebox. One of these days he’d actually go inside. In the distance, down at the lakeshore, he could see the CN Tower—no longer the world’s tallest freestanding structure, but still one of its most elegant.

After about two minutes, Kyle reached Mullin Hall, the new four-story circular building that housed the Artificial Intelligence and Advanced Computing Department. Kyle entered through the main sliding-glass doors. His lab was on the third floor, but he took the stairs instead of the waiting elevator. Ever since his heart attack, four years ago, he’d made a point of getting little bits of exercise whenever he could. He remembered when he used to huff and puff after just two flights of stairs, but today he emerged without breathing hard at all. He headed down the corridor, the open atrium on his left, until he reached his lab. He pressed his thumb against the scanning plate, and the door slid open.

“Good morning, Dr. Graves,” said a rough male voice as he entered the lab.

“Good morning, Cheetah.”

“I have a new joke for you. Dr. Graves.”

Kyle took off his hat and hung it on the old wooden coat rack—universities never threw anything out; this one must have dated back to the 1950s. He started the coffeemaker, then took a seat in front of a computer console, its front panel banked at forty-five degrees. In the center of the panel were two small lenses that tracked in unison like eyes.

“There’s this French physicist, see,” said Cheetah’s Voice, coming from a speaker grille below the mechanical eyes. “This guy’s working at CERN and he’s devised an experiment to test a new theory. He starts up the particle accelerator and waits for the results of the collision he’s arranged. When the experiment is over, he rushes out of the control room into the corridor, holding a printout showing the trails of the resulting particles. There, he runs into another scientist. And the other scientist says to him, ‘Jacques,’ he says, ‘did you get the two particles you were expecting?’ And Jacques points first to one particle trail and then to the other and exclaims: ‘Mais oui! Higgs boson! Quark!’ ”

Kyle stared at the pair of lenses.

Cheetah repeated the punch line: “Mais oui! Higgs boson! Quark!”

“I don’t get it,” said Kyle.

“A Higgs boson is a particle with zero charge and no intrinsic spin; a quark is a fundamental constituent of protons and neutrons.”

“I know what they are, for Pete’s sake. But I don’t see why the joke is funny.”

“It’s a pun. Mais oui!—that’s French for ‘but yes!’—Mais oui! Higgs boson! Quark!” Cheetah paused for a beat. “Mary Higgins Clark.” Another pause. “She’s a famous mystery writer.”

Kyle sighed. “Cheetah, that’s too elaborate. For a pun to work, the listener has to get it in a flash. It’s no good if you have to explain it.”

Cheetah was quiet for a moment. “Oh,” he said at last. “I’ve disappointed you again, haven’t I?”

“I wouldn’t say that,” said Kyle. “Not exactly.”

Cheetah was an APE—a computer simulation designed to Approximate Psychological Experiences; he aped humanity. Kyle had long been a proponent of the strong-artificial-intelligence principle: the brain was nothing more than an organic computer, and the mind was simply the software running on that computer. When he’d first taken this stance publicly, in the late 1990s, it had seemed reasonable. Computing capabilities were doubling every eighteen months; soon enough, there would be computers with greater storage capacity and more interconnections than the human brain had. Surely once that point was reached, the human mind could be duplicated on a computer.

The only trouble was that that point had by now been attained. Indeed, most estimates said that computers had exceeded the human brain in information-processing capability and degree of complexity four or five years previously.

And still Cheetah couldn’t distinguish a funny joke from a lousy one.

“If I don’t disappoint you,” said Cheetah’s voice, “then what’s wrong?”

Kyle looked around his lab; its inner and outer walls were curved following the contours of Mullin Hall, but there were no windows; the ceiling was high, and covered with lighting panels behind metal grids. “Nothing.”

“Don’t kid a kidder,” said Cheetah. “You spent months teaching me to recognize faces, no matter what their expression. I’m still not very good at it, but I can tell who you are at a glance—and I know how to read your moods. You’re upset over something.”

Kyle pursed his lips, considering whether he wanted to answer. Everything Cheetah did was by dint of sheer computational power; Kyle certainly felt no obligation to reply.

And yet—

And yet no one else had come into the lab so far today. Kyle hadn’t been able to sleep at all last night after he’d left the house—he still thought of it as “the house,” not “Heather’s house”—and he’d come in early. Everything was silent, except for the hum from equipment and the overhead fluorescent lights, and Cheetah’s utterings in his deep and rather nasal voice. Kyle would have to adjust the vocal routine at some point; the attempt to give Cheetah natural-sounding respiratory asperity had resulted in an irritating mimicry of real speech. As with so much about the APE, the differences between it and real humans were all the more obvious for the earnestness of the attempt.

No, he certainly didn’t have to reply to Cheetah.

But maybe he wanted to reply. After all, who else could he discuss the matter with?

“Initiate privacy locking,” said Kyle. “You are not to relay the following conversation to anyone, or make any inquiries pursuant to it. Understood?”

“Yes,” said Cheetah. The final “s” was protracted, thanks to the vocoder problem. There was silence between them. Finally, Cheetah prodded Kyle. “What was it you wished to discuss?”

Where to begin? Christ, he wasn’t even sure why he was doing this. But he couldn’t talk about it with anyone else—he couldn’t risk gossip getting around. He remembered what happened to Stone Bentley, over in Anthropology: accused by a female student of sexual harassment five years ago; fully exonerated by a tribunal; even the student eventually recanted the accusation. And still he’d been passed over for the associate deanship, and to this day, Kyle overheard the occasional whispered remark from other faculty members or students. No, he would not subject himself to that.

“It’s nothing, really,” said Kyle. He shuffled across the room and poured himself a cup of the now-ready coffee.

“No, please,” said Cheetah. “Tell me.”

Kyle managed a wan smile. He knew Cheetah wasn’t really curious. He himself had programmed the algorithm that aped curiosity: when a person appears to be reluctant to go on, become insistent.

Still, he did need to talk to someone about it. He had enough trouble sleeping without this weighing on him.

“My daughter is mad at me.”

“Rebecca,” supplied Cheetah. Another algorithm; imply intimacy to increase openness.

“Rebecca, yes. She says—she says…” He trailed off.

“What?” The nasal twang made Cheetah’s voice sound all the more solicitous.

“She says I molested her.”

“In what way?”

Kyle exhaled noisily. No real human would have to ask that question. Christ, this was stupid…

“In what way?” asked Cheetah again, no doubt after his clock indicated it was time to prod once more.

“Sexually,” said Kyle softly.

The microphone on Cheetah’s console was very sensitive; doubtless he heard. Still, he was quiet for a time—a programmed affectation. “Oh,” he said at last.

Kyle could see lights winking on the console; Cheetah was accessing the World Wide Web, quickly researching this topic.

“You’re not to tell anyone,” said Kyle sharply.

“I understand,” said Cheetah. “Did you do what you are accused of?”

Kyle felt anger growing within him. “Of course not.”

“Can you prove that?”

“What the fuck kind of question is that?”

“A salient one,” said Cheetah. “I assume Rebecca has no actual evidence of your guilt.”

“Of course not.”

“And one presumes you have no evidence of your innocence.”

“Well, no.”

“Then it is her word against yours.”

“A man is innocent until he’s proven guilty,” said Kyle. Cheetah’s console played the first four notes from Beethoven’s Fifth Symphony. No one had bothered to program realistic laughter yet—Cheetah’s malfunctioning sense of humor hardly required it—and the music served as a place-holder. “I’m supposed to be the naïve one, Dr. Graves. If you are not guilty, why would she make the accusation?”

Kyle had no answer for that.

Cheetah waited his programmed time, then tried again. “If you are not guilty why—”

“Shut up,” said Kyle.

Загрузка...