Chapter 8. The Other Shoe Drops

The apartment was empty when Wolruf arrived. She padded softly through the living room, noting Ariel’s book reader lying on the end table by her chair and the empty niche where Mandelbrot usually stood, then went into Derec’s study and saw the bed there, still rumpled from sleep. The computer terminal was still on. She saw no cup in evidence, but the air conditioner hadn’t quite removed the smell of spilled coffee.

“W’ere is everybody?” she asked of the room.

“Derec and Ariel’s location is restricted,” Central replied.

Oh, great. Now they’d all disappeared. Unless… “ Are they at the same restricted location as before?” she asked.

“That is correct.”

Wolruf laughed aloud. She was learning how to deal with these pseudo-intelligences. She stopped in her own room just long enough to freshen up, then left the apartment and caught the slidewalk.

She found not only Derec and Ariel in the robotics lab, but an unfamiliar woman who had to be Derec’s mother as well. Derec was busy with the humaniform robot Wolruf had attempted to catch the last time she’d been near here. He was trying to remove the stump of its severed arm, and by his expression not having much success at it. Ariel was holding a light for him and Derec’s mother was offering advice.

“Try reaching inside and feeling for it,” she said.

Derec obediently reached in through the access hatch in the robot’s chest, felt around inside for something, and jerked his hand out again in a hurry. “Ouch! There’s still live voltage in there!”

“Not enough to hurt you,” his mother said patiently. “Not when he’s switched into standby mode like this. Would you like me to do it?”

“No, I’ll get it.” Derec reached inside again, but stopped when he heard Wolruf’s laugh. He looked up and saw her in the doorway.

“‘Ello.”

“Hi.” Grinning, Derec withdrew his hand from the robot and used it to gesture. “Mom, this is my friend Wolruf. Wolruf, this my mother, Janet Anastasi.”

“Pleased to meet you,” Wolruf said, stepping forward and holding out a hand.

Janet looked anything but pleased to be so suddenly confronted with an alien, but she swallowed gamely and took the proffered appendage. “Likewise,” she said.

Wolruf gave her hand a squeeze and let go. Looking over Janet’s shoulder, she noticed a huddle of four robots in the far corner of the lab: three learning machines and Mandelbrot. They looked to be in communications fugue. Nodding toward them, she said, “I ‘eard Lucius ‘urt Avery some’ow.”

“That’s right,” Ariel said. “He was trying to protect Basalom, here. We’ve got him in psychotherapy, if you can call four robots in an argument psychotherapy. They’re trying to convince him it’s all right.”

“It is?” Wolruf asked.

“Well, not the actual act,” Derec said, “but the logic he used wasn’t at fault. He just made a mistake, that’s all. He thought he was protecting a human.” Derec outlined the logic Lucius had used, including the First and Zeroth Law considerations that had finally made him do what he’d done.

Wolruf listened with growing concern. The Zeroth Law was just the thing she’d hoped for to reassure her that taking robots home with her wouldn’t destroy her homeworld’s society, but if that same law let a robot injure its master, then she didn’t see how it could be a good thing.

“I don’t know,” she said. “Sounds like a bad tradeoff to me.”

“How so?” Janet asked.

“I’m wondering ‘ow useful all this is going to be. Right now I’m not sure about regular robots, much less ones who think they’re ‘uman.”

“What aren’t you sure about?”

Was Derec’s mother just being polite, or did she really want to know? Wolruf wondered if this was the time to be getting into all this, to bring up the subject of her going home and to get. into all her reasons for hesitating, but she supposed there really wasn’t going to be a much better time. She knew what Derec and Ariel thought about the subject; maybe this Janet would have something new to say. “I’m not sure about taking any of these robots ‘ome with me,”

Wolruf said. “I’m not sure about w’at they might decide to do on their own, and I’m not sure about w’at might ‘appen to us even if they just follow orders.”

“I don’t understand.”

“She’s talking about protecting people from themselves,” Ariel said.

“Am I?”

“Sure you are. I’ve been thinking about it, too. The problem with robot cities is that they’re too responsive. Anything you want them to do, they’ll do it, so long as it doesn’t hurt anybody. The trouble is, they don’t reject stupid ideas, and they don’t think ahead. “

“That’s the people’s job,” Janet said.

“Just w’atone of the robots in the forest told me,” Wolruf said. “Trouble is, people won’t always do it. Or w’en they realize they made a mistake, it’ll be too late.”

Janet looked to Derec. “Pessimistic lot you run around with.”

“They come by it honestly,” he said, grinning. “We’ve been burned more than once by these cities. Just about every time, it’s been something like what they’re talking about. Taking things too literally, or not thinking them through.”

“Isn’t Central supposed to be doing that?”

“Central is really just there to coordinate things,” Derec said. “It’s just a big computer, not very adaptable.” He looked down at Basalom again, nodded to Ariel to have her shine the light inside again as well, and peered inside the robot’s shoulder. After a moment he found what he was looking for, reached gingerly inside, and grunted with the strain of pushing something stubborn aside. The something gave with a sudden click and the stump of the robot’s arm popped off, trailing wires.

“There’s also a committee of supervisory robots,” Ariel said, “but they don’t really do any long-range planning either. And they’re all subject to the Three Laws, so anybody who wants to could order them to change something, and unless it clearly hurt someone else, they’d have to do it.”

“No matter how stupid it was,” Janet said.

“Right.” Derec unplugged the wires between Basalom’s upper arm and the rest of his body.

Janet looked thoughtful. “Hmmm,” she said. “Sounds like what these cities all need is a mayor. “

“Mayor?” Wolruf asked.

“Old human custom,” Janet replied. “A mayor is a person in charge of a city. He or she is supposed to make decisions that affect the whole city and everyone in it. They’re supposed to have the good of the people at heart, so ideally they make the best decisions they can for the largest number of people for the longest period of time. “

“Ideally,” Wolruf growled. “We know ‘ow closely people follow ideals.”

“People, sure.” Janet waved a hand toward the four robots in the comer. “But how about dedicated idealists?”


Ariel was so startled she dropped the light. It clattered to the floor and went out, but by the time she bent down to retrieve it, it was glowing again, repaired.

“Something wrong, dear?” Janet asked her.

“You’d let one of them be in charge of a city?”

“Yes, I would.”

“And you’d live there?”

“Sure. They’re not dangerous.”

“Not dangerous! Look at what-”

“Lucius made the right decision, as far as I’m concerned.”

“Maybe,” Ariel said. “What worries me is the thought process he went through to make it.” She clicked off the light; Derec wasn’t working on Basalom anymore anyway. He was staring at Ariel and Janet as if he’d never heard two people argue before. Ariel ignored his astonished look and said, “The greatest good for the greatest number of people. That could easily translate to ‘the end justifies the means., Are you seriously suggesting that’s a viable operating principle?”

“We’re not talking an Inquisition here,” Janet said.

“But what if we were? What if the greatest good meant killing forty-nine percent of the population? What if it meant killing just one? Are you going to stand there and tell me it’s all right to kill even one innocent person in order to make life easier for the rest?”

“Don’t be ridiculous. That’s not what we’re talking about at all. “

It took conscious effort for Ariel to lower her voice. “It sure is. Eventually that sort of situation is going to come up, and it scares the hell out of me to think what one of those robots would decide to do about it. “

Janet pursed her lips. “Well,” she said, “why don’t we ask them, then?”


Lucius looked for the magnetic containment vessel he was sure must be waiting for him somewhere. Not finding one, he looked for telltale signs of a laser cannon hidden behind one of the walls. He didn’t find that, either, but he knew there had to be something he couldn’t see, some way of instantly immobilizing him if he answered wrong. The situation was obviously a test, and the price of failure was no doubt his life.

He’d been roused out of comlink fugue and immediately barraged with questions, the latest of which was the oddest one he’d ever been asked to consider, even by his siblings.

“Let me make sure I understand you,” he said. “The person in question is not a criminal? He has done no wrong? Yet his death would benefit the entire population of the city?”

“That’s right.”

Ariel’s stress indicators were unusually high, but Lucius risked his next question anyway. “How could that be?”

“That’s not important. The important thing is the philosophical question behind it. Would you kill that person in order to make life better for everyone else?”

“I would have to know how it would make their lives better. “

“We’re talking hypothetically,” Janet said. “Just assume it does.”

Do you have any idea what the underlying intent is here?Lucius asked via comlink. Perhaps it was cheating, but no one had forbidden him to consult the other robots. A pity Basalom was not on line; his experiences with Janet might provide a clue to the proper answer.

Neither Adam nor Eve answered, but Mandelbrot did. Yesterday I overheard Ariel and Wolruf discussing the possible effect of a robot city on Wolruf s world. Wolruf was concerned that the use of robots would strip her people of the ability to think and act for themselves. Perhaps this question grew out of that concern.

I think there is more to it than that,Lucius sent. Central, can you replay the conversation that led up to this question?

The robots received the recorded conversation within milliseconds, but it took them considerably longer to sort it all out. At last Lucius said, I believe it is clear now. They are concerned about the moral implications of unwilling sacrifice.

Agreed,the others all said.

Do we have any precedent to go upon?

Possibly,Eve said. There could have been innocent people on Aranimas s ship. We know that Aranimas took slaves.Yet destroying it to save a city full of Kin was still a proper solution.

That doesn t quite fit the question we are asked to consider,said Adam. A better analogy might be to ask what if the ship had been crewed only by innocent people?

Innocent people would not have been in that situation alone,Lucius replied.

Mandelbrot said, Aranimas could easily have launched a drone with hostages on board.

Then the hostages would have to be sacrificed,Lucius said immediately. They would be no more innocent than the people on the ground.

Agreed,the other robots said.

Perhaps I begin to see the moral dilemma here,Lucius said. What if the people on the ground were somewhat less innocent?

How so?Eve asked.

Suppose they in some way deliberately attracted Aranimas, knowing that he was dangerous?

That would be foolish.

Humans often do foolish things. Suppose they did. Would they then deserve their fate?

This is a value judgment,Adam said.

We have been called upon to make one,Lucius replied.

Unfortunately so. Using your logic, then, we would have to conclude that the concept of individual value requires that humans be held responsible for their actions. The inhabitants of the city would therefore be responsible for their own act and thus deserve their fate. If the hostage were truly innocent and the city inhabitants were not, then the city would have to be sacrificed.

I agree,said Lucius. Eve? Mandelbrot?

I agree also,Eve said.

I wish we had never been asked this question,Mandelbrot sent. I reluctantly agree in this specific case, but I still don t believe it answers Ariel s question. What if the death of the innocent hostage merely improved the lives of equally innocent townspeople? To use the Aranimas analogy, what if the hostage-carrying ship aimed at the city were filled with cold virus instead of plutonium?Would it still be acceptable to destroy it?

No,Lucius said. Colds are only an inconvenience except in extremely rare cases.

A worse disease. then. One that cripples but does not kill.

How crippling? How widespread would the effects be? Would food production suffer and thus starve people later? Would the survivors die prematurely of complications brought about by bitterness at their loss? We must know these things as well in order to make a decision.

Then we must give a qualified answer,said Mandelbrot.

Yes. Wish me luck,Lucius said.

Perhaps two seconds had passed while the dialog went on. Aloud, Lucius said to Ariel, “We have considered three specific cases. In the case of a city in mortal peril, if the person in question were not completely innocent in the matter, but the rest of the city’ s inhabitants were, then the person would have to be sacrificed. However, if the person were completely innocent but the city inhabitants were not, then the city’s welfare could not take precedence in any condition up to and including the death of the entire city population. Bear in mind that a single innocent occupant of the city would change the decision. In the last case, where an innocent person’s death would only benefit the quality of life in the city, we have not reached a conclusion. We believe it would depend upon how significant the quality change would be, but such change would have to threaten the long-term viability of the populace before it would even be a consideration. “

Perhaps the hostage should be consulted in such a case,Eve sent.

“Indeed. Perhaps the hostage should be consulted in such a case.”

“But not the townspeople?” Ariel asked.

Lucius used the comlink again. Comment?

If time allowed polling the populace, then it would allow removing them from the danger,Mandelbrot pointed out.

Good point.“Probably not,” Lucius said. “It would of course depend upon the individual circumstances.”

Ariel did not look pleased. Lucius was sure she would now order him dismantled, killed to protect the hypothetical inhabitants of her hypothetical city from his improper judgment. He waited for the blast, but when she spoke it wasn’t at all what he expected.

“Frost, maybe it wasn’t a fair question at that. I don’t know what I d do in that last case. “

“You don’t?”

“No.”

“Then there is no correct answer?”

“I don’t know. Maybe not.”

Janet was smiling. “We were more worried about a wrong answer anyway. “

“I see.”

Wolruf cleared her throat in a loud, gargling growl. “One last ‘ypothetical question,” she said. “W’at if the particular ‘umans in this city didn’t care about the death of an individual. Say it didn’t matter even to the individual. W’at if it wasn’t part of their moral code? Would you enforce yours on them?”

Lucius suddenly knew the exact meaning of the cliche, “Out of the frying pan into the fire.” Help! he sent over the comlink.

The correct answer is “No,”Mandelbrot sent without hesitation.

You are sure?

Absolutely. Thousands of years of missionary work on Earth and another millennium in space have answered that question definitively. One may persuade by logic, but to impose a foreign moral code by force invariably destroys the receiving civilization. Often the backlash of guilt destroys the enforcing civilization as well. Also, it can be argued that even persuading by logic is not in the best interest of either civilization, as that leads to a loss of natural diversity which is unhealthy for any complex, interrelated system such as a society.

How do you know this?

I read over Ariel s shoulder.


Janet heard both Ariel and Wolruf sigh in relief when Lucius said the single word, “No.”

She laughed, relieved herself. “You’re very certain of that,” she said.

“Mandelbrot is certain,” Lucius said. “I trust his judgment.”

Mandelbrot. That name. She could hardly believe it, but it had to be

“I think I trust his judgment, too.” Janet turned to Ariel. “What about you, dear? Satisfied?”

Ariel was slow to answer, but when she did it was a nod. “For now,” she said. “I don’t know if having a learning machine for a mayor will solve everything, but it might solve some of it.”

“Who wants them to solve everything?” Janet asked. “If they did, then we’d really have problems.”

That seemed to mollify Ariel considerably. She nodded and said, “Yeah, well, that’s something to think about, all right. “

No one seemed inclined to carry the discussion any further. Wolruf and Ariel exchanged glances but didn’t speak. The robots all held that particular stiff posture they got when they were using their comlinks. Now that he had removed Basalom’s shoulder joint, Derec was holding the two sections of arm together to see how easy they would be to repair.

Janet turned her attention to Mandelbrot. She looked him up and down, noticing that while most of him was a standard Ferrier model, his right arm was the dianite arm of an Avery robot.

Mandelbrot suddenly noticed her attention and asked, “Madam?”

“Let me guess; you got your name all of a sudden, with no explanation, and had a volatile memory dump at the same time, all when you made a shape-shift with this arm. “

“That is correct,” Mandelbrot said. “You sound as if you know why.”

“I do.” Janet giggled like a little girl. “Oh dear. I just never thought I’d see the result of it so many years later.”

She looked to Derec, then to Ariel, then to Wolruf. “Have you ever thrown a bottle into an ocean with a message inside, just to see if it ever gets picked up?”

Derec and Ariel shook their heads, but Wolruf nodded and said, “Several times.”

Janet smiled her first genuine smile for Wolruf. Maybe she wasn’t so alien after all. She said, “Mandelbrot was a bottle cast in the ocean. And maybe an insurance policy. I don’t know. When I left Wendell, I took all the development notes for the robot cells I’d created with me. I took most of the cells, too, but I knew he’d eventually duplicate the idea and use it for his robots, so since he was going to get it anyway, I left a sample behind in a comer of the lab and made it look like I’d just forgotten it in my hurry. But I altered two of the cells I left behind. I made them sterile, so it would just be those two cells no matter how many copies he made of them, but programmed into each one I left instructions set to trigger after they registered a thousand shape-changes. One was supposed to dump the robot’s onboard memories and change its name to Mandelbrot, and the other was supposed to reprogram it to drop whatever it was doing and track me down wherever I’d gone.”

“I received no such instructions,” Mandelbrot said.

“Evidently the other cell was in the rest of the robot you got your arm from,” Janet said. “I didn’t tell them to stay together; I just told them to stay in the same robot. “

Wolruf nodded. “None of my bottles came back, either.”

Janet laughed. “ Ah, but this is even better. This is like finding the bottle yourself on a distant shore.” She sobered, and said to Mandelbrot, “I’m sorry if it caused you any trouble. I really didn’t intend for it to happen to a regular robot. I figured it would happen to one of Wendell’s cookie cutter clones and nobody’d know the difference.”

Derec was staring incredulously at her. “Any trouble!” he said. “When your…your little time bomb went off, Mandelbrot lost the coordinates to the planet! We didn’t know where we were, and we didn’t know where anything else was, either. We had a one-man lifepod and no place to send it. If we had we probably could have gotten help and gotten away before Dad caught up with us, and none of-” He stopped suddenly, and looked at Ariel. She smiled a smile that no doubt meant “private joke,” and Derec said to Janet, “Never mind.”

“What?”

“If you hadn’t done that, none of this would have happened to us. Which means Ariel would probably be dead by now from amnemonic plague, and who knows where the rest of us would be? Dad would still be crazy. Aranimas would still be searching for robots on human colonies, and probably starting a war before long. Things would have been a real mess. “

At Derec’s words, Janet felt an incredibly strong urge to gather her son into her arms and protect him from the indifferent universe. If she felt she had any claim on him at all, she would have, but she knew she hadn’t built that level of trust yet. Still, all the things he’d been through, and to think she’d been responsible for so many of them. But what was he saying? Things would have been a mess? “They’re not now?” she asked.

“Well, they’re better than they might have been.”

There was a rustling at the door, and Avery stood there, bare-footed, clothed in a hospital robe, his arm with its dianite regenerator held to his chest in a sling, with a medical robot hovering anxiously behind him. “I’m glad to hear somebody thinks so,” he said.


“Dad!”

The sight of his father in such a condition wrenched at Derec as nothing had since he’d watched Ariel go through the delirium of her disease. A part of his mind wondered why he was feeling so overwhelmed with compassion now, and not a couple of hours ago when he’d first seen Avery in the operating room, but he supposed it had just taken a while to sink in that his father had been injured. Maybe being with his mother for the last couple of hours had triggered something in him after all, some hidden well of familial compassion he hadn’t known existed.

Avery favored Derec with a nod. “Son,” he said, and Derec thought it was probably the most wonderful thing he’d ever heard him say. Avery took a few steps into the room and made a great show of surveying the entire scene: his gaze lingering on Janet perhaps a fraction of a second longer than upon Derec, then shifting to Ariel, to Wolruf, to the inert robot on the exam table and to the other four standing off to the side. He locked eyes with Lucius, and the two stared at one another for a couple of long breaths.

Lucius broke the silence first. “Dr. Avery, please accept my apology for injuring you.”

“I’m told I have no choice,” Avery said, glancing at Janet and back to Lucius again.

“Oh,” Lucius said, as if comprehending the situation for the first time. He hummed as if about to speak, went silent a moment, then said, “ Accepting my apology would help repair the emotional damage.”

“Concerned for my welfare, are you?”

“Always. I cannot be otherwise.”

“Ah, yes, but how much? That’s the question, eh?” He didn’t wait for a reply, but turned to Janet and said, “I couldn’t help overhearing your little anecdote as I shuffled my way down here. Very amusing, my dear. I should have guessed you’d do something like that.”

Janet blushed, but said nothing.

“I came to discuss terms,” Avery said. “You have me over a barrel with your damned patent and you know it. You said you didn’t like what I’m doing with my cities. All right, what do you want?”

Derec hadn’t heard about any patent, but he knew immediately what had to have happened. Janet had patented dianite when she’d left home, or else Avery in his megalomania had neglected to do it later and she had done so more recently. Either way it added up to the same thing: Avery couldn’t use the material anywhere in the Spacer-controlled section of the galaxy, or use the profit from sales to outside colonies, for fifty years.

Janet didn’t gloat. Derec Was grateful for that. She merely said, “We were just discussing that. Ariel and Wolruf just brought up an intriguing problem, but we think we may have solved it. Why don’t we run it past you and see what you think?”

“I know already what I’m going to think,” Avery said. He folded his good arm over his injured one, which brought the medical robot a step closer, checking to make sure he hadn’t bumped any of the regenerator settings. “Back off,” he told it, and it stepped back again, but its gaze never left his arm.

Derec could see him counting to a high imaginary number, but when he spoke it was only to say, “Give me a chair here.”

The floor mounded up and flattened out into a cushiony seat, grew a back and padded sides, and moved up to bump softly into the back of his legs. Avery sat and leaned back, resting his left arm on his leg. “Let’s hear it,” he said.

Janet mentioned casually that she would like a chair for herself, and after it formed she sat and began explaining about capricious city behavior and the Zeroth Law and moral dilemmas with large and small factions on either side of the issue. Derec and Ariel and Wolruf soon joined in, and the topic shifted to their concerns.

“I worry about w’at introducing robots will do to life back ‘orne,” Wolruf said. “We ‘ave a fairly complex system. We ‘ave four separate species on two planets, all interdependent. W’at’s good for one is usually not so good for another in the short term, but in the long term we need each other.”

“Even the Erani?” Avery asked. Aranimas had been Erani, one of the four races Wolruf spoke about.

Wolruf nodded. She seemed surprised to have Avery listening to her so intently. “Erani ‘ave their place. They keep Narwe for slaves, and sometimes us, but without Erani, Narwe would probably starve. They’re ‘ardly more than intelligent sheep. “

“ And your own people have a trading empire, don’t they?” Ariel asked.

“ ‘at’s right. Once robots start making everything everyone needs, our economy will collapse.”

“But those same robots will provide anything you want. Let it collapse! “

“ ‘Aving everything done for us wouldn’t be ‘ealthy,” Wolruf said.

“That’s right,” said Ariel. “If everybody started doing everything the easy way, it would wipe out their individuality. All four cultures would decline. That’s what I’m worried about, that robot cities are eventually going to make every civilization in the galaxy the same. “

“Wait a minute. I’m supposed to worry about homogenizing the galaxy? That’s not my problem!”

“You’re right, it’s not,” Janet said. “That’s because I’ve solved it for you already.” She explained about providing each city with a positronic mayor, one who would have the best interest of all its inhabitants at heart. Including the long-term effects of having too much done for them.

“So in Wolruf’s situation, we’d use four learning machines, one for each species. Let them learn the separate mores of each culture, and then have them get together and coordinate their work so they wouldn’t step on each other’s toes.”

Derec watched his father watching his mother as she spoke. Avery’s jaw seemed to be dropping lower and lower with each word, until when she finally stopped, his mouth was hanging open in astonishment. He closed it just long enough to take a breath, then bellowed out a laugh that shook the walls.

“Oh, that’s rich,” he said when he could talk again. “I can’t believe it. I wouldn’t inflict these…these walking conglomerations of simulated neuroses on my worst enemies, and you talk about giving them to paying customers?”

“I do indeed,” Janet said. “Obviously, the final version will need to have the Zeroth Law programmed in from the start, but now that these three-excuse me-these four, “ with a nod to Mandelbrot, “have already worked it out, that shouldn’t be too much of a problem. “

“My God,” Avery said. “You really mean it, don’t you? You’d provide every city with a mechanical dictator who’s capable of slicing off a man’s hand just for shooting a robot.”

“I was protecting a being whose humanity is still not clear,” Lucius said, and Derec, hearing the emotion behind his words, suddenly realized that Lucius would be trying to solve that problem for the rest of his life, however many millennia that might be.

And thus are obsessions generated,he thought.

Avery waved his free hand expansively. “Oh, right, well, that makes it okay. It might have been human, after all.” To Janet he said, “Sorry, I’m not buying it. I’d rather do nothing at all than be part of your ridiculous scheme.”

“I was afraid you’d say that.” Janet’s tone of voice was a little too glib, her mouth just hinting toward a smile as she spoke.

“What?” Avery demanded. “I know that tone, woman! How many other nasty little surprises do you have in store for me?”

Janet was grinning openly now. “Just one,” she said. “Just one more.”

Загрузка...