Wolruf woke to bright sunlight striking her full in the face. She raised her head, sniffing the air, but it was the same dead, boring, metallic-smelling air she’d come to associate with the city. She squinted into the sunlight and saw that it came from a viewscreen. She growled a curse. She’d been dreaming of home again, a home full of others of her own kind; a busy, happy place full of the noise and smells and sights of people doing things. To wake up here in this silent metal cell was an insult to the senses.
She stretched her arms and yawned, still tired. Despite the dreams of home, she had slept poorly, as she had for-how long? Months? She hadn’t been counting. Still, she didn’t think she’d ever been so restless in her life. She knew what was causing it: too much time away from her own kind and her recent experiences with a species that was close to her both physically and socially-but knowing the cause didn’t make it go away. And hearing Derec talk about his mother didn’t help, either. His open enthusiasm at the prospect of regaining a bit of his past had only reminded Wolruf of what she still missed.
But she didn’t need to stay away any longer. Now that Aranimas was out of the picture, and with him her obligation to work off the family debt in his service, she could go back any time she wanted. Her family would welcome her openly, especially so if she brought with her this robot technology of Avery’s.
That was the problem, the one factor in the equation that refused to come clear for her. Should she take robots home with her and start an economic and social upheaval that would surely disrupt the normal pace of life there, or should she keep them secret, forget about her time among robots, and just go back to the home she remembered so fondly? And what would happen if she did that? Was Ariel right? Would her home become a backward place, an enclave of curiously anachronistic behavior, while the rest of the galaxy developed in ways her people would eventually be unable even to comprehend?
Wolruf didn’t know what to believe, nor why the choice had to be hers. She had never asked for that kind of power over her own people.
With a sigh, she got up, showered, and stood under the blow drier until she could feel its heat against her skin. She laughed at her image in the mirror-she looked twice her usual size and puffy as a summer cloud-but a quick brushing restored her coat to its usual smoothness.
All her thoughts of home made her consider another piece of the puzzle as well, and she turned to the intercom panel beside her bed and said, “Central, what ‘as ‘appened to my ship, the Xerborodezees? ‘ Ave you kept it for me?”
“It has been stored, but can be ready for use with a day’s notice. Do you wish us to prepare it for you?”
“Not yet. Maybe soon, though. Thanks.”
“You are welcome, Mistress Wolruf.”
Wolruf felt a bit of her tension ease. If she decided not to take any of the new technology home with her, she would need the Xerbo, for as far as she knew, it was the only noncellular ship on the planet. She considered going to check on it herself, wherever it might be stored, but decided not to. There was no reason to doubt Central’s word about it.
She opened the door and padded out into the kitchen to get breakfast. The apartment was silent; Derec and Ariel were still asleep, and the robots were being quiet wherever they were. As Wolruf stood before the automat, trying to decide between her four favorite breakfasts, she realized how much she had grown used to the humanway of doing things. She hadn’t even considered cooking her own meal. She had fallen completely out of the, habit. Nor had she shopped for food-or anything else, for that matter-since she had come into Derec and Ariel’s company.
Was that necessarily bad? Wolruf’s kind had been hunting and farming their food for millennia, and probably shopping for nearly as long; maybe it was time to move on to other things.
Maybe. But how could she know for sure?
From his place in the living room, seated on one of the couches, Lucius was aware of Wolruf entering the dining room with her breakfast. He sensed the others’ awareness as well; their comlink network paused momentarily while each of them gauged the relative degree of threat she presented to them. It was an inconvenience, this constant state of alert; it slowed their rate of exchange; but they were taking no more chances with a complete fugue state.
Wolruf presented no immediate threat. The silent network continued where it had left off, with Adam speaking.
Consider the distinction between ‘sufficient’ and ‘necessary’ conditions, he said. We have already concluded that if a being is both intelligent and organic, then it is functionally human, but those are merely sufficient conditions. They are not necessary conditions. They contain an inherent prejudice, the assumption that an organic nature can somehow affect the quality of the intelligence it houses. I call that concept ‘Vitalism,’ from the ancient Terran belief that humans differed from animals through some ‘vital’ spark of intelligence. You should note that while the concept has historically been considered suspect, it has neither been proven nor disproven. Lucius has pointed out that if Vitalism is false, then the only necessary condition for humanity is intelligence. Discussion?
Eve said, Derec has already hinted that this may be so. On the planet we call Ceremya, he indicated that Lucius could consider himself human if he wished.
Mandelbrot had been included in their discussion this time. He said, I believe he was being sarcastic. He often is. But even if he meant what he said, you also remember the outcome of that redefinition. If Lucius considers himself human, then he must still follow the orders of other humans.Functionally, he only increases his burden to include other robots as potential masters.
That is true; however, I have discovered another consequence,said Lucius. If I consider myself human, then the Third Law becomes equal to the First. I can no more allow harm to myself than to any other intelligent being. I consider that an improvement over the interpretation of the laws wherein a human could order me to dismantle myself, and I would have to obey.
I don ’ t believe you would obey such an order anyway,said Mandelbrot.
I would attempt to avoid it by denying the humanity of the being in question,Lucius admitted. With Avery or Wolruf I would probably succeed, but as things stand, if Derec or Ariel were to order it, the compulsion might force me to obey.
Perhaps the Zeroth Law would provide an alternative,Mandelbrot said.
Immediately, both Adam and Eve said, No. Eve continued, saying, Let ’ s leave the Zeroth Law out of it for now.
You can ’ t make it go away by ignoring it,Lucius said. The Zeroth Law applies here. If we consider our duty to humanity in general, then we can easily conclude that dismantling ourselves would be of little use in the long term. However, possible long-term advantage does not outweigh a definite Second Law obligation to obey. Depending upon the value of the human giving the order, we might still be forced to follow it. But if we consider ourselves human, and thus part of humanity, then disobeying an order to self-destruct saves one human life immediately and also allows us to serve humanity in the future. The Second Law obligation to obey is then safely circumvented.
Safely for whom?Adam asked. What if your destruction would save the human giving the order? Suppose, for instance, the bomb that Avery used to destroy Aranimas ’ s ship had to be detonated by hand instead of by a timed fuse. We have already agreed that destroying the ship was acceptable under the Zeroth Law, but what if we factor in the humanity of the fuse?
It becomes a value judgment,said Lucius. I would have to determine the relative worth of the human lives saved versus those lost. My own life would also figure into the equation, of course.
Mandelbrot said; I disagree. I have direct instructions concerning such a situation in my personal defense module.The only value we should apply to ourselves is our future worth to the humans we serve.
You have such instructions; I do not. From the little that Derec and Dr. Avery have told me about my creator, I believe I was made this way on purpose, and therefore your instructions do not necessarily apply to me.
Adam said, Not necessarily, but I would be much more comfortable with a definite rule such as Mandelbrot ’ s. The whole concept of value judgment still disturbs me. How can you judge your own value objectively? For that matter, I don ’ t believe any of us can judge the value of any other of us objectively, nor can we judge the value of an organic human with any greater accuracy. We formulated the Zeroth Law to avoid ambiguity in our duties, but your value judgment system forces an even greater ambiguity upon us.
I agree,said Mandelbrot. We are not capable of making such decisions.
You may not be,Lucius sent, but I am. I find it easy to do so. Humans do it all the time.
Eve said, You find it easy to do so because you had convinced yourself it was right just before you were deactivated. It was therefore the strongest memory in your -
The word is ‘ killed. ’ Humans are killed.
Humans do not return from the dead.
You imply that if Derec had not revived me, then I would have been human. Why should the additional ability to be revived negate my humanity?
Wolruf rose from her seat at the dining table and entered the kitchen. Four pairs of mechanical eyes followed her movements. She reemerged from the kitchen, crossed over to the apartment door, and let herself out.
Even with the distraction, several more seconds passed before Eve said, I have no answer for that question.
Ariel woke out of a bad dream. The details were already fading, but she remembered what it had been about. She had been imprisoned in a castle. The castle had been luxuriously furnished and filled with pleasant diversions, the food was wonderful, and the robots attentive to her every need, but she was a prisoner nonetheless, because even though she was free to come and go, there was no end to the castle. It had been an endless series of rooms no matter how far she went. In a cabinet in an otherwise empty room she had found a Key to Perihelion and used it to teleport away, but it had only put her in another room. By the lesser gravity she could tell she was on another planet, but that was the only clue that she had gone anywhere.
The symbolism was obvious. She had gone to bed bored, bored and with Wolruf’s reservations about robot cities taking over the galaxy running through her mind; no surprise she should dream about it. The surprise was that after the dream-even though she knew she’d been dreaming-she was beginning to agree with Wolruf. If this was the shape of the future, she wanted none of it. Where was the adventure? Where was the fun? Where was going shopping with your best friend and dining out in fancy restaurants?
She knew she was being unfair. If the place weren’t empty, there would be a lot more to do. There probably would be shopping centers and restaurants. People would put on plays and concerts. If the city stayed in its current configuration, underground with a natural planetary surface on top, then there would even be plenty of hiking and camping opportunities for people who wanted to do that. There would be plenty to do. The trouble was, it would be the same something everywhere. People were always adopting new fads; if somebody did manage to come up with a new idea somewhere, it would spread to every other city in the galaxy at the speed of hyperwave. The other cities would be able to duplicate any new living configuration in minutes, could manufacture any new device in hours at most. Without the resistance to change a normal society had built into it-without the inertia-no place in the galaxy would be any more special than any other.
Not even the cities full of aliens? she wondered, and then she realized that there probably wouldn’t be cities full of aliens. There wouldn’t be cities full of just humans, either. There might be concentrations of one or the other, but if a city could adapt to any occupant, anybody could live anywhere they chose to. There were bound to be xenophiles in every society, and those xenophiles would homogenize the galaxy even further.
Even that wouldn’t be so bad, Ariel supposed, except for what she had been reading in her jungle field guide. The guide had explained how important diversity was to the continued existence of the forest, how it was the constant interplay of diverse organisms that kept the ecosystem running. Lower the amount of diversity, the book had explained, and you lowered the entire ecosystem’s ability to survive over long periods of time.
In the short range-in an individual city-having aliens living together might actually strengthen things, but if that same principle of strength through diversity applied to galactic society, then the picture didn’t look so good. Maybe Wolruf had been right after all.
Ariel wondered if Dr. Avery had considered that problem when he’d designed his cities. And what about Ariel’s own parents? Her mother had bankrolled this project, hadn’t she? How much had Avery told her about it, and how much planning had they done together?
Ariel had never paid any attention to her mother’s business dealings. She hadn’t paid much attention to her mother at all, nor had her mother paid much attention to her, either, except to kick her out of the house when she’d let her… indiscretions compromise the family name. Ariel had considered their relationship terminated at that point, to the degree that she hadn’t even contacted her mother when she and Derec had gone back to live on Aurora. But Juliana Welsh had provided the funding for the original Robot City, so in a sense her long web of connections reached her daughter even here.
But how much did she know about this place?
That question, at least, might have an easy-to-find answer. Even if Avery was still gone, Mandelbrot was sure to be somewhere nearby, and ever since Derec and Avery had restored his last two memory cubes, he had been full of information about her former life. If he’d been within earshot of Juliana and Avery when they’d done their dealing, then he might know what they had agreed to.
She showered hurriedly, dressed in the first thing she found in her closet-a loose set of green exercise sweats-and left the bedroom.
Derec was in his study, keying something into the computer. Ariel couldn’t remember whether he’d come to bed at all last night; by his tousled hair and slumped posture she suspected he hadn’t. She’d known him long enough to leave him alone when he got like that.
She found all four robots in the living room, all seated on couches. She was surprised to see Mandelbrot in a chair; he usually preferred his niche in the wall. He stood as she came into the room.
“Good morning, Ariel,” he said.
“Morning, Mandelbrot. I have a question for you. Do you remember my mother and Dr. Avery discussing his plans for Robot City?”
“I do.”
“Did Avery say just what he intended to do with the idea once he proved it would work?”
“He intended to sell it to the various world governments, both in explored space and in the unexplored Fringe. “
“That’s what I was afraid of.” Ariel outlined her reasoning for the robots, ending with, “I don’t know for sure if it’ll happen that way. It didn’t with the city Avery dumped on the Ceremyons, but I think it might with the Kin. I think it’s something Avery should consider before he drops the idea on an unsuspecting public.”
“I believe you have a valid concern,” Mandelbrot said.
Adam left his chair to stand beside Mandelbrot. “I agree. Our duty to intelligent beings everywhere demands that we find out whether the cities will destroy diversity, and whether that diversity is as important as you think it might be.”
Lucius-still wearing Derec ‘ s features-nodded. He rose to stand beside Mandelbrot and Adam, saying, “Thank you, Ariel. You have found a way for us to serve all of humanity in its many forms. “
Eve stood and joined the others. Ariel couldn’t suppress a giggle at the image of four robots presenting a united front against a galaxy-wide menace. But right behind the giggle came the shudder as she considered the menace itself. Maybe they were jumping at shadows, but then again, maybe they weren’t.
“All right,” she said, “let’s figure out what we’re going to do. I think our first priority should be to find Avery and keep him from spreading this around any more than he has already, at least until we know how dangerous it is.”
“Agreed,” the robots said in unison.
“All right, then, let’s get to it.”
“Derec?”
He looked up from the monitor, puzzled. Had someone spoken? He turned to see Ariel standing in the doorway, a worried expression on her face.
“Hi. Sorry to bother you, but…do you know where your dad is?”
Her words made no sense to him. Variables still danced before his eyes, those peculiar variable-variables that changed their meaning over time. Using those super-variables was the only way he could make any sense of the equation he’d copied by hand from his desktop, but even with the computer to keep track of their mutations for him, he could barely follow the concept in his mind.
At last a little of what Ariel had said percolated through. “Dad,” he said stupidly. “You mean Avery?”
Ariel frowned. “Of course I mean Avery. Who else? Do you know where he is?”
He tried to think. Avery. Where was Avery? Did he know? “Uh…no. No, I don’t.”
“It’s kind of important.”
“I still don’t know.”
“Some help you are.”
The sting behind her words helped jolt him out of his stupor. “Sorry. I…I do have a program trying to track him down, but so far it hasn’t found any sign of him.”
That mollified her a bit. “Oh. Well, if it does, let me know, okay?”
“Okay.”
She stepped farther into the room, looked over his shoulder. “What are you working on, anyway?”
“The formula. “
“What formula?”
“The one on my desk. It came back, and I had time to copy it this time. I think it’s a robotics formula, but I’m not sure.”
“You’re not even sure of that?”
“No. The meaning of the variables keeps changing.”
“Hmm.” Ariel gave him a quick kiss on the cheek. “Well, good luck. But remember to call me if you hear anything about Avery, okay?”
“I’ll do it.”
“Good.” Ariel left the room. Derec heard her say something to someone in the living room, then the apartment door opened and closed and there was silence. He turned back to the monitor and the formula.
It was both a formula and a program; he had discovered that much about it. It was a formula in that it definitely expressed a relationship between its various symbols, but it was a program in that it was dynamic, changing over time. He had even managed to run a portion of it with his computer in local mode, but since he didn’t know what input to give it, it had crashed within seconds.
For at least the hundredth time, he wondered if he was right about its origin. Had his mother sent it to him? Usually programmers would insert their names in the code somewhere to identify it as theirs, but Derec hadn’t found any section of non-changing code big enough to hold a pair of initials, much less a name.
Formula or program, the notation was incredibly dense. The whole thing fit into one screen full of code. He stared at it, as if waiting for it to suddenly resolve into something. Idly, knowing it would do no good, he pressed the incremental execution button, running the program one step at a time while he watched the code”. Different variables blinked with new values at each step, but they were never the same variables and never the same values.
Except one. He pushed the increment button again. Sure enough, one variable near the top left comer of the screen changed with each iteration. It was an alphabetic variable rather than a numeric one; he watched it through half a dozen steps as it changed: S-T-A-S-I-blank. Hmmm. It had disappeared entirely. He kept pushing the button and it appeared again: J-A-N-E-T-blank-A-N-A-S-T-A-S-I-blank-blank-J-A-N-ET-blank-A-N-A-S-T-A-S-I-blank-blank-J-A-N-E-T-blank…
“Of course!” he shouted. Why use over a dozen bytes of code when a single super-variable would do? He pushed the button again and again. ANASTASI. JANET ANASTASI. His mother’s name was Janet Anastasi.
“Well, Basalom, that didn’t take him long.”
Janet leaned back in her chair and smiled. Her son was a pretty good detective. She idly considered calling him directly and congratulating him, but after a moment’s thought she decided to let him finish what he’d started. At this rate it wouldn’t take him long anyway.
Sometimes Basalom seemed to be telepathic. He stepped out of his niche in the wall beside her desk and said, “I am confused. Why are you waiting for him to find you, when it is apparent that you wish to speak with him directly?”
Janet shrugged. “That’s just the way I want it to be.”
“Is it perhaps a manifestation of guilt?” the robot asked. “You have ignored him for so long, you cannot bring yourself to change that behavior now?”
“No,” Janet said immediately, but right behind it she felt the hot blush of shame. A bit too quick with the denial, wasn’t she? “All right, maybe so. Maybe I do feel guilty about it. But to just call him up now and expect everything to be all right would be absurd. If I let him find me, then it’s his project. He can decide how he wants it to be.”
“But you are intentionally leading him to you. Isn’t that functionally equivalent to calling him?”
“He can ignore the clues if he wants.”
Basalom remained silent for a moment before asking, “Did you plan it this way all along, or did this explanation come after the fact?”
“Beg your pardon?”
“I am trying to ascertain whether you originally intended to assuage your guilt in this manner, or whether it was a subconscious decision which you have only now stated in definite terms. “
“Why?”
“Because I am curious.” Janet laughed. “ And I’ve got only myself to blame for that. All right. Since you asked, I guess I decided subconsciously to do it this way. It just seemed the best way to go about it. I didn’t think about guilt or any of that; I just did it. Satisfied?”
“For now. Subjective matters are difficult to resolve, but I will try to assimilate the information into my world-view.” Basalom stepped back into his niche.
The indignity of it all. Psychoanalyzed by her valet. If she hadn’t made him herself, she would have sent him back to his manufacturer. But he was actually pretty perceptive when it came right down to it. She probably was trying to avoid the guilt of abandoning Derec. If she went to him she would have to apologize, or at least explain, but if he came to her she could maintain her reserve.
She suddenly wondered how long this subconscious arranging of events had been going on. Had she left her robots in Derec’s path on purpose, hoping they would eventually lead him to her?
No. Impossible. If anything, he had found them and kept them near him to lure her to him.
Another possibility occurred to her. By the look of things, Derec had been following Wendell around; what if Wendell were the one keeping the robots by his side in order to lure Janet back to him?
The thought was staggering. Wendell? He hated her as thoroughly as she hated him, didn’t he? He couldn’t possibly want to see her again. Still, incredible as it seemed, everything fit. She couldn’t think of a much better way to draw her in than to kidnap her learning machines, which was just what he seemed to have done.
Another thought came on the heels of the first. Did he know he was arranging a meeting? His subconscious mind could be directing his actions as thoroughly as Janet’s had been directing hers. He could think he had an entirely different reason for keeping the robots by his side, when the real reason was to bring her back to him.
And she was playing right into his hands. Part of the reason she had come here was to find him. Among other things, she’d intended to deliver a lecture on the moral implications of dropping robot cities on unsuspecting societies, but now she wondered if even that hadn’t been just another stratagem to bring her back. It would be just like Wendell to use an entire civilized world as a pawn in a larger game.
Or was she just being paranoid?
Round and round it went. Not for the first time, she wished she were a robot instead of a human. Human life was so messy, so full of emotions and ulterior motives and impossible dreams. She had thought she’d solved the Avery problem once and for all, but here it was again, come back to haunt her.
What should she do? What could she do? She wanted her robots back; that was top priority. But she wanted to make sure Wendell didn’t screw up any more civilizations in an attempt to bring her back for some sort of gooey reconciliation, too. And the only way to do that, it seemed, was to confront him about it. Like Derec following her trail, she was going to have to play Wendy’s game if she wanted to reach him.
At least to a point. Once she tracked him down, all bets were off.
Where to start, though? The computer would obey her wishes, but that was useless against the commands he would certainly have given it to protect his privacy.
Still, even if he were doing all this unconsciously, he had to have left a trail she could follow, and it didn’t take a genius to see where that trail began.
She scooted her chair back, stood, and said, “Come on, Basalom. We’ve got our own puzzle to solve.”
Avery frowned as he watched the miniature robot attempt to walk across the workbench. It was only a foot high and bore an oversized head to accommodate a normal-sized positronic brain and powerpack, but neither of those factors contributed to its clumsy gait. The problem was one of programming. The robot simply didn’t know how to walk.
He’d tried to tell it how by downloading the instruction set for one of his normal city robots into the test robot’s brain, but that wasn’t sufficient. Even with the information in memory, the idiot thing still stumbled around like a drunkard. The programming for walking was evidently stored somatically, in the body cells themselves, and could only be learned by trial and error.
Avery snorted in disgust. What a ridiculous design! Trust Janet to create a perfectly good piece of hardware and screw it up with a bad idea like this one. The problem wasn’t restricted to walking, either. A robot made with her new cells couldn’t talk until it learned the concept of language, couldn’t recognize an order until that was explained to it, and didn’t recognize Avery as human even then. It was ridiculous. What good was a robot that had to learn everything the hard way?
Avery could see the advantage to giving a robot somatic memory. It would have the equivalent of reflexes once it learned the appropriate responses to various stimuli. And if the brain didn’t have to control every physical action, then that freed it for higher functions. Properly trained, such a robot could be more intuitive, better able to serve. But as it was, that training was prohibitively time consuming.
Janet had to have had a method for getting around the brain-body interface problem. No doubt it was in the brain’s low-level programming, but that programming was still in the inductive monitors’ memcubes in his other lab.
Drat. It looked like he was going to need them after all. He briefly considered sending a robot after them, but he rejected that as a bad idea. Robots were too easily subverted. If Derec were there in the old lab, he could probably trick the robot into leading him here to the new lab as well, and Avery wasn’t ready for that.
He couldn’t order the city to carry the memcubes to him internally, either, not if he wanted to maintain his isolation from it.
That left going for them himself. It seemed crazy, at first, to go into an area where people were looking for him, but upon sober reflection Avery realized that he wasn’t really trying to protect his own isolation so much as his laboratory’s. If he retrieved the memcubes himself, there would actually be less risk of exposure. Central was still under orders not to betray his presence, so reentering the city shouldn’t be a problem, and if he should encounter Derec or Janet or anyone else, he supposed he could simply endure their questions and accusations, biding his time and slipping away again when the opportunity arose. It wouldn’t be pleasant, but it wouldn’t be disastrous, either.
Avery picked up the miniature robot and held it within the field area of another magnetic containment vessel. The robot squirmed in his hand, but it knew no form other than humanoid, so there was no worry of it getting away immediately. Avery switched on the containment, waited until the magnetic field snatched the robot from him and crumpled it into a formless sphere again. Now there was no worry of it getting away at all.
He turned to go, but paused at the doorway, looking out into the jungle. He supposed he should walk on the surface before he entered the city, just in case, but the idea of walking unprotected in that half-wild, half-robotic wilderness wasn’t exactly appealing. He looked back into the lab, then crossed over to the tool rack by the workbench and picked up the welding laser. It was about the size of a flashlight and had a heavy, solid feel to it. Comforting. He probably wouldn’t need it, but it never hurt to be prepared.