GUBBER Anshaw knew he was not a courageous man, but at least he had the courage to admit that much to himself. He had the strength of character to understand his own limitations, and surely that had to count for something.
Well, it was comforting to tell himself that, at any rate. Not that such self-understanding was much use under the present circumstances. But be that as it may. There were times when even a coward had to do the right thing.
And now, worse luck, was one such time. He watched as Tetlak, his personal robot, guided Gubber’s deliberately undistinctive aircar through the dark of night toward Settlertown. The aircar slowed to a halt, hung in midair waiting for Settlertown’s traffic and security system to query the car’s transponder and see that it was on the preapproved list. Then the ground opened up beneath them as a fly-in portal to the underground city granted them entrance. The car flew down through the depths, down into the great central cavern of Settlertown, and came in for a landing.
Gubber used a hand gesture to order Tetlak to stay with the car, then got out himself. He walked to the waiting runcart and got in. “To Madame Welton’s, please,” he said as he settled in. The little open vehicle took off the moment he sat down. Gubber barely had time to reflect on the unnerving fact that there was no conscious being in control of the cart before he was delivered to Tonya’s quarters.
He walked to her doorway and stood there for a moment before he remembered to press the annunciator button. Normally that was something his robot would do for him. But Tetlak made Tonya nervous sometimes, and he had no wish for unneeded awkwardness. It was bad enough that he had come without calling ahead.
A sleepy Tonya Welton opened the door and looked upon her visitor in surprise. “Gubber! What in the Galaxy are you doing here?”
Gubber looked at her for a moment, raised his hand uncertainly, and then spoke. “I know it was risky to come, but I had to see you. I don’t think I was followed. I had to come and say—say goodbye.”
“Goodbye!” Tonya’s astonishment and upset were plainly visible on her face. “Are you breaking it off because—”
“I’m not breaking anything off, Tonya. You will always be there in my heart. But I don’t think I will be able to see you again after—after I go to see Sheriff Kresh.”
“What!”
“I’m turning myself in, Tonya. I’m going to take the blame.” Gubber felt his heart pounding, felt the sweat starting to bead up on his body. For the briefest of moments, he felt a bit faint. “Please,” he said. “May I come in?”
Tonya backed away from the door and ushered him in. Gubber stepped inside and looked around. Ariel stood motionless in her robot niche, staring out at nothing at all. The room was in its bedroom configuration, all the tables and chairs stowed away, replaced by a large and comfortable bed—a bed that Gubber had reason to remember most fondly. Now he crossed the room and sat, morosely, on the edge of it, feeling most lost and alone.
Tonya watched him cross the room, watched as he sat down. Gubber looked up at her. She was so beautiful, so natural, so much herself. Not like Spacer women, all artifice and appearance and affectation.
“I have to turn myself in,” Gubber said.
Tonya looked at him, quietly, thoughtfully. “For what, Gubber?”
“What? What do you mean?”
“What charge, exactly, will you confess to when you turn yourself in? What is it you’ve done? When they ask you for a detailed description of how you committed your crime, what will you say?”
Gubber shrugged uncertainly and looked down at the floor. He had no idea what to confess to, of course. In his own mind, he had committed no crime, but he doubted the law would share that opinion. But what point to confessing to a crime in order to shield Tonya when he did not know what, if anything, they suspected she had done? Tonya had her own secrets, and he dared not ask what they were.
Clearly it would be safer for both of them if each kept certain things to themselves for now.
The silence dragged on, until Tonya took it as an answer.
“I thought so,” she said at last. “Gubber, it just won’t work.” She sat down next to him and put her arm across his shoulders. “Dearest Gubber, you are a wonder. Back home on Aurora, I must have known a hundred men full of thunder and bluster, always ready to show me just how big and brave they were. But none of them had your courage.”
“My courage!” Gubber looked sadly at Tonya. “Hah! There’s a contradiction in terms.”
“Is it? No big burly Settler man would dream of confessing a crime, going to a penal colony, for the sake of the woman he loved. And you’d do it, I know you would. But you can’t. You mustn’t.”
“But—”
“Don’t you see? Kresh is no fool. He’ll be able to crack through a false confession in a heartbeat, and you don’t know what to confess to. We have the police report, but he’s not fool enough to tell us everything he knows. Once he’s cracked you, he’ll ask himself why you’d confess to what you hadn’t done. Sooner or later, he’ll find out you did it to protect me. Then we’ll both end up in trouble.”
Something deep inside Gubber froze up. He hadn’t thought that far ahead. But no, wait. There was one thing she hadn’t thought of. “That won’t happen, Tonya. After all, no one knows about us—”
“But maybe they will, Gubber. Odds are Kresh will find out sooner or later. I’ve done what I could to protect you, and I know you’ve done the same for me. But we dare do no more. If we’re lucky, and we don’t draw attention to ourselves, we’ll be all right. But if either of us does anything to draw Kresh’s attention—”
Tonya let the words hang in the air. There was no need for her to complete the sentence. Gubber turned to her, put his arms around her, and kissed her, passionately and for a long while. At last he drew back, just a bit. He looked her in the eye, stroked her hair, whispered her name. “Tonya, Tonya. There’s nothing I wouldn’t do for you. You know that.”
“I know, I know,” Tonya said, her eyes bright with loving tears. “But we must be careful. We must think with our heads, not our hearts. Oh, Gubber. Hold me.”
Then they kissed again, and Gubber felt passion sweeping away his fears and worries. They reached for each other, eagerly, urgently, pulling their clothes off, falling back onto the bed, their bodies coming together in desire and need.
Gubber glanced up and saw Ariel standing, motionless, in her wall niche. For a split second he worried, wondering if her being there would bother Tonya. A robot in the bedroom meant nothing to a Spacer, of course
The devil take it. It was more than obvious that Ariel was the furthest thing from Tonya’s mind. Why bring her attention to it? He reached out to the side of the bed and jabbed down the manual switch, shutting off the overhead lights, and gave no more thought to it.
Ariel stared blankly at the opposite wall, pale green eyes dimly glowing, as the two humans made love in the darkness.
NIGHT had come, and there was darkness, and shadow, but no quiet, or rest, or safety. Whatever else changed, danger was the constant. Of that much Caliban was sure.
Caliban walked the busy downtown, ghost-town streets of Hades. The place was bustling with energy, and yet there was a feeling of the tomb about the place, as if it were a busy, active corpse, not yet aware of its own death, hurrying about its business long after its time had come and gone.
Night and day did not seem to matter so much here, in the heart of town. Here, the streets were just as busy now as they had been when he had passed this way in daytime.
But no, it was inaccurate to say that there was no difference between day and night. There was no change in the amount of traffic on the streets and walkways, but there was a huge change in the character of that traffic. Now, late at night, the people were all but gone, but the robots were here.
Caliban looked about himself, at the proud, brightly lit, empty towers of Hades, the grand boulevards of magnificent and failed intentions. But the heart of that world, that city, was empty, barren.
Yet the unpeopled city was still crowded. Humans had been a sizable minority during the day, but in the wee hours of the night, it was robots, robots, everywhere. Caliban stood in the shadow of a doorway and watched them all go by. These robots of the night were different from the daytime robots. Almost all of those had clearly been personal servants. In the night, the heavy-duty units came out, hauling the heavy freight, working on construction jobs, doing the dirty work while there were fewer humans around to be disturbed by it.
A gang of huge, gleaming black construction robots trudged down the street, past Caliban, toward a tall ivory-colored tower, half-finished and yet already lovely. But there were already half a dozen equally lovely towers within a few blocks of where Caliban stood, all of them virtually empty. Across the street, another gang of robots was hard at word disassembling another building that seemed scarcely any older or more used.
Caliban had seen many other work crews come out in the last hour or so, likewise doing needless maintenance work: searching for litter that was not there; polishing the gleaming windows; weeding the weedless gardens and lawns of the parks; busily keeping the empty city core shining and perfect. Why were these robots not employed in the emptier, threadbare, worn and dirty districts, where their work could have some meaning? Why did they work here?
The empty city. Caliban considered the words. They seemed to echo in his head. There was something wrong with the very idea of such a place. From his datastore, from the emotions of whoever had loaded the store, came the sure, certain knowledge that cities were not meant to be so. Something was going desperately wrong.
Another piece of data popped up from the datastore, a straight, solid fact, but the ghosts of emotion hung about this one fact more strongly than any other emotion he had absorbed. It was the thing that the person who created his datastore cared about most of all: Every year the total human population went down—and the robot population went up. How could that be? he wondered. How could the humans allow themselves to get into such a predicament? But no answer came up from the datastore. For no reason that he could understand, the question, though it had nothing to do with him, was suddenly of vital importance to him.
Why? he wondered. And why do I wonder why? Caliban had noted that most robots he observed had a distinct lack of curiosity. Few were even much interested in their surroundings. Something else, yet again, that set him apart. When his maker had molded his mind into an oddly shaped blank, had that maker also blessed him and cursed him with an overactive degree of curiosity? Caliban felt certain it was so, but in a way it did not matter. Even if his sense of curiosity had been deliberately enhanced, that did not stop him from wondering all the same.
Why, why, why did the robots blindly, needlessly, build and disassemble, over and over, rather than leaving things as they were? Why create huge buildings when there were none to use them? Madness. All of it madness. The voice of the datastore whispered to him that the city was a reflection of a society warped, twisted, bent out of any shape that could make normal life and growth possible. It was opinion, emotion, propaganda, but still, somehow, it spoke to him.
The world was mad, and his only hope of survival was to blend in, be accepted as one of the inmates of this lunatic asylum, get lost among the endless robots that tended to the city and its inhabitants. The thought was daunting, disturbing.
Yet even perfect mimicry would not protect him. He had learned that much, almost at the cost of his existence. Those Settlers last night had clearly meant to kill him. If he had acted like a normal robot, he had no doubt that they would have killed him. They had expected him to stand placidly by and permit his own destruction. They had even thought it possible that he would willingly destroy himself on the strength of hearing that weak and tortuous argument about how his existence harmed humans. Why had they thought that strained line of reasoning would impel him to commit suicide?
Caliban stepped out from the shadowy doorway and started walking again. There was so much he had to learn if he was to survive. Imitation would not be enough. Not when acting like a standard robot could get him killed. He had to know why they acted as they did.
Why was he here? Why had he been created? Why was he different from other robots? How was he different from them? Why was the nature of his difference kept hidden from him?
How had he gotten into this situation? Once again, he tried to think back to the beginning, to search through the whole recollection of his existence for some clue, some answer.
He had no memory of anything whatsoever before the moment he came on, powered up for the first time, standing over that woman’s unconscious body with his arm raised from his side. Nothing, nothing else before that. How had he come to be in that place, in that situation? Had he somehow gotten to his feet, raised his arm, before he awoke? Or had he been placed in that position for some reason?
Wait a moment. Go back and think that through. He could see no compelling reason to assume that his ability to act could not predate his ability to remember. Suppose he had acted before his memory commenced? Or suppose his memory prior to the moment he thought of as his own awakening had been cleared somehow? Alternately, what if, for some reason, he had been capable of action before his memory started, and his memory had simply not commenced recording until that moment?
If any of those cases were possible, if the start of his memory was not a reliable marker for the start of his existence, then there were no limits to the actions he might have taken before his memory began. He could have been awake, aware, active, for five seconds before that moment—or five years. Probably not that long, however. His body showed no signs of wear, no indication that any parts had ever been replaced or repaired. His on-line maintenance log was quite blank—though it, too, could have been erased. Still, it seemed reasonable to assume that his body was quite new.
But that was a side issue. How had that woman come to be on the floor in a pool of blood? It was at least a reasonable guess that she had been attacked in some way. Had she been dead or alive? He reviewed his visual memories of the moment. The woman had been breathing, but she could easily have expired after he left. Had the woman died, or had she survived?
The thought brought him up short. Why had he not even asked himself such questions before?
Then, like twin blazes of fire, two more questions slashed through his mind:
Had he been the one who attacked her? And, regardless of whether or not he had—was he suspected of the attack?
Caliban stopped walking and looked down at his hands.
He was astonished to realize that his fists were clenched. He opened out his fingers and tried to walk as if he knew where he was going.
THE night before, Alvar Kresh had taken a needle-shower in hopes of helping him to sleep. Tonight he took one in hopes of waking up. He was tempted to watch the recording of Leving’s lecture while sitting up in bed, but he knew just how tired he was, and just how easy it would be for him to doze off if he did that. No, far better to get dressed again in fresh clothes and watch on the televisor screen in the upper parlor.
Kresh settled down in front of the televisor, ordered one of the household robots to adjust the temperature a bit too low for comfort, and told another to bring a pot of hot, strong tea. Sitting in a cold room, with a good strong dosage of caffeine, he ought to be able to stay awake.
“All right, Donald,” he said, “start the recording.”
The televisor came to life, the big screen taking up an entire wall of the room. The recording began with a shot of the Central Auditorium downtown. Kresh had seen many plays broadcast from there, and most times the proceeds were rather sedate, if not sedated, and it looked as if the occasion of Leving’s first lecture had been no exception. The auditorium had been designed to hold about a thousand people and their attendant robots, the robots sitting behind their owners on low jumper seats. It looked to be about half-empty.
“…and so, without further ado,” the theater manager was saying, “allow me to introduce one of our leading scientists. Ladies and gentlemen, I give you Dr. Fredda Leving.” He turned toward her, smiling, leading the applause.
The figure of Fredda Leving stood up and walked toward the lectern, greeted by a rather tentative round of applause. The camera zoomed in closer, and Kresh was startled to be reminded what Leving had looked like before the attack. In the hospital, she had been wan, pale, delicate-looking, her shaved head making her look too thin. The Fredda Leving in this recording looked as if she had a slight touch of stage fright, but she was fit, vigorous-looking, with her dark hair framing her face. All in all, an unfashionably striking young woman.
She reached the lectern and looked out over the audience, her face clearly betraying her nervousness.
She cleared her throat and began. “Thank you, ladies and gentlemen.” She fumbled with her notes for a moment, clearly still somewhat nervous, and then began. “I would like to start my talk this evening with a question,” she said. “One that might seem flippant, one wherein the answer might be utterly obvious to you all. And yet, I would submit, it is one that has gone thousands of years without a proper answer. I do not suggest that I can supply that missing answer myself, now, tonight, but I do think that it is long past time for us to at least pose the question.
“And that question is: What are robots for?”
The view cut away to reaction shots of the people in the auditorium. There was a stirring and a muttering in the audience, a strangled laugh or two. People shifted in their seats and looked at each other with confused expressions.
“As I said, it is a question that few of us would ever stop to ask. At first glance, it is like asking what use the sky is, or what the planet we stand upon is for, or what good it does to breathe air. As with these other things, robots seem to us so much a part of the natural order of things that we cannot truly picture a world that does not contain them. As with these natural things, we—quite incorrectly—tend to assume that the universe simply placed them here for our convenience. But it was not nature who placed robots among us. We did that to ourselves.”
Not for ourselves, Kresh noticed. To ourselves. What the devil had Leving been saying the night of the lecture? He found himself wishing that he had been there.
Fredda Leving’s image kept talking. “On an emotional level, at least, we perceive robots not as tools, not as objects we have made, not even as intelligent beings with which we share the universe—but as something basic, placed here by the hand of nature, something part of us. We cannot imagine a world worth living in without them, just as our friends the Settlers think a world that does include them is no fit place for humans.
“But I digress from my own question. ‘What are robots for?’ As we seek after an answer to that question, we must remember that they are not part of the natural universe. They are an artificial creation, no more and no less than a starship or a coffee cup or a terraforming station. We built these robots—or at least our ancestors built them, and then set robots to work building more robots.
“Robots, then, are tools we have built for our own use. That is at least the start of an answer. But it is by no means the whole answer.
“For robots are the tools that think. In that sense, they are more than our tools—they are our relatives, our descendants.”
Again there was a hubbub in the audience, a stirring, this time of anger and surprise. “Forgive me,” Fredda said. “That is perhaps an unfortunate way to phrase it. But it is, in a very real sense, the truth. Robots are the way they are because we humans made them. They could not exist without us. There are those who believe that we humans could not exist without them. But that statement is so much dangerous nonsense.”
Now there was a full-fledged roar from the back of the hall, where the Ironheads had congregated. “Yes, that does strike a nerve, doesn’t it?” Fredda asked, the veneer of courtesy dropping away from her voice. “ ‘We could not live without them’—it is not a factual statement, but it is an article of faith. We have convinced ourselves that we could not survive without robots, equating the way we live with our lives themselves. We have to look no further than the Settlers to know that humans can live—and live well—without robots.”
A chorus of boos and shouts filled the hallway. Fredda raised her hands for quiet, her face stern and firm. At last the crowd settled down a bit. “I do not say that we should live that way. I build robots for a living. I believe in robots. I believe they have not yet reached their full potential. They have shaped our society, a society I believe has many admirable qualities.
“But, my friends, our society is calcified. Fossilized. Rigid. We have gotten to the point where we are certain, absolutely certain, that ours is the only possible way to live. We tell ourselves that we must live precisely as our ancestors did, that our world is perfect just as it is.
“Except that to live is to change. All that lives must change. The end of change is the beginning of death—and our world is dying.” Now there was dead silence in the room. “We all know that, even if we will not admit it. Inferno’s ecology is collapsing, but we refuse to see it, let alone do anything about it. We deny the problem is there.”
Kresh frowned. The ecology collapsing? Yes, there were problems, everyone knew that. But he would not place it in such drastic terms. Or was that part of the denial she was talking about? He shifted uncomfortably in his seat and listened.
“Instead,” Leving’s image went on, “we insist that our robots coddle us, pamper us, while we go about our self-indulgent lives, as the web of life that supports us grows ever weaker. Anytime in the last hundred years, we, the citizens of Inferno, could have taken matters into our own hands, gotten to work, and saved the situation—saved our planet—for ourselves. Except it was so easy to convince ourselves that everything was fine. The robots were taking care of us. How could there be anything to worry about?
“Meantime, the forests died. The oceans’ life-cycle weakened. The control systems broke down. And we, who have been trained by our robots to believe that doing nothing is the highest and finest of all activities, did not lift a finger.
“Things got to the point where we were forced to swallow our pride and call in outsiders to save us. And even that was a near-run thing. We came very close to choosing our pride over our lives. I will admit quite freely that I found calling in the Settlers just as galling as any of you did. But now they are here, and we Spacers, we Infernals, continue to sit back, and grudgingly permit the Settlers to save us, treating them like hired hands, or interlopers, instead of rescuers.
“Our pride is so great, our belief in the power of robot-backed indolence so overpowering, that we still refuse to act for ourselves. Let the Settlers do the work, we tell ourselves. Let the robots get their hands dirty. We shall sit back, true to the principle that labor is for others, believing that work impedes our development toward an ever more ideal society, based on the ennobling principle of applying robotics to every task.
“For robots are our solution to everything. We believe in robots. We have faith in them—firm, unquestioned faith in them. We take it hard, get emotional, when our use of them is questioned. We have seen that demonstrated just moments ago.
“In short, my friends, robotics is our religion, to use a very old word. And yet we Spacers despise the thing we worship. We love robotics and yet hold robots themselves in the lowest of regard. Who among us has not felt contempt toward a robot? Who among us has not seen a robot jump higher, think faster, work longer, do better at a job than any human ever could, and then comforted himself or herself with the sneering, contemptuous—and contemptible—defense that it was ‘only’ a robot. The task, the accomplishment, is diminished when it is the work of a robot.
“An interesting side point is that robots here on Inferno are generally manufactured with remarkably high First Law potential, and with an especially strong potential for the negation clauses of the Second and Third Laws, the clauses that tell a robot it can obey orders and protect itself only if all human beings are safe. To look at it another way, robots here on Inferno place an especially strong emphasis on our existence and an especially weak one on their own.
“This has two results: First, our robots coddle us far more than robots on most other Spacer worlds, so that human initiative is squelched even more here on Inferno. Second, we have a remarkably high rate of robots lost to First Law conflict and resultant brainlock. We could easily adjust our manufacturing procedures to create robots that would feel a far lower, but perfectly adequate, compulsion to protect us. If we did that, we would reduce our own safety little, if at all, but our robots would suffer far less needless damage attempting rescues that are impossible or useless. Yet instead we choose to build robots with excessively high compulsion to protect. We make our robots with First Law potential so high that they brainlock if they see a human in trouble but cannot go to the human’s aid, even if other robots are attempting to save the human.
“If six robots rush in to save one person, and four are needlessly damaged as a result, we don’t care. This is absurd waste. But we don’t care about the loss of robots to needless overreaction. We have so many robots, we do not regard them as particularly valuable. If they destroy themselves needlessly in answer to our whims, so be it.
“In short, we hold our robot servants in contempt. They are expendable, disposable. We send beings of many years’ wisdom and experience, beings of great intelligence and ability, off into grave danger, even to their destruction, for the most trivial of reasons. Robots are sent into burning buildings after favorite trinkets. Robots throw themselves in the face of oncoming traffic to protect a human who has crossed the street carelessly to look at a shop window. A robot is ordered to clear a smudge off an exterior window of a skyscraper in the midst of a hundred-kilometer-per-hour gale. In that last case, even if the robot should be swept off the side of the building, there need be no concern—the robot will use its arms and legs to guide its own fall, making sure it does not strike a human being when it hits, faithful to the First Law even as it plummets toward its doom.
“We have all heard the stories about robots destroyed in this useless effort, or to indulge that pointless impulse. The stories are told, not as if they were disasters, but as if they were funny, as if a robot melted down to scrap or smashed to bits in pursuit of no useful purpose were a joke, instead of a scandalous waste.
“Scarcely less serious are the endless abuses of robots. I have seen robots pressed into service as structural supports, simply ordered to stand there and hold a wall up—not for a minute, not as an emergency remedy while repairs are made—but as a permanent solution. I have seen robots—functional, capable robots—told to stand underwater and hold the anchorline of a sailboat. I know a woman who has one robot whose sole duty is to brush her teeth for her, and hold the brush in between times. A man with a broken water pipe in his basement set a robot to bailing the place out-full-time, nonstop, day in, day out, for six months—before the man finally bothered to have repairs made.
“Think about it. Consider it. Sentient beings used as substitutes for anchors, for toothbrushes, for pipe welds. Does that make sense? Does it seem rational that we create robots with minds capable of calculating hyperspace jumps, and then set them to work as deadweights to keep pleasure boats from floating away?
“These are merely the most glaring examples of robot abuse. I have not even touched on the endless tasks we all allow our robots to do for us, things that we should do for ourselves. But these things, too, are robot abuse, and they are demeaning to ourselves as much as to our mechanical servants.
“I recall a morning, not so long ago, when I stood in front of my closet for twenty minutes, waiting for my robot to dress me. When I finally remembered that I had ordered the robot out on an errand, I still did not dress myself, but waited for the robot to return. It never dawned on me that I might select my own clothes, put them on my own body, close the fasteners myself. It had to be done for me.
“I submit to you that such absurdities as that do more than waste the abilities of robots. They hurt us, do damage to humans. Such behavior teaches us to think that labor—all labor, any labor—is beneath us, that the only respectable, socially acceptable thing to do is sit still and allow our robot-slaves to care for us.
“Yes, I said slaves. I asked a question at the beginning of this talk. I asked ‘What are robots for?’ Well, ladies and gentlemen, that is the answer that our society has come up with. That is what we use them as. Slaves. Slaves. Look into the history books, look into all the ancient texts of times gone by and all the cultures of the past. Slavery has always corrupted the societies in which it has existed, grinding down the slaves, degrading them, humiliating them—but likewise it has always corrupted the slave masters as well, poisoned them, weakened them. Slavery is a trap, one that always catches the society that condones it.
“That is what is happening to us.” Fredda paused for a moment and looked around the auditorium. There was silence, dead silence.
“Let me go back to that day when I waited for my robot-slave to dress me. Thinking about it after the fact, seeing just how ridiculous that moment had been, I resolved to manage for myself the next time.
“And I found that I could not! I did not know how. I did not know where my clothes were. I did not know how the fasteners worked or how the clothes went on. I walked around half a day with a blouse on backwards before realizing my mistake. I was astonished by my ignorance on the subject of caring for myself.
“I started watching myself go through my day, noticing how little I did for myself—how little I was capable of doing.”
Alvar Kresh, watching the recording, began to understand. This was why she no longer kept a personal robot. A strange decision, yes, but it was beginning to make some sort of sense. He watched the recording with rapt attention, all thought of his own exhaustion quite forgotten.
“I was astonished just how incompetent I was,” Fredda Leving’s voice said. “I was amazed how many little tasks I could not perform. I cannot begin to describe the humiliation I felt when I realized that I could not find my way around my own city by myself. I needed a robot to guide me, or I would get hopelessly lost.”
There was a nervous titter or two in the audience, and Fredda nodded thoughtfully. “Yes, it is funny. But it is also very sad. Let me ask you out there who think I am being absurd—suppose all the robots simply stopped right now. Let us ignore the obvious fact that our entire civilization would collapse, because the robots are the ones who run it. Let’s keep it tight, and personal. Think what would happen to you if your robots shut down. What if your driver ceased to function, your personal attendant ground to a halt, your cook mindlocked and could not prepare meals, your valet lost power right now?
“How many of you could find your way home? Very few of you can fly your own cars, I know that—but could you even walk home? Which way is home from here? And if and when you got home, would you remember how to use the manual controls to open the door? How many of you even know your own address?”
Again, silence, at least at first. But then there was a shout from the audience. The camera cut away to show a man standing in front of his seat, a man dressed in one of the more comic-opera variants of the Ironhead uniform. “So what?” he yelled. “I don’t know my address. Big deal! All I need to know is, I’m the human being! I’m the one on top! I got a good life thanks to robots. I don’t want it messed up!”
There was a ragged flurry of cheers and applause, mostly from the back of the house. The view cut back to Fredda as she stepped out from behind her lectern and joined in the applause herself, clapping slowly, loudly, ironically, still going long after everyone else had quit. “Congratulations,” she said. “You are the human being. I am sure you are proud of that, and you should be. But if Simcor Beddle sent you here to disrupt my speech, you go back and tell the leader of the Ironheads that you helped me make my point. What troubles me is that it almost sounds as if you are proud of your ignorance. That strikes me as terribly dangerous, and terribly sad.
“So tell me this. You don’t know where you live. You don’t do much of anything. You don’t know how to do anything. So: What in the Nine Circles of Hell are you good for?” She looked up from the man to the entire audience. “What are we good for? What do we do? What are humans for?
“Look around you. Consider your society. Look at the place of humans in it. We are drones, little else. There is scarcely an aspect of our lives that has not been entrusted to the care of the robots. In entrusting our tasks to them, we surrender our fate to them.
“So what are humans for? That is the question, the real question it all comes down to in the end. And I would submit that our current use of robots has given us a terrifying answer, one that will doom us if we do not act.
“Because right here, right now, we must face the truth, my friends. And the true answer to that question is: not much.”
Fredda took a deep breath, collected her notes, and stepped back from the podium. “Forgive me if I end this lecture on that grim note, but I think it is something we all need to face. In this lecture I have stated the problem I wished to address. In my next lecture, I will offer up my thoughts on the Three Laws of Robotics, and on a solution to the problems we face. I believe I am safe in saying it should be of interest to you all.”
And with that, the recording faded away, and Alvar Kresh was left alone with his own thoughts. She couldn’t be right. She couldn’t.
All right, then. Assume she was wrong. Then what were humans good for?
“Well, Donald, what did you think?” Alvar asked.
“I must confess I found it to be a most disturbing presentation.”
“How so?”
“Well, sir, it makes the clear implication that robots are bad for humans.”
Kresh snorted derisively. “Old, old arguments, all of them. There isn’t a one that I haven’t heard before. She makes it sound like the entire population of Hades, of all Inferno, is made up of indolent incompetents. Well, I for one still know how to find my own way home.”
“That is so, sir, but I fear that you might be in a minority.”
“What? Oh, come on. She made it sound as if everyone were utterly incompetent. I don’t know anyone that helpless.”
“Sir, if I may observe, most of your acquaintances are fellow law enforcement officers, or workers in fields that you as Sheriff often come into contact with.”
“What’s your point?”
“Police work is one of the very few fields of endeavor in which robots can be of only marginal help. A good police officer must be capable of independent thought and action, be willing to cooperate in a group, be ready to deal with all kinds of people, and be capable of working without robots. Your deputies must be rather determined, self-confident individuals, willing to endure a certain amount of physical danger—perhaps even relishing the stimulus of danger. I would suggest that police officers would make for a rather skewed sample of the population. Think for a moment, not of your officers, but of the people they encounter. The people that end up as victims in the police reports. I know that you do not hold those people in the highest regard. How competent and capable are they? How dependent on their robots are they?”
Alvar Kresh opened his mouth as if to protest, but then stopped, frowned, and thought. “I see your point. Now you’ve disturbed me, Donald.”
“My apologies, sir. I meant no—”
“Relax, Donald. You’re sophisticated enough to know you’ve done no harm. You got me to thinking, that’s all.” He nodded at the televisor. “As if she hadn’t done that already.”
“Yes, sir, quite so. But I would suggest, sir, that it is time for bed. “
“It certainly is. Can’t be tired for the Governor, can I?” Alvar stood and yawned. “And what the hell could he want that can’t wait until later in the day?”
Alvar Kresh walked wearily back to his bedroom, very much dreading the morning. Whatever the Governor wanted, it was unlikely to be good news.