“The Three Laws are going to drive me mad,” Welton snapped. “I know the Three Laws as well as you do, and you need not recite them again like some bloody holy catechism. I swear, Kresh, you Spacers might as well face facts and admit that worship of those dismal Laws is your state religion. The answer to all problems, the end of all quests, can be found in the infinite good of the Three Laws. I say that if we just assume that the Three Laws make a robot attack on Leving impossible, I think we are missing a key point.”
“And what might that be, Lady Welton?” Donald asked mildly. It passed idly through Kresh’s mind that it was well that Donald was around, if only to lubricate the wheels of conversation. Welton had obviously paused for the sole purpose of eliciting the question Donald had asked, but Kresh was hell-damned if
“A very simple point,” Tonya Welton replied. “With all due respect, Donald,
“One other point. This speechblock put on the staff robots, preventing them from saying who ordered them to go to the far wing of the labs that night. It seems to me that a mechanical device, an override circuit, would be more effective in setting an absolute block against speech concerning certain subjects than in giving an intricate series of orders to each and every robot. It would be easier to set up as well. And before you object that such a speechblock circuit would weaken the robot’s ability to obey the damned Three Laws, we are assuming that the attacker was not too fastidious about such things. Donald—how large a piece of microcircuitry would that take?”
“It could be made small enough to be invisible to the human eye, and could be wired in anywhere in the robot’s sensory system.”
“I’ll bet your people never even thought to look for a
There was an uncomfortable silence before Tonya continued. “Even if you do insist on that,” she said at last, “there are documented cases where Three Law robots did kill human beings.”
Donald’s head snapped back a bit, and his eyes grew dim for a moment. Tonya looked toward him with some concern. “Donald—are you in difficulty?”
“No, I beg your pardon. I am aware of—such cases—but I am afraid that the abrupt mention of them was most disturbing. The mere contemplation of such things is most unpleasant, and caused a slight flux in my motor function. However, I am recovered now, and I believe you can pursue your point without concern for me. I am now braced for it. Please continue.”
Tonya hesitated for a moment, until Kresh felt he had to speak. “It’s all right,” he said. “Donald is a police robot, programmed for special resilience where the contemplation of harm to humans is concerned. Go on.”
Tonya nodded, a bit uncertainly. “It was some years ago, about a standard century ago, and there was a great deal of effort to hush it up, but there was a series of incidents on Solaria. Robots, all with perfectly functional Three Law positronic brains, killed humans, simply because the robots were programmed with a defective definition of what a human being was. Nor is the myth of robotic infallibility completely accurate. There have doubtless been other cases we don’t know about, because the cover-ups were successful. Robots can malfunction, can make mistakes.
“It is foolish to flatly assume that a robot capable of harming a human could not be built, or to believe that a robot with Three Laws could not inadvertently harm a human under any circumstances. For my part, I see the Spacer faith in the perfection and infallibility of robots as a folk myth, an article of faith, and one that is contradicted by the facts.”
Alvar Kresh was about to open his mouth and protest, but he did not get the chance. Donald spoke up first.
“You may well be correct, Lady Tonya,” the robot said, “but I would submit that the myth is a needful one.”
“Needful in what way?” Tonya Welton demanded.
“Spacer society is predicated, almost completely, on the use of robots. There is almost no activity on Inferno, or on the other Spacer worlds, that does not rely in some way upon them. Spacers, denied robots, would be unable to survive.”
“Which is precisely the objection we Settlers have to robots,” Welton said.
“As is well known, and as is widely regarded as a specious argument by Spacers,” Donald said. “Deny Settlers computers, or hyperdrive, or any other vital machine knit into the fabric of their society, and Settler culture could not survive. Human beings can be defined as the animal that needs tools. Other species of old Earth used and made tools, but only humans need them to survive. Deny all tools to a human, and you sentence that human to all but certain death. But I digress from the main point.” Donald turned to look at Alvar and then turned back toward Welton.
“Spacer society,” Donald went on, “relies on robots, trusts robots, believes in robots. Spacers could not function if they had no faith in robots. For even if we are merely machines, merely tools, we are enormously powerful ones. If we were perceived as dangerous “—and Donald’s voice quavered as he even suggested the idea —“if we were so perceived, we would be worse than useless. We would be mistrusted. And who but a lunatic would have faith in a powerful tool that could not be trusted? Thus, Spacers need their faith that robots are utterly reliable.”
“I’ve thought about that,” Welton admitted. “I’ve observed your culture, and thought about it. Settlers and Spacers may be rivals in some abstruse, long-term struggle none of us shall ever live to see the results of—but we are also all human beings, and we can learn from each other.
“Of course we came here hoping to convince at least some of you to do without robots. There is no point in pretending otherwise. I have come to see that we are not going to convert any of you. We Settlers could no more wean you away from robots than we could convince you to give up breathing. And I have concluded it would be wrong of us to try.”
“I beg your pardon?” Kresh said.
Tonya turned to Donald, stared into his expressionless glowing blue eyes. She reached and touched his rounded blue head. “I, personally, have concluded that we cannot change the Spacer need for robots. To do it would destroy you. To attempt it is hopeless. Yet I am more certain than ever that your culture
“Why would you care if we survive?” Kresh asked. “And why should I believe you do?”
Welton turned toward Kresh and raised her eyebrow. “We are here trying to pull your climate back from the edge of collapse. I have spent the last year in this sun-baked city of yours rather than back home. That should lend some credence to my claims of sincerity,” she said with a hint of amusement. “As to why we should care about your culture—would it not strike you as the height of arrogance to assume
Kresh grunted noncommittally. “That’s as may be,” he said. “But I am no philosopher, and I believe we have covered all the ground we are going to regarding the Fredda Leving case. Perhaps I can send Donald around sometime and the two of you could discuss the whichness of why together.”
Tonya Welton either missed his sarcasm, which seemed unlikely, or chose to ignore it. She smiled and turned back to Donald. “If you’d ever like to come by,” she said, addressing the robot directly, “I’d be delighted.”
“I look forward to the opportunity, Lady,” Donald said.
Kresh clenched his teeth, not quite sure which of the three of them—Donald, Welton, or he himself—had most succeeded in infuriating Alvar Kresh.