“One standard year and forty-two days.”
“What are the specifications for your on-board memory system?
“A capacity of one hundred standard years non-erasable total recall for all I have seen and heard and learned.”
“Do you enjoy your work?”
“No,” said Kaelor. “Not for the most part.”
An unusual answer for a robot. Generally a robot, when given the chance, would wax lyrical over the joys of whatever task it was performing.
“Why do you not enjoy your work?” Fredda asked.
“Dr. Lentrall is often abrupt and rude. He will often ask for my opinion and then reject it. Furthermore, much of my work in recent days has involved simulations of events that would endanger humans.”
Uh-oh, thought Fredda. Clearly it was a mistake to ask that follow-up question. She would have to reinforce his knowledge of the lack of danger, and then change the subject, fast, before he could pursue that line of thought. Thank Space she had turned down his pseudo-clock-rate. “Simulations involve no actual danger to humans,” she said. “They are imaginary, and have no relation to actual events. Why did you grab Dr. Lentrall and force him under a bench yesterday?”
“I received a hyperwave message that he was in danger. First Law required me to protect him, so I did.”
“And you did it well,” Fredda said. She was trying to establish the point that his First Law imperatives were working well. In a real-life, nonsimulated situation, he had done the proper thing. “What is the status of your various systems, offered in summary form?”
“My positronic brain is functioning within nominal parameters, though near the acceptable limit for First Law-Second Law conflict. All visual and audio sensors and communications systems are functioning at specification. All processing and memory systems are functioning at specification. A Leving Labs model 2312 Robotic Test Meter is jacked into me and running constant baseline diagnostics. All motion and sensation below my neck, along with all hyperwave communication, have been cut off by the test meter, and I am incapable of motion or action other than speech, sight, thought, and motion of my head.”
“Other than the functions currently deactivated by the test meter, deliberate deactivations, and normal maintenance checks, have you always operated at specification?”
“Yes,” said Kaelor. “I remember everything.”
Fredda held back from the impulse to curse out loud, and forced herself to keep her professional demeanor. He had violated her order not to volunteer information, and had volunteered it in regard to the one area they cared about. Only a First Law imperative could have caused him to do such a thing. He knew exactly what they were after, and he was telling them, as best he could under the restrictions she had placed on him, that he had it.
Which meant he was not going to let them have it. They had lost. Fredda decided to abandon her super- cautious approach, and move more quickly toward what they needed.
“Do you remember the various simulations Dr. Lentrall performed, and the data upon which they were based?”
“Yes,” Kaelor said again. “I remember everything.”
A whole series of questions she dared not ask flickered through her mind, along with the answers she dared not hear from Kaelor. Like a chess player who could see checkmate eight moves ahead, she knew how the questions and answers would go, almost word for word.
Q: If you remember everything, you recall all the figures and information you saw in connection with your work with Dr. Lentrall. Why didn’t you act to replace as many of the lost datapoints as possible last night when Dr. Lentrall discovered his files were gone? Great harm would be done to his work and career if all those data were lost for all time.
A: Because doing so would remind Dr. Lentrall that I witnessed all his simulations of the Comet Grieg operation and that I therefore remembered the comet’s positional data. I could not provide that information, as it would make the comet intercept and retargeting possible, endangering many humans. That outweighed the possible harm to one man’s career.
Q: But the comet impact would enhance the planetary environment, benefiting many more humans in the future, and allowing them to live longer and better lives. Why did you not act to do good to those future generations?
A: I did not act for two reasons. First, I was specifically designed with a reduced capacity for judging the Three-Law consequences of hypothetical circumstances. I am incapable of considering the future and hypothetical well-being of human beings decades or centuries from now, most of whom do not yet exist. Second, the second clause of the First Law merely requires me to prevent injury to humans. It does not require me to perform any acts in order to benefit humans, though I can perform such acts if I choose. I am merely compelled to prevent harm to humans. Action compelled by First Law supersedes any impulse toward voluntary action.
Q. But many humans now alive are likely to die young, and die most unpleasantly, if we do no repair the climate. By preventing the comet impact, there is a high probability you are condemning those very real people to premature death. Where is the comet? I order you to tell me its coordinates, mass, and trajectory.
A. I cannot tell you. I must tell you. I cannot tell you—
And so on, unto death.
It would have gone on that way, if it had lasted even that long. Either the massive conflict between First and Second Law compulsions would have burned out his brain, or else Kaelor would have invoked the second clause of First Law. He could not, through inaction, allow harm to humans.
Merely by staying alive, with the unerasable information of where the comet was in his head, he represented a danger to humans. As long as he stayed alive, there was, in theory, a way to get past the confidentiality features of Kaelor’s brain assembly. There was no way Fredda could do it here, now, but in her own lab, with all her equipment, and with perhaps a week’s time, she could probably defeat the safeties and tap into everything he knew.
And Kaelor knew that, or at least he had to assume it was the case. In order to prevent harm to humans, Kaelor would have to will his own brain to disorganize, disassociate, lose its positronic pathing.
He would have to will himself to die.
That line of questioning would kill him, either through Law-Conflict burnout or compelled suicide. He was still perilously close to both deaths as it was. Maybe it was time to take some of the pressure off. She could reduce at least some of the stress produced by Second Law. “I release you from the prohibition against volunteering information and opinions. You may say whatever you wish.”
“I spent all of last night using my hyperwave link to tie into the data network and rebuild as many of Dr. Lentrall’s work files as possible, using my memories of various operations and interfaces with the computers to restore as much as I could while remaining in accordance with the Three Laws. I would estimate that I was able to restore approximately sixty percent of the results-level data, and perhaps twenty percent of the raw data.”
“Thank you,” said Lentrall. “That was most generous of you.”
“It was my duty, Dr. Lentrall. First Law prevented me from abstaining from an action that could prevent harm to a human.”
“Whether or not you had to do it, you did it,” said Lentrall. “Thank you.”
There was a moment’s silence, and Kaelor looked from Lentrall to Fredda and back again. “There is no need for these games,” he said. “I know what you want, and you know thhhat I I I knowww.”
Lentrall and Fredda exchanged a look, and it was plain Lentrall knew as well as she did that it was First Law conflict making it hard for Kaelor to speak.
Kaelor faced a moral conundrum few humans could have dealt with well. How to decide between probable harm and death to an unknown number of persons; and the misery and the lives ruined by the ruined planetary climate. And it is my husband who must decide, Fredda told herself, the realization a sharp stab of pain. If we succeed here, I am presenting him with that nightmare choice. She thrust that line of thought to one side. She had to concentrate on Kaelor, and the precious knowledge hidden inside him. Fredda could see hope sliding away as the conflicts piled up inside the tortured robot’s mind. “We know,” she said at last, admitting defeat. “And we understand. We know that you cannot tell us, and we will not ask.” It was pointless to go further. It was inconceivable that Kaelor would be willing or able to tell them, or that he would survive long enough to do so, even