out as well. Except this one had not been killed with a single neat shot through the chest. It had been scorched on the left arm, had its head half-melted off, and then taken a final bum hole through the chest. Three shots at least, from increasingly close range. It looked as if this robot had been on the move, trying to react, and had nearly reached its attacker before it finally went down. Kresh felt more certain than ever that there was something suspicious about the ease with which the robots in the bedroom had died. He went farther down the hall and saw two more SPR robots, both shot through the head and the chest. The one in front of the entrance to Grieg’s office had been shot the same way.

He stepped back into the bedroom where Donald was waiting for him. “Donald,” he said, “who manufactured these security robots?”

“The models used here are manufactured by Rholand Scientific,” Donald said.

“Good,” Kresh said. “Then Fredda Leving can examine them without a biased opinion. Hand me the voicephone and connect me.”

“Sir, in terms of security, may I remind you that Fredda Leving was present last night and may well have had the opportunity to tamper with the robots—”

“In terms of security we will be utterly paralyzed if we are too careful. Fredda Leving was not part of this, I can tell you that straight out.”

“I agree that the balance of probabilities weighs heavily against suspecting her,” Donald said. “However, these robots have clearly been tampered with, and she was perhaps the only person who was present who had the expertise for that job. My First Law potential does extend to preventing you doing professional harm to yourself, and to the potential harm to others if an investigation this serious and dangerous were suborned. I must therefore point out there is no logical basis upon which she can be excluded absolutely.”

Kresh took a deep breath and forced himself not to explode. Handling robots could be a damned nuisance, but it was only made doubly hard by losing one’s patience. Of course the same thing was true of people: You were forced to deal with unreasonable demands by being excessively reasonable. “Donald,” he said in a calm, slow voice. “I agree with you that there is no logical basis for excluding Fredda Leving as a suspect. However, I can assure you that there are reasons, outside of logic, that make me utterly certain she had nothing to do with this.”

“Sir, you have said yourself, many times, that any human being is capable of murder.”

“But I have also said that no one human is capable of every murder. Fredda Leving might kill to defend herself, or in a fit of passion, but she is incapable of involving herself in this level of brutality. Nor is she much good as a conspirator, and this was clearly a conspiracy. Fredda Leving was not capable of this killing, and she would have no motive for it. Indeed, I cannot think of anyone with a better motive for keeping the Governor alive. Listen in and monitor her voice-stress if you like, but give me the phone and make the connection. That is a direct and absolute order.”

Donald hesitated a full half second before responding. Kresh thought he could almost see the First and Second Law potentials battling it out with each other. “Yes, sir,” he said at last, and handed over the phone.

It was a sign of just how rattled Donald was that he would kick up such a fuss over such a minor point. The sight of the Governor’s corpse had upset man and robot. Both of them knew that was not merely a dead man—it was, in all probability, a whole planet suddenly thrown into peril.

With a beep and a click-tone the phone line connected. “Um—um—hello?”

Kresh recognized Fredda’s voice, sleepy and a bit muddled. “Dr. Leving, this is Sheriff Kresh. I’m afraid I must ask you to return to the Residence immediately, and to bring whatever technical equipment you have with you. I need you to examine some, ah—damaged robots.” It was a clumsy way to put it, but Kresh couldn’t think of anything else he could say on an unsecured line.

“What?” Fredda asked. “I’m sorry, what did you say?”

“Damaged robots,” Kresh repeated. “I need you to perform a fast discreet examination. It is a matter of some urgency.”

“Well, all right, I suppose, if you say it’s urgent. It will take me a while to get to the Limbo Depot robotics lab and collect some examination gear. I didn’t bring anything with me. I’ll get there as fast as I can.”

“Thank you, Doctor.” Kresh handed the handset back to Donald. “Well?” he asked.

“Sir, I withdraw my objections. You were indeed correct. My voice-stress monitoring indicated no undue reaction to a call from you from the Residence at this hour. Either she has no idea whatsoever of what has happened, or she is a superb actress—an accomplishment of which I am unaware in Dr. Leving.”

“Once in a while, Donald,” Kresh said, “you might try taking my word on questions of human behavior.”

“Sir, with all due respect, I have found no topic of importance wherein questions so utterly outnumber the answers.” Kresh gave the robot a good hard look. Had Donald just made a joke?

Prospero, Fredda told herself as she hurried to get ready. It had to be something to do with Prospero. Why else would Kresh be there at this hour, and calling her in? Something must have gone wrong with Prospero. Fredda Leving had hand-built the New Law robot, and programmed his gravitonic brain herself. She remembered how much of a pleasure it had been to work on the empty canvas of a gravitonic unit, with the chance to make bold strokes, work out whole new solutions, rather than being strait-jacketed by the limitations and conventions and excessive safety features of the positronic brain.

Ever since the long-forgotten day when true robots had first been invented, every robot ever built had been given a positronic brain. All the endless millions and billions of robots made in all those thousands of years had relied upon the same basic technology. Nothing else would ever do. The positronic brain quite literally defined the robot. No one would consider any mechanical being to be a robot unless it had a positronic brain—and, contrariwise, anything that contained a positronic brain was considered to be a robot. The two were seen as inseparable. Robots were trusted because they had positronic brains, and positronic brains were trusted because they went into robots. Trust in robots and in positronic brains were articles of faith.

The Three Laws were at the base of that faith. Positronic brains—and thus robots built with such brains—had the Three Laws built into them. More than built in: They had the Laws woven into them. Microcopies of the Laws were everywhere inside a positronic brain, strewn across every pathway, so that every action, every thought, every external event or internal calculation moved down pathways shaped and built by the Laws.

Every design formula for the positronic brain, every testing system, every manufacturing process, was built with the Three Laws in mind. In short, the positronic brain was inseparable from the Three Laws—and therein lay the problem.

Fredda Leving had once calculated that thirty percent of the volume of the average positronic brain was given over to pathing linked to the Three Laws, with roughly a hundred million microcopies of the Laws embedded in the structure of the average positronic brain, before any programming at all was done. As roughly thirty percent of positronic programming was also given over to the Three Laws, the case could be made that every one of those hundred million microcopies was completely superfluous. Fredda’s rough estimate was that fifty percent of the average robot’s nonconscious and preconscious autonomous processing dealt with the Laws and their application.

The needless, excessive, and redundant Three-Law processing resulted in a positronic brain that was hopelessly cluttered up with nonproductive processing and a marked reduction of capacity. It was, as Fredda liked to put it, like a woman forced to interrupt her thoughts on the matter at hand a thousand times a second in order to see if the room were on fire. The excessive caution did not enhance safety, but did produce drastically reduced efficiency.

But everything in the positronic brain was tied to the Three Laws. Remove or disable even one of those hundred million microcopies, and the brain would react. Disable more than a handful, and the brain would fail altogether. Try and produce positronic programming that did not include endless redundant checks for First, Second, and Third Law adherence, and the hardwired, built-in copies of the Three Laws would cause the positronic brain to refuse the programming and shut down.

Unless you threw out millennia of development work and started from scratch with a lump of sponge palladium and a hand calculator, there was no way to step clear of the ancient technology and produce a more efficient robot brain.

Then Gubber Anshaw invented the gravitonic brain. It was light-years ahead of the positronic in processing speed and capacity. Better still, it did not have the Three Laws burned into its every molecule, cluttering things up. The Three Laws could be programmed into the gravitonic brain, as deeply as you liked, but with only a few hundred copies placed in the key processing nodes. In theory, it was more liable to failure than the millions of copies in a

Вы читаете Inferno
Добавить отзыв
ВСЕ ОТЗЫВЫ О КНИГЕ В ИЗБРАННОЕ

0

Вы можете отметить интересные вам фрагменты текста, которые будут доступны по уникальной ссылке в адресной строке браузера.

Отметить Добавить цитату