I am in a laboratory of some sort, I am Caliban, I am a robot. The answers came from inside him, but not from his mind. From an on-board datastore, he realized, and that knowledge likewise came from the datastore. So that is where answers come from, he concluded.

He looked down to the floor and saw a body lying on its side there, its head near his feet. It was the crumpled form of a young woman, a pool of blood growing around her head and the upper part of her body. Instantly he recognized the concepts of woman, young, blood, the answers flitting into his awareness almost before he could form the questions. Truly a remarkable device, this on-board datastore.

Who is she? Why does she lie there? What is wrong with her? He waited in vain for the answers to spring forth, but no explanation came to him. The store could not—or would not—help him with those questions. Some answers, it seemed, it would not give. Caliban knelt down, peered at the woman more closely, dipped a finger in the pool of blood. His thermocouple sensors revealed that it was already rapidly cooling, coagulating. The principle of blood clotting snapped into his mind. It should be sticky, he thought, and tested the notion, pressing his forefinger to his thumb and then pulling them apart. Yes, a slight resistance.

But blood, and an injured human. A strange sensation stole over him, as he knew there was some reaction, some intense, deep-rooted response that he should have—some response that was not there at all.

The blood was pooling around Caliban’ feet now. He rose to his full two-meter height again and found that he did not desire to stand in a pool of blood. He wished to leave this place for more pleasant surroundings. He stepped clear of the blood and saw an open doorway at the far end of the room. He had no goal, no purpose, no understanding, no memory. One direction was as good as another. Once he started moving, there was no reason to stop.

Caliban left the laboratory, wholly and utterly unaware that he was leaving a trail of bloody footprints behind. He went through the doorway and kept on going, out of the room, out of the building, out into the city.

SHERIFF’S Robot Donald DNL-111 surveyed the blood-splattered floor, grimly aware that, on all the Spacer worlds, only in the city of Hades on the planet of Inferno could a scene of such violence be reduced to a matter of routine.

But Inferno was different, which was of course the problem in the first place.

Here on Inferno it was happening more and more often. One human would attack another at night—it was nearly always night—and flee. A robot—it was nearly always a robot—would come across the crime scene and report it, then suffer a major cognitive dissonance breakdown, unable to cope with the direct, vivid, horrifying evidence of violence against a human being. Then the med-robots would rush in. The Sheriff’s dispatch center would summon Donald, the Sheriff’s personal robot, to the scene. If Donald judged the situation warranted Kresh’s attention, Donald instructed the household robot to waken Sheriff Alvar Kresh and suggest that he join Donald at the scene.

Tonight the dismal ritual would be played out in full. This attack, beyond question, required that the Sheriff investigate personally. The victim, after all, was Fredda Leving. Kresh must needs be summoned.

And so some other, subordinate robot would waken Kresh, dress him, and send him on his way here. That was unfortunate, as Kresh seemed to feel Donald was the only one who could do it properly. And when Alvar Kresh woke in a bad mood, he often flew his own aircar in order to work off his tension. Donald did not like the idea of his master flying himself in any circumstances. But the thought of Alvar Kresh in an evil mood, half-asleep, flying at night, was especially unpleasant.

But there was nothing Donald could do about all that, and a great deal to be done here. Donald was a short, almost rotund robot, painted a metallic shade of the Sheriff’s Department’s sky-blue and carefully designed to be an inconspicuous presence, the sort of robot that could not possibly disturb or upset or intimidate anyone. People responded better to an inquisitive police robot if it was not obtrusive. Donald’s head and body were rounded, the sides and planes of his form flowing into each other in smooth curves. His arms and legs were short, and no effort had been made to put anything more than the merest sketch of a human face on the front of his head.

He had two blue-glowing eyes, and a speaker grille for a mouth, but otherwise his head was utterly featureless, expressionless.

Which was perhaps just as well, for had his face been mobile enough to do so, he would have been hard- pressed to formulate an expression appropriate to his reaction now. Donald was a police robot, relatively hardened to the idea of someone harming a human, but even he was having a great deal of trouble dealing with this attack. He had not seen one this bad in a while. And he had never been in the position of knowing the victim. And it was, after all, Fredda Leving herself who had built Donald, named Donald. Donald found that personal acquaintance with the victim only made his First Law tensions worse.

Fredda Leving was crumpled on the floor, her head in a pool of her own blood, two trails of bloody footprints leading from the scene in different directions, out two of the four doors to the room. There were no footprints leading in.

“Sir—sir—sir?” The robotic voice was raspy and rather crudely mechanical, spoken aloud rather than via hyperwave. Donald turned and looked at the speaker. It was the maintenance robot that had hyperwaved this one in.

“Yes, what it is?”

“Will she—will she—will she be all—all right right?” Donald looked down at the small tan robot. It was a DAA-BOR unit, not more than a meter and a half high. The word-stutter in his speech told him what he knew already. Before very much longer, this little robot was likely to be good for little more than the scrap heap, a victim of First Law dissonance.

Theory had it that a robot on the scene should be able to provide first aid, with the medical dispatch center ready to transmit any specialized medical knowledge that might be needed. But a serious head injury, with all the potential for brain damage, made that impossible. Even leaving aside the question of having surgical equipment in hand, this maintenance robot did not have the brain capacity, the fine motor skills, or the visual acuity needed to diagnose a head wound. The maintenance robot must have been caught in a classic First Law trap, knowing that Fredda Leving was badly injured, but knowing that any inexpert attempt to aid her could well injure her further. Caught between the injunction to do no harm and the command not to allow harm through inaction, the DAA-BOR’s positronic brain must have been severely damaged as it oscillated back and forth between the demands for action and inaction.

“I believe that the medical robots have the situation well in hand, Daabor 5132,” Donald replied. Perhaps some encouraging words from an authority figure like a high-end police robot might do some good, help stabilize the cognitive dissonance that was clearly disabling this robot. “I am certain that your prompt call for assistance helped to save her life. If you had not acted as you did, the medical team might well not have arrived in time.”

“Thank—thank—thank you, sir. That is good to know.”

“One thing puzzles me, however. Tell me, friend—where are all the other robots? Why are you the only one here? Where are the staff robots, and Madame Leving’s personal robot?”

“Ordered—ordered away,” the little robot answered, still struggling to get its speech under greater control. “Others ordered to leave area earlier in evening. They are in—are in the other wing of the laboratory. And Madame Leving does not bring a personal robot with her to work.”

Donald looked at the other robot in astonishment. Both statements were remarkable. That a leading roboticist did not keep a personal robot was incredible. No Spacer would venture out of the house without a personal robot in attendance. A citizen of Inferno would be far more likely to venture out stark naked than without a robot—and Inferno had a strong tradition of modesty, even among Spacer worlds.

But that was as nothing compared to the idea of the staff robots being ordered to leave. How could that be? And who ordered them to go? The assailant? It seemed an obvious conclusion. For the most fleeting of seconds, Donald hesitated. It was dangerous for this robot to answer such questions, given its fragile state of mind and diminished capacity. The additional conflicts between First and Second Laws could easily do irreparable harm. But no, it was necessary to ask the questions now. Daabor 5132 was likely to suffer a complete cognitive breakdown at any moment in any event, and this might be the only chance to ask. It would have been far better for a human, for Sheriff Kresh, to do the asking, but this robot could fail at any moment. Donald resolved to take the chance. “Who gave this order, friend? And how did you come to disobey that order?”

Вы читаете Caliban
Добавить отзыв
ВСЕ ОТЗЫВЫ О КНИГЕ В ИЗБРАННОЕ

0

Вы можете отметить интересные вам фрагменты текста, которые будут доступны по уникальной ссылке в адресной строке браузера.

Отметить Добавить цитату