the plastic storage boxes.

Susan had nearly finished her sandwich, her legs flopped over the arm of her chair, when Nate finally arrived, his tread light and his footsteps nearly silent on the tiled floor. When his gaze fell on her, he stopped, and a welcoming grin split his face. “Dr. Susan Calvin.”

Susan sat up properly in her chair. “Robot N8-C.”

“Call me Nate.”

“Only if you call me Susan.”

“Deal.”

For the second time in two days, Susan studied the robot. He still looked like nothing other than a tall, male human. He might have gears inside, but they did not stutter and whir. If anything, he seemed more graceful, more easily gliding than most humans. “Nate, can you sit for a little while? Do you have some time to talk?”

“I do.” Nate chose the chair catty-corner to Susan’s.

It amazed her how human that action seemed. Most people would have selected the exact same spot, comfortably close for conversation but not violating any personal space. Nothing about him suggested mechanization. Had he not told her, had her father not confirmed it, she would never have known his true nature. “Would you answer a hypothetical question for me?”

Nate spread his hands and nodded, clearly trying to calculate a purpose that had not yet become obvious. “If you wish.”

Susan leaned forward. “Let’s say a fire broke out in a chemical factory with one man trapped inside. Based on the last-known location of the man, and the composition of the fire, he is certainly dead. You also know exposure to the particular heated chemicals involved would destroy your circuitry. You’re told to go in and rescue the man. What do you do?”

Nate laughed. “Someone just learned about the Three Laws of Robotics.”

Caught, Susan could only join the laughter. “Indeed. So, what do you do?”

“Hypothetically.”

“Of course.”

Nate sat back with a sigh of consideration. “It would greatly depend on the specifics of the situation. The Laws have a balance that can actually push Number Two ahead of Number One or Number Three ahead of Number Two in certain situations. It’s not as black and white as the wording might, at first, seem.”

Susan continued to smile. She had been right.

Nate went on. “If I knew for a fact the man inside was alone and dead, Law Number One no longer takes priority. The issue of a human coming to harm from my actions or inactivity becomes moot.”

Susan nodded.

“Law Number Two commands me to obey all orders given by human beings. In your scenario, I’ve been ordered to rescue the man, presumably by a human being. If I know the man is dead, then the command becomes nonsensical; and, therefore, I am no longer obligated to follow it. In that case, Law Number Three comes into effect, and I must protect my own existence. So, assuming all the facts you and I presented, I would not enter the burning chemical factory.”

Susan had surmised as much when she had discussed it with her father.

“However,” Nate added, “if I had any reason to believe the man inside might still be alive, or another human being might be in danger, Law Number One would override all the others. With or without the command, I would do whatever I could to rescue those humans, even if it led to my own destruction.”

Susan liked that she could predict Nate’s actions, and she wished humans were that easy to read.

“Now let me add something to your scenario that might surprise you.”

Susan became all ears. This, she had not anticipated. “All right.”

“Let’s say I heard meowing coming from that burning factory and saw a girl crying and calling for her Fluffy. Then, I would also go inside.”

Susan paused in uncertainty. “To save a cat?”

“Yes.”

Susan tried to guess the reason. “Because . . . if the cat survived . . . the man might . . . also —”

“No.” Nate did not allow her to finish. “We’re still assuming the man is definitely dead.”

“The cat . . . ,” Susan started, then stopped. “The cat is not a human being. Law Number One says nothing about animals.”

“True.” Nate met Susan’s gaze directly. His brown eyes looked placidly into hers, so very real, so human. “But the girl is. Losing her cat would harm her emotionally. And so, by Law Number One, I’m driven to save it at risk to my own existence.”

Excitement thrilled through Susan, and she could do nothing more than stare. Instinctively, she had known the Three Laws of Robotics would not prove as solid and obvious as they originally seemed. However, she had not expected to discover such critical and expressive thinking from a robot. Her father had not given this positronic brain concept the credit it deserved. Clearly, robots did not just think and learn. Nate had applied logic to circumstances to account, not only for facts in evidence, but for complex human emotions. Susan knew more than a few living, breathing people with a lesser grasp of empathy than Nate. “Wow.” She could think of nothing else to say.

Nate straightened his dress khakis, then ran a hand through his short hair. He had gestures and mannerisms that made him seem all the more alive. “Are we finished, Susan?”

Susan met Nate’s gaze again, the remainder of her lunch forgotten. “Do you have a few more minutes?”

“A few.” Nate remained in his chair. “Do you have more hypotheticals?”

Susan sighed and straightened her own clothing. “Actually, this is a real situation. I have a patient who had brain surgery performed by a man they call ‘one of the greatest neurosurgeons in the world.’”

“Dr. Sudhish Mandar,” Nate filled in.

Startled again, Susan managed only a “yes.” Then, “Do you know him?”

“If you asked him, he’d say the greatest, not just one of a group at the top.” Nate planted both arms on the armrests of his chair, gripping the ends in his hands. “You should know that, although I have read all those books” — he gestured vaguely toward the shelves — “I am not considered a medical expert.”

Susan’s attention followed Nate’s motion. If he had read every book on those shelves, and retained even a quarter of it, he had as much knowledge as most physicians. “It’s not a medical question. It’s . . . moral.”

“Moral? And you’re asking me? Morality is a human construct.”

Susan disagreed. The Three Laws inflicted an honor on the robot that humans might not appreciate, even if they recognized it. “Nevertheless, I’d like your opinion.”

“Okay.” Nate settled back into his chair.

“The neurosurgeon has declared the surgery a success and no longer looks after the patient. For months, she was believed to have dementia” — Susan looked up to see if she needed to define the word; Nate waved her on — “due to presurgical swelling of the brain because of the initial problem or damage to the brain tissue during the surgery.”

Nate continued to nod his understanding.

Susan downplayed her role in the upcoming information. “Yesterday, we discovered this young girl’s dementia is actually due to heart failure. The cardiologists put her on medicine to help control it, and her mental status is clearing.”

Nate smiled. “That’s good news. Right?”

“Well, yes.” Susan gave a less than enthusiastic response. “Except they had to put her on very high doses of serious medications, and it’s only helping to keep the problem in check. It’s not a cure. My concern is the condition will soon overcome the medications; and, not only will the dementia return, but the heart failure will worsen.”

“And she’ll die?” Nate guessed.

“Eventually.” Susan knew a young, strong body had ways of compensating for congestive heart failure. Starling Woodruff’s already had, or it would have been discovered sooner. “She can probably go home, but she can’t live anything approaching a normal life until we fix the underlying problem causing the heart failure.”

“Which is?” Nate prodded.

Susan shook her head. “We don’t actually know. Heart failure occurs when the heart can no longer meet the metabolic demands of the body at normal venous pressure. A lot of things can cause it.” A huge list ran through

Вы читаете I, Robot To Protect
Добавить отзыв
ВСЕ ОТЗЫВЫ О КНИГЕ В ОБРАНЕ

0

Вы можете отметить интересные вам фрагменты текста, которые будут доступны по уникальной ссылке в адресной строке браузера.

Отметить Добавить цитату