said, his voice deceptively mild.
“Yes, they are. Gubber designed them that way and was justifiably proud of what he had done. But no one would listen to him—until he came to Fredda.”
“All right, that’s fine. But then we get into a problem area. I am not very happy to hear about this New Law experiment, to say the least, but it appears to have legal sanction from the Governor, and I don’t see that there is much I can do about it. But, as I understand it, these gravitonic brains have the New Laws as part of their integral makeup, just as the positronic brain’s basic structure must of necessity include the Three Laws. So how did you manage to erase those laws from Caliban’s brain?”
“They were never there in the first place,” Terach said. “There
“The Laws interconnected all the aspects of the brain so thoroughly that any attempt to modify one part of a positronic brain would affect every other part of it in complex and chaotic ways. Imagine that rearranging the furniture in your living room could cause the roof to catch fire, or the paint on the basement walls to change color, and that putting out the fire or repainting could cause the doors to fall off and the furniture to reset to its original configuration. The interior architecture of the positronic brain is just about that interconnected. In any sort of deep-core programming or redesign, anything beyond the most trivial sort of potential adjustment was hopelessly complex. By leaving the gravitonic brain with a clean structure, by deliberately
Jomaine looked up and saw the anger and disgust on Alvar Kresh’s face. Clearly the very idea of tampering with the Three Laws was the depths of perversion so far as he was concerned. “All right,” the Sheriff said, trying to keep his voice even. “But if there are no Laws built into the gravitonic brains, how do these damned New Laws get in there? Do you write them down on a piece of paper and hope that the robot thinks to read them over before going out to attack a few people?”
“No.” Jomaine swallowed hard. “No, no, sir. There is nothing casual or superficial about the way a Law set— either Law set—is embedded into a gravitonic brain. The difference is that the lawset is embedded centrally, at key choke points of the brain’s topology, if you will. It is embedded not just once, but many times, with elaborate redundancy, at each of these several hundred sites. The topology is rather complex, but suffice it to say that no cognitive or action-inductive processing can go on in a gravitonic brain without passing through a half dozen of these Law-support localities. The difference is that in a modern positronic brain, the Laws are written millions, even billions, of times, across the pseudocortex, just as there are billions of copies of your DNA written, one copy in each cell of your brain. The difference is that your brain can function fairly well if even a large number of cells are damaged, and your body will not break down if a few DNA cells fail to copy properly.
“In a positronic brain, the concept of redundancy is taken to an extreme. All of the copies must agree at all times, and the diagnostic systems run checks constantly. If a few, or even one, of the billions of redundant copies of the embedded Three Laws do not produce identical results compared to the majority state, that can force a partial, perhaps even a complete, shutdown.” Jomaine could see in Kresh’s face that he was losing him.
“Forgive me,” Jomaine said. “I did not mean to lecture at you. But it is the existence of these billions of copies of the Laws that is so crippling to positronic brain development. An experimental brain cannot really
“I see the difficulty,” Donald said. “I must confess that I find the concept of a robot with your modified Three Laws rather distressing. But even so, I can see why your gravitonic brains do not have this inflexibility problem, because the Laws are not so widely distributed. But isn’t it riskier to run with fewer backups and copies?”
“Yes, it is. But the degree of risk involved is microscopic. Statistically speaking, your brain, Donald, is not likely to have a major Three Laws programming failure for a quadrillion years. A gravitonic brain with only a few hundred levels of redundancy is likely to have a Law-level programming failure sooner than that. Probably it can’t go more than a billion or two years between failures.
“Of course, either brain type will wear out in a few hundred years, or perhaps a few thousand at the outside, with special maintenance. Yes, the positronic brain is millions of times less likely to fail. But even if the chance of being sucked into a black hole is millions of times lower than the chance of being struck by a meteor, both are so unlikely that they might as well be impossible for all the difference it makes in our everyday lives. There is no increase in the
“That is a comforting argument, Dr. Terach, but I cannot agree that the danger levels can be treated as equivalent. If you were to view the question in terms of a probability ballistics analysis—”
“All right, Donald,” Kresh interrupted. “We can take it as read that nothing could be as safe as a positronic brain robot. But let’s forget about theory here, Terach. You’ve told me how the New Laws or Three Laws can be embedded into a gravitonic brain. What about Caliban? What about your splendid No Law rogue robot? Did you just leave the embedding step out of the manufacturing process on his brain?”
“No, no. Nothing that simple. There are matrices of paths meant to contain the Laws, which stand astride all the volitional areas of the gravitonic brain. In effect, they make the connection between the brain’s subtopologic structures. If those matrices are left blank, the connections aren’t complete and the robot would be incapable of action. We
“What, Doctor, was the nature of the experiment?” Donald asked.
“To find out what laws a robot would choose for itself. Fredda believed—we believed—that a robot given no other Law-level instruction than to seek after a correct system of living would end up reinventing her New Laws. Instead of laws, she—we—embedded Caliban’s matrices with the desire, the need, for such laws. We gave him a very detailed, but carefully edited, on-board datastore that would serve as a source of information and experience to help him in guiding his actions. He was to be run through a series of laboratory situations and simulations that would force him to make choices. The results of those choices would gradually embed themselves in the Law matrices, and thus write themselves in as the product of his own action.”
“Were you not at all concerned at the prospect of having a lawless robot in the labs?” Donald asked.
Jomaine nodded, conceding the point. “We knew there was a certain degree of risk to what we were doing. We were very careful about designing the matrices, about the whole process. We even built a prototype before Caliban, a sessile testbed unit, and gave it to Gubber to test in a double-blind setup.”
“Double-blind?” Kresh asked.
“Gubber did not know about the Caliban project. No one did, besides Fredda and myself. All Gubber knew was that we wanted him to display a series of situation simulations—essentially holographic versions of the same situations we wanted Caliban to confront—to the sessile free-matrix testbed unit, alongside a normally programmed Three Law sessile testbed. We would have preferred using a New Law robot, of course, because those were the Laws we wanted Caliban to come up with on his own. Unfortunately we hadn’t received any sort of approval for lab tests of New Law robots at that point, so that was no go.
“But the main test was to see if an un-Lawed brain could absorb and lock down a Law set. Gubber did not know which was which, or even that the two were supposed to be different. Afterwards he performed a standard battery of tests on the two units and found that the results were essentially identical. The sessile No Law robot had absorbed and integrated the Three Laws, just as predicted.”
“What happened to the testbed units?” Donald asked.
“The No Law, free-matrix unit was destroyed when the test was over. I suppose the Three Law unit was converted into a full robot and put to use somehow.”
“What goes into converting a sessile unit?”
“Oh, that is quite simple. A sessile is basically a fully assembled robot, except that the legs are left off the torso while it is hooked to the test stand and the monitor instruments installed. Basically just plug the legs in and off it goes.
“At any rate, Fredda intended Caliban as a final grand demonstration that a rational robot would select her