The last time I had seen Sunita, she had walked me to my quarters aboard Symposium. It was shortly before midnight on a night like any other; we had been in the laboratory late, running Vanguard through a series of tests to simulate what it would do when it lost radio contact with the base on Titan. It had recently developed the strange habit of reaching out to the comms systems of other departments aboard Symposium and persuading their AIs to do its talking for it, and we had to figure out a way to structure our tests so it couldn’t do that. There wouldn’t be anybody else to contact on Titan, after all; it had to learn to rely only on itself and what we brought with us. We had continued our conversation all the way from the lab to our personal berths, speculating on what the overnight results would show, planning a new test, exchanging rapid-fire ideas and adjustments so easily, so comfortably, as we had done for years. I had not known it would be our final conversation. I had not known to say goodbye. I had said, “I’m betting it figures it out,” and Sunita had smiled at me, a beautiful wide smile that lit her entire face, and she had said, “You know our naughty little child best. Sleep well.”
Three hours later I woke to screaming alarms and fire and pain.
I never saw Sunita again. She was gone. Vanguard was gone. Our mission was gone. And I was here, in this hateful box of metal on an ugly rock in the belt, with this cold, smirking woman before me, and I did not know what to say.
“I have so many things I want to ask you about it,” Ping said. “Do you mind? It’s the evolutionary aspect that interests me the most. I can’t say I keep up with the literature as much as I should, but how did you avoid Baldwin’s Law? You must have had precautions in place, to achieve project approval within the disarmament treaty. I know how very particular they are about avoiding the mistakes of the past.”
I glanced at Adisa, but his expression remained blandly unconcerned, as though Ping wasn’t using polite euphemisms to talk about the attempted genocide of his people and destruction of his home. Mistakes of the past was what people said when they wanted to talk about the horrors of the Martian war without acknowledging that those horrors had been entirely intentional. Vice Admiral Dane Baldwin had been responsible for developing and deploying the United Earth Navy’s autonomous weapons on Mars. The threshers that razed the agricultural domes and kick-started a famine, the dusters that destroyed the solar panels and cast entire cities into a deadly winter, the slugs that poisoned the water supply and rendered nearly half of the survivors sterile.
Unintended consequences of technological advancement, Baldwin had said at the tribunal following the war. Not his fault. The machines made their own decisions. The machines were responsible.
I put my PD down and sat forward in my chair. I didn’t know what Mary Ping wanted from me, only that she meant to provoke, but I had been provoked by better than her a hundred times before.
I said, “There’s no such thing as Baldwin’s Law. It has no basis in theory or practice. Artificial intelligences are not inherently destructive. There have been fully evolutionary AIs since Zhao’s first Taijin mind, and most of them don’t turn into killing machines. Baldwin didn’t want to be held accountable for what he had created, so he blamed the machines for his own choices. He was never trying to do anything but create weapons of war.”
“But his excuse convinced the tribunal,” Ping said. “Oh, I know they found him guilty of some minor war crimes, but his only punishment was a few years of house arrest. He’s a free man now.”
That much was true. He had been the invited guest at an AI conference a few years before I left Earth. I had seen him in the hallways of the convention center, a red-faced man in a tailored suit—no sign of his naval uniform—talking in a booming voice while acolytes and admirers scurried behind him, asking questions he never deigned to hear.
“The war tribunal was not made up of experts in artificial intelligence,” I said.
“You truly don’t think violence is inevitable in the evolution of an advanced AI?” Ping asked.
“I know it isn’t.”
“But violence is inevitable in nature,” she said, “and isn’t the goal of an evolving AI to mimic nature as closely as possible? And do we truly understand what happens on the frontier between technology and nature—such as in your own lovely body?”
I curled my left hand into a fist but did not move it from the table.
“Nothing is inevitable with AI,” I said. I would have really liked for Adisa to jump in and get the conversation back on track, but he kept quiet. “The goal of an evolving AI is to improve itself for the tasks it is given, and to do so in ways that we can’t conceive or define. If that task is not a violent one, there is absolutely no reason for the AI to seek violent solutions.”
“Vanguard never did?”
“Vanguard was an explorer,” I said tightly. “Everything it did was toward the goal of collecting as much information as possible in an unfamiliar environment while not disturbing or altering that environment any more than absolutely necessary. Destructive actions would have made that goal harder to achieve.”
“You must have been so very proud of it.”
“Yes. I was.”
“Have you considered creating it again? If it could be done