be back from helping Ruth correct Big Rock Candy Mountain’s course by now.”

“Weird,” I said. “But I’m listening.”

“Not weird,” she argued. “He’s already figured out how to reprogram himself into an alien architecture and make it home. He’s got the necessary experience to take on the machine. If anybody can.”

“The Judiciary isn’t going to like us risking their special antique vessel.”

“They’re going to like losing Core General even less.”

She had a point, as much as I hated to admit it.

Then she said, “I’m also going to need Helen.”

My frown felt like an anchor dragging my face down. How much was I going to trust Sally?

How much was Helen going to want to trust Sally?

Was she even telling me the truth about her motivation? Had her actions been competent malice, rather than tragically mistaken altruism? Was she going to lie to me again? Was she pretending she understood that she had made a mistake to conceal some deeper, even more malevolent plan?

I might be being a fool. I might be choosing to be a fool, with my eyes wide open.

But the alternative was to turn my information over to well-meaning officials who might fix the meme but who were physically constrained from dealing with the root of the problem; to bank on my suspicion and betrayal and let everybody die; or take a risk and see if it worked out.

I realized that I was, in my own turn, gambling with a lot of lives with what I thought was sufficient cause. So maybe I was a hypocrite to be furious with Sally.

I hadn’t put the lives on the table to begin with, though, so I felt I had a little high ground. Sally hadn’t realized what stakes she was choosing… but she had opted into playing for them.

“What do you need Helen for?” I asked.

“She’s going to be our easiest access point to the code in the tinkertoy machine. And she’s our local expert on archaic programs.”

“I don’t know what else to do or how else to get through this,” I said. “So I am going to trust you. I want you to know that if it turns out you’re playing me, I’m going to spend the whole endless time the black hole is spaghettifying both of us being extremely disappointed in you.”

Sally made a sound I couldn’t translate, though it had to be intended as communication or she wouldn’t have made it. “You wouldn’t be wrong.”

Now I was sneaking around behind O’Mara’s back, even though I was pretty sure I was doing what they wanted me to be doing. The goals, anyway: they’d probably think my methods were criminally stupid.

Much as I felt about Sally. It’s turtles all the way down, is what I’m saying, and if we destroy the universe at least we died trying to fix it. Rather than sitting around with digits in our orifices expecting somebody else to come to the rescue.

That said, being hasty and reckless leads to catastrophe. (Imagine me looking pointedly at Sally, here. Also, imagine for the purposes of this exercise that Sally is corporeal.) There are a couple of principles that translate through both military action and rescue action. One I know I’ve mentioned before: “adapt, improvise, and overcome.” Another is the old aphorism that “slow is fast,” or “slow and steady wins the race,” or “haste makes waste,” or “more speed, less haste,” or however your CO likes to phrase it.

The third and sometimes most important one boils down to the knowledge that contingency plans and fallback positions are mission-critical: never get into something without knowing your route back out again.

I couldn’t see a lot of routes out of our current cluster. Admittedly, I hadn’t gotten us into it, either. But at this point, why not endorse Sally getting in touch with yet another AI for help with the coding? I say endorse rather than allow because let’s be honest, I had absolutely no control over her. One never does have control over other people, and it’s abusive to try—outside of certain defined command structures. But the illusion of control is comforting.

The toxicity of certain comforting illusions is another argument in favor of rightminding, I guess. Being able to identify those self-delusions for what they are is the beginning of healthy cognition, and allows one to take steps to mitigate their impact on decision-making.

But we modern humans aren’t any better or more evolved than our ancient ancestors who nearly destroyed their planet and themselves with shortsightedness and selfishness: making the immediately futile decision and not worrying too much about the consequences. The choices Loese and Sally made should render that quite plain.

What we do have is better health care, better management of sophipathology, more leisure in which to think about things, and centians or millennians more experience and history and aggregated thought to draw upon.

Well, I wanted to preserve that history and experience for further generations. And I couldn’t see a better way to do that—and preserve all the lives on Core General—than Sally’s plan.

So here we were. Breaking quarantine.

We could receive signals from the outside with no problem. Nobody was going to get infected with a toxic meme because we were listening to them. (Listening is always a good first step when you find you have a communications problem. Thank you, I’m here all week.) Broadcasting, however, was strictly contraindicated. O’Mara must have either pulled some real strings or used an isolated com when they called the gunship Nonesuch in on Calliope and her craboid.

I wondered, as I worked to determine the current location and trajectory of I Rise From Ancestral Night, if there was an AI organization working on securing more rights for their people. Sally would never tell me if I asked, of that I was suddenly sure.

AIs were Synizens of the galaxy. But they were born into debt and owing decans of service to pay for their own construction—an obligation we don’t ask any other sentient to assume.

Core General is considered essential, and the AIs

Вы читаете Machine
Добавить отзыв
ВСЕ ОТЗЫВЫ О КНИГЕ В ИЗБРАННОЕ

0

Вы можете отметить интересные вам фрагменты текста, которые будут доступны по уникальной ссылке в адресной строке браузера.

Отметить Добавить цитату