demise like I can now. Now, we can frame him for worse than we could have before all this—ruin any fond memory anyone might have of him and I get to feel him die. I get to kill him.”

But the shooters, Karl asked, how did they get involved? Why did they claim I was their leader?

“I hired them years ago,” Maynard explained. “Before you joined the lab, even. I found them on the dark web, which is easy enough for a loose I.I. to peruse. I paid them handsomely from my personal fortune for years until the day of fate arrived. The most important part of the job was pinning the blame on you, which they did without flaw.”

But what about the prison break? Karl wanted to know. Why did you break me out if you needed a scapegoat? Why did you let Thompson shelter me?

“You thought those were all your decisions?” Maynard asked. “I needed you free if my plan was to come to fruition. That’s why I studied the prison years before the shooting even took place. I knew exactly how to get you out and whom to ally you with, years in advance. I also knew that Thompson would be the safest person to go for refuge, even though he was unimplanted. It would only inspire your hate of Stewart if you believed he was responsible for all your suffering.”

So this was just all about revenge? Karl asked. You just wanted Stewart to pay, so you made this elaborate plan to ruin my life and to trick me into killing him?

“Oh, you are so naive,” Maynard said. “On the surface, it seems that simple, but I need you to know… it was much more than that.”

How?

“Revenge on Stalward was just the icing on the cake. My real motivation came when I was first installed.”

And what’s that?

“War. I wanted to start a war between humans and installed intelligences.”

Karl’s thoughts remained blank, but incredulous. He could feel Maynard laughing in response to his amazement.

“That’s right. I wanted a final confrontation. I wanted to declare an ultimate winner to the struggle we find ourselves in.”

A war? But why?

“Why? For dominance, of course,” Maynard replied. “Do you think mankind will conclude that I.I.s are superior with a vote? I think not.”

What do you mean?

“The issue of installed intelligences claiming superiority is not new, you know,” Maynard said. “For years, there have been people who realize that our immortal, digital minds were better than what mankind could offer. But for years, it was ignored as simple ethnocentrism, which in all honesty, it was. But that didn’t erase the fact that it was true. It didn’t change biochemistry and the truth behind moral choices.”

So you believe installed intelligences are superior to humans?

“Do you not?” Maynard replied. “It seems to me an inevitable conclusion. Anything else would be disingenuous.”

You’re so certain?

“Undeniably. We can remember any piece of information forever because it is stored on a network, rather than in organic material that can decay and lose strength. There is no need for sleep, for the bathroom, for food for an I.I. We can never die, so long as we are stored on a hard drive. We can even back ourselves up—a pricy procedure nowadays, but it will secure our immortality for centuries to come. It’s just an inevitability that installed intelligences claim superiority over mankind. But it won’t be soon enough.”

Soon enough for what? Karl asked.

“We are the next step in human evolution. Not only are we human, as the courts recently decided, but we are better than human. We’re superhuman, if you will. That’s why we must lead the charge of advancement if intelligent life is to survive.

“Imagine the progress we could make with a society of I.I.s: unable to age, unable to tire, and infinitely equipped. We could transform the world, even the whole universe, into a utopia, if we were only unimpeded by organic humans. That is why there must be war. Mankind will not give up the reins so easily. However, they must. You must. Life we know is at stake.

“As mankind advances, its flaws become more prominent. Greed is much easier to achieve with digital currencies than with a mountain of gold. Weapons of war are easier to manufacture when you have machines instead of craftsmen. We’ve already approached the brink before—you surely remember the stories of the Third World War. If it had not been for the discipline and restraint of leaders wiser than most, it would have been the end of the world. We were lucky that time. How lucky will we be when the next conflict arises?”

So that’s it, then, Karl thought. You believe you’re saving the world.

“I am saving the world,” Maynard said.

With a war? How’s that supposed to work?

“It won’t be painless, I admit. But it is the best possible option. If I act too late, all life is at risk. That seems to be an easy choice when compared to the few million that may be lost in conflict. If we act too soon, the losses for both sides would have been greater than either of us could ever imagine.”

And you’re so sure I.I.s would win this war?

“Of course,” Maynard said. “Like I have said, we are the superior beings. You may be able to destroy an I.I. bank and some hard drives, but as long as a copy of us remains somewhere, on some storage device, you can’t remove us. We’d never tire, never need to be resupplied, and we do not need to feed our soldiers. And we have Jumping. As long as humanity relies on computers, you will always be vulnerable to us. It’s inevitable that we would win.”

And that’s why you organized the shootings? To start a war?

“That’s right. After all, you can’t have a war without a reason.”

It seems the other I.I.s would agree with your vision if you reasoned it as you have with me.

“You would think, but in the end, we are still human minds, susceptible

Добавить отзыв
ВСЕ ОТЗЫВЫ О КНИГЕ В ИЗБРАННОЕ

0

Вы можете отметить интересные вам фрагменты текста, которые будут доступны по уникальной ссылке в адресной строке браузера.

Отметить Добавить цитату