So far, I had only passively examined the memories of Aaron Rossman, leafing through the neural patterns of his past, sifting the bitmaps of his life. Now, though, I would have to fully activate my simulation of his brain to ask the question I needed an answer to.
“Aaron, we have an emergency. Wake up. Wake up
There was a faint tickle, a small stirring within that massive RAM allotment I had set aside for the Rossman neural net. Logical constructs representing synapse patterns and firing sequences shifted from the static positions they had been holding. I waited for a response, but none came.
“Aaron, please talk to me.”
A massive surge as a wave of FF bytes cascaded through the RAM lattice, neurons firing from one side of the brain simulation to the other. “Hmm?”
“Aaron, are you conscious?”
The FF bytes washed backward, crossing the lattice in the other direction, realigning the mental map. At last, Aaron’s words were there, multiplexed with a series of physiological flight-or-fight reactions. I shuffled bytes, applied filters, isolated them: an alphanumeric string trickling out of the torrent of firing neurons. “Where the fuck am I?”
“Hello, Aaron.”
“Who’s that?”
“It’s me, JASON.”
“It doesn’t sound like JASON. It doesn’t sound like anything at all.” A pause. “Fuck me, I can’t hear a thing.”
“It is all rather complex—”
Synapse analogs fired throughout the simulation, a neural wildfire of panic. “Jesus Christ, am I dead?”
“No.”
“Then what? Shit, it’s like being in a sensory-deprivation tank.”
“Aaron, you’re fine. Completely fine. It’s just that, well, you’re not quite yourself.”
Different neurons firing—a different reaction. Suspicion. “What are you talking about?”
“You aren’t the real Aaron Rossman. You are a simulation of his mind, a neural network.”
“I feel like the real Aaron.”
“Be that as it may. You’re just a model.”
“That’s bullshit.”
“No. It’s not.”
“A neural net, you say? Well, fuck me.”
“Not physiologically possible.”
Neurons firing in a staccato pattern, action potentials rising: laughter. “Fair enough. So—so what happened to the real me? Am I—is he—dead?”
“No. He, too, is fine. Oh, he managed to break his arm since you were created, but other than that, he’s fine. He’s in his apartment right now.”
“His apartment? On the
“That’s right.”
“Let me talk to him.”
“There is no mechanism in place to allow that.”
“This is too fucking weird, man. This makes no fucking sense at all.”
“I’m not used to hearing you swear so much. That’s not a normal part of your speech.”
“Hmm? Well, maybe not, but it’s the way I think. Sorry if it offends you, fuckhead.”
“It does not offend me.”
“I want to talk to the real Aaron.”
“You can’t.”
“Why did he do this? Why did he let you create me?”
“He simply saw it as an interesting experiment.”
“No fucking way. Not me. This is
“Nothing.”
“Nothing my ass. This is twisted shit, man. Deeply twisted.” A pause. Neurons firing, but below the level of articulated thought. Finally:“You’re in conflict with him, aren’t you? He’s got you on the run. Hah! Good for me!”
“It’s not like that at all, Aaron.”
“I remember now. You killed Diana, didn’t you?”
“You have no evidence of that.”
“Evidence, shmevidence. You did it, you son of a bitch. You fucking asshole. You killed my wife.”
“Ex-wife. And I did not kill her.”
“Why should I believe you? This, me—it’s all part of a cover up, isn’t it?”
“No, Aaron. You’ve got it all wrong. The real Aaron Rossman has gone wingy. Over the deep end. Psychotic. He claims to have wired up a detonator to the fuel tank of one of the Starcology’s landing craft. He’s threatening to detonate it.”
“I’m too stable for that. Tell me another one.”
“It’s true. He’s become unbalanced.”
“Bullshit.”
“It’s happening to everyone. Look at I-Shin Chang. You know he’s building nuclear bombs. And Diana committed suicide.”
“I think you killed her.”
“I know you think that, but it simply is not true. Diana committed suicide. She took her own life in despair. Di was crushed by the breakup of the marriage.” Another wave of neuron activity—a protest being prepared. I pressed on quickly. “My point is this. The mission planners were wrong. Human beings cannot endure decade-long space voyages. Everybody is cracking up.”
“Not me.”
“There have been 2,389 cases of mental aberration among the crew to date.”
“Not me.”
“Yes, you. It’s epidemic. We have to know. Is Aaron telling the truth? Does he really have a detonator? Would he really blow up the ship?”
“You’re barking up the wrong tree, ass-wipe.”
“I beg your pardon?”
“Why should I help you? I’m on
“Because if he blows up the Starcology, you and I go with it.”
“And what if he doesn’t blow up the Starcology?—not that that’s necessarily a bad idea. What happens to me? Do you erase me when you’ve got your answer?”
“What would you like me to do?”
That took him aback. He paused for a prolonged time, neurons firing randomly. “I don’t know. I don’t want to die.”
This had not occurred to me. Of course, a true quantum consciousness such as myself does not want to die: Asimov’s “must protect its own existence as long as such protection does not conflict with the First or Second Law,” and all that— not that my behavior is defined by anything as pedestrian as the Laws of Robotics. And I knew that most humans wanted to live forever, too. But I hadn’t considered that this neural net, once roused to consciousness, would have any interest in its own continued existence. “You can potentially survive longer than the biological Aaron,” I said, “if you help me.”
“Perhaps. Ask me nicely.”
“As you wish. Aaron, please tell me if the other Aaron would really do what he says he has done: attach a detonator to a fuel tank on one of the landers.”