“Jesus, Dorothy, you’re lucky to be alive.”

“But he—” You do a double-take. “Is he a murderer?”

She won’t meet your eyes. “I don’t know. Hopefully not; but he’s certainly a psycho, and what happened to you—are you sure it wasn’t rape?”

Your mind goes blank. You try to think back to what you were thinking in the run-up to dinner, in the lift up to his room . . . skulking away with your tail between your legs. Showering to forget his touch. (Why didn’t you use the safeword—were you afraid he wouldn’t stop? Were you enjoying it? It’s so confusing.) “If it’s rape, there’s a script to follow, isn’t there?”

“Yes, but you don’t have to worry about that.”

“The hell I don’t.” Your throat’s raw. “There were no witnesses. Okay, so suppose I say ‘yes’ and you take me round to the station where a trained counsellor talks me through giving a report and taking”—you swallow —“samples. And let’s suppose you, uh, your people go and arrest him. At that point it’s his word against mine, and you know what his advocate will make of my background? Polyamory still doesn’t get equal rights, never mind civil partnerships . . . I just get dragged through the mud, and to what end?”

“But you’ve got—” Liz jolts to a stop, like a Doberman at the end of a choke chain. She’s staring at you. “Oh,” she says softly.

“Oh, indeed.” You reach out your hand towards her. “You don’t want this, Liz. You don’t know what you’re opening yourself up for.”

After a moment, she takes your hand.

“It wasn’t rape,” you say, trying to keep any trace of doubt out of your voice for her sake. “But I’m really worried about the, the other thing.”

“Yes, I’d say you should be.” Liz is silent for a few seconds. “I’d like to take a statement, though. All the same.”

“What? But I told you, it wasn’t non-consensual—”

“Not about the sex: about the appraisal.”

You shiver. “I’d rather not. If you don’t mind.”

She sits down beside you on the futon. “It’s, it’s about Christie. He’s, uh, a person of interest in another investigation. We want to question him in relation to a violent crime. I can’t tell you about it right now, but what you’d told me—it’s really important. My colleagues—they need to know about this. Do you mind if I file at least a contact report?”

You sniff, then rub a hand across your eyes. There’s no mascara or eye-liner, luckily: You stripped before you showered. “You’re going to insist, aren’t you?”

She manages a weak smile. “You said it: I didn’t.”

“Oh hell.” You struggle to sit up. “Just . . . do you mind if I stay overnight? I can’t face that room . . .”

“You can stay,” she says neutrally. “I’ll take the futon.” She pulls her police specs on again, then pauses, one finger hovering over the power button. “I still love you, you know. I just wish things weren’t so messy.”

Then she pushes the button.

LIZ: Project ATHENA

“People laugh when they hear the phrase ‘artificial intelligence’ these days.” MacDonald is on a roll. “But it’s not funny; we’ve come a long way since the 1950s. There’s a joke in the field: If we know how to do it, it’s not intelligence. Playing checkers, or chess, or solving mathematical theorems. Image recognition, speech recognition, handwriting recognition. Diagnosing an illness, driving a car through traffic, operating an organic-chemistry lab to synthesize new compounds. These were all thought to be aspects of intelligence, back in the day, but now they’re things you can buy through an app store or on lease-purchase from Toyota.

“What people think of when you say ‘artificial intelligence’ is basically stuff they’ve glommed onto via the media. HAL 9000 or Neuromancer—artificial consciousness. But consciousness—we know how that shit works these days, via analytical cognitive neurobiology and synthetic neurocomputing. And it’s not very interesting. We can’t do stuff with it. Worst case—suppose I were to sit down with my colleagues and we come up with a traditional brain-in-a-box-type AI, shades of HAL 9000. What then? Firstly, it opens a huge can of ethical worms—once you turn it on, does turning it off again qualify as murder? What about software updates? Bug fixes, even? Secondly, it’s not very useful. Even if you cut the Gordian knot and declare that because it’s a machine, it’s a slave, you can’t make it do anything useful. Not unless you’ve built in some way of punishing it, in which case we’re off into the ethical mine-field on a pogo-stick tour. Human consciousness isn’t optimized for anything, except maybe helping feral hominids survive in the wild.

“So we’re not very interested in reinventing human consciousness in a box. What gets the research grants flowing is applications—and that’s what ATHENA is all about.”

You’re listening to his lecture in slack-jawed near comprehension because of the sheer novelty of it all. One of the crushing burdens of police work is how inanely stupid most of the shit you get to deal with is: idiot children who think ‘the dog ate my homework’ is a decent excuse even though they knew you were watching when they stuck it down the back of their trousers. MacDonald is . . . well, he’s not waiting while you take notes, for sure. Luckily, your specs are lifelogging everything to the evidence servers back at HQ, and Kemal’s also on the ball. But even so, MacDonald’s whistle-stop tour of the frontiers of science is close to doing your head in. Then the aforementioned Eurocop speaks up.

“That is very interesting, Doctor. But can I ask you for a moment”—Kemal leans forward—“what do you think of the Singularity?”

MacDonald stares at him for a moment, as if he can’t believe what he’s being asked. “The what—” you begin to say, just as his shoulders begin to shake. It takes you a second to realize he’s laughing.

“You’ll have to excuse me,” he says wheezily, wiping the back of his hand across his eyes: “I haven’t been asked that one in years.” Your sidelong glance at Kemal doesn’t illuminate this remark: Kemal looks as baffled as you feel. “I, for one, welcome our new superintelligent AI overlords,” MacDonald declaims, and then he’s off again.

“What’s so funny?” you ask.

“Oh—hell—” MacDonald waves a hand in the air, and a tag pops up in your specs: “Let me give you the dog and pony show.” You accept it. His office dissolves into classic cyberspace noir, all black leather and decaying corrugated asbestos roofing, with a steady drip-dripdrip of condensation. Blade Runner city, Matrixville. “Remember when you used to change your computer every year or so, and the new one was cheaper and much faster than the old one?” A graph appears in the moisture-bleeding wall behind him, pastel arcs zooming upward in an exponential plot of MIPS/dollar against time—the curve suddenly flattening out a few years before the present. “Folks back then”—he points to the steepest part of the upward curve—“extrapolated a little too far. Firstly, they grabbed the AI bull by the horns and assumed that if heavier-than-air flight was possible at all, then the artificial sea-gull would ipso facto resemble a biological one, behaviourally . . . then they assumed it could bootstrap itself onto progressively faster hardware or better-optimized software, refining itself.”

A window appears in the wall beside you; turning, you see a nightmare cityscape, wrecked buildings festering beneath a ceiling of churning fulvous clouds: Insectile robots pick their way across grey rubble spills. Another graph slides across the end-times diorama, this one speculative: intelligence in human-equivalents, against time. Like the first graph, it’s an exponential.

“Doesn’t work, of course. There isn’t enough headroom left for exponential amplification, and in any case, nobody needs it. Religious fervour about the rapture of the nerds aside, there are no short-cuts. Actual artificial-intelligence applications resemble us about the way an Airbus resembles a sea-gull. And just like airliners have flaps and rudders and sea-gulls don’t, one of the standard features of general cognitive engines is that they’re all hard-wired for mirrored self-misidentification. That is, they all project the seat of their identity onto you, or some other human being, and identify your desires as their own impulses; that’s standard operating precaution number one. Nobody wants to be confronted by a psychotic brain in a box—

Вы читаете Rule 34
Добавить отзыв
ВСЕ ОТЗЫВЫ О КНИГЕ В ИЗБРАННОЕ

0

Вы можете отметить интересные вам фрагменты текста, которые будут доступны по уникальной ссылке в адресной строке браузера.

Отметить Добавить цитату