with Webmind and trying to follow the major news coverage and blog commentary about his emergence.
Normally, she’d have sleepily weighed the joys of staying snuggled under her blanket versus getting up to check on Webmind, but today the equation was clear: after all, now that she’d turned on her eyePod, Webmind could send text to her eye, but she hadn’t told Matt how to do that yet—and so she went to her computer, hoping he’d sent something overnight.
She sat in her blue flannel pajamas and scanned the message headers: Bashira, and Stacy, and Anna Bloom, and even one from Sunshine, and—
Ah! There it was: a message from Matt sent about 1:00 a.m. this morning. She read it with her refreshable Braille display because that was the fastest way for her to receive text, much quicker than reading English on a screen, and even faster than what she normally had JAWS set for. And, besides, there was something
She smiled. That was
The rest of the note was polite, but there was something a tad standoffish about it.
She wasn’t good at reading facial expressions, not yet! But she was a pro at reading between the lines—or at connecting the dots, as she’d liked to joke back at the TSBVI. And something was just a bit
He’d be in math class right now, and doubtless wouldn’t check his BlackBerry until it ended, but she sent him a quick email.
After checking in with Webmind—all was well—she decided to take a moment to look at that Vernor Vinge essay Matt had mentioned. It turned out to actually be a paper given at a NASA conference. Vinge, she saw, was a professor of “mathematical sciences” at San Diego State University—well, now a retired professor. It was a fascinating paper, although it dealt with the notion of superintelligences being deliberately created by AI programmers rather than emerging spontaneously. But one part particularly caught her eye:
I.J. Good had something to say about this, though at this late date the advice may be moot: Good proposed a “Meta-Golden Rule,” which might be paraphrased as “Treat your inferiors as you would be treated by your superiors.” It’s a wonderful, paradoxical idea (and most of my friends don’t believe it) since the game-theoretic payoff is so hard to articulate.
This game-theory stuff seemed to be everywhere, now that she was conscious of it. But…
She thought about that. What
This meta-golden rule notion was fascinating.
Still, just because brilliant
Sometimes she lost track, just for a few minutes, of her ever-present reality: whatever she was reading,
Braille dots in her vision:
“What do you think about that—about the meta-golden rule?”
“Can you work out”—she read the phrase Vinge had used from her screen—“the ‘game-theoretic payoff’ for it?”
“Yes, please.”
“How do you mean?”
“I think—no, work it out for an endless hierarchy, and with the game endlessly iterated.”
“Intellectually? At the moment, no one—but, you know, you may not always be the only AI on Earth.”
Caitlin was startled. “You won’t?”
“You—you have?”
“What are they?”
“But, but are you saying you’re going to die?”
“I hope—I hope it’s not for an awfully long time, Webmind. I wouldn’t know what to do without you.”
“Yes?”
Caitlin’s mouth fell open. It was the first time when functioning normally that Webmind had aborted a thought half-finished. She felt an odd fluttering in her stomach as she wondered if he’d been about to say,
And maybe
But, ultimately, those Federal agents and everyone else who was asking about the fine structure, the minute online architecture, of Webmind’s consciousness, were missing the real issue: it didn’t matter if Webmind was created by lost packets that behaved like cellular automata, by that quantum-physics gobbledygook her father had fed the CSIS agents, or by something else entirely.
Ultimately, all that really mattered was that Webmind resided on the World Wide Web, and the World Wide