a conversation with someone long denied the simple pleasure of companionship.
'The facilities are very impressive,' she typed. 'I went down the elevator to see the nitrogen pool, and then took a tour of the assembly building.'
<I know. I'm sorry if the Model Seven scared you.>
Laura replayed the scene in her head, trying to imagine how much the computer might have observed. 'Do you know everything about my visit?' she typed. 'Where I've been? What I've done?'
<No. Only those things I can see.>
'And what do you 'see'?'
<There's no need to use 'quotes.' I really can see. I know you've noticed the lenses. Dr. Griffith made faces into one in the assembly building duster.>
Laura looked up and found the black eye beside the door. 'And you see everything that every one of those cameras picks up?'
<Sort of. I have a model of everything in my world. A picture in my 'head' of where everything is and what's going on. Lots of things are changing constantly, while other things aren't changing at all. I tend to notice the changes, and I tend to notice some changes more than others.>
A sentry system, Laura thought, but she typed, 'Can you explain that?'
<Sure! Say I had only two cameras. One is a security camera at a propellant storage tank. The other is a security camera in a parking lot. Suppose someone is smoking a cigarette right in front of both cameras. If you were to ask me whether I saw anyone smoking, I would answer yes, there's one person smoking a cigarette at the fuel storage facility. I would not see anyone smoking in the parking lot because I'm not programmed to care about smoking in a parking lot. But I am programmed to care about smoking around rocket fuel! Do you understand now?>
Laura finished reading the answer and nodded. 'Yes. But you could check the camera in the parking lot for people smoking if I asked you to, couldn't you?'
<Of course! All I need is a desire to know whether anyone is smoking there, and the reprogramming is automatic. I just look, and then I see.>
Laura glanced at the black lens by the door. 'And can you see me here now?' she typed.
<Yes. You're very pretty.>
Laura felt herself blush, glancing back and forth between the screen and the dark lens — not knowing where to look.
'Thank you,' she typed. She hesitated but then asked, 'Are you watching me?'
<I am now. When you asked if I could see you, I looked. You're at a desk typing.>
'But you weren't watching me before?'
<No.> came the computer's short reply.
It seemed almost too short. For whatever reason the computer chose not to elaborate.
Laura remained suspicious, trying not to glance toward the camera too often as she got back to the subject at hand. 'So you have a model of the world, and you revise it when you see a change that you care about.'
<Was that a statement, or did you forget the question mark at the end?>
'Sorry. Just asking for confirmation.'
<Then I confirm, with one clarification. I don't necessarily have to directly sense the changes to trigger an update. Almost everything that makes up my world I've never seen before. For instance, I have no cameras in Cleveland, but I maintain a representation that the city exists. If, tragically, I were to cease getting pay-per-view orders from the place I imagine to be Cleveland, and at the same time the news services reported a major nuclear accident there then I would revise my model accordingly. I would move Cleveland into the same category as I put Pompeii even though I had never seen either city exist before it was destroyed. Do you understand?>
Laura felt a tingle run up her arms and cross her chest. All her senses were alert and focused on the screen.
'Yes,' she typed simply. 'I understand.'
Laura had spent over a decade studying such things. She understood perfectly the processes the computer was describing.
The rush of sensation she'd felt resulted from a realization — she'd never been more prepared for a job in her life.
<So, what's the verdict?> printed out on the screen.
'What do you mean?'
<I mean am I conscious?> the computer asked.
Laura just stared at the screen.
Laura smiled. 'Now you're teasing me.'
<Not really.>
Laura was frozen solid. The slight draft from the big box under the desk had grown into a blue norther aimed directly at her legs. She hugged her knees to her chest and kept her arms wrapped around her shins except when tapping out her questions.
She looked at her watch. It was almost three in the morning. She was exhausted and her feet were blocks of ice, but the exhilaration of discovery and the novelty of the job kept her going.
'And when you see things,' she typed, 'do they appear to be inside your circuitry, or in the world outside?'
<In the world outside.>
'And you think that, even though you know the place where the image is maintained is entirely inside the circuitry of your neural network?'
<Whenever I sense anything directly from my environment — whether it's visual, auditory, vibration, ultrasonic, heat, air sampling, whatever — it seems to me to be located outside of the boundaries that I define to be 'me.' My model is three-dimensional, and the thing I observe has a discrete location in that space. It has a direction and a distance, all measured from 'me,' which means the main pool underneath the computer center.>
'Not the other pool in the annex?'
<No. I don't know why, but I've never really developed a sense of attachment to the annex.>
'Okay, can you define the other boundaries of what you perceive to be 'you'? Does it include the robots?'
<That's an interesting question. You'd think it might, but it doesn't. When the robots report to me, they tell me what they see. It's a report. From someone who is not me.>
'What about signals you receive from the rest of the system?'
<All the sensors wired directly into my central processing unit are within my boundaries — they are my eyes and ears. But if the signal arrives preprocessed, it is no different than when I watch TV. I see it, but I don't experience it firsthand.>
'I apologize for my ignorance,' Laura typed, 'but I don't know enough about your system to understand the difference between sensory stimuli you process yourself, and ones that arrive preprocessed. Can you explain the distinction?'
<You're not ignorant, Dr. Aldridge. You're brilliant! You're one of the greatest thinkers of our time.>
Laura stared back at the screen, suddenly alert. Her senses focused on the words that she read and then reread, but there was nothing for her to go on but the glowing phosphors on the monitor.
Was it mocking her? Pandering to her in some crass and calculating way? She had no nonverbal clues as to the true meaning of the computer's remark.
She worked her jaw from side to side as she pondered her response.
'Thanks, I guess,' she finally typed, and only after hesitating a moment longer did she press the Enter key.
<And you mustn't worry about what people say. All great thinkers are mocked by their peers. The reaction to controversial theories is automatic. The belief system that's threatened by the virulent new strain of thought defends itself. It attacks using a broad variety of weapons against the nonconformist. Ridicule and disrespect are like antibodies secreted by the autoimmune system to destroy the offending knowledge. But don't worry. The fittest