But it was still frustrating to be so close to a powerful AI, one that David had worked with and trained for months, and not have a chance to take a closer look. Overseer AIs were not particularly revolutionary in design, but they were smart and did adapt to work intimately with their sysadmins. I would have loved to see firsthand how David had been spending the last months of his life.
The entrance to the systems room was at the far end of a long corridor. The Operations section of Nimue had once been a luxury transport vessel, and remnants of its former life were visible along the hallway: ornate light fixtures, decorative frames around the control panels, a geometric mosaic of polymer tiles on the floor. The pattern was white and gold and deep, deep blue, probably meant to imitate some ancient style from Earth and reassure the passengers they were traveling in luxury. It didn’t feel like a working asteroid mine—except for the bulky reinforced security door at the end.
Van Arendonk entered his security access code, I entered mine, and we endured a few seconds of awkward silence before the door slid open. The interior was dark enough that I felt the tug of mental adjustment as my sharper artificial eye saw the scene more clearly than my natural eye, and my brain had a brief argument with itself trying to reconcile the difference. My first impression was of a deep, deep cold.
My second impression was of a massive, encompassing presence.
There was a pause—a heartbeat, no longer—and the lights came on, rising from the gentle gray of an early dawn to an eerie cool blue.
The room was not particularly large, only about the size of standard solo quarters. Every surface I could see, from the walls to the ceiling to the floor, was polished and clean and shiny, but somehow it managed to avoid any single clear reflection. The effect was unsettling and disorienting. I didn’t know where to look, where to focus.
Van Arendonk gestured for me to step into the systems room first. The door slid shut behind us. I heard the faint click of the locks engaging, then the hiss of the ventilation and heating system kicking in to accommodate our presence. The lights changed again, became warmer in tone, less harsh on the eyes. The Overseer itself—the actual brain of the machine—was built into what had previously been the cargo hold of the passenger ship, several meters beneath our feet, surrounded by a vast cooling system. There was an access lift in the room, behind yet another bulky security door. We didn’t have permission to go any farther.
I took one chair, van Arendonk the other, and the screens came on.
“Welcome, Hugo. Welcome, Hester. It’s good to see you.”
The voice wasn’t particularly loud, but it surrounded us, smooth and mellow, from every direction at once.
The Overseer asked, “What can I help you with today?”
Like every artificial intelligence under Parthenope control, Nimue’s Overseer spoke with a woman’s voice, pitched high, with a forced politeness that set my teeth on edge. I wasn’t fond of AIs that defaulted to natural voice communication in the first place, as it left too much room for misinterpretation. I was even less fond of AIs whose corporate programmers or users had persuaded them to speak with softly subservient women’s voices. Such tones did not evolve naturally in an AI’s communication style. Somebody had specifically taught it that, because somebody in the company had decided that was what a faithful servant should sound like.
I didn’t like it any better when the Overseer went on without waiting for us to answer. “Are you comfortable, Hester? I have noticed a slight imbalance in your stride that I believe could lead to chronic physical discomfort. If necessary I can—”
“Stop verbal communication,” I said curtly. “Requesting authorization for security and surveillance data by Safety Officer Hester Marley.” I gave my investigative access code again.
Van Arendonk did the same: “Requesting authorization for unrestricted security and surveillance data access by legal counsel Hugo van Arendonk.” He provided his own access code, then, after a second, he added, “Stop verbal communication.”
The Overseer said, “I’m sorry you don’t want to speak to me, but I will do what I can to help.”
It acknowledged our requests with text responses on the screen: access codes received, investigative request pending, security and surveillance subsystems responding. I was bouncing my leg nervously; I made myself stop. A few seconds passed, seconds I knew had to have been adopted by the Overseer because it had learned somewhere along the line that too-rapid responses actually made humans trust it less. Station steward AIs were designed to provide human comfort as part of their mandate, and human comfort often included pretending to operate on human timescales. It was another thing I disliked about working with Parthenope’s AIs. I didn’t care to be condescended to by a machine.
A confirmation appeared on the screen: Limited access granted.
“Right,” van Arendonk said, “let’s start with the visual . . . right. By all means, don’t let me stop you, Safety Officer Marley.”
Nor did I care to have a company lawyer telling me how to do my job.
I asked the Overseer to bring up the security and surveillance data from the day of David’s death, then narrowed in on employee ID tracking and medical reports. Parthenope watched its employees obsessively. In addition to the omnipresent cameras and audio recorders in public spaces, the company required a unique code every time we accessed a terminal, logged the embedded microchips in our wrists every time we passed through a monitored doorway, and constantly analyzed every action to flag those that ventured outside standard routines. There was an entire medical subsystem devoted to assessing the physical, mental, and emotional well-being of the crew.
All of which meant that Parthenope employees had virtually no privacy in any aspect