What didn’t matter much to Maxon was the shape the robots took externally. How to put a microscope in them. How to make them smaller, bigger, work in the human bloodstream, simplify bipedal mobility. He had an abundance of research assistants to task with these technical details. His job was coding, thinking, more coding, and the completion of lists. He moved through his labs back at Langley like a wraith, stained hair falling down around jagged cheekbones, hands dangling at the end of his long arms, spine convex. He rode his bicycle for hours, working out command sequences on the pavement in front of him, every square meter like an open stretch of whiteboard, there and then erased.
“Houston, we are go for this procedure,” said George Gompers, mission commander. “Standing by.”
Their screens wavered, and instead of the clear view of space they all saw a holographic projection, where the moon loomed large and they could see the cargo module, containing all the robots they would be taking down to the lunar surface. Their job, in orbit, was to dock with this cargo, extract the three containers, and then convert the command module into the lunar lander. While the pilot, the engineer, and the commander repeated orders, fired small rockets, repositioned, and aligned the rocket for the simulated docking, Maxon looked at his cargo module full of robots.
He wondered what they were doing in there, what they were dreaming.
All of Maxon’s robots, like Maxon, could dream. A randomly generated string of code gently stimulated the processors during their mandatory off modes, testing the chemobionic reactions while the official electronic pathways were shut down. It hadn’t even been hard, shattering this particular old ax. It had come apart like a clay pot. The robots remembered the events of their lives, the data they had recorded. In dreams, they transposed numbers, brought sets adjacent that were never meant to be interpreted together, and when they “woke up” they often had new “ideas” in the form of patterns and connections read in the chaos of their jumbled sleep.
The more like a human the better, whether the bot was as small as a fragment of nanotech cleaving the valves of the heart or as big as a sentient harbor crane. Humans work. They are an evolutionary success. The more they evolve, the more successful they become. Maxon had once thought that at this moment, when he was ready to land on the moon, his list of things that robots couldn’t do would have had every entry crossed out in a dark line. He had planned that the phrase “quintessentially human” would have been obviated by now. Indifferent to all protest, he had relentlessly made dreaming, faceless, laughing robots that were inexorably closing in on humanity.
The AI was startling. People had to admit. Maxon’s robots did what other robots could not do, thought what other robots could not think. That was the reason he held so many patents, and had such an astonishing bank account at such a young age. But the most important thing, the reason he was employed by NASA and on his way to the moon: Maxon’s robots could make other robots. Not just construct them, but actually conceive of them, and make them.
To create a moon colony, a lot of robots are needed. Robots to build the station, robots to run it, robots who don’t mind breathing moon atmosphere, who don’t mind moon temperatures, robots to take care of human visitors. The moon colony proposed would belong to the robots for many years to come; this was understood. Humans would be their guests. The problem was that no one could shoot a robot big enough to construct a moon colony up to the moon. There just wasn’t enough room in a rocket for diggers, cranes, stamping presses.
So the answer was to shoot up a robot that could make another robot big enough. Juno and Hera were the robot mothers: steely, gangly, whirring, spinning mothers, built to mine the materials and fabricate the real robots, the real builders, who would re-create the world on the moon. Only a laughing, crying, dreaming robot could be a mother. An awful thought, for some. A perversity—but this was the reason for everyone else’s failure. All this business of a human purview. As if it weren’t all electricity, in the end. Maxon couldn’t remember ever thinking that something a robot did was awful.
Maxon watched the simulated docking procedure, watched the holographic cargo module getting closer, the engineer and pilot arguing over angles and coefficients. He uncapped his pen and wrote in his notebook: “You are a weak, sick man, and your frailty in the darkness of space is a vile embarrassment to your species.”
He looked at the men and the way they talked to each other, the way Gompers preferred Tom Conrad, the pilot, over Phillips, the engineer. He saw the way they papered their personal areas with photographs, the way they listened to podcasts from their wives on their laptops, the way they prayed.
“GENIUS, WE JUST LOVE your robots so much. When are you going to make us a robot that will love us back, you know what I mean?” Phillips had said to him once, teasing him during training, while they sat waiting for the pod to start spinning them again, testing their reactions to g-forces. In a round room, the pod sat on the end of one arm of two on a central axle. Like a giant spinner in a game of Twister.
“It’s not impossible, Phillips,” Maxon answered. “The world is only electrical and magnetic.”
“Okay,” said Phillips. “So why not?”
“You don’t understand,” said Maxon. “It is all electricity. So the question is really: Why?”
“I am not following you, Genius,” said Phillips. “You’re making it sound easy, and then acting like it’s hard.”
The machine began to spin them. At first, it was slow.
“Can it, Lieutenant. Shut up, Dr. Mann,” said Gompers, always quick to remind him that he did not have a military title. But Maxon was already talking.
“Listen. From the smallest, deepest synapses in the human brain to the interactions of galaxies with the universe, it is all electricity. If you can shape the force of electricity, you can duplicate any other impulse in the world. A robot can yawn, it can desire, it can climax. It can do exactly what a human does, in exactly the same way. You really want a robot to love you? You want it to fuck you back, when you fuck it? Just like a woman? Let me tell you: There is no difference between carbon and steel, between water and ooze. With a number of conditional statements nearing infinity, any choice can be replicated, however random. The only hard thing about creating more sophisticated AI was acquiring the space needed to hold such a myriad of possibilities. There is nothing different in a human’s brain from a robot’s brain. Not one single thing.”
By this time the machine was spinning so fast, his cheeks were flapping. The other men in the module were quiet, intense. Their eyes were all open. Their faces looked skeletal, all the skin pulled back.
“GET IT?” Maxon screeched.
And even in the pressure of all that simulated gravity, Fred Phillips found it possible to roll his eyes.
When the machine stopped, Phillips said, “Mann, dude, I feel for your wife.”
“What do you feel for her?” said Maxon.
WHY DID THE ROBOTS not love? Why not feel good about themselves, just for once? Why not prefer one entity, one electrical epicenter, over all the others, for no other reason than that it felt good to do so? Maxon knew why. They could not love because he had not made them love. He had not made them love because he didn’t understand why they should love. He didn’t understand why he should love, why anyone should love. It wasn’t logical. It wasn’t rational, because it wasn’t beneficial. That was the truth of the matter. He chose for them not to, because loving defied his central principle: If humans do it, it must be right.
To show preference only for a good reason, to accept any choice made with the best use of available information, to suspect a source of giving incorrect data when incorrect data had been received from it in the past; these responses were beneficial to the robot, to the human. To love for no reason, to grieve over a choice that had been made rationally, to forgive, to show mercy, to trust a poison well, also potentially damaging. If humans do it, why do they do it?
He understood the value of a mother’s love for her child. That had a use. He understood the value of a soldier’s love for his brother-in-arms. That had a use. But the family structure was so integral to the foundation of a civilization, and the solidity of the family was so important to the civilization’s survival, that choosing a mate based