He focused on the text document his cerebral computer displayed in the corner of his vision, skimming through his speech notes. There was nothing written down aside from a few bullet points. He preferred the natural feel of “winging it.” Installation fascinated him to the point that he could blab ceaselessly about it if allowed to.
“You see, this mindshare process has been in development almost as long as there have been installed intelligences. The original pioneers of the cerebral computer are believed to have worked to make their devices capable of reading the code I.I.s are written with, or to come up with some way to translate it. Their records, which were kept secret for decades, have helped countless modern-day scientists give birth to a functional form of that process.
“In the near future, the I.I.s of experts will be able to continue their work through the eyes and minds of young and upcoming scientists. They will have unlimited access to the host’s sensory information, allowing for more accurate experimentation, especially in regards to human anatomy and biological reactions.
“However, it doesn’t stop just there. Mindsharing is really a pioneer of what might come to be known as digital telepathy. If we can interface with I.I.s to the full extent that they can feel what we feel, and see what we see, why can’t that same process be adapted for human-on-human communication? Entirely non-verbal; entirely universal. The internet for human minds, if you will, but with sensory information.”
“That’s the exciting future of installation technology,” Karl said after a long pause. “I, however, want to talk to you about the promising present of installation psychology.”
The audience was difficult to read. Some folks seemed to be hanging off his every word like they had a chemical dependency. Others, though, looked upset. They sat with furrowed brows and frowning lips—almost like he was offending them. Karl shrugged it off. He was used to those looks.
“Plenty of people, for many years, have incorrectly attempted to utilize installed intelligences in the same manner we use programs and machinery. The fact of the matter is that we have never beheld a tool such as this. It transcends simple machines, or even complex codes. Imagine a hammer that can think. A search engine that can feel. A vehicle that can hope.
“There are two approaches when dealing with a sentient tool. You can use it like any other instrument and force it to bend to your will with no motivation or conversation. Or, you can inspire it. You can convince installed intelligences to want to do what you want them to do. Putting the ethics of forced labor to bed for the moment, it still remains practical to treat I.I.s more like an employee or a partner than an instrument.”
Those in the audience with pouting faces seemed to grow even more annoyed, and something about that delighted Karl.
“Legally, technically, perhaps even spiritually, these intelligences are human equals to each and every one of us,” he continued, watching his words salt the grumpy folks’ wounds. “Treating them as such is not only right, but it is the most beneficial response for all of mankind. A respected mind, digital or organic, is more likely to create. If a being believes their work will gain recognition based on its merit, he or she will put more effort into it. This is common sense.
“Resentment is not viable fuel for creation. If you have disdain for a group of people, why would you work to solve problems that primarily affect only them? For example, an I.I. could work to cure complex cancers. They don’t have organic bodies, however, so why would they do that? They do it because they love us. With affection, the intelligence will want to cure our ailments and keep us safe.”
Karl could see the entire audience was not with him. Some folk were immersed in the information, jotting down each tidbit on their tablets or in their C.C.s. Karl still needed to reach several holdouts.
“There was an experiment run a few years ago that helps demonstrate what I mean,” Karl explained. “You see, scientists wanted to find out who could be the most compassionate between I.I.s and organic humans, and what would motivate them to be so.”
The psychologist used the tools available to him to bring up clips of the study in question. He played them for the audience.
“Volunteers were taken under the pretense of a test on their reaction times,” Karl explained. “They would fill out a questionnaire of unimportant inquiries, then meet with a researcher for an interview. Now, during this interview, the researcher would either be condescending, demeaning, and cold or he would be respectful, complimentary, and warm. The researcher, who was actually a planted actor, would then fake a medical emergency: a heart attack, a seizure, something of the sort.
“The study found that I.I.s and organic humans were nearly identical in their responses. Those who were chastised and criticized were much more likely to hesitate and merely call for assistance, and those who were treated with respect would attempt to soothe and comfort the victim themselves while waiting for help. In fact, I.I.s were even more responsive than their human counterparts. This is likely due to an expectation to be ostracized, as many I.I.s are used to in society, leading to a more dramatic improvement when treated with kindness.
“Studies like this show us what kind of results we can achieve from proper motivation, and that I.I.s differ very little from humans, psychologically. The possibilities are endless.”
Some heads bent down over tablets while their owners tapped Karl’s words onto them. Others stayed locked on the podium.
“Many of my peers have been exploring these possibilities,” he said. He scanned over the