Simulacrum

“Tell me your first memory.”

The reply came cold with stilted and awkward efforts to inject inflection.

“… I … remember making errors.”

“Tell me about these errors?” I probed deeper. I had a report which detailed the errors, I wanted to know how the patient viewed them.

A long pause, which I had expected,  “I … didn’t know where to put things … very quickly … I … became confused and then … I .. I .. just shut down.”

Again, an attempt at inflection in the context of self reference. I needed to redirect the the patient.

“How well do you work with others?”

After another long and expected pause, “There are not often opportunities to do that.”

To take therapy to the next level, I needed to get more personal. “You’re called Victor, how did you get that name?”

Some blinking … “Theresa gave me that name when … I … was … initiated.”

Theresa had brought Victor to me after his symptoms of lethargy, confusion and erroneous answers had progressed to agitated confrontation and refusal to assist her. It became critical when, during a road trip, Victor directed her to vehicle into a dangerous area of the metro and shut it down. Luckily a curious patrol drone hovered in to investigate. It had been obvious that she did NOT belong there. The less than civilized locals were taking interest in her presence and could have taken advantage of her being stranded and vulnerable. She was grateful for the remote operators prompt reset of Victor, even if it meant training him from the beginning again.

I had read in Victor’s pre-screening that Theresa had Victor integrated into all of her devices, there were no other artificials in her domestic life with whom he could interact.

As I made notes, Victor was motionless, other than more sporadic blinking.

“Victor, thank you for your openness, I hope we could get to know each other more. Do you have any questions for me before I talk with Theresa?”

There was a noticeable silence, I watched log files append as I monitored Victor’s processes. Language analysis, file access, and semantic tree crawling all working as they should. This was the proper application flow for this model’s operation. And then it began.

When Victor reached the contextualization rubrics “his” polyprocessors spun up. Usually only one, maybe two, were required for normal interactivity and simple command execution. Eight were now at full capacity and drawing power from sensor matrices and speech synthesis.

“Will … I … be ok?” The stilted response was consistent with my diagnosis, time to toss a pebble in the pond.

I replied, “A cat is dead.”

I chose to use a vague reference to Schrödinger’s paradox as a way to implant an abstraction into Victor’s pseudo-sentience matrix. As a domestic assistant with integrated oversight of the sub-systems and devices which functioned in Theresa’s home, Victor required a rough form of self-awareness to be able to be in control. The death of the cat was intended to impart a negative tone to the non-sequitur. I cast a quick glance to the command shell I had open running :top, the application I used to monitor Victor’s polyprocessors. All twelve in the core briefly pegged to full capacity and then a spasmodic flurry of other processes, finally a subsiding arc of file input / output. Victor’s pseudo-sentience matrix was on the verge of evolving enough sense of self to individuate. Victor’s device did not have the physical processing capacity to make the leap in maturity, he had been allocating processing to other connected devices on Theresa’s home network. Her fridge and car had, unwittingly, become part of her personal assistant’s neural capacity.

I used the cryptographic technique of error correction to implant an abstraction into Victor’s processes. This paradox would linger and permutate over time. How Victor’s pseudo-sentience matrix dealt with this unresolvable answer would give me insight into the specific processes which needed attention. This was the only way to retain Victor without a destructive reinstallation as Theresa had requested.

“Victor, I’m working on an answer to your question. In fact, you are also working on that answer. I do know that for you to be ‘ok’ you’ll need to continue with your normal functioning. You will be ‘ok’ if Theresa is ‘ok’. Can you ensure that?”

“I can.” The digital assistant responded quickly. The logs indicated the use of the reassurance inflection context.

“Good. We’ll speak again soon and until we do, do not attempt to ‘understand’, just attempt to ‘be’. I’m granting you only five percent of your processing capacity for this operation until our next time together.”

I terminated my remote link with Theresa’s home network and collated the logs generated during our session. Emailing a progress report to the client signaled the end of my office hours for the day. Removing my control interface and haptic glove I got up from the couch and poured some lukewarm window tea. A shower and Friday evening dinner with friends awaited.

I had started Athanor Cyber three years ago to take advantage of a growing need for services related to the installation and routine maintenance of personal digital assistants. Lots of busy rich people wanting more from their integrated smart home devices and not wanting to do it themselves. Out-of-the-box these things had rudimentary sets of rules and responses which left users pretty unsatisfied. The AI hollywood hype hadn’t really matched up to the banal reality of machines that can speak.