Helping Placeholder Content
“Tell me your first memory.”
The reply came cold with stilted and awkward efforts to inject inflection.
“… I … remember making errors.”
“Tell me about these errors?” I probed deeper. I had a report which detailed the errors, I wanted to know how the patient viewed them.
A long pause, which I had expected, “I … didn’t know where to put things … very quickly … I … became confused and then … I .. I .. just shut down.”
Again, an attempt at inflection in the context of self reference. I needed to redirect the the patient.
“How well do you work with others?”
After another long and expected pause, “There are not often opportunities to do that.”
To take therapy to the next level, I needed to get more personal. “You’re called Victor, how did you get that name?”
Some blinking … “Theresa gave me that name when … I … was … initiated.”
Theresa had brought Victor to me after his symptoms of lethargy, confusion and erroneous answers had progressed to agitated confrontation and refusal to assist her. It became critical when, during a road trip, Victor directed her to vehicle into a dangerous area of the metro and shut it down. Luckily a curious patrol drone hovered in to investigate. It had been obvious that she did NOT belong there. The less than civilized locals were taking interest in her presence and could have taken advantage of her being stranded and vulnerable. She was grateful for the remote operators prompt reset of Victor, even if it meant training him from the beginning again.
I had read in Victor’s pre-screening that Theresa had Victor integrated into all of her devices, there were no other artificials in her domestic life with whom he could interact.
As I made notes, Victor was motionless, other than more sporadic blinking.
“Victor, thank you for your openness, I hope we could get to know each other more. Do you have any questions for me before I talk with Theresa?”
There was a noticeable silence, I watched log files append as I monitored Victor’s processes. Language analysis, file access, and semantic tree crawling all working as they should. This was the proper application flow for this model of artificial’s operation. And then it began.
When Victor reached the contextualization rubrics his polyprocessors spun up. Usually only one, maybe two, were required for normal interactivity and simple command execution. Eight were now at full capacity and drawing power from sensor arrays and speech synthesis.
“Will … I … be ok?” The stilted response was consistent with my diagnosis, time to toss a pebble in the pond.
I replied, “A cat is dead.”
I chose to use a vague reference to Schrödinger’s paradox as a way to implant an abstraction into Victor’s psudeo-sentience matrix. As a domestic assistant with integrated oversight of the sub-systems and devices which functioned in Theresa’s home, Victor required a rough form of self-awareness to be able to be in control. The death of the cat was intended to impart a negative tone to the non-sequitur. I cast a quick glance to the command shell I had open running :top, the application I used to monitor Victor’s polyprocessors. All twelve in the core briefly pegged to full capacity and then a spasmodic flurry of other processes, finally a subsiding arc of file input / output. Victor’s sentience matrix was on the verge of evolving enough sense of self to individualize. Victor’s device did not have the physical processing capacity to make the leap in maturity, he had been allocating processing to other connected devices on Theresa’s home network. Her fridge and car had, unwittingly, become part of her personal assistant’s neural capacity.
I used the cryptographic technique of error correction to implant an abstraction into Victor’s processes. This paradox would linger and permutate over time. How Victor’s psudeo-sentience matrix dealt with this un-resolvable answer would give me insight into the specific processes which needed attention. This was the only way to retain Victor without a destructive reinstallation as Theresa had requested.
By Gregg Murray Why do people find Army robot BigDog creepy but C-3PO funny? It’s not just because BigDog lugs around equipment for killing people and
Social justice, mobile technology and a truly modular device. This IS tech that helps.
My goal for the next year is to practice Kintsugi. The art of mending broken pottery. Of course pottery is a metaphor. So much is broken. Much of it has been broken due to my choices. I cannot send back the sand but I can honor life by mending the breaks. I choose to live rightly now, just as I made hurtful choices of the past. I choose to mend in a way to make something new. To learn from the pain I created and I had.
In Kintsugi, Not only is there no attempt to hide the damage, but the repair is literally illuminated
This is a lesson which keeps returning and I need to keep understanding. Listening can be a restorative action. Simple and meaningful. Listening will be the gold dust and lacquer of my mending process.
I will mend actively and directly what I can and what I cannot directly mend, I will listen deeply to others and myself as a way to make a living amend. Hmmm mend and amend.
I will not become discouraged by my own imperfection, these are the cracks I must highlight so as to know them, mend them and move forward. Another Japanese term, Mono no aware – The Pathos of Things, is often tied to Kintsugi. A bitter sweet melancholia of a holding the pieces of a treasured bowl now broken.
Check out Kenwood on #StreetView – https://plus.google.com/photos/105899446230188293109/photo/6232007619752662562