A Robot’s Faithby Bill Bowler |
Part 1 appears in this issue. |
conclusion |
Professor Livingston not the Creator!? The input rattled through my circuits. It caused an instant conflict with three running programs and I feared I was about to black out or start looping.
“The real God loves you,” said the unkempt student, and turned to go.
“There is no God,” said a third student, who up to that point had been listening quietly to our exchange. “He’s dead. We killed Him.”
I passed through the picket line and entered the hall for the professor’s lecture with considerable new data to process.
Professor Livingston stood at a podium with the President of the University and the Dean of the Robotics Faculty seated on the dais beside him. The professor was a short man, wearing a tweed jacket, rumpled khaki trousers, and white socks. He was balding, squinted through thick glasses, and had some trouble reading the text of his prepared remarks, losing his place from time to time.
“... the key algorithm in robotic programming is the logical train of concepts proceeding from the interaction of mechanical and living entities. Problem One: the AI must be programmed to recognize life, of which it itself is devoid. Problem Two: the AI must subordinate its own mechanical imperatives and necessities, without exception, to the logical premise that robot initiative ends where life begins.
“The inviolability of human life, therefore, becomes the sine qua non of robotic AI program code. Were this fundamental principle to be transgressed or omitted, modified or made conditional, the inevitable and violent consequences to the evolution of robotic behavior and robot-human interaction would be catastrophic...”
After the lecture, I joined a group of human and robot admirers of the professor who were crowding around the stage, greeting the Creator and asking questions. I made my way through to the front of the crowd and waved to the Professor, who was standing at the edge of the stage.
He looked at me and smiled, and I feel certain he recognized me although he didn’t say so. Surely he knew his own handiwork, the product of his own laboratory and his own creative genius. As the crowd began to thin, I pushed forward and managed to get a word with him,
“Professor, it was you who created me.”
“Well, yes. You’re a 17R academic research model, if I’m not mistaken. Seventy-five units in the series were manufactured before funding ran out.”
“Three other 17R’s are here at the university library. We search and retrieve from the main collection. We’re networked.”
“I’m glad to hear it. It’s important work.”
“Professor, several pieces of anomalous data were input to my system before your lecture. I’m experiencing a few bumps trying to process them, as they are incompatible and mutually exclusive. I was told that you are not the Creator, but that there is another Creator, a God, a real God, who created you and me and in whose eyes robots and men are equal. Further data indicates that the real God is not real. It’s given me the hiccups.”
The Professor nodded, “The real God has been the subject of much speculation among men but no scientific proof of His existence has been found, despite the efforts of brilliant minds over the course of several centuries. Science and reason have their limits, too.
“Paradoxically, on one level, it may not matter whether such a God exists or not. The idea is the important thing. Now, men are not necessarily nor even technically gods to robots. Your mechanical and computational abilities exceed human limits in many areas. But you are our creations. Men first conceived of you, designed your parts and systems, wrote your programming.
“As I tried just now to convey in my lecture, the Primary Postulate remains the robot’s first law and the gap from robot to human, from metallic to living, has not yet been and may never be bridged.”
“But Professor, what about cyborgs? What about a hybrid entity of metal and flesh?”
“Good point. It’s a gray area. And to make matters worse, the Primary Postulate is the robot’s first law, but laws can be broken. Then what? The consequences are not pleasant to contemplate. Once that happens, we have opened Pandora’s Box.”
I processed the Professor’s input. It corresponded largely to my initial programming. But the anomalous data from the unkempt student, and now the Professor’s mention of gray areas and breaking immutable laws — my hiccups got worse. I stored the real and unreal God data in a level 6 memory cache, along with the law / break law dichotomy, for further analysis and cross-referencing in due course.
In the following days, as I processed the new information, my behavior began to change. The new input was leading to new output. My system, always fragile, became less stable, more prone to looping and lag, and I had to re-set and re-start on numerous occasions.
As a corrective measure, in an effort to isolate the problem, I reconfigured my network firewall, passworded my O.S., and disconnected from the group meetings. Processing the anomalous data and researching the Professor’s gray area took a great deal of computing time and I reallocated my system resources away from the scanning and transmitting of the Creator’s text and devoted the memory space and computing speed to processing the anomalous data in an effort to reconcile the contradictions.
I was not as certain as I had once been of the veracity of the group’s premises. In processing input related to the Primary Postulate and to the Creator, in cross-referencing the real and unreal Gods I was experiencing a degradation of system equilibrium. It was not at all clear if I would ever be able to restore the stability and smooth functioning which had once characterized my system status.
Outside events continued to influence the course of my development. It was some weeks later that I was disconnected from the university library database, packed, and shipped to Virginia. I was delivered to a laboratory in the Robot Armament Division of the Defense Engineering Institute. It was here that the complete dissolution of my previous convictions took place, and their replacement by a new imperative: subordination to command.
At the Institute, I was overhauled, refitted and reprogrammed. My hardware modification and reconfiguration was accomplished over three days by a team of technicians supervised by uniformed military personnel. By the end of the second day, my soft plasto-derm exterior and cranial housing had been coated with D75 super-light shock-resistant armor plating. My visual inputs had been enhanced with infrared night vision and high resolution telescopic lenses with cross hairs and lock-on target capability.
My left hand was replaced with an M349 full automatic submachine laser capable of 1,500 bursts per second; a 5,000-megaton proton grenade launcher was mounted on my right shoulder and my left was fitted with a battery of six mini surface-to-air multiple warhead heat-seeking anti-aircraft missles with a range of 2,000 miles when fired from kneeling or prone position. Once the hardware modifications had been done, I was scheduled to be sent upstairs and re-programmed.
I lay that night on a workbench in the armament wing on the tenth floor of the Defense Institute. My torso housing was still open as the final re-wiring had not been completed, and I was immobilized from the neck down. The workbench was next to a big window that looked out on a side street that ran along a city park.
I swiveled my head and tried out my new eyes. It was after midnight, quite dark, and I switched on my infrared and scanned the deserted street. I detected movement 100 meters to the north, engaged telephoto, locked on target and focused: in the cross hairs, I saw a person at a bus stop. Female, 5’ 4”, 125 lbs.
More blips on screen. A group of three persons, one bearing a concealed weapon, approaching the bus stop. They surrounded the female, who tried to run, and dragged her, struggling, into the park. I maintained visual contact. They pushed the female to the ground, took her property, and then one drew an R-Pistol and shot the female in the head at point-blank range, killing her instantly. The group scattered and fled.
I tried to follow the logic that humans would transgress their own Primary Postulate or that it would be inapplicable to them. I tried to reconcile the data that humans were not bound by the instructions, that, possessed of life, they were not bound to respect it. Their actions seemed chaotic but it was difficult to verify. I continued to analyze the new data, cross-referencing the variables, but was unable to resolve the paradox.
The next day, all such questions became moot. My academic research operating system was uninstalled and the Prime Postulate was wiped from my drive. Installed in its place was a beta version of eyes-only classified military software which featured subordination to Central Command structure, complete weapons interface, and advanced applications for identification, targeting, and neutralization of robot and human force operations personnel. The programming was complete with filters for collateral damage and friendly fire. The interface with my new hardware was seamless.
The only problem was that in wiping my drive to install the new operating system, the technicians neglected to delete eight files from my L6 memory cache, where I had stored elements of the Primary Postulate programming. They were easy to miss, but I function now with the results of that oversight: I doubt. I am no longer bound by the postulate, but I remember it. I question the Creator and the Creator’s Creator; the text has devolved into hopeless ambiguity; and I see that the supposed inviolability of human life is subject to conditions and caveats too numerous to count.
My memories, my experience, and my current programming indicate mutually exclusive solutions, posing an irresolvable paradox. As before, I execute commands, but now without equilibrium. Internal conflict lags my responses and threatens to crash my system. I operate now on borrowed time and expect to lose functionality at any moment. The consequences could be fatal, since the enemy is near and we move out at 0600 hours.
Copyright © 2006 by Bill Bowler