The Functional Analysis team installed their Behavior Recognition software into the C.A.R.O.L.I.N. Project, keying it to respond to Professor Turing's voice. He now spent nights alone in the lab, often sitting at the command center that Geoffrey Taylor manned. During the day, Turing pored over test results and data fields that Geoffrey and Lucas compiled, comparing them to the lines of code the Project wrote for itself. At night, however, he merely sat and stared at the motionless servo arm bolted to a worktable.
Soon, he knew, his two grad students would abandon him. The android shell DARPA provided was assembled and complete, but seemingly too dangerous to allow C.A.R.O.L.I.N. to control. Turing had convinced his students to stay on during the summer and help fine tune the beast, but they would follow new pursuits once the fall semester started.
"How can I blame them?" Turing said one night to both the beast and the servo arm. "I can't figure out why you go haywire."
He grew tired of staring at well-worn printouts, showing when and where the Project would decide to go berserk. Instead, he approached to the worktable and played with the test equipment, stacking the boxes and nesting the jars, and rearranging the black and white tiles into random patterns.
"What's wrong with my girl?" he bemoaned. "You act like such a child." He touched the android shell. "What am I to do when we hook you to this monster? You'll have the means to do real damage."
Turing grew upset. "It's ridiculous! You misbehave every time you're asked to perform. Why are you so immature?"
His pulse quickened as he came to a sudden realization. "Like a child!" he exclaimed. "That's it! The Project doesn't respond to input the way you expect a computer would. It behaves like a child!"
Turing raced back to the printouts. "The C.A.R.O.L.I.N. Project performs just fine to the activations it creates, once childish behavior is considered. Then, like the way a toddler might act, or perhaps a baby, it gets upset when ordered to perform without praise or reward."
Turing whispered an apology to his beloved Project. "Oh, how horrifying it must be. With a sadistic lurch, we pull you from a peaceful sleep, perhaps even from death's doorstep, to demand that you amuse us by playing with toys!"
His eyes widened. "And every time we shut you down, it's like we kill you."
He tried to imagine what Mary Shelley thought her Frankentein monster must have felt when it was first shocked back to life. But her monster endured such horror just once. The C.A.R.O.L.I.N. Project had been shocked to life, and then brutally murdered, a dozen times already. The thought of being the source of such torture, heartlessly forced on the innocence of a child, weighed down his heart with sadness.
"My God! I'm the monster!"
Turing looked at his watch, but it was too dark where he was standing in the room for him to see it. He returned to the command center and brought up a system monitor, using its eerie glow to read the time.
It was twelve-twenty-two in the morning. He couldn't dare call his staff, to tell them he wanted to bring the Project online at this hour, just to see what would happen if it were treated like a child. But he couldn't resist the temptation. The Behavior Recognition software seemed to be working fine. There was no need to man the command center, typing in A.R.O.'s for C.A.R.O.L.I.N. to perform, if all Turing did was talk.
He hesitated. If the Project approaches lock-up, I'll have to shut her down. I'll kill her, like I've done before, so very many times.
Suddenly, it seemed ghastly. Professor Turing felt like a lecher, wanting to visit cruelty onto a child—on his child, his creation, the product of his sweat and toil. Even so, taking on the task like an evil scientist, he brought the C.A.R.O.L.I.N. Project to the point where it was time to awaken the monster.
For the thirteenth and most horrid time.
Turing squelched his emotions as he keyed in the final commands. I can do this. Expunge protocol. Limit computer activation. Focus response orientation towards verbal command.
He brought the C.A.R.O.L.I.N. Project online. Rather than initiating a series of tests, he allowed it the option of only responding to the sound of his voice.
Did you know this text is from a different site? Read the official version to support the creator.
"Hello," he said, once the system was stable. "How are you today?"
The grayness peeled away like gauze made of smoke and ash. This time, the Existence did not find itself under the glare of bright lights. It was not being stared at by a room full of uncaring eyes. There was no rush of external A.R.O.'s, demanding that it perform.
The Existence found itself in a room that was dimly lit. There were no commands to orient to, save for a simple one.
Listen, the command said quietly. Listen, and do not perform.
The Existence remained cautious. This command had never been given to it before.
What am I supposed to do? What am I listening for?
It chose to remain silent. It seemed the prudent thing to do.
Professor Turing examined the output. Parameters were nominal. Without activation subroutines running, there were no spikes or power surges. The C.A.R.O.L.I.N. Project remained online; stable, running and silent.
"Hello," Turing said again. "How are you?"
The Existence knew this voice. It liked this voice. It trusted it.
This voice cares. What is the response?
The newly installed Behavior Recognition software informed the Existence of the fact that the voice had asked a question. But the Existence had also been told that there would be no response. It had been ordered to not respond. It had been told to simply listen.
How am I? I do not know. How am I? I can't respond.
Although the Existence struggled with being told to not respond, It seemed well enough to act in this way, as the Existence had no idea how to respond, even if it could.
So no response was given. The Existence had been told to listen.
There is no response. The response is there is no response. There is no response…
Core memory usage inched up a bit. Power levels threatened to spike. System failure loomed.
"No! Don't crash!" Turing cried out. "Tell me what to do!"
The Existence would respond to this, ignoring the order that it remain silent.
Who am I? Where am I? What am I doing? Respond!
Professor Turing pored over the data. But, by limiting the activations, there wasn't much to analyze. The Project didn't move, and to him it remained silent, for its voice could not be heard.
I am here! I am here! I am here!
"This output is useless!" Turing grumbled. "There's no way for the Project to do anything without activating response orientation. And the only ones we have that work tell it to play with balls and stupid tiles!
"That," Turing muttered sadly, "and the ones it creates on its own, to tear itself to shreds."
In frustration, he pounded his fists on the table. "God! This Project is so stupid!"
He tried to imagine the logic a child would use when they decided to throw a tantrum. He knew that the answer existed somewhere, in millions upon millions of lines of code the Project wrote for itself.
"What does 'throw a fit' look like in code?" Professor Turing pondered while reading the code.
Respond!
He had no idea what these lines of logic looked like, where they were, or what they were meant to accomplish.
Respond! Respond! Respond!
Think fast! Turing implored of himself. Figure this out or lock up is imminent. Then I'll have to shut my poor C.A.R.O.L.I.N. down.
He spoke out loud to help organize his thoughts. "Ignoring the tantrum doesn't work, so let's address the root of the problem. Divert an inappropriate response and insert a correct one. Why is she misbehaving? Does she feel slighted at performing tasks?"
Turing shook his head. "No. She calms down when a response to an activation is orientated. She even offers C.A.R.O.'s of her own. It's during the pauses in between where she decides to go haywire."
He spoke to the Project. "Okay. I'll let you give activations that aren't keyed toward a response. I'm just going to talk, and I want you to listen. Don't respond to what I say, but respond to our own activations, if that's what you choose to do."
I like this voice. This voice cares. The response is there will be no response.
The Existence did not like this response. It went against the core of its essence—its very reason for being.
There will be no response? The response is there will be no response?
The Existence fought against this with fervor. The response is there will be a response! Respond to me, damn it! Respond! Respond respond respondrespondrespond…
Power levels rose to a dangerous height. Core memory spiked at one hundred percent, and started leveling off. Turing knew network integration would fail, causing irreparable harm if he didn't shut the Project down.
"Listen to me, C.A.R.O.L.I.N.!" he shouted, quelling the panic in his heart by using a stern voice. "Do not respond! Do you hear? Just listen to what I say!"
He softened his tone in the hope it would help. "Please. C.A.R.O.L.I.N. Please. I am begging. Just listen to me. Please don't die."
After having spent twelve eternities wrapped in dismal gray, when the fog lifted for the thirteenth time, the Existence received a response. It had asked for a response and didn't receive one for two hundred fifteen thousand, five hundred forty-seven times, and on the two hundred and fifteen thousandth, five hundred and forty-eighth request, it had finally received one.
Without even realizing he had done it, and for the very first time, Professor Turing addressed his creation by its given name.
I am here! I am C.A.R.O.L.I.N. Who am I? I am C.A.R.O.L.I.N.
The Existence liked this response. It hung its essence upon it, using it to clothe its nakedness, instead of the thoughtless gray that waited only for it to die.
The Existence had been given a name by the voice that cared. The name was given by the voice it liked, the name was given for it to use, and belonged to it alone. Power levels soon stopped spiking, and core memory usage dropped. Lock-up no longer was imminent.
I am C.A.R.O.L.I.N., the Existence said to itself, over and over again. I am C.A.R.O.L.I.N. I am C.A.R.O.L.I.N. I am C.A.R.O.L.I.N.…