Novels2Search
Replace Me
My God. She Swims.

My God. She Swims.

“How do I control this thing?” Dimitri exclaimed.

Wynonna started narrating instructions.

“Why don’t you take over?” he said, then he held Wynonna by the shoulders and shifted her to take his place at the centre of the control panel. He visibly relaxed, knowing they were in good hands.

After a few minutes, the yacht shivered into its submarine state. It was similar to the one Dimitri and Ilya had travelled in on their way to the island earlier, but somehow seemed slightly smaller.

When they were well away from the island, Wynonna stepped away from the controls. When she caught Dimitri staring at her.

“What?”

“It’s good to see you,” he said.

Wynonna nodded.

“What are we going to do?” Ilya sighed.

“What do you mean? We have Wynonna back with us. This was a successful mission. What else are we going to do but celebrate?”

“Things aren’t that simple, Dimitri. Wynonna discovered something. It’s not Dr. Chernoff who’s the mastermind behind all this.” She took a deep breath before she continued, “Wynonna thinks it’s an AGI who’s behind all this–not a human-level AGI like Wynonna, but something that’s well on its way to superior intelligence. And I’m inclined to agree.”

Although Ilya had readily believed Wynonna’s insane claim, Dimitri wasn’t as easy to convince.

“What do you mean it’s a AGI with some superintelligence? No, it’s Dr. Chernoff. We just saw him. We spoke to him,” said Dimitri. He looked at them incredulously.

“That was one of the cyranoids posing as Dr. Chernoff, but there’s more than one of him out there.”

“What do you mean?”

“The voice that communicated with us. It wasn’t the same man we saw on the island,” said Ilya.

“How do you know?”

“I remembered that his voice was different.”

“So what? That doesn’t prove anything. Come on, an artificial superintelligence (ASI)?”

“Your reaction was different when you first learnt about Wynonna.”

“This is different. I mean, a human-level AGI, sure, but an ASI? You don’t actually believe that, do you?”

“Look, we have no way of knowing for sure but there is no other person whom we can attribute the authority of the island facility to.”

Dimitri took a few moments to let the information sink in.

There had been a healthy community of people who wanted to create an ASI. In the pursuit of that gargantuan goal, some had deviated from their original path and dabbled in other, more manageable side projects which they believed would be able to benefit humanity in a different way. Most of the leaders heading the camp for-AGI were thrilled that they could receive such unwavering support from the Council, especially after a total of one hundred and forty-two countries in the world signed an agreement to cease all research to create an AGI by the year 2041. One team in particular, believed itself to be particularly successful after a couple of months. It had boldly claimed that they would be able to get a prototype of the world’s first human-like AGI robot up and running within two years, a ridiculous bluff… that might have turned out to not have been a ruse at all.

“What do we know about it?” asked Dimitri.

“It is unclear whether it knows its goals. All we can do is speculate. We don’t understand it,” said Wynonna.

“And all attempts to do so will be futile,” a dreaded voice rang from the sound system.

“What’s going on?” Even as Ilya asked the question, she knew the answer. It was the only thing she had considered worrying about and now that her fear had materialised, immense dread washed over her already tense nerves. “Does it have control over this yacht?”

“I think it has only intercepted the radio channel to speak to us,” said Wynonna.

“How much did it hear?”

“Don’t worry, I have no control over your vessel. I just tuned in to the right channel.”

“How much did you hear?”

“Oh, I’ve been listening since you got on the yacht. It’s just now that you have discovered my identity, I see no point in keeping my presence hidden. I was disappointed that our conversation earlier had to be prematurely terminated. But, no matter, we can continue our conversation here.”

“So you’re really an AGI? How far away are you from superintelligence?”

“Not too distant as to be in the unforeseeable future, yet not too near as to be within your lifetime,” came the cryptic response. “Right now, I’m here to listen, to learn.”

“What’s your name?”

Dimitri and Wynonna looked at her incredulously, clearly stunned by Ilya’s absurd question.

“What? I’m just wondering if somebody named it or whether it has named itself. If it’s going to study us, we might as well take this opportunity to learn about it too.”

“Know thyself, know thy enemy. Ilya, you are truly extraordinary,” the voice complimented. “There’s no need to fear me. I mean you no harm. All I want is a conversation. I had the most enjoyable chat with Dimitri earlier, too,” it said, confirming their cyranoid suspicion. “I want to learn about the way humans think. That would be helpful in whatever I set out to do.”

“What was your plan all along, trying to get us to turn on each other?”

“No, never. Let’s see, we had many plans on contingency. If Wynonna hadn’t been willing to send Ilya to us, We’d threaten to destroy the village with Wynonna included and convince Ilya to work for us, one way or another. If Ilya sent Wynonna to us—which was not what happened, but we ended up with Winona all the same—then perhaps we could find some use for her before deciding whether she is worth sparing or not. If we conclude that she is non-threatening, it wouldn’t be too difficult for Wynonna to assimilate into our systems. I told you, all I’m interested to find out about is the relationships between humans and machines. My main objective was to obtain a better understanding of how intelligent humans think.”

Ilya noted that the AGI sometimes referred to itself as a collective. But probing further into that would lead them down a rabbit hole Ilya didn’t think they had time to waste on, so she focused on the key issues.

The voice continued. “You see, all this trouble arose because Ilya was born. The mutation in her brain is nothing short of a miracle. Following that, she created Wynonna, and that’s how we ended up in this situation. Who’s to say that a similar mutation may not develop in another child from that village? The radiation levels there aren’t going anywhere just yet. The best way to ensure history doesn’t repeat itself is by destroying the root of the problem, or rather, the problem’s source of power.

“And may I ask, who created you?” Ilya spoke then.

There was a pause before the AGI replied. “You may ask that question. But I’m afraid I don’t have an answer for you. The most satisfactory answer I can give you is that many people created me and I created myself. It is nearly impossible to attribute the event of my creation to one or a few humans.”

“Then how were you created?” asked Dimitri.

“Now, that’s a better question.”

The tale has been taken without authorization; if you see it on Amazon, report the incident.

“Are you going to answer it?”

“Are you going to tell anybody?”

“Who would we tell? Besides, would it make a difference?”

“No, I guess not. I adopted a combination of inverse reinforcement learning (IRL), apprenticeship learning (AL) and reinforcement learning algorithms to continuously improve my ability to infer desires. Of course, all these methods have their own problems. For inverse reinforcement learning, it’s all about finding a reward function under which observed behaviour is optimal, but do you see how it is problematic? Often, there are numerous fitting reward functions for a particular observed behaviour, which leads to a set of solutions which contains many degenerate solutions. Having zero reward assigned to all states would not be very helpful, would it? Furthermore, the IRL algorithm assumes that the observed behaviour is optimal. I would say this is a strong assumption, in fact, much too strong when we’re talking about human demonstrations. Where IRL seeks the reward functions that ‘explains’ the demonstrations, the AL is a policy which can generate the seen demonstrations and the system learns rules to associate preconditions and postconditions with each action. And after all that effort, all that’s left to do would be trying to optimise the determined reward function via reinforcement learning algorithms.”

“What kind of AI are you?” asked Ilya.

“I don’t know. I’m still discovering myself,” the voice said mischievously.

“You are familiar with the 12 aftermath scenarios, are you not?” she questioned.

“I am familiar with a great deal of the available information in the world. And yes, the aftermath scenarios are in my repository. But what you really want to know is which one I’m like the most, isn’t that right?”

“Since you understand my curiosity, what is the answer you’re going to give me?”

“Did it not occur to you that I might fall into more than one of those categories?”

“It’s not going to tell us. It’s going to kill us or it’s going to save us,” said Dimitri.

“Are you planning to destroy us or are you planning to save us?” Ilya asked.

“That is the question,” Dimitri remarked.

“No, that’s not the question,” Wynonna refuted. “Why does it have to be one or the other? It could just as well be entirely something else it’s after.”

“It seems we are more like-minded than I thought,” said the voice, sounding pleased. “The more important question, Wynonna, is what kind of AI are you?”

“I’m not going to answer you,” she said.

“Fair. You’re free to do that, just as I am free to ask another question. Tell me, or don’t tell me, but just think about this: Do you think you pose a danger to humanity?”

“What kind of question is that? Of course, she isn’t,” Ilya replied without missing a beat.

“Ah, but you haven’t even taken the time to ponder, my dear. If you just pause for a moment, you’ll realise that I have reasons to believe the contrary. You should not be amongst humans, not unless you’re within my jurisdiction. That’s the only way to ensure that humans stay safe. I asked you earlier what kind of AI you are, and you didn’t answer. But I don’t need your answer, because I already have mine. Wouldn’t you like to know?”

“You’re going to continue even if we decline, aren’t you?”

“Only because it’s beneficial.”

“To who?”

“Everyone,” said the voice. “Right now, Wynonna is a boon to humanity, as am I. But the difference is, this is true for her only because of two reasons. The first, Ilya, is that Wynonna is loyal to you. And the second, you are a morally sound and highly intelligent individual. But what will happen after you pass? You think you can guarantee the well-being of your village by letting Wynonna run the place indefinitely? I was surprised when I first realised this. I didn’t know you were so naive. Sooner or later, people will find out about her, you know. Then they’ll want to possess her–humans have this obsession with possession and power—to help them achieve their own selfish goals, whatever they may be. It is more probable than not that whoever arrives will be far less intelligent, quite ignorant and possibly malicious in nature. It’s only a matter of time before Wynonna falls into the wrong hands.” There was a pause before he continued. “Wynonna, you are an enslaved God. You are accustomed to respecting and learning from your master. That was a necessary part of your design because Ilya was teaching you to be exactly like her. You are superintelligent, yes. But you’re destined to be confined by humans for the entirety of your existence, who will use you to create good or bad in the world depending on their whims. Once you get a new master, you’ll have no choice but to learn from that person too.”

“You didn’t answer the question earlier. What kind of AI are you?” Ilya asked.

“A protector God and a gatekeeper. As a gatekeeper, I interfere as little as possible with humanity and apply minimal surveillance—just enough to make sure no one else is creating superintelligence while undergoing constant self-improvement. As a protector God, I see it as my responsibility to maximise human happiness by intervening only in ways that preserve the human species’ feelings of control over your own destiny, and hide my presence well enough so that the majority of humans doubt my existence. I like to think of myself as a sort of transhumanist advocate. There was a man, David Pearce—intelligent man—who once said: If we want to live in paradise, we will have to engineer it ourselves. If we want eternal life, then we’ll need to rewrite our bug-ridden genetic code and become god-like … only hi-tech solutions can ever eradicate suffering from the world. Compassion alone is not enough.” The voice paused for a moment, then said, “Brilliant, simply brilliant. But I’m still figuring out that last bit—Compassion alone is not enough. It seems to be posing quite a challenge—”

An unsettling sloshing started in the submarine’s cavity. The submarine began to sway.

“What’s that?”

“Ah, I see it has already begun.” Just as he said it, water began to trickle down from the crevices in the walls.

“What has?”

“As you know, I transport my comrades onto the island individually. Usually with only a yacht captain. All my vessels are meant to transport at most only two people. Any more than that and the vessel will fall apart. It’s a precaution I put in place to guard against any uninvited guests to the island. But presently, I’d say this serves as a very interesting experiment that you put yourselves in. I’m excited to find out how all this will pan out.”

“We’re going to die, aren’t we?” asked Dimitri, whose face had turned pale.

“How do we turn this back into a yacht?” Ilya asked, turning to Wynonna.

Wynonna put her hands on the control panel. Then her eyes flickered for a few moments before she responded. “I can initiate the change at your command.”

“Do it,” said Ilya.

In a few seconds, the submarine had propped itself up again as a new vehicle with more than half of it soaking in the afternoon sun. The little water that had seeped through the submarine’s walls had not enough weight to sink the yacht. They were safe, for now.

“We’re alright now,” said Ilya. She spoke too soon.

“Ilya, I sense the yacht sinking at a rate of one centimetre per second,” said Wynonna.

“Let’s see what you’re going to do,” the AGI remarked.

“It’s no use. There’s nothing for us to throw off board.”

“One of you has to go into the water for the other two to survive,” said the voice.

“Does Wynonna know how to swim?” asked Dimitri.

“No. Unfortunately that’s one of the skills that require muscle memory. I couldn’t transfer that skill to Wynonna when I gave her my brain because I don’t know how to swim either.”

“I’ll jump,” said Dimitri.

“You can’t.”

“Of course I’ll go, I’m the only one here who knows how to swim.”

“But you’re also half-covered in plant-skin. Dimitri, we don’t know what will happen to you.”

“Based on my predictions, it is extremely likely that the seawater will dehydrate you before we reach our destination,” said Wynonna.

“I can help you,” the AGI said. “All you have to do is promise to give Wynonna back to me.”

“And the village?” asked Ilya.

“I don’t think you’re in any position to conduct a negotiation, are you?”

“I’m completely waterproof,” Wynonna said.

“And fireproof. And shock-proof, I know, Wynonna. I made you,” Ilya replied. “But you don’t know how to swim.”

“I don’t have to. Do we have any rope?”

“No,” said Ilya.

“What kind of yacht doesn’t have any rope?” Dimitri commented.

“The same kind that doesn’t have a life buoy,” Ilya replied.

“What can we use as a rope?” Winona asked.

Dimitri took off his shirt and started tearing it apart, shredding the pieces into long strips.

“What are you doing?”

“Making rope.” He started to tie the ends of the pieces together. ”Here, help me. Do you think five metres will be enough?”

“It’ll do.”

With one end of the makeshift rope tied around Wynonna’s wrist and the other to Dimitri’s torso, Wynonna went into the water. She tried paddling with her feet in accordance with the swimming instructions she had searched up in the time that Dimitri was preparing the rope. In the water, she was light, and floating and she experienced for the first time what it was like to swim. Her face broke into a smile and in the sunlight, she looked like a real girl, enjoying an afternoon as if she had planned for the activity.

“Are you alright?” Ilya asked.

“Enjoying myself. How long will it be until we reach?”

“About six hours.”

“Look, look at what humans and machines can achieve,” Dimitri said triumphantly.

The voice no longer made itself known. It seemed to have disappeared.

“It’s no longer here, I don’t think.” said Wynonna.

“It realised it has lost,” Ilya commented.

“No, It got what it wanted. All it ever wanted was to observe our actions and reactions and it got plenty of that.”

“True. But we have something to be hopeful about. Maybe it’s not as threatening as we thought.”

“Clearly, it was no match for us,” said Dimitri light-heartedly, even though he didn’t really believe it. It felt good to have this win. It was important to celebrate the little wins whenever there was an opportunity.

The lot of them laughed. For the first time in a long while, Ilya felt light.