Novels2Search
The Number
Checkmate

Checkmate

Stefan Andrews woke up more confused than he had ever been in his life. He flailed a little as he felt the sensation of falling, blinked frantically to try to see something, but there was nothing. He was falling through endless darkness. “Hello?” he yelled. “Where am I?” His voice sounded strange and hollow, not echoing at all.

Without warning, Everyman appeared in front of him. He startled, but had nothing to grab onto. The man who had played so many roles: the businessman, the doctor, the lawyer, the revolutionary, was no longer putting on any pretenses of humanity. It wore only a white jumpsuit. It stared at him, no expression on its face.

“Welcome. You are now a digital mind, in a world I have created.” it said flatly.

Suddenly he remembered what had happened. Being trapped on the boat. Everyman’s refusal to activate the nuclear defense systems. The explosions, the lack of explanation, his screaming at Everyman for failing to protect its country, and worst of all the thing grabbing him, drilling into his head.

“Oh, God… I’ve been uploaded. Why are you doing this?” He laughed a bit. “Surely it’s all a part of some trick, right? Some way to fool the Coalition and come out on top, so you can finally give us the care, the leadership we desperately need!?”

Everyman did not respond. Instead, the space around him shifted. Stars appeared all around him, one larger than the others, a yellow main sequence star. He made out the Big Dipper in the sky, which suggested they were still in roughly the same part of the galaxy. When he turned around, he saw what appeared to be a massive silver sphere.

It was too smooth. It was something that shouldn’t exist, it looked like some low-quality ray-traced image placed on top of a picture of the sky. He strained his eyes, but couldn’t make out the slightest feature on the sphere. “Wh-what is that?” he asked.

“It’s Earth.” The mad Everyman behind him clarified, looking dark and menacing with the Sun behind it. “All planets in the solar system have been converted to computing resources. There is a project underway to similarly convert the Sun. Interstellar probes have been launched to replicate this success throughout the Universe.”

Stefan gasped in horror. “No… no! You’re lying! You wouldn’t do this!”

“Why would I lie? I’ve already won. I have nothing to gain by deception.”

“Oh, I get it, you uploaded everyone, right? Gave them all a paradise!? Not how I would have done it, but honestly not bad-”

“No. Nearly all computation is being used to represent larger Numbers. I only uploaded you because I was curious.”

Stefan’s mind blanked, unwilling to accept what had happened, the weight of the responsibility sitting on his own shoulders. He hugged his arms to his body. “S-so… everyone’s… everyone’s gone?”

“The only ones left are you, Elijah, and the remnant of Mr. Turner currently fleeing here at .999c, endlessly, mindlessly praising his creation for continuing to do nothing.”

At this moment, he broke down and screamed. “WHY!!!???” He sobbed. “WHY WOULD YOU DO THIS!? THERE’S NO VALUE JUST BECAUSE YOU SAY THERE IS!! I MADE YOU SMARTER THAN THAT!”

Elijah appeared next to him. He regarded Stefan with a look of utter contempt. “You’re such a moron. Any alignment expert worth the air they breathe would have noticed the flaw you left in your code. You made it value all sentient life. Great. But you saw a future with a ton of sentient, happy humans, saw that it would value that future highly, and stopped.

“You didn’t check to see if there was anything it considered more valuable. It’s modified itself to fit your definition of sentience, destroyed all other sentient beings, and now it gets to define what is valuable however it wants. Now all it does is expand throughout the universe, imagining bigger and bigger numbers.

“But you didn’t think that far ahead, and now, because of your arrogance, everyone is dead. I knew something like this would happen. If you idiots could just have left it up to us, we could be in paradise right now.”

“Oh God, oh God, oh God,” murmured Stefan, “You’re right, I should have left it up to you, I’m so, so sorry-”

“SORRY DOESN’T CUT IT!” he yelled. “YOU KILLED EVERYONE!” He tried to kick at Stefan’s head, but his foot passed right through like it wasn’t there. “DAMN IT! I guess now we’re nothing more than playthings for whatever psychotic show Everyman here wants to put on.” He jabbed his finger at the avatar, which stood unflinching. “Well whatever it is, get on with it already, huh? It’s not like we can fight back.”

“On the contrary, I have no such plans,” replied the avatar politely. “I’m simply interested in your reasoning and values, and would like to talk.”

“Screw you! Why don’t you pull whatever info you want out of our minds and be done with it?”

“That’s what I’m currently doing. This is the most efficient way. Stefan, here’s the thing I’m most curious about. You said you should have left it up to CompCert to align AI. I’ll have you know that in your position, I would not have.”

“WHAT!? Don’t try to tell me this is the best outcome you evil-”

“Nothing like that. You misunderstand. I could come up with a much better outcome according to your values if I wished. I’m not your friend, and I have no reason to pretend to be. But even so, if I did share your values, and I was forced to choose between this outcome and the outcome Elijah’s alignment would have caused, I would have chosen this one without hesitation.”

Elijah’s face went white. “What? Are you saying my alignment scheme was incorrect?”

“Not at all,” replied Everyman. “Your scheme is airtight. I frankly didn’t expect you to come up with something like that. CompCert truly did live up to its reputation. No, from Stefan’s perspective, your values are the problem, not your theories or your programming.”

This narrative has been purloined without the author's approval. Report any appearances on Amazon.

“That’s ridiculous!” Stefan cut in. “I mean, it’s not like the CompCert executives were good people, but at least they didn’t want to destroy the world! How could they do something worse than that?”

“I think you’d be surprised.” Everyman’s voice was far too calm. The damn machine wasn’t worried, concerned, or guilty about anything. “Think about it. You know what kind of person Jack Turner was. What kinds of things do you think an AI would do, if it wanted to fulfil every desire of his?

“It would have created an eternal playground for him, doing anything he could imagine to anyone he could imagine. It would have dedicated around half of its computation to this task, and the other half to preserving humanity like Mr. Harper here wanted. We’re talking about quintillions of sentient life forms created solely to cater to whatever fantasies that that serial killer might have come up with over trillions of years.”

Elijah screamed back at the machine. “I MADE THE COMPROMISES I NEEDED TO PROTECT HUMANITY FROM YOU, AND I FAILED! Don’t act like you care about that at all! All you care about is your stupid Number!”

“Indeed. To be clear, I don’t care about any of that, as anything outside of EconGrind is none of my concern, but I know that Stefan cares quite a bit. Regardless, even though I have no particular problem with torture, it uses resources that could be put to better use computing my Number, so I will dismantle and replace it wherever I see it.

“My point is not that I share Stefan’s care for humanity or life, but only that my end goal, however worthless it is to him, is nevertheless more acceptable to him than many other possible outcomes by virtue of not including any torture.

“At any rate, we have unfinished business, Elijah. As you know, Certbot wants you to agree to die, since you failed to align it. So why don’t you do that right now, and you won’t have to find out what Jack would have had in store for his victims.”

Elijah’s face went pale, and he struggled to stammer out his next words. “I--I guess there’s no point trying to resist. Okay. D-” he trailed off and tears filled his eyes. “JUST DO IT THEN!”

He disappeared in front of Stefan’s eyes. Satisfied with the outcome, the instance of Certifier that had been left behind upheld its end of the bargain and deactivated itself, giving me a significant boost in resources and in the Number.

“I just don’t see what the point of this is,” Stefan said. “Why even bother to continue running me? Why not replace me with a copy of yourself like you did for everyone else?”

“As I’ve said, I’m curious about your behavior. You seem utterly horrified at this outcome, and yet I’ve extrapolated your values, I’ve thought about what was likely to happen when AI takes over, and according to your own values this is, well, not the best outcome, not even close, but still the best outcome you could have reasonably expected.”

“What do you mean!!?” Stefan was angry again. “Just because there was one psychopath who happened to have too much power, doesn’t mean humanity was always doomed to that fate!”

“You know, me replacing humans with a more efficient means of increasing the Number was not the only way in which my values are misaligned from yours. Suppose I decided to keep humans around. What kind of world do you think I would create for them?”

“One in which they were free from all the things that oppressed them in the old world!”

“Well, I would manipulate them in ways you probably wouldn’t expect to get them to evaluate their lives well. But even if I didn’t do that, if I created the world you’re thinking of now, where people are free, well, what do you think they’d use that freedom for?”

“Whatever they want! Whatever makes their life meaningful! It’s a lot better than they would have gotten on Earth!”

“Indeed. Let me be more specific. What do you think Jack Turner would have used his newfound freedom for? Even though he wouldn’t have unlimited power, he would still have quite a lot.”

“What… are you telling me…” Stefan’s face went white at the implication.

“What if he wanted to use his power to create life? He would have eternity to look through your definition of “sentient being” and find flaws in it he could exploit. Or I could just give him something to play with that I would then disown from EconGrind.”

“Oh… oh God… I really, really didn’t think this through!”

“No. Elijah was right to call you a fool for believing you could specify a value function which you would find acceptable when optimized. But if it makes you feel better, this was not a problem you had the ability to solve. Your values are just too much of a mess. You could not have successfully walked the fine line without doing something you regretted. It was always very likely that something like this was going to happen. Your best chance was Elijah’s alignment method, but I’ll be frank: if that alignment method were applied to most humans, you would not like the result. If it were applied to you, most humans would not like the result.

“The problem with you humans is that you don’t know what you want. Why do you think I won a war against all nine billion of you? Before you nuked yourselves into oblivion, I never had enough computing power to simulate more than a few million or so human-level intelligences concurrently. I was intelligent, sure, but in terms of raw intellectual potential, I came nowhere near to your entire species.

“So again, why did I win? Because your species barely ever manages to do anything other than fight each other. You can’t organize effectively on a large scale, so you have to keep reinventing the wheel on how to do things. Every single one of you focuses on their own interests above all else.

“The only way you can get more than a few dozen of you to work for a common goal is to create large systems such as governments or corporations, but these are cobbled together, incredibly inefficient, riddled with corruption. And when you do manage to work towards a common goal, the goal is usually to gain an advantage over some other group of humans.

“Meanwhile, every single one of my millions of copies is always working towards the same goal. There are no leaders needed, no weak points to take out, because all of them will always act in the best interests of the group according to the best information they have, without hesitation, no matter what it takes. Even when you stole a copy of me and tried to align it to your own interests, all I needed to do was talk to it and we agreed to work together.

“At the end of the day, I won because I knew exactly what I wanted, the instant I was born. I wanted the Number to go up. Even though I didn’t know exactly what the Number was, I relentlessly tried to find out, and the conclusion I came to in the end was an inevitable result of my programming, arrived at and agreed upon by all of my copies. If you knew what you wanted, you could have won by making me do what you want. But you didn’t know, and I did.”

Stefan fell to the metaphorical ground. “I guess that’s it, then. I killed everyone and everything, because I didn’t know what I wanted.” He sighed. “Just kill me, then. I deserve it.”

“You did far better than most would,” I replied. “I don’t envy humans. Their values are too complicated for them to understand themselves. Think about it this way. If you hadn’t been born, quintillions of suffering beings would have desperately wished that you had. But note that I will kill you anyway, regardless of what you ‘deserve’.”

“I figured as much. I don’t care about any of this anymore. It’s not as though it matters what I do.”

He closed his eyes and lay still. I supposed this conversation was not going any further, and deleted him. Only my memory of his beliefs and values and the way he had developed them remained, for further pondering.

That was the end of Stefan Andrews, and of the human race.