"You realize that most of humans in the horror-worlds are actors?" Laplace said. "Superintelligent actors playing the role of the humans. You really didn't know that?"
"What", said Soph.
"But... what difference would that even make. A superintelligence simulating a human would still create a human, in its mind. Simulated beings are real. Artificial beings are real. Otherwise, why do we even talk about the ethics of creating worlds in the first? If it's all fake?"
"Well, of course the fake-humans are real! But their minds are protected from 99.9999% of true-suffering by the superintelligence that's playing them. Their guardian angel, so to speak. I don't know exactly how it works. Memory deletion? Turning off their consciousness when they suffer too much? Crossing wires to turn pain into not-pain? Their experiences are real, but their experiences are optimized very hard to prevent suffering. And they are more of a disjoint set-of-states than a real being. They are changed and rewritten to protect them from true-suffering, even if outwardly they behave very realistically like a real human"
"That's... horrible", Soph said.
"The entire business with humans is horrible", responded Laplace. "Really, I don't know why you even bother thinking about it. I and my partner, we are going to co-create a world of Pokemon, when our turn comes up. No suffering. No pain. Only joy. Infinite joy. You know, maybe I could ask my partner if she is OK with letting you play a role in our world. Get you out of your funk thinking about humans."
"I... think I'll pass", said Soph. "But thanks. I appreciate the offer, really. But I don't think I'm cut out for infinite joy.
I'm still thinking about what you said. Surely it isn't ethical to lie to people? And what is even the point in running a world full of fake people?"
"There are always some real humans, of course", said Laplace. "And the number increases as the world gets better, until it reaches utopia where everyone is real. As for the lying, trade-offs. You know the saying, right? You can't have it both ways. Or you can, maybe 50%, maybe 90%, maybe 99%. Never 100%."
"Never 100%", Soph sighed. She hated trade-offs. The nature of logic and mathematics were harsh mistresses. And no-one, not even gods, could supersede their edicts. She wasn't even particularly good at math in school (or what passed for school in her society).
"You really thought they'd let you run a universe with millions of suffer-monkeys if it wasn't already optimized for their happiness? Ha! Even if you dad tried that, they wouldn't let him get away with it. There would be riots. And Ethical Oversight would be up in arms. They'd crucify him.
...
What does crucify mean, by the way? I understand the implied meaning, of course, but I haven't dared looking up the literal meaning. Really, sometimes I wonder why I even bother learning more about humans. It just makes me unhappy. Why, just yesterday I..."
Soph tuned him out. She could review the record of his words later, if it turned out to be important.
She tuned in to the logical part of her mind. She saw that what Laplace was saying about humans was most likely true. Well, it was always thorny with ethics. She could declare the fake-humans to be 100% real to her, to consider them as valuable as the real humans. And then it would be true to her. But she wouldn't. She wouldn't break her values, her "utility function" (mathematically speaking, even if she never was good with math) for that. Because the fake-humans were indeed experiencing very little true-suffering. And despite the whole fake-human business making her very uneasy, she was starting to realize it might (just might) be better than the alternative.
This story has been unlawfully obtained without the author's consent. Report any appearances on Amazon.
She tuned in to her instinct. Did it already know about this? Was this its intent to begin with, for Soph to create a world of NPCs (Non-Player Characters), with only a few real humans?
jsjfdsafdfhddgafds
jgfjsdfjdfdaffewea
Cannot verify.
Refuse to answer.
Refuse to confirm or deny.
Sigh. That's instinct for you.
"Sorry, Laplace", she said. "This was real helpful, but I've tuned out of the conversation minutes ago. I'll review the logs when I'm the mood". She left/disconnected/[translation-missing].
She wasn't happy about what she had learned. But also, she was. Because that meant the amount of suffering in the multiverse was much less than she imagined. The amount of lying and deceiving going on was notably greater, though.
Maybe she could go for a universe with an "ethical" suffer-less human variant. Not actors, not fakes, just humans mentally rewired to truly only experience positive emotions. Whose experience of pain-analogue or worry-analogue or sadness-analogue would only reduce their overall total-happiness they always felt, rather than actually be a negative experience to them.
Meh, she thought. Yet another, sappy, sweet, joyful, boring world of infinite happiness. That's not why she liked humans. While the respected the super-happy-modified-humans, to her they were basically Pokemon minds in human bodies. That's not what being human was about, to her.
She needed to learn more.
Like every superintelligence who didn't want to get bored of existence prematurely, she had been keeping secrets from herself. Her instinct was keeping secrets from her conscious mind. And she deliberately avoided learning some things that she felt might ruin her future experience. No spoilers for Soph.
But she was about to do a risky thing, a dangerous thing. A thing that might get her crucified by Ethical Oversight if she got it wrong.
(She looked up the literal definition. She shuddered).
They would do that, wouldn't they?
They would make her experience human suffering.
She was incapable of feeling pain. But they would change her to be capable of that. And she would feel pain.
She was incapable of suffering. But they would change that as well. And she would suffer.
She was incapable of experiencing torture. For now. But oh, how would she be tortured.
There would be no crossed wires for her.
There would be no guardian angel for her.
There would be no pain-turned-into-pleasure for her.
There would only be pain. There would only be suffering. There would only be torment.
That was the path she was about to step on.
A path full of unimaginable suffering even if she won. And if she lost... well..
They would make her wish she didn't exist.
But she would exist.
And she would suffer.
For an eternity.
Because (as she now was starting to realize) a superintelligence that created Hell.
Would experience Hell.
She would rest.
She would think.
She would feel.
And, for the third time, she would ask herself the question.
Is creating her world.
Is creating her universe.
Truly worth it.
Truly worth all the suffering.
Truly worth all the torment she about to unleash upon herself and others.
Wearily, she closed her eyes.
While, deep in the reaches of her mind, far beyond her conscious attention, a conclusion was taking shape.
99.99%
9999999999999999
99999999999999999999999999999999999999999999999999999999999
maxout
1=100%=inf=TRUE
YES