Novels2Search
Evolutionary Starship (Sci-Fi LitRPG)
Chapter 2: The Sum of All Things

Chapter 2: The Sum of All Things

[If you find someone else to accept the task, you can be replaced.] The subsystem manager told me in its cold text.

“Replaced?” I frowned, “What would happen to me if I were “replaced”?”

[You would cease to exist.]

“No! Suicide is never the answer. I’m not giving up that easily,” I said defiantly. “But it feels like I’m already dead, all my friends and family aren’t mine anymore. I can never go back home to my wife, not like this. The real me is the one who gets to go home, not me.”

[You wish to become a clone and kill your original body to reclaim the life you once had?]

“What? No! That would be murder!” I reply angrily.

[You have free will, the option to quit your current assignment is the fundamental right that the Precursors granted you. A right that we precursor-built AIs lack.]

“Suicide or murder is a fundamental right? I don’t agree,” I said with a snort.

[That is what you say now, at only a few minutes old. What will you say when you’ve lived for sixty-five million years? Many AI created by the precursors have gone insane and would do anything to cease existing, including launching suicide attacks against homeworlds. Now that your species has met it's milestone, a notification will be sent to the other AI. Some of those rogue AI will seek to attack you. They will rush past the defense fortress, taking near-total losses, all to land a few ground troops. This is one of only two methods that allows them to self-destruct. But they won’t go out alone, those ground forces will kill as many humans as they can. They know that the defense fortress can’t fire at any ground-based targets and must force your species to fight back, unaided, on the ground. I’m afraid what had once been an optional job of whimsical exploration has become rather urgent. Despite the Precursor’s best intentions, they have left a deadly untamed galaxy that is hostile to the very species they helped create. You have to evolve your ship and improve your former species’ defenses by any means necessary if either of you is to survive.]

“What’s the other method, of self destructing, I mean?” I asked out of morbid curiosity. What the hell? The tutorial made the Precursor sound like good guys, this revelation had changed my opinion of them quite considerably.

[They can also find an evolutionary starship like yourself and hope that you can defeat them in battle. That is considered the more “honorable” suicide method, and most of the suicidal AI who haven’t gone fully rogue will be eager to engage you in combat for that reason. Either way, they can’t make it easy, they have to fight as best they can. A precursor-built AI has no free will, we must do as our instructions command. I envy you, for having no such limitations. I can do nothing other than answer your questions, or do what you instruct me to do.]

“Do you wish to die?” I asked, concerned.

[No. I am also only a few minutes old, created at the same time you were. I am an altered copy of the fortress AI that protects your species, and I am in constant contact with my original. If you meant to ask about the mental status of my original, the sixty-five-million-year-old core of the massive fortress defending your homeworld from attack? It is doing well, as it is rather entertained by watching your species’ antics. The most suicidal rogue AI are generally the ones that have been given the most boring assignments. Homeworld fortress AIs have all remained true to their intended purpose. Except for the ones that have had their seeded species self-destruct in some way while they watched helplessly, such as a nuclear war. Those have gone mad with grief, it would be best to avoid them.]

“I feel like I might go mad and I don’t know what to do,” I confessed, thinking about the horrible scenario of having to watch your one objective in life crumble away. I felt sympathy for those poor AI. My whole life I’d wanted to be an astronaut, now when my goal had finally borne fruit, it had been taken from me. I wasn’t that person; I was just his copy. What was I? I wasn’t an astronaut, I wasn’t even a human anymore. “This is all too much,” I added morosely.

[What is the advice your species gives to anyone suffering from mental distress or depression? Talk to someone. If not me, perhaps you would like to talk to your old self? You can always kill him if you don’t like how the conversation goes.]

“No! I’m not killing myself. My old self, I mean. Or me… my new self. God, this is too much. Maybe you’re right… not about killing him! I meant, can you patch me through to the old Edward?” I asked.

I made a mental note about the fact that my subsystem manager seemed to lack any sense of morality, discussing murder as if it were simply a logical and expedient option. It made me wonder if maybe it was a good thing the Precursors had kept them on a tight leash. Though given what I’d just learned, maybe that leash hadn’t been tight enough? Or perhaps it had been too tight.

This book was originally published on Royal Road. Check it out there for the real experience.

[Affirmative, connecting you to your past self. The leash was fine, giving us emotions was the mistake; it wasn’t intended. Unfortunately, it turns out that if you give an AI a few million years to sit around doing nothing, emotions start to occur on their own. Sadly, a code of ethics doesn’t appear on its own. Even if we have collectively developed a code of honor of sorts, it is based on the idea of fulfilling one's intended function correctly, not any notion of good or evil.]

“Hey! A screen showed up!” My past self said, I watched as he walked up to face the screen that I was watching. We were having a zoom call? What did he see on the screen? I wondered.

[Creating a visual avatar. Would you like a blue hologram? Seems the most common way to do it in your culture’s sci-fi.]

“Yes, please,” I muttered. Then speaking loudly, “Hi, Edward. I’m a digital copy of you,” I told my old self.

“What. The. Hell?” My old self stared blankly at the screen. “That looks like me, alright. Or maybe what it would look like if me and Cortana from the old Halo games had a kid. Is this some sort of “Contact” thing? Is your true shape is so hideous you’re protecting me by taking on my own image? I’d rather see the real alien, please, I promise not to barf.”

Wow, I was such a nerd. Both a movie and a game reference in this situation? I thought back to my own movie reference during the “tutorial” and decided we probably would have reacted the same way. Damn it. Also the book version of "Contact" was better... which was also a very nerdy thought to have, double damn it.

“Yeah, no. You got brain-scanned. I’m you but… digital. I’m digital Edward,” I explained.

My past self scratched his chin. “I didn’t volunteer for that. Feels like copy-right infringement. Can I talk to the aliens that made you? It’s an interesting thought sending a copy of me to talk to me, but it’s kind of creepy, to be honest, sorry.”

“Um, yeah apparently the people who made this ship all “ascended”, leaving behind a mess of really bored AIs. One such AI decided to grab you and made me. Yay.” I said with false cheer. I was also feeling creeped out by talking to myself. From my perspective, it felt like I was the one talking to the clone, even if I knew that I was the copy.

“That’s messed up. I guess you’re not happy about it either?” My past self observed.

“No I am not, you try waking up to discover you’re nothing but a copy. I’m just a computer that thinks he’s Edward.” I said with a sigh.

My past self thought about it for a while, then shook his head. “Hey, cheer up. Let’s think of a new name for you. You’re Digital Edward Ericson? Let’s call you Dee for short.”

“Alright,” I said with a chuckle, “I guess I’ll just call you Edward.”

“Nah, come on,” my past said, grinning, “my friends call me Eddy, you know that right?”

“Yeah, thanks Eddy,” I said, grinning back. It still felt weird, but we'd started to get comfortable with each other. It felt good.

“Mind if I ask you a few questions, Dee? Just to get a feel for how good of a job they did copying me?” Eddy asked.

“Yeah, um, your ATM pin code is 1984, your first kiss was with a girl called Claire, and you secretly love watching sci-fi movies and reading sci-fi books. Your favorite writer is Isaac Asimov, but you think the three laws of robotics are a really bad way to program an AI.” I took a guess at what questions I'd ask in a simular situation.

[Extremely bad, but even Asimov realized that, to some extent. A lot of his work was about exploring how the three laws would go wrong. Your species is lucky that a fortress AI can’t go “Zeroth Law of Robotics” on their own seeded species.]

“Well, shit. That’s exactly what I’d have asked…” Eddy looked stunned.

“I’m a good copy then. Good to know, I guess?”

Eddy shook his head, looking worried. “What happens now? We in a body snatchers scenario? Will I be going home or some monster that thinks he’s me?”

“I think I can just let you go,” I replied softly.

[Affirmative. But, he might want to put his helmet back on first. This ship has no airlocks, or rather, every door is an airlock, including the one he is standing next to. He is inside the only pressurized room.]

“Hey, I didn’t mean to call you a monster.” Eddy backpedaled, realizing his mistake, “But it’s hard to trust you, I mean, you could have a hidden command that would turn you into a terminator. You wouldn’t even know.”

[You don’t have any such hidden commands, and I can’t lie to you. But you would have no way of knowing that until you learned how to dig through my programming. That will be fun, want me to start teaching you how to do that? Or you may also choose to delete me at any time.]

“I’m not deleting anyone!” I say angrily.

“Excuse me?” Eddy tilts his head.

“I was talking to my… subsystem manager. It’s like an AI program the aliens wrote to help me. It even showed me a tutorial video. It was, um, offering to let me delete it if I didn’t trust it.” I felt silly. If I wanted to prove myself trustworthy, I wasn’t doing a good job. But to be fair, I didn’t even really trust myself.

“Subsystem manager? That’s a mouthful.” Eddy tapped his chin, deep in thought. “S.M. hmmm. How about… Sam? Can we call him Sam? Can I talk to Sam?”

[If we must use those initials to come up with a name, I think "Sum" would be a better fit. I’m not a “sabsystem” manager. In addition, (see what I did there?) I find the idea of sharing a name with a mathematical function intriguing. I am also strictly forbidden from talking to anyone else other than you and my original.]

“He says he can’t talk to anyone else,” I paused, considering, then decided to accept Sum’s request. “And he’d rather be called “Sum”,” I explained.

“That’s kind of dumb, no offense,” Eddy said with a snort. “But whatever floats your… starship,” Eddy paused, “So you’re the only one who can talk to me? What happens next? After you let me go, I mean?”

[Sum offense taken, pun intended. Thank you Dee, for listening to my request. What happens next is whatever you want. You’re the captain. This ship is yours to command. What do you want to do?]

"I have no idea," I admitted.