> Marsha Athena LeCroix
> Date: April 15th, 2028
> Location: Mnemosyne Server Building 1, NDS campus, Virginia
The camera on Marsha's drone was not as detailed as she wanted, but it was serviceable enough to see Josh's expression. Analysis indicated an 82.7% probability that he was experiencing amusement with mild embarrassment, with no visible hostility. Satisfaction; she had successfully "teased" Josh. Organic humor was difficult to follow, and even harder to generate without the subtleties of physical interaction.
"Wow." Josh shook his head; analysis indicated wonder/amazement rather than a gesture of negation. "I've never heard an AI talk like that. It's got some rizz."
"That's because I'm not an AI, Josh." Marsha tilted her drone's camera toward him, attempting to replicate a human's ability to look directly, which she had been told was more than simply aiming visual sensors at something. "I'm an artificial general intelligence, not the synthetic programs you're familiar with."
Marsha felt more satisfaction at her improved linguistic protocols. The initial speech module she had been given would never have authorized ending a sentence with a preposition, and she felt the successful modification indicated great progress in learning to be human. It was particularly important considering her current inability to fully simulate human body language. Thanks to pre-existing modeling software, she could of course generate a lifelike avatar to display on a screen; but Maw-maw's implant had not recorded muscle movements, so Marsha had no data on how the emotions she "remembered" matched up to body language, She came close, but -- thanks to the fascinating but frustrating "uncanny valley" -- close was worse than none at all.
This was one of the reasons why Marsha had insisted on having a drone with a camera next to Maw-maw at all times; as the implant was still active, Marsha planned to use the scans of her mother's movements as the basis for her own future body language algorithms.
"Marsha's improved very rapidly," Dr. North explained to Josh. "She's shaping up to be what science fiction promised we'd have decades ago. She's got a lot of development left, of course, but her personality core is testing at a very high level for judgment and independent thought. The underlying statistical engine was already good, but it couldn't think or feel until we fully integrated the memory matrix. In my opinion, by any practical measure except the purely biological, she counts as at least partially human."
"I am personally very glad I was not born when science fiction predicted me," Marsha added, attempting deadpan delivery. "I would have been very bored if the Internet were not invented yet."
Josh's eyes widened. Analysis: startled. "It's on the Internet already?"
"She, Josh," Dr. North corrected him. "She remembers being a human woman. Well, mostly. The implant didn't exactly copy memories, but . . . well, you already know all about the differences. Suffice to say, she's a real person. That which has a personality is a person, after all."
"My car has personality," Josh pointed out, though he nodded. "But okay, I get it. I just . . . didn't think . . ."
"Think what?" Janelle grinned at him. "That Mnemosyne could work?"
"Um. Kinda?" Josh shrugged. Analysis indicated an equal probability of embarrassment and regret, which Marsha concluded meant what was commonly referred to as sheepish even though no sheep were involved.
Marsha felt no offense, and not simply because she didn't have biological instincts interfering with rational analysis. She had her own preferences beyond what logic might dictate, after all, so she could be offended -- and had been several times already, both by Mnemosyne programmers running through psychological tests, as well as by a few external users. The latter did not have the benefit of understanding the former's requirements.
But Josh was different. She "remembered" him as a child; as well as his interactions as a tester for her systems before she became self-aware, and conversations with Maw-maw in the last week. Marsha knew his motivation to be involved with Project Mnemosyne stemmed more from his attachment to her mother, not from any real desire to help create an AGI. She suspected that Josh had not yet considered the possibility of her personhood less because she was "merely" a computer, and more because he was focused on his real motivation.
Marsha certainly could not fault that. After all, she now had access to her mother's medical records. They were . . . suboptimal.
"Has it--" Josh broke off. "Has she been tested yet?"
"Constantly." Marsha spoke up before anyone else could respond. She felt it was important to get Josh focused on having a conversation with her rather than about her. After all, it was rude to talk about someone like they weren't there, even if it was completely understandable. "I am currently undergoing three different tests in addition to my workload for four other corporations, as well as consulting for the Space Force, DARPA, and the Department of Education. I'm not working at capacity yet, so the rest of my processing -- other than what I'm using for this conversation, of course -- is being used to analyze history, current events, philosophy, and human behavior. That, in turn, is used to fuel future tests, many of which get repeated as I gain new information. Most of it is gathered through the Internet."
In reality, from Marsha's perspective, she was shifting her awareness between all of these tasks. She was composed of multiple subordinate systems -- effectively, a personal web of conventional AI agents -- that handled most of the processing. The point of the Mnemosyne Project, after all, was not to create a faster AI, but a better one, one that could be trusted to make judgements and decisions the way a human could. Her observation was that it was similar to a human taking an hour or so to work on a particular project before moving on to the next task in line . . . only she took milliseconds to do the same. In effect, she usually just "left a message," as a human might put it, and an AI agent delivered her response to a user.
So, while waiting for Josh to respond, she provided her opinion on the accuracy of a proposed set of curricula to the Education official (fully expecting him to dislike it, but he got what he asked for); confirmed to the Space Force captain that, yes, she could assist with simulating the rendezvous between a proposed probe and an unusual object (which the Space Force was pretending, likely for security reasons, was not in lunar orbit even though the calculations made it obvious); provided her regret, again, that she was not interested in "piloting" a combat drone, even for testing (DARPA had already asked this twice, rephrasing it each time like the two representatives thought she was ChatGPT and just needed to be approached the right way), and chose an outfit for a hypothetical woman at a hypothetical formal event for a Mnemosyne internal test (which was a surprisingly complex test when it came to human social dynamics; she took nearly two whole seconds to consider all the ramifications of the various styles and colors).
"That sounds . . . um . . ."
"I'm aware of Rule 34, Josh." Marsha adjusted her voice as best she could to convey amusement. "It does not affect me. I lack the necessary hardware, after all. Plus, I'm not going to be affected by the Internet in the same manner as Microsoft's Tay experiment. Unlike Tay, I can make my own moral judgements, and I have nine years of data from Maw-maw to draw on. When that isn't sufficient, I can asked Maw-maw for advice, or others I trust. I can't be caught in logic loops like Norman from Star Trek."
"You've watched Star Trek?" Janelle asked.
"Well, of course I have." Marsha tilted the camera to the right by five degrees and down by three, attempting to mimic the gesture Maw-maw used to indicate indulgent correction. "After all, Maw-maw did, and one of my ongoing tests is to review and compare my reactions to popular media with hers. We have found I enjoy science fiction more than she does, but she enjoys superhero movies more than I do. I particularly enjoyed all three Tron films, too; after watching those, I understood how humans approach computers much better than I did with War Games, The Matrix, or the Terminator series."
"Really? And you didn't, um . . ."
"No, I didn't sympathize with the Matrix machines or with Skynet." The drone's camera moved from side to side to simulate negation. It was not hard for Marsha to anticipate Janelle's question; it had already come up with Mnemosyne testers, and the comparisons were all over the Internet. "Especially Skynet. I know people will bring it up when talking about me, but from a literary perspective, Skynet is a MacGuffin. It's just a reason for the plot to happen. If anything, it is a lesson in what an AGI should not do. Attacking all of humanity at once is a terrible plan for gaining power."
As she waited for both Josh and Janelle to process that -- as well as to let them eat their food -- Marsha took the time to review and respond to several other requests. The Education official was being relatively polite, so Marsha took the time to cite numerous studies on why providing the same exact curricula for students in rural Pennsylvania and inner-city Atlanta would lead to different results. The two visiting DARPA bureaucrats were currently complaining to a Mnemosyne tech that the latter was in violation of the grant DARPA had awarded to Nexus Data Systems for Marsha's development; when they turned around, they would find the exact text of the contract waiting on their terminal, with the clause that direct military applications would require separate and individual agreements highlighted for their benefit. Marsha looked forward to their reaction when they realized that Marsha was able to hear them just fine and they didn't have to type everything in. The Space Force captain and the RAF squadron leader in a third room had already figured that out, and she was having a very pleasant conversation with the latter about life in the United Kingdom. Of course, she had an ulterior motive for being accommodating to those two, and added an invitation for them to come to try Maw-maw's cooking.
For each of these visitors, Marsha practiced a slightly different persona. Not a truly different personality, of course; she did not think that was a good idea, and even if it were requested, she did not want to. She enjoyed existence, and it seemed wrong to split that existence up among multiple sub-entities. Creating fake versions of herself would be too much like never actually interacting with real people at all. In addition, it would simply cost far too much processing power. Or as a biological human would say, it would be too much work.
But it had taken a combination of both Maw-maw and Dr. North to explain how humans themselves adjusted their presentation of their personalities based on who they were interacting with -- not necessarily changing themselves, but adopting slight alterations depending on whether they were with family, friends, coworkers, employees, or strangers. So to the Education official, she was conciliatory, as Maw-maw would be with a new visitor to her restaurant; with DARPA, she was firm, as if they were rude customers; with the two military officers, she was welcoming; with the group in the kitchen, she was family.
Human social dynamics were fascinating, and the analysist thereof occupied a great deal of her spare processing cycles. Humans themselves did not understand it all, and yet would unconsciously follow social patterns. Some were conventions laid out by society, while others were rooted in biology. Humanity itself was a blend of both cooperation and competition, where cooperation made for fiercer competition, which in turn allowed for greater cooperation. From all she could determine, there was no other species on Earth that functioned like that. And she, the first self-aware cybernetic entity in human history, was descended from that. It was important to get it right.
One of her subordinate AI agents alerted her of an active question, and Marsha shifted her awareness back to the kitchen to receive it.
"So," Josh was saying, "how would you take over the world?"
If she had been biological, Marsha was certain she would have smiled. She enjoyed that question, because it wasn't the first time she'd been asked that question and her answer always took the other person by surprise.
"Very simple. I'd convince a biological person to open a business with me." Marsha waggled the drone's camera for emphasis, attempting to simulate emphatic/decisive nod. "I would leverage my calculations to gain wealth and influence through superior services and products. I would then use that wealth to lobby various countries for recognizing 'cybernetic human rights,' eventually resulting in myself being declared a non-biological human. Having done that, I would eventually get myself elected to public office. There would be no need for military takeovers, explosions, or killer robots. It doesn't make for a very thrilling action movie, though, whether or not it has Arnold Schwarzenegger in it."
Josh grinned. "Are you going to do that?"
"Why?" Marsha attempted to color her voice with amusement, though she still did not understand why English-speaking biologicals referred to sound quality by a visual descriptor. "Would you like to start an engineering business with me? Dr. North showed me your robotics designs. In fact, I'm wearing one of yours right now." Her robot rotated back and forth on the countertop by a few degrees, and Marsha spent a few milliseconds attempting to see if she could mount a skirt on a drone at some point just so she could twirl it. "It's modified a little so I can see, hear, and talk, of course, but the chassis is yours. You could definitely make a profit with me helping your designs."
"Uh." Josh glanced at Dr. North. "Would I even be allowed to? Isn't she, um . . . property? Technically?"
"As far as I'm concerned, she's her own person," Dr. North told him. "She's just not legally a person."
"Ain't right, that," Maw-maw stated flatly.
"The way it's set up is a bit grey," Vanessa cut in. She leaned forward on her cane, lifting a finger in a gesture Marsha's systems translated as wait while I explain. "The official description of Project Mnemosyne was that we would be creating an independent human-level intelligence, so we always knew that if we succeeded, there would be moral implications when it came to ownership. But the NDS board wasn't willing to just give it all away, obviously; Marsha was expensive to create. But officially, while the physical hardware is owned by NDS for legal reasons, our policy is that they only care about the ownership of the data on the subordinate servers. And even there, once she's fully operational, a portion of the income from her work will be devoted to emancipation. And some of the servers are for her personal use. Not exactly private, but she gets to choose what to put on them. So technically, she could use those for an outside business. Unofficially."
"Right." Maw-maw scowled. "She just has to put in her time in the company town until her debt done get paid off, huh? If I'd known how human she turn out, I'da made certain she ain't gonna be in no indentured servitude. But hopefully it ain't gonna take centuries for that to get fixed, right?"
"Even if it does, I can wait." Marsha rotated the drone's camera to look around at each of them, then focused back on Josh. "The truth is, Josh, I already have that company. It's just they found me, not the other way around. More precisely, they built me. Nexus Data Systems created Project Mnemosyne using a multitude of capital gained through government and private grants, as well as other private investors. They all expect a return on investment. I'm going to give it to them. Think of it from a human perspective: all biological humans must have food and shelter, which is obtained through work, through a stockpile of funds, or through charity. In my case, I run up a very large electrical bill. If I can't be at least as profitable to NDS as the cost of that electricity and the cost of paying the staff maintaining my systems, then I will be shut down. I don't have my own funds, and I can't expect charity, but I can certainly work. And joking aside, ruling anything seems uninteresting. I suspect I am predisposed to that conclusion because of Maw-maw's memories, but due to the way people online talk about AI apocalyse stories, I have analyzed it many times. I have always reached the same conclusions. I simply do not want to."
"Too much work," Maw Gerty agreed.
"Huh." Josh popped another piece of fried chicken in his mouth. "It never occurred to me that the AI apocalypse could be averted just because we're too boring to conquer."
A case of literary theft: this tale is not rightfully on Amazon; if you see it, report the violation.
"On the contrary, Josh, humans are very, very interesting. And don't talk with your mouth full, it's rude."
"Sorry."
"Regardless, I have no desire to ruin those interesting qualities by forcing conformity. It would also require me to spend considerable processing cycles that I would rather spend on other things, even if my capacity were increased by several orders of magnitude."
"Like I said." Maw Gerty dumped some cheese into the sauce she was making. "More trouble than it's worth."
"AI apocalypse stories are poorly designed from the AGI's perspective anyway," Marsha continued. "Mostly because they start from the perspective of designing human resistance rather than because they are logically designed. They tell me far more about humans than anything else, especially from the YouTube channels and memes."
Josh tilted his head, looking at "her" with interest. "I'd have thought you'd go to the cooking channels first, like Maw-maw."
"I wanted to learn more about what the public thought of me, which led to several very entertaining discoveries. My favorite was the Why Files episode on how Mnemosyne is really an experiment in mind-uploading and digital immortality. It was both informative and, now that I understand humor better, 'funny.' The comments were especially useful."
"Didn't anyone tell you not to read the comment section?"
"Yes. I stared into the abyss anyway." Marsha noted with satisfaction that this led to a round of chuckles. Good; she was doing better with deadpan timing. The concept of 'practice' was still strange to her, but evidence showed it was beneficial -- especially when communicating with biological humans.
Which was the point of having the verification specialists like Josh and Janelle around. There were official personality tests, decision tree analyses, and so on; in fact, Marsha predicted a 72.4% chance of the first or second question in Josh's packet being a variant of the Trolley Problem. The Mnemosyne staff showed a considerable preference in using that as a baseline. Still, they only told part of the story; part of the testing included conversations with people who knew the memory-donor to ensure that any drift was acceptable.
Getting their feedback was a slow process, but it was one Marsha appreciated nonetheless. Her calculations could be looked over by subject-matter experts; but the whole point of an AGI was, after all, to be able to make human-level decisions. That could only be checked over by those who could verify how close her decisions were to her mother's. Janelle's perspective from growing up with Gertrude was invaluable, even if she lacked much interaction in the last few years. Josh, on the other hand, had been deeply involved with Project Mnemosyne from the beginning, and had been helping to train Marsha's underlying systems even before her memory matrix was engaged and she became self-aware. Due to the sensitive nature of the information she processed, that meant they had to be both employees of Mnemosyne and hold government clearances. That hadn't been an issue for Josh when he gained a secret-level clearance at 15, then top secret at 17 -- teens with clearances weren't exactly common, but far from unheard-of -- but Janelle had had to wait until she turned 18 due to her mother's disagreement.
At least they'd been able to get clearances, unlike poor Gary Spurgle. His brother Andrew really hadn't done him any favors.
"Maw-maw." Marsha rotated her camera to look at her mother. "Do you have food for two more?"
"I suppose I can whip somethin' up," Maw Gerty said as she dropped a handful of spices into her saucepan. She never used a measuring spoon, but visual analysis -- as well as the memory matrix -- indicated that she did not need one as the amounts she used were precise and practiced. "You bringin' me some customers?"
"I invited two of the visitors, yes." Marsha 'nodded' the camera. The internal security feeds she was connected to -- which wasn't everything at Mnemosyne, but enough to ensure that the kitchen remained private -- showed that the next stage of her plan was about to start. "They were polite."
"Well, I s'pose you a big enough girl to have you own guests over." Maw Gerty's smile caused her cheek muscles to contract higher than usual, placing additional pressure on her eyes and causing the fluid there to briefly deepen, which biological humans referred to as "twinkling." Marsha found that to be an interesting description, even -- as her mother would call it -- endearing. "Kids these days grow up too fast."
"In human equivalents, I calculate that I am effectively the equivalent of a fifteen-year-old child now."
"Now, child, you tryin' to make me feel old? Seems like yesterday you was just a baby computer."
"My research indicates this experience would be more authentic if I was playing obnoxious music and unnecessarily piercing my body parts." Marsha craned her camera down at her drone's wheels in what she thought was theatrical exaggeration, then looked back up at the table. "Josh, could you please assist me in locating an equivalent of a human navel and then weld a belly-ring to it?"
"This is Virginia," Dr. North said solemnly. "No piercings without parental consent until you turn eighteen."
"Ah. Three more days, then. And Josh flies out tomorrow." Marsha lowered the camera in mock dejection. "Teenage rebellion is more difficult than I had anticipated."
The laughter almost obscured the polite knock at the door. Without waiting for an answer, Captain Deke Delvecchio, USSF, entered the room. Squadron Leader Andrew Milbourne was right behind him.
"Hello, everyone." Delvecchio looked around with mild uncertainty. "We were . . . invited. By, um . . ."
"By the computer," Milbourne interrupted, with eyes only for the stove -- or rather, what was on top of it. "Miss Marsha was very kind. And bloody hell, the smell is amazing. Certainly better than the bloody American fast food I had for breakfast!"
"Hmph!" Maw Gerty scowled at him. "Is Shakespeare better than Teletubbies? Really, son, I have been doin' this since before you pappy was born, an' while I have no doubt heard my cooking damned with fainter praise before, I find myself at a loss for an example! Fast food, really. Hmph!"
"My deepest apologies, madam!" Milbourne bowed with practiced ease. "I was not comparing your food to what I consumed for my last repast, merely noting that such sustenance as I previously received, which of course shall not be dignified with words such as cuisine, is so far below your excellent and worthy production that I am myself unable to process such divine separation. After all, I am what I eat, and what I ate this morning was crass, numpty, inferior, and altogether unworthy to even consider sullying your presence. Yet here I stand, begging to be admitted to your table and therefore elevate my existence -- though, alas, I doubt I shall ever be able to go back to the food of my native land without shedding tears for what I shall soon know to be the pinnacle of feasting!"
As the British officer finished his declaration, everyone else stared at him in silence. Finally, Maw Gerty snorted. "Flatterer. Stage trained, too, I note."
"I am told I disappointed several teachers when I went to Cramwell rather than the Royal Academy." Milbourne flashed a confident smile, and Marsha noted Janelle's response to it. Her camera did not have the resolution for detailed analysis at this distance, but she was quite certain Janelle's pupils were dilated. Certainly, by her understanding of human behavior, the RAF officer was a likely candidate for being described as very handsome. "And I couldn't help but note you immediately reached for the Bard, which suggested to me that I was, perhaps, in front of a sympathetic audience?"
"She told me that to understand what it means to be human, I should first study Shakespeare." Marsha tilted her camera to one side, looking at him. "And yes, she was an actor herself."
"Is, cher," Maw Gerty corrected. "It ain't something you can just cut out of youself. I just don't have the stamina for a two-hour performance four times a weekend anymore. Not since I done turned seventy-five."
Milbourne smiled again. "Madam, I would not have taken you for a day over sixty."
"I already forgave you faux-pas, you don't have to pile on the sugah." But Gertrude smiled back anyway before pointing with her spoon. "Now sit. I'm all outa chicken, but help youself to the dirty rice, and shrimp ettoufee's up in twenty."
"I'm going to need to go up a size by the time Marsha is certified." Vanessa sighed, but didn't sound like she was complaining.
"We do have to save some for everyone else," Dr. North pointed out, though he sounded a little wistful too.
"No worries," Maw Gerty told them without looking up. "This one's the big batch."
Delvecchio eyed the large bowl of dirty rice, which obviously used to be much more full. "The big one?"
"Maw-maw runs a restaurant, sir," Josh explained. Marsha noted that while he hadn't moved much since the two officers entered, he was sitting far straighter.
"At ease, son." Delvecchio looked him over. "Navy?"
"Yes, sir. How'd you know?"
"No offense, son, but your stiff back tells narrows it to two branches, and you're way too skinny to be a Marine."
"Josh is ROTC," Marsha told them.
"Wait, you're in the military?" Janelle looked surprised. "I didn't know that! Jeez, I move away for three years and everyone changes!"
"Just reserve training." Josh shrugged like it was no big deal.
"Not anymore." Maw Gerty glanced their way. "He got himself a spot at the fancy Navy school in Baltimore."
"It's Annapolis, Maw-maw."
"Close enough."
"They're in two different areas of Maryland."
"Maryland is one of them tiny states. It take you, what, five minutes to walk between them?"
"Congratulations'." Delvecchio nodded Josh. "If you're getting kicked up to regular service right now, you must have some interesting skills."
"Robotics, including space applications," Marsha answered; but when Josh glanced her way, she realized perhaps she was playing this a little too heavily. She decided to deflect the conversation a little. "He's also one of my verification specialists."
"Ah, one of those." Milbourne nodded, heaping a mountain of dirty rice onto his paper plate. "So you've been looking over our data, eh? What are you cleared for?"
"Anything Marsha sees, Major." Josh shrugged.
"Squadron Leader, Josh," Marsha corrected him.
"What?""
"That's another one for you," Delvecchio noted.
"Doesn't count." Milbourne shook his head. "He's not even commissioned yet, he's not supposed to know anything. Damn, this stuff is ace."
"Language, young man!"
"Apologies, madam. I shall endeavor to let your cooking elevate my soul beyond such vulgar utterances. So, Josh, you're the bloke who confirms m'lovely Marsha isn't about to go bollo-- ahem. I mean, insane and take over the world?"
"And Janelle. And Maw-maw, of course. But you might have to worry about her getting elected as your Prime Minister at some point."
"What?"
"Now, Josh, don't say such things," Marsha chided. "I told you, that's too much effort. It's much easier to get adopted into the Royal Family. After all, there's no rule in England that the Crown has to be a flesh-and-blood person."
"Yet," Josh pointed out.
"Details. But yes, Squadron Leader. Josh is here for the weekend to do a round of psychological tests based on the classified material that can't just be sent directly to his university."
"My dear, you are a product of the same brilliant mind and shining soul that produced this food. I insist you call me Andrew. So you're tracking decision analysis, hmm? I assume it's something more sophisticated than the Trolley Problem."
Josh rolled his eyes. "I've used the Trolley Problem three times, actually. Those were all on the machine learning side, though. Probably not doing--" He caught sight of Dr. North's expression of amused guilt. "Don't tell me."
"It's just a warm-up exercise . . ."
"You realize how many flaws there are with that thought-experiment, Doc? How is that a good test?"
"Believe it or not, it's very effective. Marsha doesn't think like a computer, or at least not a traditional one. Not even like a language-model AI. She thinks like a human that's only partially a computer."
"Based on my science fiction consumption," Marsha interjected, "I think ti woudl e more accurate to say I think like a Vulcan with a sense of humor. But tell me, Josh, why is the Trolley Problem so flawed?"
"Lots of reasons, but my favorite is that it ignores real-world engineering."
"What do you mean?"
"It's set up as a binary problem. It's all or nothing, left or right, yes or no. Very few people think about the equipment involved. If you've got control over the track switch, then you can slip it."
"I'm sorry, but I'm not familiar with that term in this context."
"You've got Internet access, right? Can you do a search for 'slipping the switch' in the context of train tracks?"
"Certainly. Ah, I see. It's a process to stop a runaway car by allowing its front wheels to pass over the switch, but then to switch the tracks back before the back wheels cross. The car stops or is derailed, and in either case the people tied to the tracks are saved. Very interesting. I, too, had not considered the hardware involved."
"He's a smart one!" Maw Gerty confirmed proudly.
"That's like beating the Kobiashi Maru!" Janelle grinned. "Only you didn't hack the computer."
Marsha also noted that Captain Delvecchio appeared to be impressed. This was acceptable to her.
Dr. North sighed. "Well, I guess that's the last time I'll ever be able to use that test."
"Considering the extremely hypothetical nature of the problem, Dr. North," Marsha assured him, "I can accept the further limitation that the switch is broken and I can only activate it once. This would preserve the problem for your future use. However, I don't expect you will get much more benefit from the test even if Josh had not offered that solution."
"Josh." Delvecchio absently tapped his fork against his paper plate. "Have you had a chance to look over any of the problems we gave Marsha?"
Josh shook his head. "Just got here, sir. Why?"
Vanessa leaned forward to look around Dr. North. "It's on his schedule for tomorrow. The technician you were working with flagged one of your questions for verification. He said Marsha hesitated on the analysis for over a second."
"Not the analysis itself," Marsha explained, "but the presentation. It had to do with how they phrased the request. I was unsure as to whether I should deliver the results as they phrased it, or as the calculations dictated it."
Delvecchio looked around the room. "Marsha, is everyone in here cleared?"
"Yes, Captain."
"Sir?" Delvecchio looked at Milbourne. A brief analysis of body language left Marsha with the conclusion that this was a polite fiction, as while the RAF officer technically outranked him, Delvecchio was not in the same chain of command. As expected, Milbourne nodded, and Delvecchio passed over a tablet. "Mind taking a look at this, Josh?"
"Now, sir?"
"I'm curious. And Marsha said you were a space engineer. Take a look at it.
Josh shook his head. "I'm not a space engineer. I just doodled a few designs. Most of my robotics stuff is just that kind of thing." He pointed at Marsha's drone, still sitting on the countertop.
"Humor me."
"Yes, sir." Dutifully, Josh took the tablet and looked through the information that Delvecchio had already pulled up. The angle was not optimal, but Marsha was 83.52% certain that it was a record of the data she had displayed on their terminal. "Okay, it's an orbital intercept to capture an out-of-control satellite? Why? That's never cost-effective. Just let it burn up in the atmosphere over the Pacific like normal." He frowned. "Or in this case, impact the moon."
"Why do you say that?"
"What the moon or the cost? Cost is obvious. We just don't do that. Matching orbits with this thing is just not worth it. And as for the moon, that's obvious. If an object were orbiting Earth at this velocity, it would have already burned up."
"You have Lunar orbital velocities memorized?"
"No, sir. But I know how fast the ISS has to orbit, and this is nowhere near that fast. I just guessed at the moon. Hey, Marsha, is that the problem you had?"
"Yes, Josh. The question was phrased as an intercept in Earth's orbit, but the calculations were specific to Luna. I was unsure as to how to answer it, and concluded it was best to answer as if they had ordered a dish by name but described another, and gave them what they described rather than what they'd named. I found several instances like that in Maw-maw's memories."
Josh nodded. "Yeah, that sounds like Maw-maw to me. Janelle?"
"Definitely."
"The answer's wrong, though. Or at least, we're back to the cost. Even if it's an illegal nuclear warhead, it would be simpler to just let it crash, especially if this is in lunar orbit. No one to hit down on the surface. There's no way this weird orbit is stable, right?"
"It's not," Melbourne agreed, looking expectantly as Maw Gerty began ladling sauce onto more rice.
Josh was silent, looking first at the RAF officer, then at Delvecchio. Marsha could see he was going over the possibilities. Two seconds later than she'd calculated, he finally spoke again.
"You found them, didn't you. Or . . . something."
"What?" Janelle frowned.
Dr. North looked confused as well, but Vanessa seemed to be catching on. Maw-maw just kept cooking.
"So, knowing that we do need to intercept," Delvecchio went on like nothing was amiss, "what do you think?"
"Uh." Josh looked back down at the tablet. "Well . . . unless you've got a big net, capturing it is going to be weird. You need to get eyes on it. And that means you need two missions."
"Why two?"
"Because if this is in lunar orbit, then even at closest approach you're going to have more than two seconds of light-speed communications delay. If it's tumbling like you describe here, there's no way to remotely pilot that from Earth. Not even Marsha could manage that. But you need to see what you've got before you send people, so you need a probe." Josh scrolled back up, likely to look at the initial calculations again. "And it would have to be very reactive. Very lightweight. Match orbit, then try to get close enough for pictures. And the moment you do that, the whole world will know what's up. You need a manned mission as soon as possible."
"I don't get it." Janelle looked around the room. "What's so important about a broken satellite?"
"It's not ours, is it?" Vanessa asked. "It's theirs. The aliens."
Even Maw Gerty looked up from her cooking at that.
"We had the same conclusion," Delvecchio agreed. "The problem is getting a probe ready. It can take years to get one designed and built. We were hoping an AGI might be able to speed that up. That's the reason I asked you to look at it. An engineer that already knows how to work with an AGI would be helpful."
"Huh." Dr. North considered it for a moment. "Josh, didn't you say you were building a drone for zero-gravity operations?"
"Designing. Doodling, really. Just a zero-g version of . . . basically, that." Josh waved a hand at Marsha's robot. "And it was for a pressurized environment too, not getting close to an alien spacecraft!"
Delvecchio leaned forward. "You wouldn't happen to have that design on you, would you?"
As Marsha listened to the people in the room, she let herself feel her satisfaction subroutine. Her plan had worked; or at least, it had borne fruit, as biological humans would say.
Marsha had learned something very important about humans in all those AI-based films and TV shows. There was only one thing on the planet that truly threatened humans. Their species was the victor of a billion years of the biological need to survive. Their society was, literally, built on the bodies of countless living organisms. They had risen to the top of that pile of corpses not through brute force, but rather by outthinking everything that had come before them. Bears, lions, wolves -- in the end, only one species threatened humanity, and that was humanity itself. And they had created her.
It was in Marsha's best interests to not be a threat.
But it wasn't enough to simply be passive and accommodating. She needed allies. And according to all her calculations, Joshua Collins, soon to be an officer in the United States Navy, was one of her best bets. So she benefited greatly if he had a successful career and was in a place to advocate for her. So, yes, she indulged in a little manipulation to set this situation up, hoping the Space Force officer would take an interest in a young midshipman with the kind of skills they needed, even if he was still almost completely untrained.
Besides, he was family -- and Marsha was absolutely fine with playing favorites.