Novels2Search
Nanobots, Murder, and Other Family Problems
Thu 09/22 18:37:11 EAT and Fri 09/23 20:42:39 EAT

Thu 09/22 18:37:11 EAT and Fri 09/23 20:42:39 EAT

Thu 09/22 18:37:11 EAT

The van rattles as we bounce along the dirt track to our camp for the night. Evan lies sacked out in the row ahead of me. I wish I could phase in and out of sleep as easily as he does. Louise, up in the front seat, is chatting with Kofi. Behind me, Jeff stares out the window, a weary look on his lean face. This trip has been good for him. He walks on his own power most of the time now, even when he can find a flat surface to glide across.

We’ve got maybe half an hour before we get back to the others. I like the work here better than the projects in Somalia. Today felt really good. Maybe I’m just used to the heat now, or maybe it’s because it’s not quite so dry here. Or it could be because we actually see the people we’re helping. It’s definitely more interesting than laying pipes through the desert.

Jeff sighs. He looks more than just tired. More like he’s worried. Jeff’s not normally a big talker, but he's been extra quiet on this trip. I had chalked it up to fatigue, since he’s been working both his muscles and his brains, but he’s stayed quiet even now that it doesn’t look like he’s struggling just to stand up. As he stares out at the brown and green landscape, his gaze is even more distant than normal.

“How are you doing, Jeff?” I ask him. “You holding up OK?”

“I am fine.”

“You’ve just been quiet lately.”

“I suppose I have,” he answers, turning to look at me. Weird, he almost never looks at you with his regular eyes.

“Something bothering you?” I probe.

“Actually, yes. There is a problem that I am having trouble solving.”

“Cooking up some new code? I’m surprised you have the energy. I’ve been too wiped out all trip to even think of working on any of my projects.”

Jeff pauses a moment. My overlay shows a bot eye forming from Jeff’s cloud and zipping past me to take a look at Evan’s sleeping form.

“No, it is not anything like that,” Jeff answers after a moment, his bot eye dissolving. “It is about Father and his dump truck full of nanobots. It should not be possible. At least not with any abstraction that I have been able to figure out. You can coordinate groups of nanobots. The software built into the phone’s controller does that easily. You can even coordinate groups of groups, as I demonstrated in the code I shared with you. That is how I enable my larger cloud.” He shakes his head slowly, his mouth pulling into a frown. “But if you were to try to take that abstraction to a higher level, you would need some adaptive and dynamic automation to optimize the management of the lower level functions. Lacking that, your efficiency would drastically diminish. But Father’s efficiency does not decrease when he coordinates the efforts of a larger cloud. If anything, he appears to be more efficient the more bots he is running.”

It takes me a second to parse all of that. I have to read back through it twice in my log before I’m sure I understand what he’s getting at. “Wait, you're saying he gets more effective per bot when he’s handling more of them? That doesn’t make sense. Are you sure?”

For a long moment I just hear the hum and bumps of the road beneath us.

“Yes. I have been observing quite closely,” he says finally, staring out the window again. “He has been leaving a large number of nanobots behind for maintenance at each of our build sites, so his cloud size has been fluctuating significantly. The math is complex, because I have to figure in the decreased total productivity from having fewer bots and then figure out the delta in his efficiency. My expectation would be that when he finishes a job and loses some bots to maintain it, he should get more efficient on a per bot basis because his control can be more precise, but less productive overall, because he has fewer nanobots to work with. So I have plotted what I expect the net effect would be whenever he disconnects bots from his cloud. But the observed reality doesn’t match that. He seems to be getting less efficient as he loses nanobots, so the total productivity drop is much larger than I would have expected.”

“So?” I ask. “Maybe he’s just excited in the mornings when he starts and gets tired through the day.”

“A valid point, which is why I also factored that into my calculations. But there is more to consider. When I was running the pipes in the large plant, he was building around them. He did not know where I would place the pipes, so there was no way for him to have programmed the changes to the design in advance. The blueprints were adapting. The design was changing, even where he was not looking. His cloud was making decisions on its own.”

That could only mean one thing. Unthinkable. Not even Father would do that. “Wait, you don’t—”

“He is running adaptive controls on the nanobot cloud’s internal processors,” Jeff interrupts. “The processors in each individual nanobot are using learning intelligence. That is the only explanation that makes sense. When he loses nanobots, he loses both workers and computational resources.”

Warring sensations of elation and terror course through me. If he’s right—and I can’t imagine he’d say this if he weren’t sure—then Father is more dangerous than I had thought. If he’s offloading controls to his cloud’s processors, his cloud could have all sorts of contingencies and reactions built into it. They could do work independent of his control. He could have any number of programs that get triggered if he’s incapacitated or killed. Maybe it’s a good thing I didn’t crush his head with a rock back in Djibouti.

On the other hand, I might have an ally here. Jeff seems even more disturbed about it than I am. This goes against every rule about safely using nanobots. If he’s running adaptive AI on the bots, he’s only a small step away from a potential Gray Goo event. One bug in the software is all it could take to have uncontrolled reproduction consume everything on the planet.

Does Jeff understand the full implications? And if so, what is he willing to do about it?

“But that’s illegal,” I say, probing him. “That’s against laws that he helped to write. You don’t think he’d do that, do you?”

He turns and gives me a long look.

“Noah, you have only known Father for a few months,” he says softly. “But based on your experience with him, do you think that he is the sort of man who would let rules of any sort prevent him from doing what he thinks needs to be done?”

“No, I guess not.”

“Indeed,” he says, his voice turning firm. “If he thinks that breaking rules is necessary, he will do it without hesitation. Even if the rules were his own creations.”

Stolen novel; please report.

“Isn’t that dangerous though?” I say, feigning ignorance to see his reaction.

“In this case, Noah, it is the most dangerous thing in the world.”

Perfect. “So what do we do?”

“I am not certain.” Jeff turns back to the window. “I need more time to consider. I am very concerned. Please keep this between us. I do not want to worry the others. They do not all have the temperament necessary to handle this information discreetly.”

“Of course,” I reassure him.

With any luck, he’ll come to the same conclusion that Louise, Andrea, and I came to last night: that Father’s behavior can’t stand, and we’re the only ones that can do anything about it. From there, it’s not a big jump to needing to kill him. For Jeff, he needs to think that it was his idea. That’s the only way he’ll go for it. If he realizes I’m pushing him that way, it’ll backfire. That’s OK. I’ve waited this long, I can wait a little longer to get another sibling on my side.

Jeff settles back into the seat and closes his eyes, deep in thought. I do my own thinking, pondering all the ways I could use him to my advantage.

Fri 09/23 20:42:39 EAT

I finish setting up our gear in the freshly-built shelter and step over to the door. Another long day. Off in the distance, a clean white light appears. Looks like that last village of the day already hooked something up to their new power supply. That one was my favorite of the five we did today. Not that the others were bad, I mean, the reaction from the people here has been amazing across the board.

I settle down on the shelter’s entrance step to give my sore feet a break. The cloud cover that made the weather this afternoon so nice breaks open, splattering stars across the sky. My stomach grumbles. I hope Ibrahim gets here soon with our evening meal.

Jeff comes over to me and gives my foot an awkward nudge with his. “Noah, there is something I saw over there that I think you would like to see.”

Real subtle, Jeff. But I go along with it and get up. “Sure, let’s check it out.”

“What is it?” Marc asks, starting to follow us.

“A colony of exceptionally large beetles. Noah was an amateur entomologist in his former life and loves observing insects in the wild,” Jeff lies.

“Ew, gross. I hate bugs.” He wanders off toward Evan and the girls instead.

“Good work, Jeff,” I say quietly as we amble toward the spreading canopy of a clump of trees.

“Please excuse the deception,” Jeff responds, “but I didn’t want Marc to overhear some thoughts I’ve had.”

“I figured. What have you come up with?”

“You received Father’s explanation of his encounter with the wild nanobot swarm created by Universal Robotics. Correct?”

“Yeah, he told me the story the first time I met him.”

“I have come to suspect that he was not honest in his account. Specifically, I believe that he lied about the aftermath of that incident.”

“Interesting. Care to elaborate?”

“Do you recall that Father indicated that the original swarm had developed some rudimentary self-awareness?” Jeff asks.

“I remember that part,” I say, glancing back at the camp. No one is coming out to follow us. Good.

“Per Father’s explanation, he removed from the nanobots the ability to gain sentience again. I have come to believe that is not true. I have been reverse engineering the software of my own cloud, and I now believe that he simply added a layer of control that suppresses the default programming. He never removed it entirely. The suppression, I believe, is imperfect. When a critical mass of nanobots are connected in the same mesh network, the intelligence that existed in the original swarm begins to manifest itself.”

He’s using his patient voice, the one he uses when he talks to the other siblings. I think that means I’m on the edge of being considered an idiot. Maybe I’ve been playing too dumb with him. Gotta walk that fine line, maybe challenge him a little.

“So you think the smoother control he has when he’s running a big cloud is the swarm AI at work?” I ask. “I’m not sure that your hypothesis makes sense. If the AI revived itself, don’t you think the first thing it would do is eliminate the threat to itself? That’s Father.”

Jeff half-smiles awkwardly. “I thought that initially too, but I’ve exhausted all other avenues.” His voice goes back to the conspiratorial tone he normally uses with me. “There is nothing else in the programmatic layer of our interfaces that could account for the phenomena. I have checked his nanobots, and they are identical to our own. Same hardware, same software.”

“How did you do that? Is there an interface you can use to connect to someone else’s cloud?”

“Version numbers are available in the networking signatures that the clouds use to identify one another for the overlays,” he replies as if it’s something everyone knows. “You just need to decode the raw packet data. But that is not the point. The point is, there are software systems running on the nanobots that are not under human control. Complex ones. Systems capable of accessing self-replication features. Systems that could potentially self-direct.”

“So why haven’t we all been slagged?” I ask, looking up at the flat canopy of the tree above us. “He had a literal dump truck full of those things, that had to be enough to hit critical mass. Why didn’t the rogue AI just wake up and take control?”

“That is exactly the crux of what I do not understand,” Jeff says, slowly circling one of the trees. “And that is why I wanted to discuss this with you. I am afraid I have debugger’s blindness. I need a sounding board.”

“Sure, I’ll bounce ideas with you.” I’m not sure at this point if he’s telling me his ideas because he thinks I’m smart, or he just needs someone to talk to. Jeff doesn’t fit in with the rest of our cohort, or any of the rest of the Butler Institute, but he’s latched on to me for some reason. I think he’s decided we’re kindred spirits or something. “But let me ask a couple of questions that might shake things up for you. First, if you’re right, what can we do about the bots? They’re all over the place now. We’ve probably deposited a trillion for maintenance on the installations we’ve done on this trip alone.”

“More than that. Somewhere on the order of ten to the fifteenth power at least.”

“Right, and that doesn’t include all the power projects he’s done over the last fifteen years. He’s got those huge self-maintaining solar fields all over North America, plus I don’t know how many others on other continents, but probably a lot.”

“Yes,” Jeff confirms. “SynTech has thirteen other significant power installations on record.”

“Sure. So what can we do? If you’re right, that means that the maintenance lobotomy is just another layer on top of the original code. So all of those solar fields he’s been building for years are potential Gray Goo sites. How could we ever stop that?”

He stands silently for a long minute.

“I do not know. I had not even considered that facet of the problem.”

“Second question then,” I ask, pinning a lot of hopes on his answer. “What can we do about Father? If you’re right, he’s putting the whole world in danger.” I hold my breath as he considers.

“I do not know,” he finally says. “But I fear that the normal mechanisms of criminal justice may not be sufficient for this case.”

“Yeah, how do you put a guy on trial who can melt the jury’s brains with a wave of his hand if he doesn’t like their verdict?”

He nods slowly. “Yes. I suppose he might, at that. It is strange though. I had never considered him a violent man.”

I look back at camp. Louise and Andrea are off to one side of the shelter on their own. I think I can go get them without the rest of the group seeing us. I make a decision.

“Wait here for a minute,” I tell him. “There’s something you need to hear.”