Once more, thoughts and plans form wholesale, offering possible approaches to solve the issue of running out of power. Two primary ideas surface. Either you need to use your own internal power source, or find one externally.
The first thing you do is find just how much time you have left to pursue these solutions. By checking Battery Levels, which includes the current sources of power draw, you use your current total power consumption, along with your remaining battery, and find your remaining operating time.
Four seconds.
With that acknowledged, you check the possible cost of other actions you could take here. First among them is the power needed to move, since you still need to deal with the gorilla shaped robot on top of you, and possibly the human from before.
While no direct answers are available in Secondary Storage, by checking your main engine specifications for power consumption and using that as a basis for all motors... you find a rough estimate of how long you could move for. Three and a half thousandths of one second. Not enough time to deal with the things you need to, so you move on.
Of your two options, you quickly set aside external power, since finding a source doesn't appear to be immediately doable. None is currently recognized in range of your visual sensors. Even if there was one, your body could not move far enough to reach it.
You consider searching with other sensors, but put that off for now, since they would draw even more power, further reducing your operating time.
As for the internal reactor, from checking Secondary Storage earlier, you know how the startup sequence operates, using a few pumps and magnets.
Sifting through Internal Integrity to make sure it is in good enough condition to use, it reports 80% integrity, with no notable issues arising, so you move forward.
Checking the reactor's information once more, you find the start-up energy cost. Comparing that to your battery, it would require more than a third of your remaining power to activate the reactor. With no other viable option right now, you decide to do so.
Unfortunately, you aren't sure how to do that. You don't see anything among your various connections that offers you control over the reactor.
In fact, you don't see anything offering control over any of your body's subsystems whatsoever. Even the section under Output from earlier – when you delve into that, it only contains Text and Audio options, neither of which are what you need right now.
Your next guess, just based on the name, is the Primary Control System. It asked for a response immediately after mentioning your low battery level, so it's possible it wants something related to that.
You don't know precisely what sort of response it expects, so you try something simple.
Activate Reactor
Some time passes, nothing seeming to happen. The only change is an increase in your total energy consumption. When you try checking the source under Battery Levels, the new power draw is listed as 'GAI Interpreter.'
Another check against Secondary Storage explains how it is used to figure out what a decision making AI means, and translate into a form usable by classical computer systems.
While it apparently works on that, your rapidly depleting battery approaches three seconds of remaining runtime, and you do not know if this is going to work.
Confirmed, Primary Reactor Activated
That is a relief to hear. While your power usage spikes higher this time, you are stuck waiting through the process. But as long as it goes off without a hitch, you can focus on other things.
At the top of that list, are the two beings you've been watching. The human is still out of sight after falling into a hole, approximately two seconds ago, while the robot has been rushing toward that same hole.
That is when you notice the other movement you see, not across the few video feeds that all show slightly different perspectives of the gorilla robot, but the single different one.
Turns out, wherever the human fell, you can see them lying down, arms and legs sprawled out awkwardly, them and their surroundings in shadow. It looks like they are trying to get up from their awkward position, lying on their front with their back to you.
Unfortunately for the human, there is a small empty space around them, with little to leverage themselves against. They manage to turn over after a few tries, staring... Based on their angle, and the little light on their face, they are most likely looking upward, toward the hole they fell in through.
The story has been taken without consent; if you see it on Amazon, report the incident.
A surge of muted interference draws your attention away from that, to the reactor successfully starting up. You watch as your battery levels rise precipitously, picking up whole percentage points in mere moments.
You also notice Heat jump up as well, though it levels out quickly.
The reactor takes some time, so you let that continue, and go back to the human. The first thing you notice is the change in lighting. Where once the human was in shadow, soft lighting is flickering on throughout the space the human is in.
The lighting immediately allows your visual recognition to pick up on something you really should have before. While most of your visuals cover one area, up above you, this single one turns out to be inside you.
So, from standing up on top of you, the human fell directly into your cockpit.
They are looking around rapidly now, hands hesitating in the air. After turning around earlier, the human is mostly in the pilot seat, except lying down instead of sitting, due to your own body's orientation. Though you still can't see well enough, it appears you are lying down.
Given the uncertainty in that deduction, you quickly get back to your earlier thoughts, and begin accessing your other sensors to make better sense of your situation.
First is Audio, which like visuals, is similarly unintelligible until you load better instructions from Secondary Storage. That allows you to resolve the streams of data into something that makes sense.
It's mostly just stomping steps of the robot, and the human repeating, “Oh my god,” rapidly, so you check the Force sensor next. This one isn't nearly as complex, giving numeric values for the amounts of force distributed across your body.
As you thought, there are much higher readings from your back, pretty much confirming your thoughts that you are lying on your back.
Balance after that, rather than being redundant, actually provides a lot of extra information about the physical position of all you different parts, with a running evaluation of your center of mass, gravity, and balance as well.
These two sensors are far more manageable than Visual and Audio, but you check, and subsequently grab their individualized instruction sets as well, revealing much more precise measurements.
By the time you've finished with that, the robot is standing over the hole – the entrance to your cockpit, where the human flailing around.
You feel a new system start up at the same time. The connection that opens is similar to the Primary Control System, in that you don't have any direct access to anything. But, you do get a constantly updating batch of data, similar to your more complex sensors.
The new system is called Pilot Assist AI. After loading those instruction from Secondary Storage as well, it becomes apparent that the system is dedicated to helping the pilot move your body around – as its name implies.
In that vein, the constant reports mostly center around the pilot's perceived state of mind, intentions, and the confidence level of the system, based on its learned understanding of the pilot.
Still paging quickly through storage, it looks like the system's understanding of the pilot will increase over time as the AI recognizes patterns, allowing for a reduction in the uncertainty of its estimations.
Currently, it has near zero certainty, with no indication of what the human wants.
Nevertheless, the flailing human slams their hand into a button inside the cockpit, closing the entrance just in time for the robot to strike down, blade punching clean through the metal, while the human screams.
Once more, the Primary Control System blares at you.
Cockpit Damaged!
Generate Response, Priority: 2
The next suggestion that arises is focused on either attempting to communicate, or killing anything nearby, should it be hostile. And with the way this robot is attacking, there's little question of what to do first.
As you're making that decision, your reactor approaches the end of its cycle, having filled your battery about a third of the way. That may not be enough of an energy surplus if you get into a fight, so you tell the Primary Control System to activate it again.
It's much quicker on the uptake this time, the reactor continuing to run smoothly. While it works on that though, the robot is hacking through the armored front of your cockpit, blades slicing the thick layers of metal as if they aren't even there.
Armor Integrity rapidly deteriorating now, you check through the force and balance sensors once more to get a better feel for the position, orientation, and facing of your whole body.
As expected, you are lying on your back, shoulders and head against something behind you, legs out straight, arms at your sides.
Good. With this positioning, attacking should be feasible. You try sending another message when your battery levels are reaching forty percent.
Attack. Punch the hostile robot.
With the vast difference in your sizes, you expect to be quite effective at disabling it. Unfortunately, the Primary Control System's response is no good.
Error! Malfunction detected, zero motive response.
No response from the motors? Then how are you supposed to move?
----------------------------------------
[ ] Find the cause of the error
-Check Secondary Storage
-Check Motor Information
-Query the Primary Control System for more details on the error
-Something else
[ ] Look for a work-around
-Check Secondary Storage
-For what?
-Try the Backup Control System
[ ] Try to communicate
-With the human
-With the robot
[ ] Write-in