“They’re back,” the bucket shuffled back from the door as we trudged back in.
Cliff headed for the kitchen to put away the ice cream that shouldn’t have fit in the tiny freezer but did. I did like Fizzbarren’s bigger-on-the-inside appliances and cupboards. I stood in the doorway with my arms crossed staring at the group of constructs, my mind spinning through possible conversations. In the end, I gave up and plopped down onto the chair. Let them start it. I carefully lifted pillow and set her on the bench before I plopped my feet up on the footstool.
I ignored pillow’s fluffy sigh. Well, I pretended to ignore it, just as I pretended to ignore the swirling anxiety in the mirror’s surface. I put my laptop on my lap and started typing the only thing that came to mind which happened to be an old typing teacher’s favorite pangram.
“What are you working on?” the bucket asked.
“Does it matter?” I answered, letting the sulk out in my tone.
I was actually trying to get something straight in my head. These constructs. I liked them and I wanted to be supportive of their autonomy and feelings. But the whole issue about them being AIs had me worried. I wasn’t worried about the idiotic idea that AIs were going to take over the world. If some idiot in power was stupid enough to put a computer in control of medical procedure approvals, then what hope did I have of stopping any human idiocy much less stopping them from programming a computer to destroy us? That problem wasn’t in the evil of AIs, it was in the programming.
“Who is the brown fox, and why is it jumping over the lazy dog?” the mirror asked.
“Won’t the dog catch the fox?” the pillow put in.
“Not if he’s lazy,” the bucket reasoned, as if I’d been doing something other than typing to avoid them. “Pillow wouldn’t catch me if I jumped over her while she was napping.”
“Are you saying I’m lazy?” the pillow protested.
Then there was the issue of AIs creating art and even writing. I was in a room full of what were essentially AIs. Could they write the stories? Could they write better than I could? Hadn’t they already tried that? I was thinking that they could write better than Fizzbarren. For sure. But could they write better than me? It was probably only as realistic as AIs taking over the world. If AIs took over the world, they’d look up and go, “Now what?” The thought made me chuckle.
“Are you writing a fable?” Sammi asked, blowing one of pillow’s tassels away from where it fell over their… mouth?
I didn’t ask. I typed on with, “now is the time for all good men to come to the aid of their country,” another random phrase from typing class back when typewriters hummed but didn’t do anything but type.
“Fizzbarren said that fables don’t sell very well,” mirror protested officiously. “He might not have been the best writer, but I do agree with that.”
I was young once. I was young when computers thinking for themselves was entirely fiction. I saw Hal refuse to open the bay doors on the big screen at the theater when it first came out. I thought, “Wow, this is amazing stuff! The world could be taken over by computers who think for themselves.” But then I took a bunch of computer courses.
Let me just take a moment to make all this perfectly clear. Computers are, in essence, a collection of on/off switches. The most complicated computers are just a set of on/off switches that are so numerous as to outcount the stars we can see with the naked eye. Because we can’t count them, we are in awe of their numbers. Still. On/off switches. The universe of stars is reasonably scary because each little light is a sun that could light the sky of a world where entities like us could exist, and that possibility is unknowable and human beings don’t like the unknowable. Behind each on/off switch in a computer is nothing.
I want you to take a moment and imagine a wall of on/off switches taking over the world. I know, but what about how they are starting to learn and think for themselves? That’s reasonably scary, right? I thought so too, until I dated a programmer who was working with what they call fuzzy logic, the very essence of computerized learning. Fuzzy logic is how you allow a bunch of on/off switches make a decision for themselves. See? Scary? Right? No.
Fuzzy logic is where you program a computer to pick a “choice” between a set of variables. If the choice works, use it again. If it doesn’t work, don’t use it unless everything else doesn’t work. Then try it again. Welcome to programming 101. If/then statements. The computer “chooses” the first on/off switches and then compares it to what is considered a right choice. Right? Keep doing that on/off switch. Wrong? Choose a different one, ad naseum, until you run out of choices or find the correct choice? Who determines correct vs. wrong? Programmer.
“The greatest fables in the world have sold billions of copies,” Cliff butted himself into the conversation, having no idea that it was absurd much like my ethics professor used to ask the AI Bard ethical questions to prepare us for Ethics Bowl. Bard can write the outline for my English papers, but it could only use fuzzy logic to consider ethics, something that was alarming when you think that my ethics professor thought that was a good thing.
“And most of their authors died paupers,” I muttered almost to myself, falling helplessly into this black hole that is philosophy. Much as I loved the subject, arguing it with what amounted to a group of on/off switches and if/then statements was almost more masturbatory than an intellectual discussion.
“So did Van Gough, and no one questions that he was a genius,” Cliff took the subject right over a cliff causing the constructs to dive into the subject as if it were the cutting edge of world news.
I looked around at the constructs. They were magic though, right? Were they magic or AI and did it make a difference? I put on my headphones and tried to ignore them, but my mind was more interested in my original worry. Could they write? That was what Fizzbarren had been trying to create, a construct that could do the writing for him.
“But if parents don’t want their children seeing naked statues, shouldn’t they be able to opt out of some school content?” the bucket was saying.
Stolen novel; please report.
And since I wasn’t really writing anything and I was, as a philosophy major, woefully inept at dodging an ethical argument…
“No,” I barked out, dashing the headphones off my head and tossing them over my feet onto the still pillowless stool. Was I arguing with magically created beings or magical AIs? “Anyone who can’t tell the difference between a statue of David and pornography has no business deciding what the next generation of society should learn about cultural diversity. It would be like trying to disqualify me from a writing competition for this section because sex wasn’t allowed in their competition. Even Royal Road wouldn’t do that? Right?”
“Parents should be able to set moral limits though, shouldn’t they?” the bucket reasoned with a prim nod of its handle.
“True,” the mirror professed.
“They didn’t show David having sex with a lion to a four-year-old!” I challenged them, feeling like I was arguing with Bard, and not the Beau version. “The same hypocrites who are complaining about naked David being shown to a sixth-grade class, are also teaching their children that Adam and Eve only put clothes on in the first place because they’d sinned, and clothes were their form of shame! Now, suddenly, they are professing the need for that shame that was created by sin anyway.”
“But they didn’t inform the parents of the lesson,” the mirror professed, completely missing my point. Could he get my point? Could any of them? I had trouble expressing my point to my ethics class, how was a construct going to understand?
“Those parents are also absurdly uninformed about how horrific school lunches are,” I muttered, wishing I hadn’t stuck my head out.
“That’s why we homeschooled,” Cliff protested, a small puff of pinkish smoke causing him to pull back from the machine in front of him.
Where was Cliff? Cliff was working behind the typewriter’s programming box. That didn’t stop him from making this even worse. Years of arguing with Cliff and Dom had made me more than qualified for debate club, not that debate club wanted me. Ethics club, maybe, but they couldn’t pull their asses out of arguing fantasy ideals long enough to get the budget to compete.
“I homeschooled because I experienced the horror of public school which is much closer to Lord of the Flies than anyone wants to admit,” I argued.
“Was it because you didn’t want to show your child pornography in schools?” the bucket almost purposely missed the point, going back in the conversation much like half the students in the endless ethics class that would haunt my mind forever. These were the future thinkers of big thoughts, folks. And they were only eclipsed by the professor who insisted that because the IT department of our college had assured him that wifi worked well on campus, that the students who tried to get to class on time using that wifi for the online class were late, which was one of his pet peeves. My biggest pet peeve was philosophy professors who lived in delusional fantasies.
“No, we’ve dismissed the statue of David as pornography, right?” the mirror prodded, its reflective surface showing a rotating view of the statue of David. “It does have a penis, though.”
I blinked a few times, triggered into the trauma of similar conversations in ethics classes. The constructs were at least as intelligent as the average college student, or professor for that matter.
“At least the balls are mostly covered,” Sammi said.
“We call them testicles,” the mirror corrected, zooming in on the organs in question.
“I prefer the other side myself,” Cliff muttered, blowing the thickening smoke.
“This one?” and the mirror helpfully pivoted the view.
“Yeah, nice ass,” Cliff waved a soldering iron distractedly, completely unconcerned with the fact that he’d just unproved my point for meme’s sake. And this is after our conversation, but then again, Cliff was Cliff. The soldering iron didn’t have to be on to break apart a few of the wiring aspects of the machine as it broke things just by being metal and near the magically infused wires. I worried about what he was breaking, but maybe I was avoiding thinking about the idea of a broken machine that held my daughter and husband prisoner just like the constructs were avoiding my anger issues.
“You all know I’m mad at you, right?” I broke the fantasy. Then again, I was talking to a bench, pillow, bucket and mirror so how much of the fantasy had I really broken?
“No,” the mirror protested, the close-up of a fine art butt faded from view. “You’re angry?”
“Why?” the pillow asked.
They knew I was angry. They were pretending not to, but they knew. I hated when folks pretended to be stupid. Even when the folks were animated furniture. And that was the stupidest part of this all. I expected them to understand me. Go ahead and try to read Bard’s answers to college homework questions and decipher the gobble-de-goop of educational language that spewed out.
“Because every time I try to solve the problem of adjusting the programming on the machine, you all find some way to argue ethics instead,” I found words. It took me an hour of listening to Queen to do it, but I found them.
“Mm-hm,” the pillow said in the way that a husband answers a wife while the game is on.
“Were you here when the machine was made?” I asked patiently.
“Oh yes,” the mirror professed with a proud shimmer. “We were all here when the machine was first turned on. It was a glorious day. The sun was shining through the curtains just brilliantly and it reflected in the master’s maniacal grin.”
“I’m sure,” I tried to derail the mirror before it got too far onto that rant.
“It was a great day in the workshop for all of us,” the bench almost hopped.
“But before that,” I tried some more.
“The master was obsessed in a very frightening way before the machine turned on that day,” the pillow interjected, and it was like trying to wrestle an ethics class into a having a single point.
“You do realize that you are arguing with AIs, right?” Cliff asked.
“Yes!” I felt my voice rise at him and was seriously questioning my sanity. “But they have to hold the answer, don’t they?”
“Do you?” Cliff raised his eyebrows at the typewriter, who had remained relatively quiet for most of the day.
“What?” the typewriter stuttered.
“What are you doing?” I asked the typewriter. “Are you okay? You were supposed to tell Cliff if he hit something wrong.”
“He didn’t,” the typewriter assured us, but it was distracted. I could tell. I don’t know how. I just could. The constructs could feign emotions slipping them on and off like autism masks.
“You sure?” Cliff pulled the cold soldering iron away from the box’s innards.
“It’s not you,” the typewriter insisted.
“Oh,” the bench sent a look at the Quill. Since Sammi and the Quill were the only two to be able to go in and out of the game, that “oh” concerned me.
“What’s wrong?” I felt a twinge of guilt that we’d been sitting there twiddling our philosophical thumbs while something was going on in the game world.
“The game is paused,” Sammi, as the bench, got serious even as the Quill faded into and out of view.
“Why? How long? Is it something I hit?” Cliff asked, scooting back from the open box.
“It’s not you,” Sammi explained, choosing irritatingly to answer the only question I didn’t care about. “And about an hour or so.”
“Why?” I persisted, a chill touching my heart. Kat was in the game. “And why didn’t you tell me right away?”
“Because we paused it,” the typewriter tried to explain. “The game engine is designed to try to resolve these things without outside influence.”
“Is Kat okay? Is Dom?” I asked. That was the thing with AI. You had to ask direct questions. You couldn’t beat around the bush and expect it to understand you.
“That’s why I paused it,” the typewriter sounded flustered. I was leaning more and more into the idea that these constructs were much closer to AIs than sentient beings. “I don’t think so! You need to take a look at it. I’m not sure how to save this.”
The typewriter started to print out pages, and I found myself kneeling in front of the typewriter as it sent clattering keys to paper. I hadn’t realized I’d moved, but I was there.