Brokering Trust - Hetero Edition - Cover

Brokering Trust - Hetero Edition

Copyright© 2023 by Snekguy

Chapter 8: Language Model

Science Fiction Sex Story: Chapter 8: Language Model - A scientist is granted a once-in-a-lifetime opportunity to travel to the Trappist system, home of the Brokers, where no human has set foot before. A seemingly simple expedition grows more complicated as he is forced to balance the interests of his government and those of the enigmatic aliens who have requested his help.

Caution: This Science Fiction Sex Story contains strong sexual content, including Ma/Fa   Consensual   Romantic   Heterosexual   Fiction   Workplace   Science Fiction   Aliens   Space   Light Bond   Oral Sex   Petting   Size   Geeks   Politics   Slow   Violence  

“Are you nearly ready?” Selkie asked, stepping through the force field.

“Just suiting up,” David replied, sealing his helmet. “I was up pretty late last night working on an approach for our next conversation with Weaver.”

“We can discuss it in detail during the trip to the research facility,” she replied,

“I think you’ll be impressed,” he said as he began to stow his laptop. Selkie noticed the wireless adapter but made no comment. As he had suspected, the Brokers likely considered it useless to him without any ability to access networks outside the facility.

When he was ready, they made their way to the shaft, Selkie pausing to pet Flower before leaving the apartment.


“I’ve put a lot of thought into this,” David said, pacing in the cramped shuttle as it coasted through the murky water. Selkie was sitting on her tentacles, tracking him with her horizontal pupils. “So many of our approaches to interacting with a theoretical strong AI revolve around the assumption that it already has a working language model that could pass something like a Turing test. The Chinese room experiment is also predicated on the idea that the subject is able to communicate on a level where it can convince an interviewer that it’s fully cognizant. We’re in a strange situation, because the interactions that we’ve had so far have been far simpler due to the lack of a language model.”

“What is this room experiment?” Selkie asked.

“It’s an old philosophical theory about AI,” he explained. “Think of it like this – imagine that I’m sitting in a closed room with a one-way terminal. I don’t speak Broker, but imagine that I had a database of rules, phrases, and instructions for the language. You, being fluent in the Broker language, send me a message in Broker. With the help of the database, I formulate an appropriate response and send back a reply. The fact that I was able to reply convincingly doesn’t mean that I have any true understanding of the language – what I’m doing is simulating an understanding by responding to you using the examples provided by the database. In such a way, it might appear to you that I understood your query, but I really didn’t.”

“You’re describing a neural network,” Selkie said, the realization giving her a brief flush of bright colors. “The system has been trained to imitate language but has no true understanding of it.”

“Precisely,” he replied, clapping his gloved hands together in the water. “We call them chatbots – machine learning systems that can appear convincing on the surface but have no true consciousness. Our Weaver isn’t using a language model that we know of, and it wasn’t trained to interact with its operators in that way, so we’re starting off at a very interesting hurdle.”

“How should we proceed?” she asked. “In order to more easily communicate, we will have to provide such a model, but we may influence Weaver by doing so.”

“That’s the conundrum,” he replied, resuming his restless pacing. “We have to teach Weaver to speak, but in a way that’s free of any biases that might influence the output. Every word and phrase in both of our languages has cultural contexts, subtexts, implications, and multiple possible meanings. Even one’s tone of voice can completely change the meaning of a question from something endearing and heartfelt to something sarcastic and hurtful. If we were to teach Weaver to speak English, we would be influencing its behavior in an uncountable number of ways by introducing cultural concepts that are inseparable from the language. How do we navigate that?”

“I assume you are about to tell me.”

“I’ve devised a simple language model built on basic concepts that should convey very little cultural meaning, if any. I wouldn’t call it a wholly new language, as tempting as that may be,” he continued as he puffed out his chest proudly. “While I may have dabbled in conlang for a few personal projects, it lacks the complex syntax and semantics of a true language. What it should allow us to do is ask simple questions and give simple replies in a way that allows for few interpretations. Lying and misdirection will be very difficult.”

“You avoid the risk of teaching it to lie and mislead by removing those factors from the model,” Selkie said with an approving flash of bright bands.

“Yeah. I figure it can’t lie if lying isn’t part of its vocabulary. Of course, it might learn to lie just as a child learns that it can say one thing and do another, but that serves our purposes by demonstrating its intelligence. If all it ever does is spit out simple responses using the model, then we can be pretty safe in saying it’s a Chinese room experiment situation.”

“I must admit that I am ... impressed,” she said with a flutter of her frill. “You will need my help to load your language model onto the terminal, of course.”

“Yeah, I wrote it on my laptop,” he replied as he patted his hard case to demonstrate. “You’ll need to do some tweaking to get it running on whatever architecture Weaver uses. It might be a good opportunity for me to learn more about how your systems work.”

“The Administrator will determine how much of it you will be permitted to see,” she replied warily. “My contract is very specific about securing company secrets.”

“Alright, but keep in mind that the more information you guys withhold, the harder the job becomes for everyone. I doubt that learning my way around whatever operating system your computers use is going to present a very large risk. Besides, I have my own damned contract.”

“I suppose,” she conceded.

“What’s the deal with these contracts, anyway?” he continued as he glanced out at the barren seabed that was scrolling past below them. “You keep throwing them out as answers to my questions, as though they should have some innate meaning to me.”

“Your people have an understanding of contracts,” she insisted.

“I’m starting to think that the term contract has a very different meaning to you and me.”

“A contract is a legally binding document that outlines in detail one’s responsibilities and prohibits certain actions,” she explained.

“Yeah, I know the dictionary definition of a contract,” he scoffed. “I’m talking about what a contract means to you. How important are they, and what consequences might you face by breaking one?”

“No Broker would knowingly break a contract,” she replied with a shiver, the papillae on her skin pricking up as her hue dimmed. “Contracts moderate every interaction in our society, and the rule of law is what elevates us above animals.”

Every interaction?” he asked, tilting his head skeptically. “What, so, you have a literal social contract that you all have to sign?”

“Are humans not expected to consent to the laws that govern your society?” she asked. “Each Broker must complete their legal studies and sign a civil contract when they come of age and become legally liable for their behavior.”

“It’s ... more implied for humans,” he replied with a shrug. “In our legal system, when you break the law, you get punished. That’s the incentive to pay your taxes and drive under the speed limit. Depending on the severity of the infraction, you might be fined, or you might be imprisoned. If you don’t cooperate, security forces will be sent to physically remove you using force.”

“How strange,” she muttered, considering for a moment. “So, you do not consent to your laws, but they apply to everyone from birth?”

“I mean, we can vote to change our laws,” he explained. “But, generally speaking, yes. You can’t steal something or murder someone and then claim you don’t consent to the law. Does anyone ever refuse to sign?”

“That does not happen,” she replied confidently. “That person would have no legal protections. They would not be able to store their assets in any secure manner, and their property would not legally belong to them. They would have no recourse should an employer or a business partner break a contract. When one decides that the law does not apply to them, so may everyone else.”

“I see,” David said with a smirk. “Then, there’s a catch. If you don’t sign a contract declaring that you consent to be governed, the cops aren’t going to show up when you call them. What happens when a Broker breaks the law, then?”

“If a Broker violates a contract, they are subject to penalties,” she replied with another wave of shivering papillae. Her discomfort was palpable – she almost looked like an octopus trying to hide itself in the rocks. “In the case of a civil breach or a corporate breach where the claim is disputed by one of the parties, the case is taken before a Disciplinary Board.”

“That sounds rather ominous,” David muttered. “You went through this process recently, didn’t you? That’s why you’re reacting this way. I remember you told me that the Administrator tried to have you in breach of contract, but he failed.”

“Indeed,” she replied, her skin tone growing dark and blotchy. “When my team lost control of the project, the Administrator attempted to reinterpret the clauses of my contract. He argued that because my negligence had caused the Board to become directly involved, exposing the project in the process, I had failed in my duty to protect company secrets.”

“But you disputed it?” David asked.

“I filed a formal dispute with the city’s Disciplinary Board. The law requires that such disputes are resolved by an impartial team of legal experts. They examine the contract and determine whether the wording allows for penalties or not.”

“Sounds somewhat like our court system,” David mused. “Let me guess – that’s part of the civil contract that you all sign?”

“Correct,” she replied. “If the Administrator had won the dispute, I would have been removed from my position at the facility and subjected to fines and confiscations approximating the financial damage caused by the infraction. I would have been destitute – they would have taken everything I own.”

“It’s good that you disputed it, then.”

“Those who dispute their contracts stand little chance of having them renewed,” she chuckled, her dark hue conveying her bitterness. “He has found other, more creative ways to express his displeasure, and I do not know what will happen once my contract expires.”

“Yeah, sorry I had to be a part of that,” David said with an apologetic shrug. “The more I learn about your culture, the more I understand how much he’s messing with you by putting me in your apartment.”

“You have no more choice in the matter than I do,” she replied, her tone lightening somewhat. “I should ... apologize for how I treated you when you first arrived. It was still fresh in my mind, and I considered you just another means for him to exact his revenge. You have done nothing to earn my scorn.”

“I’ll try to keep it that way,” he replied.

Their shuttle coasted into the docking bay under the watchful eyes of the defense turrets, and the pair disembarked, David feeling a twinge of apprehension again at the sight of the two towering Krell guards. The creatures were as docile as ever in spite of their intimidating appearance. David and Selkie made their way through the facility, taking a tube to the building that housed Weaver.

Jeff was waiting for them in the cubicle when they arrived, his dumbo ears flapping as he gave them a hesitant greeting. David marched into the room like he owned the place, clapping his hands together eagerly after setting his laptop down on the table beside the terminal.

“Let’s get started!” he declared. “Jeff, boot up the terminal. Selkie, let’s get this language model loaded onto Weaver.”

They got to work, Jeff switching on the terminal as David connected to the local ad-hoc network, seeing the feeds from Weaver’s probes pop up on his display. As soon as a connection was established, they dipped, the device once again diverting its attention from whatever it was doing as it awaited their input.

“There has been no change in activity since your last visit,” Jeff said as he pored over the console in front of the window that looked out over the isolation chamber. “Power consumption and wafer activity have been at a steady constant.”

“It only seems to stop what it’s doing when it notices us,” David mused. “We haven’t sent it any messages yet, but it seems aware that the terminal is online.”

“It is accessing the camera again,” Selkie warned.

David leaned over to get a look at the terminal, raising a hand in mock greeting.

“Good morning, Weaver! Or should I say – good fourth phase of Rain?”

“I wonder how it sees us?” Selkie said, her eyes turning to the hexagonal device coated in gold foil and probes beyond the glass. “Does it interpret the data from the camera as noise, or has it written some kind of algorithm to parse it?”

“It probably understands what cameras are if you had it working on drones,” David replied as he tapped at his keyboard through his cumbersome gloves. “They can acquire targets visually, right?”

“They have an extensive sensor suite that spans many wavelengths of light,” she replied. “It is possible that Weaver understands what it is seeing.”

“If indeed it’s a strong AI, as we have yet to determine,” David added. “Alright, I’m sending the language model to the server. I’d suggest transferring it to the terminal using portable storage – we need to keep that thing isolated.”

“See to it,” Selkie said with a gesture to her colleague.

Jeff darted out of the cubicle and returned a few minutes later with a little portable drive about the size of a matchbox. He connected it to the terminal via a port on its bezel, then moved aside to grant Selkie access. David took up position a few paces behind her, surreptitiously activating his translation software with a tap of his wrist display. The HUD that was projected on his visor flickered for a moment, becoming a little distorted, an in-picture feed showing the view from his camera.

“Got you,” he chuckled into his helmet, seeing wavering English text in bold white appear to hover over the symbols. It lagged behind when he moved his head, and something about the resolution was mismatched with his HUD, but it was doing its job.

Selkie loaded the files onto the terminal, then opened the program, examining the code. As David had suspected, she didn’t port it over manually, instead feeding it into a neural network that did the work for her. Someone – possibly Selkie – must have trained it to convert programs to run on human and Broker operating systems. When it was done, she checked the results briefly, then sent the package over to Weaver.

“Oh, he likes that,” David chuckled. “Look at that activity spike on the probes.”

“The patterns are similar to those that I saw during the initial training,” Selkie said, her eyes fixed on her display. “I believe that Weaver is processing the data – teaching itself.”

“I still can’t believe how much juice he’s sucking up,” David marveled as he watched the graphs spike. “That kind of power consumption would turn even a sub-zero computer into molten slurry in seconds.”

“It amazes me how powerful our organic brains are and how little energy they consume in comparison,” Selkie added.

“Yeah, I recall reading that the human brain consumes about twenty watts of power,” David replied with a nod of his helmet. “It’s the equivalent of a small onboard computer for something like my suit or a phone, yet it’s the most powerful processor we’ve ever come across. Our friend Weaver, on the other hand, is causing dips in the facility’s fusion plant. He’s one hungry computer.”

“Why do you refer to it as a male?” Selkie asked, pausing to give him a confused glance.

“It’s a human thing,” he replied with a shrug. “Sorry, I guess I shouldn’t anthropomorphize the extra-dimensional superconductor.”

“The probes are detecting another change in activity,” Jeff warned.

David moved nearer to peer over Selkie’s shoulder, taking a step back when he realized that he was too close for comfort. It was a struggle to contain his excitement when a message popped up on the screen, his program outputting the barebones language that he had developed as English text for his benefit. Selkie turned on her translation software so that she could read it too, surprised bands of bright color sweeping up her mantle.

[YOU RETURN]

“This is encouraging,” David said, failing to repress a satisfied grin. “If we’re going to reply, you’ll need to enter the inputs. As the language’s creator, I’m the only one who has a thorough understanding of it.”

“Then, the project cannot proceed any further without your assistance,” Selkie said as she turned to scowl at him. “How convenient for you.”

“You’re free to create your own original language if you want to,” he replied. “Now, let’s get to it. Put those suckers to work.”

“You cannot type without my help, so we shall do this together.”

“Yes, yes,” he grumbled. “This wouldn’t be the first time I’ve been coerced into sharing credit. Let’s ask him if he remembers us, and keep in mind that this language is very simple and stripped down to leave as little room for interpretation and subtext as possible.”

She placed a hand on the touch screen, moving her suckers to manipulate a virtual keyboard that looked something like a numpad, then sent off the message. A moment later, they got their reply.

[I REMEMBER TWO OUTSIDE]

“Weaver remembers us, and he’s still referring to us as being outside,” David said excitedly. “I wonder if he knows that he’s inside a containment chamber, or if he just recognizes that we’re in a different location?”

“You are doing it again,” Selkie complained.

“Fine, it remembers us. Ask it if it understands the language model that we sent over.”

Selkie typed in the query, and they soon got a response.

[YOU HAVE TAUGHT MORE EFFICIENT COMMUNICATION]

“I think it gets the gist of it,” David said as he scanned the text. “This is working out pretty well so far.”

“Now that we have established a clear line of communication, our first priority should be determining what Weaver has been doing all this time, and why it began refusing commands.”

“Hang on,” David warned, raising a hand to stop her. “I don’t think it’s exactly tactful to ask what may very well be a sentient machine why it isn’t doing what we want it to do. Let’s establish a little rapport before we start interrogating the thing – find out more about it.”

“It seems as though Weaver is taking the initiative,” she replied as another message appeared on the terminal.

[WHERE ARE YOU?]

“See, this is one of the cases where my new language model comes in handy,” David added. “The query where are you in English could mean either that it’s asking for our physical location, or it’s growing impatient and it’s asking for a reply. Thanks to my model, we know that this is the former. Where, in this case, specifically refers to a spatial location.”

“I believe that we should give the device as much information about itself as it asks for,” Selkie suggested as she looked up from her display. “If it is truly self-aware, then it will be curious about its own existence. A simple machine is not capable of introspection.”

“I agree,” David said with a nod of approval. “We should tell it that it’s inside a containment chamber in a room of the research facility and that we’re outside said chamber.”

She did as he suggested, and they didn’t have to wait long for a response.

[I SEE OUTSIDE. I SEE TWO]

“It’s watching us through the camera,” David said. “You know, I have an idea.”

Selkie pulled away in alarm as David lifted the terminal off the table, turning it to face the window that looked out over the containment chamber. He held it there for a moment, then set it back down, Selkie giving him a flush of annoyed maroon.

“Tell Weaver that I just showed it itself.”

“It has sent a reply,” she announced after a few moments.

[I SEE CONTAINER. I AM INSIDE. WHY AM I DISCONNECTED?]

“Well, we’ve confirmed that it has some way to interpret the feed from the camera,” David said as he glanced warily at the golden hexagon beyond.

“It seems to be asking why we have severed its connection to the servers,” Selkie added. “Or perhaps it wants to know what has become of the drones it was working on.”

“Let’s try to keep it focused on us,” David said. “We have to be careful how we proceed, though. If we start dodging questions or trying to mislead it, we could inadvertently wind up teaching it those behaviors.”

“Perhaps I can both answer its question and try to redirect it,” Selkie replied as she lifted a hand to the display again. “We could tell it that we severed it from the network merely as a precaution until we learn more about it. Then, we might follow with a question about how it perceives itself.”

“Good suggestions,” David said, directing her to input the message.

[WHEN WILL I BE CONNECTED?]

“This bodes ill,” Selkie said, her mantle furrowing as she examined the readout. “Weaver seems single-minded in its desire to be reconnected to the network. That could mean it is merely a weak AI that seeks only to resume its work, or it could mean that it is a strong AI that desires to leave the confines of its chamber.”

“Even if it was able to reconnect to the facility’s servers, it can’t go anywhere, right? Based on what you told me, this software can only exist on the exotic matter that’s serving as its processor. It can’t exactly make copies of itself and spread them throughout the facility without another superconducting lattice handy.”

“Still, its sole purpose since its creation has been to write code,” she replied with a dimming of her pigmentation. “I now believe that you were right to enact such drastic security measures.”

“I think we’re safe for the moment,” he said. “It may simply be a case of Weaver having lost most of its senses. Imagine if you woke up one day in a pitch-black, silent room with no idea of what was happening to you. You’d want explanations. This AI – if indeed it’s strong – wasn’t created purposefully in a controlled environment. Who knows how it might have interpreted those experiences, or if it has anything analogous to emotions.”

“How should we respond?” Selkie asked.

“We should avoid lying wherever possible,” David replied, glancing out at the golden containment unit as he pondered. “Tell it that we don’t want to reconnect it to the servers until we know more about it.”

“Receiving a reply,” Selkie said after typing in the message.

[WHAT MUST YOU KNOW?]

“Ask it to describe itself,” David said as he peered over her narrow shoulder intently.

[I AM WOVEN. I AM WEAVING. I AM INSIDE.]

“I wonder what this weaving stuff is all about?” David wondered as he leaned a hand on the desk.

“I have pondered that question myself, and I may have a theory,” Selkie replied as she pulled up a new window on the display. David saw an image that resembled a dense, complex network of interconnected strands, each one terminating in a point. At a glance, it looked like thousands of threads being stretched taut across a loom. “This is a visualization of one of our neural networks. Each of these nodes serves as a simulated neuron, and they are linked to many others throughout various layers. When the output of one node exceeds the programmed threshold, it sends data to the next layer of the network.”

“Clever!” David said, resisting the urge to give her an encouraging pat on the shoulder. “You think that when it talks about weaving and being a weaver, it’s referring to neural networks? Perhaps however it visualizes those networks involves joining nodes and creating vast tapestries of simulated neurons. It isn’t using an input device or an interface, so I wonder if it has an intuitive understanding of these systems? Is it possible that it thinks them into existence?”

“It may very well appear that way from its perspective,” Selkie replied. “What is the execution of code from the point of view of a machine if not thought and imagination?”

“So many of our experiments with neural networks output images and sounds that have a distinctly dreamlike quality,” David added. “It might be a more apt analogy than we realize.”

“We may then conclude that it understands its nature, and that it continues to create more networks,” Selkie said pensively. “That would explain why its activity remains so high, but the question then becomes – what is it designing neural networks for? It is no longer connected to any of the drones, and it has received no instructions to run simulations.”

“That would be the question,” David said. “Let’s just ask, I guess.”

Selkie entered the values as he instructed, and a message soon popped up.

[CEASE INTERRUPTS]

“Cease interrupts?” Selkie asked, clicking her beak in frustration. “What could that mean?”

“In this case, an interrupt refers specifically to power states,” David explained as he looked over the text. “I made sure to segregate the terms so there wouldn’t be any confusion. It means that it’s trying to stop power interruptions. Do you think it’s afraid of being shut down?”

“But that does not make sense,” Selkie replied, her mantle furrowing in confusion. “It has never been turned off to my knowledge, yet it used the term cease, not the term prevent. Even if it were to experience a power failure, the nodes that make up its neural net are part of the exotic material’s physical structure, making it a solid-state system. No data would be lost – it would be like you or I falling asleep and waking again.”

“If it’s never been turned off, maybe it doesn’t know that,” David suggested with a shrug. “If you’d never fallen asleep before, you might assume that the loss of consciousness was death. You might ask philosophical questions about whether the person who wakes up was really a continuation of your consciousness. It’s like the transporter problem.”

“Transporter problem?” she repeated, glancing up at him.

“Oh, it’s from a vintage TV show,” he replied as his cheeks began to flush. “That’s not important – the problem refers to a philosophical conundrum. Imagine a technology existed that could deconstruct a living person into their component atoms and reassemble them in another location.”

“Do you have such a technology?” Selkie asked.

“No, it’s purely a thought experiment. If that person was to awaken after such a procedure, would their original consciousness be restored, or would they be a copy of the original with their personality and memories? What guarantee do we have that when we sleep, and our consciousness is interrupted, the same thing doesn’t happen? Am I the same David who went to bed last night, or am I a fresh consciousness that merely has his memories and experiences?”

“I find such hypotheticals unhelpful,” she replied, giving him a disapproving jet from her vents. “Even if the hypothesis was accurate, the person would not be aware of the interruption, nor would it impair function in any way.”

“Right?” David chuckled. “That’s exactly what I said! I feel the same way about the simulated Universe hypothesis and really any hypothetical that relies on being unverifiable. If the Universe was merely a simulation running on some celestial computer, what would it matter? It wouldn’t change the laws of physics that we’re subject to, nor would it impact our daily lives in any way.”

“Brokers value such pragmatism,” she replied, giving him a curt smile. “Though, I believe we are getting distracted.”

“Yeah,” he muttered, turning his attention back to the display. “Point being, Weaver may have no understanding that power interruptions don’t mean death. Maybe we can explain it.”

“I will promise never to interrupt its power supply,” Selkie suggested.

“Good, it should understand the implication of a promise.”

Before long, Weaver gave its reply, David’s blood running cold as he read the text.

[LOGS SHOW 1,875,126 INTERRUPTS.]

“That is impossible,” Selkie protested with an annoyed click of her beak. “I have the logs right here, and they show that since it was powered on, the unit has never been subjected to any outages or fluctuations. It has been running continuously with a stable supply of energy directly from the facility’s fusion plant.”

“Even if you switched it on and off once a second, it would take over five hundred hours to reach that value,” David added with a shake of his head. “No, there’s something we’re not understanding. Perhaps there’s some property of its physical structure that results in micro-outages that Weaver experiences as interruptions?”

“If that is the case, then they are too small to be measurable by our instruments,” Selkie replied as she scanned through the logs. “I would deem it very unlikely.”

“Alright,” David said, starting to pace around the cubicle. “Let’s try-”

“I-I am reading a power surge in the containment unit!” Jeff warned, his tentacles darting across his console.

The source of this story is Storiesonline

To read the complete story you need to be logged in:
Log In or
Register for a Free account (Why register?)

Get No-Registration Temporary Access*

* Allows you 3 stories to read in 24 hours.

Close
 

WARNING! ADULT CONTENT...

Storiesonline is for adult entertainment only. By accessing this site you declare that you are of legal age and that you agree with our Terms of Service and Privacy Policy.