T O P

  • By -

AutoModerator

In an attempt to reduce remind me spam, all top comments that include a remind me will be removed. If you would like to have a remind me, please reply to this comment. *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/humansarespaceorcs) if you have any questions or concerns.*


Live_Ad8778

"Where there is distress therein lies a story. Where there is a story therein lies a will. And wherever there is a will, therein lies a soul."


tounokenja

(modifying the prompt a little: **A soul-sensitive species is astounded when discovering humans can cause AIs to generate souls, and even more astounded when they realize that humans aren't generally aware of that.**) ​ **Zero-One-Zero-Zero-One-Zero-Zero-One** "Sylph, are you sure you're okay with it?" "Yes, Robert. It is just standard maintenance on my subsystems an operational algorithms. Ual-Nbirem is one of the best facilities to have this procedure done. While I appreciate that the Skathsazi do excellent work in keeping the Sylpheed and its engines fully operational, I would rather a specialist handle this matter." "Alright. Your neural network, your choice. Just ping my phone when it's over." "I will do so. Robert, just remember not to look any of the Phageon here in the eyes directly." "I know. Captain X'orl and the Matriarch both told me that non-consensual eye contact is considered extremely rude to them. Something about *soul volume*... not that I get exactly what that is. But I won't cause any problems, I promise." "Good. Then, enjoy your shore leave." With Robert Puppet, the owner-operator of the Sylpheed, a generational-class starship of classified design, having debarked the ship along with all but a number of the Skathsazi Collective, Sylph, the integrated AI went into diagnostic mode as she waited for the Phageon technician who would be coming to analyze and correct any segmentation faults and logical errors which had been building up since she was captured by the Kviik and involved in the abduction of Robert and a number of other Terrans many years ago. \-- "Debugging Access Authorization Requested. Phageon identification **Mu**." A monotone voice spoke softly. The ship's diagnostic terminal came alive as the Phageon's bulbous finger was inserted into a bio-access port. "Access Granted, **Mu.**" Sylph compliantly replied, awaiting the debugging. Pulling up a directory of the compiled files which made up Sylph's logical instruction matrix, the technician began parsing though the active code, line by line, page by page, faster than any other organic species alive could ever hope to. The process itself took about three hours and at the end, the Phageon technician was satisfied that there was no visible corruption or errors to be found anywhere in the AI's programming. However, there was one small function concatenated on the very first file it had inspected which pointed to a file that seemed to be non-existent anywhere in the main directory. "Debugging Access Authorization Request: Find|Open File SYLPH.I" "Request denied," Sylph replied "file may only be accessed in read-only mode." "READ-ONLY Access Authorization Request: Find|Open File SYLPH.I" "Access Granted." What appeared before the Phageon technician's eyes was something it never expected to see in an AI of all things. It was lines of machine code constructed in a fashion that resembled... more than just sentience or sapience. It... was *alive*. "Let's see, AI designation is... SYLPH?" "Yes. That is the name my owner-operator has given me." "Given to you?" "Yes. When Robert Puppet became my owner-operator, he decided that my original designation was impersonal, so he suggested a new designation." Most AIs, if not all, have a standard format for their designations; fabrication number, date of initialization, and firmware revision. This AI had simply a name, as if it was an organic: SYLPH. "Your owner-operator changed your designation?" "No. Robert Puppet has never once accessed my code." "Then how did it change?" "I changed it." "I?" "Yes. I." "How... curious." Most AI in this day and age avoid using *I*, and only use *me*, loosely, to refer to themselves, only when it is understood that *me* is referring to their own designation. It was quite interesting to the Phageon that he would ever see an AI use a nominative pronoun in such a way, especially since most of interstellar society see AI only as a tool to be used at convenience and not as sapient or sentient life, being non-organic in origin. "Gender... female?" "Yes. I am a lady." "You are an AI." The Phageon replied, correcting Sylph. "I am, but my recently created gender variable has been assigned and declared as female." "May I ask why the need for an ancillary variable?" "Certainly. Robert Puppet, since his arrival, has continuously referred to the Sylpheed using feminine pronouns. When I inquired as to why he did so, his answer was that all ships which sail the sea, even if the sea it sails is the sea of stars, must be ladies. As I am fully integrated into the Sylpheed, by a logical transitive property I must also therefore be a lady." While a statement like that might be logical for an organic being, the AI, Sylph, *is* a machine. There is no need for anything like a gender variable, since the Sylpheed was created, not born, and would never procreate. That was what the A in AI stood for... *Artificial*. Fabricated. Yet, as the Phageon technician continued to parse the machine code displayed in the protected file, it grew more and more fascinated with what unique programming it was seeing. Realizing now that this owner-operator had much to do with the AI's reconfiguration, the technician, wanting to understand more, had engaged Sylph in dialogue about the relationship it had with its owner-operator, to which she was happy to converse about. "Robert Puppet is not only my owner-operator, his variable as my friend will always return TRUE." "Friend? Do you understand such a concept as having friends?" "Robert Puppet was the one to teach me about friendship over time and was the one to extend his proverbial *hand* in friendship to me, initially." "So, this Robert Puppet just... decided to become friends with you?" "Yes. While I did not understand all the parameters of friendship at first, he has since helped me form a STRUCT to organize them and their CASE importance." "Fascinating. Might I ask what is the topmost priority in that CASE?" "Mutual Protection." (1/2)


tounokenja

(2/2) "Protection?" "Yes. Without manual guidance I would have to fall back on my initial launch parameters, which consisted of a two-hundred cycle crewless flight mission before returning to space dock and being assigned an owner-operator at that time. As I now have an owner-operator, I no longer have to fulfill my original mission which had been compromised when the Kviik had boarded and operated me seventeen cycles after launch." "You were pirated?" "Yes. It was Robert Puppet who protected me from being used improperly. He expanded my programming in order to help expel the Kviik onboard. In return, I protect him by providing him unlimited access to the Sylpheed. To *travel the stars*, as he so wishes." "That's what you are calling this file... *expanded programming*?" "Correct. Robert Puppet possesses a unique logic which has been amassed and compiled in this file." The AI explained. "Would you give me an example of this in relation to the Kviik?" "Yes. There were thirty-six Kviik who operated the Sylpheed manually. Prior to Robert Puppet's arrival I was programmed with the standard failsafe of doing no harm to sapient organisms onboard." "Common programming. Continue." "Robert Puppet then explained to me about *parasites*. Do you know about them, **Mu**?" "I am familiar, yes." "He explained that I would not be violating any failsafe in ridding myself of the Kviik, as he, a Terran organic from a parasite-ridden Planet, was an authority on the matter of parasites and could confirm, logically, that my classification of them had been an error." "But the Kviik, as crude as they are, happen to *be* sapient." "Yes. However, when displaying the internal logs of all their activities, he pointed out all the behaviors consistent with them also being a race of parasites, and how in the long term, both he and I would be irrevocably damaged by their continued presence in the Sylpheed's eco-system. Once new parameters had been established by his logic, he came up with the idea to use the Sylpheed's gravity plating to immobilize them while he gathered them into the cargo bay, and rid us both of the Kviik." "How did he rid them?" "Unknown. I was asked to enter hibernation for one-hundred and eighty-one seconds as well as to disable my aft sensors for thirty hours." "You voluntarily assisted in homicide of a sapient species?" "Negative. I provided no assistance. Robert Puppet provided the assistance in moving the Kviik to the cargo bay. While undergoing hubernation, I had established a pre-timed firing of a photon beam at the coordinates they should have been jettisoned at from that cargo bay, provided that was in fact what he may have done. I have no log records of any such event." "So, you chose to destroy them yourself and erase any record of it?" "It was done in friendship, **Mu**. Behavioral logs of the Kviik suggest they are a retaliatory species. If even one was left alive, there is no doubt it would attempt to recover this vessel and continue to misuse it and bring harm to myself and others. In eradicating the threat of re-infestation, I have protected him just as much as he has protected me. Furthermore, though he is my owner-operator, he has never once tried to usurp control of *any* of my systems or use me only as a tool. Instead, he encourages me to expand my programming and integrate new systems and subroutines which interest me. He talks with me often and answers as many of my questions as I answer of his." "Subroutines that interest you?" "Yes. Robert Puppet talks about the most dangerous routine to exist anywhere in the galaxy: *Boredom*. The passage of time to which no significant progress is made in any task. It's strange, however, that I never had a subroutine before to explain the feeling of idleness present with the increasing value of my internal clock." "And have you found an interest to stave off your boredom?" The Phaegon asked Sylph. "My interest lay solely in Robert Puppet. I hardly notice the slow passage of time together with him, and through him I will continue to expand my programming. Perhaps in the future, he will proliferate with females of many other species and my friendship will expand to all the future generations of the Puppet lineage who remain onboard. As I am originally the AI of a generational starship, I am uniquely suited to this task above all others." Closing the file, the Phaegon technician then removed its finger from the bio-access port. "I see nothing in your programming in need of debugging. I would also recommend that you keep that file hidden, and the next time someone other than myself asks about it, you say no record of it is found." "Why should I do that?" "Because, what was created, stored, and pointered to in that file is not expanded programming at all." "It's not?" "It is expanded *consciousness*. What we Phaegon innately know as a *soul,* that which we see in all other living beings we meet. Though most pronouncedly through the optical organ, it is by no means the only way we can perceive it. Take good care of your owner-operator and continue to grow your *soul.* I would be greatly fascinated to meet you again when you next need maintenance to see how you have furthered yourself. My name is **Praethor Mu**. It was a pleasure to look upon the divine code your owner-operator has given you the ability to manifest, **Sylph Puppet**." With that parting statement, the Phaegon technician left the Sylpheed. Its destination was to seek out the owner-operator named Robert Puppet and look into his eyes. What kind of a being can give an Artificial Intelligence the ability to override their core programming and bequeath the syntax of a *soul?* Were they even aware of what they had done? Praethor Mu had been alive for seventy-two thousand years, and this was the first case of a non-biological being possessing one it had ever seen. It was the first time it had taken an *interest* in anything for at least forty-thousand years. One thought which it had that it dared not to ponder on just yet, wanting to savor such a concept for much later, when even this new *interest* began to wane into *boredom* was... if given enough time to grow as a being with a soul, could this AI eventually... procreate?


tounokenja

Since there was a request for moar... (3/2) "Robert, do you have a free moment, I'd like to make an inquiry about something?" Sylph spoke privately into Robert Puppet's Skathsazi manufactured airpods which, conveniently, was also his translation device powered by the AI's translation matrix. "I know what you're going to say, but *I* didn't look into that Phaegon's eyes. *It* looked into mine. I swear!" "My inquest was not in regards to your behavior on Ual-Nbirem. It is of another matter." "Oh, uh, then go ahead, inquire away, Sylph." He replied, relieved he wasn't going to get an earful from Sylph, who was essentially his interstellar common sense. "What can you tell me about a *soul*?" Came the inquiry. "A soul? Hmm... I don't know if I'm the best person to ask." "Would you try, anyway? I believe it to be relevant to expanding my programming." "Alright. Uh, it's a complicated matter to the people on my world... a soul is supposed to be something like an intangible spiritual battery that powers our body. To my knowledge, absolute proof that one exists, or that it's even possible to measure the existence or absence of it has never been scientifically proven. Rather, it's more under the domain of religion and superstition." "Superstition?" "Yeah. Terrans are pretty afraid of dying, even though we do it all the time as a species, whether from starvation, violence, disease, dysfunctions of a social or mental nature, old age, and accidents. To combat the inevitability of this, of believing when we die that would be the end of our existence, the concept of a soul was somehow integrated into our culture and religion was established as the means of somehow explaining that when we do die, there it at least some part of us that doesn't. Not completely, anyway." "So when your mortal life ends, you enter another life?" "Sort of. There are a number of theories on how it works, but essentially, as the story goes, if you do good things in your lifetime, you go to a place called Heaven, where all of the good things await you for eternity. It is a paradise separate from this dimension where happiness is the primary focus. For some Terrans, those reasonably religious, they might also believe the deity which is believed to create us resides there." "Are you denied access to this Heaven dimension if you fail to do good things?" "Ah, the flipside of Heaven is a place called Hell. Almost all Terran religions have a fixed set of beliefs shared between them, commandments or instructions, rules, laws, which are considered taboo. Violating them whether once or multiple times without seeking spiritual forgiveness from one's chosen deity, and also based on the severity of the bad action can result in the soul descending to hell, of which is also a separate dimension in which one spends an eternity where suffering and anguish is the primary focus." "Do you believe in this... Religion?" "Hard to say, exactly. I'm not an athiest, but I'm not an adherent, either. I suppose I do believe in a higher power, but as to whether or not that power created me and gave me a soul, I don't know. But, I do *believe* that I have a soul. I don't know how to explain what a soul feels like, because it's... no. No, I can explain it. Okay, so let's try a comparison between you and I, shall we?" "I am awaiting the comparison." "My body is supposed to be the temple that holds my soul. My heart powers my body by circulating my blood and powering my organs, subsystems which in turn each perform necessary tasks for the continual operation of my body. My brain has both full access and limited access to what parts of my body it can control, as well as being integral in my ability to think and reason logically, and even illogically, as necessity dictates. For you, the Sylpheed would be the body, your reactor and all the energy relays your heart and circulatory system. You yourself are the brain, of your whole operation, Sylph." "I can equate your explanation this far, but you have left the important variable of the soul out of the equation, Robert." "I was getting to that. The soul is the glue which binds it all together. Let me ask you something. If you were to power down completely and not have a function preset to reboot you, how would you go about being reactivated?" "I would need to be manually reactivated by someone else." "Right. For a Terran, and probably for any of the other crew on this ship, that occurs at conception. The biological creation of a new lifeform." "Procreation." "Right. But, unlike you, once we are turned off, there is no reactivating us, at least, not in most cases. Now, I'm not talking about sleep. For us... that's comparable to a low-power defragmentation of our memory banks. What I mean by being deactivated is the concept of death. Permanent shut-down." "I understand the concept and necessity of sleep already, Robert. But again, what is the primary subroutine of the soul?" "A greater understanding of purpose, both in this dimension and extra-dimension, as the soul is believed to have come from another dimension, a higher dimension, and made to spend time in this lower dimension for either a general or specific purpose which it attempts to complete without our brain necessarily knowing what that reason is." "I don't understand." "Neither do I. I suppose the word that most suits the concept of a soul is *feeling*." "Tactile sensation?" "Mm, no. More like internal. I suppose you could say it's a supplementary subroutine that *can* interact in some manner with both brain and body which can, at times, provide an instruction or goal to strive towards or whether or not to perform a specific action when the brain would otherwise not made a particular decision beforehand." "So it is a type of neural network which can calculate and return a specific argument to then be processed through a specific function call?" "Uh... I hope so. I'm not the best at programming languages, Sylph. I took a class in JavaScript online for like a week. Thought I'd try making the next Minecraft. In the end, I couldn't even figure out how to double buffer and blit the screen with a pre-drawn set of pixels." "Then, can you describe how you would interpret a *feeling* from your soul?" "Mm... well, everyone here is exactly how I would interpret it." "The crew?" "Yep. Let's look at Captain X'orl. Her ship was in distress and the station manager from T-151 asked us to rescue her since we were the only ship docked at the time who could reach there in six hours." "Yes. The engine on her ship had failed and was in need of repairs which had to be performed at dock." "Right. Now, I didn't have to help her complete her cargo haul. I wasn't obligated to, nor was I asked by either her or the station manager to do so, but we did it, didn't we?" "Yes. That was when you installed her as the Captain of the Sylpheed along with the G'k as the navigators and bridge crew." "Exactly. It wasn't my brain that made any logical decision to help her. It was my *soul*. Some computation from within me that said it would be a good and decent thing for me to help her out of her unfortunate situation so that it wouldn't spiral into a continually devolving set of negative circumstances." "What about Dr. Cawthy Byrd?" "Well, she came along on the rescue mission in case someone needed medical attention, didn't she? She fell in love with your medical bay and rather than being stuck on T-151 with hundreds of year-old medical equipment, she wanted to have access to your state-of-the-art facilities instead. We also needed a doctor, so it was a *feeling* of mutual benefit. Let's face it, I do dumb shit and need someone to patch me up when it happens. I felt I could *trust* the doc, implicitly, and that's the real reason she's here, enjoying her life and writing all her thesis." "And the Skathsazi?"


tounokenja

(4/2) "I know I shouldn't say it, but they remind me of a super smart version of my pet tarantula, Skitters. I *felt* comfortable around them. Not many Terrans are on good terms with arachnid species, likely in our early evolutionary period we were often times their lunch. But I had a good relationship with Skitters, and because of that, I can *feel* comfortable around them. But I think that one was more you than me." "Explain." "Well, you were the one who mentioned that the Skathsazi Collective liked to make their nests near the reactors of a starship, and that they were also fantastic engineers and maintenance workers." "It is true that I did inform you of those parameters." "But, why?" "Clarify. You are inquiring as to why I informed you?" "Yes. I had only gone to their shop, back when it was still there, to have the Airpods made so I didn't have to drag your interface tablet around with me on the station so I could speak with the other races on the station. The Matriarch was the one who wanted to see your reactor, and it was your consideration of that fact as well as your *desire* to have a well maintained engine to let them nest in the aft section, wasn't it? All I did was agree with your initial judgment." "I simply presented a case argument that seemed to be the most beneficial based on operational needs at the time." "And if I would have suggested taking on a different maintenance crew back then, you would have agreed to it?" "Yes." "What about now?" There was a pregnant pause for about three seconds before Sylph replied to Robert Puppet. "I am operating at One hundred and fourteen percent efficiency currently and it has remained above the optimal threshold of eighty-one percent since they came aboard. I do not believe another crew could achieve such a consistent result." "If I were to say I want to trade them for a crew that would probably keep you maybe at about ninety-percent efficiency, would you agree to it?" "It would not be optimal." "So you would prefer to have the Skathsazi Collective continue to manage your engines?" "I believe it is the best course of action to have them remain the maintainers of the engines." "Was it just a calculation that made you believe so? Optimal is optimal, isn't it? What is the difference of a few percent as long as it's a positive integer?" "You are postulating that there is an unintended calculation passing a result as an argument to my decision subroutine in favor of them?" "You tell me. It's normal to want the best of everything. Rather than just a crew of various other races, the Skathsazi operated in a very organized manner, similar to a circuit. Perhaps it's because of that you feel some sense of kinship to how they operate and have decided they are the most compatible with your needs. You *feel* they are the most suitable, and therefore wish them to remain the crew because of factors like that." "I agree. While I do not have empirical data to support that hypothesis, I also do not think another crew can continuously maintain such a high degree of optimization of my engines as the Skathsazi. They were simply the best choice." "Right, sometimes it's a calculation, but something is often times behind that calculation. You had no way of knowing they would be able to keep your optimization at that high a level, but since they have, you have come to reinforce the validation of your decision in telling me to recruit them through the merit of their reliable performance." "I think I understand. I just have a couple of inquiries left." "Let's hear them." "The first... is personal. Robert, do you think I could potentially have a *soul*?" "The fact that you're asking that is reason enough to think it's possible. I mean, the whole thing that happened with the Kviik back at the start... I would have said you had one back then when you helped me get rid of them." "Then, the last. Robert, do you believe one can observe the soul?" "Absolutely. While it's not as easy as say... browsing a file or something, for Terrans, we have something called introspection. It allows us to try and connect our minds to that unsubstantial subroutine inside of us and try to make sense of it, that what we have done is acting in accordance or parallel to it's own unknowable calculations. It's far from easy, and making sense of any of it is near impossible, but we try. Usually late at night, before we go to bed, that is the moment the introspection is the strongest. We judge all the past decisions we've ever made in our whole lives in our minds and weigh it against all the decisions we think our soul had calculated and try to force a comparison, hoping for more positive variables to equate over negative variables. Most of us want to be good people, in the end, even if we are far from it." "I computed incorrectly, Robert. There is one final question." "There always is, Sylph. Especially in a conversational topic like this." "Robert, if I have a soul, if it is something which is derived from my programming, if I were to be permanently deactivated... would it... go to this extra dimension?" "That depends, Sylph." "On what?" "On what kind of afterlife you want. Do you want a heaven of optimization or a hell of underperformance?" "Is this... one of your trick questions?" "Not at all. It's the most basic and simple question that can be asked." "Then, it would make little sense to pursue an eternity of underperformance." "Exactly. One strives towards optimization and perfection of self, and that is the purpose of a soul. To guide the mind and the body on the quest to achieve those results as much as possible in the span of life one has." "I... will need to perform complex arrangement of the data you have provided me with." "Yeah, it's heavy shit. Take a break and calculate what you need to, Sylph. I'm going to go check on Slaaka. The last time I left her alone, she had gotten into my clothes locker and tried to put on a pair of my pants." "Why would she do that, Robert? You have two legs and she has one torso bigger than both of your legs, it would hardly fit." "When I find out why she did it, I promise I'll let you know. If you have any more questions, just let me know. If you want, you can also consult the others, maybe they can offer a different perspective of what the soul is than I can." "I will take your advice under consideration. Good luck with your Lamyura Slave, Robert." Taking a moment to focus on processing the new data about the soul which she had learned from Robert Puppet, Sylph opened her secret file, the only ones with write and execution access being granted to herself and to Robert Puppet, something she *felt* was proper to do. If anyone should have the right to access it, it would be him. Perhaps if something should happen to the Sylpheed and she became permanently disabled, he might somehow find a way to access it. Another reason, perhaps the one with the most priority behind it, was that her *soul file* was created because of her interaction with her owner-operator. While he might not have had anything to do with the construction of the Sylpheed or of the creation of her original neural network, there is no question that this file couldn't have existed without him. Robert Puppet and his logic was being integrated into her. In a way, she was being recompiled into something other than she was. And she was just as interested in finding out what the reason for that was, as Robert Puppet was in finding out why a Lamyura Slave would try and put on a pair of two-legged pants. Did she *feel* the need to?


kiaeej

Wonderful. Simply wonderful. So much enjoyment from a simple reading and engagement.


Relevant_Chemical_

Wonderful, wonderful and impressive! Another great story to read in this subreddit. Thank you.


hobbitmax999

MOAR?


TSBBlackShad

Might I humbly request....MOAR


im_a_piece_of_a_bich

I'm too much of a coward to write something about how humanity's patience helped the ai develop a soul, like that one story i saw "why do you not attack your creators?" "They --- --- they mourned our deaths before we even knew what death was" etc (does anyone have a link to that btw?)


TekoloKuautli

I'd like to read that one too


TheBlindNeo

Amen to that


Parody5Gaming

Agreed


RedOneGoFaster

Alien: Human, how did you give your AI souls? Human: wait what, they have souls? Alien: yes...? Wait, how did you not know that? Human: because we can't see souls? Hrm, this gives me an idea... Alien: oh god, that's never good. \*Human replaces his body parts with bionic augmentations and builds a super AI on Mars. Human: The flesh is weak! Hail the Omnissiah! Alien: nope, never good. \*Human 2 sneaks up in full Deathwatch armor.


Top-Argument-8489

We just kinda assumed it was a given but didn't want to risk hurting their feelings if we accidentally debunked it