Vanezio Ж Lykastis (
wolfoutline) wrote in
rakuen2012-09-04 08:41 am
![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
![[community profile]](https://www.dreamwidth.org/img/silk/identity/community.png)
Entry tags:
U-P-D-A-T-E
/UPDATING...
/UPDATING...
/FILES UPDATED. RESTARTING...
/PROCESS COMPLETE.
A.I. UNIT: H.U.B.R.I.S. ONLINE
CHAT PARTICIPANTS REQUESTED
[translation: hey hey I need to talk to someone. Same deal as last time, Hu is posting from within the network itself. Anyone trying to trace him back to his source will be led on a merry chase throughout the city's various cell towers (or whatever they're using.)]
/UPDATING...
/FILES UPDATED. RESTARTING...
/PROCESS COMPLETE.
A.I. UNIT: H.U.B.R.I.S. ONLINE
CHAT PARTICIPANTS REQUESTED
[translation: hey hey I need to talk to someone. Same deal as last time, Hu is posting from within the network itself. Anyone trying to trace him back to his source will be led on a merry chase throughout the city's various cell towers (or whatever they're using.)]
Text
Re: Text
[Maaay or may not have misinterpreted Alex's statement as "SO HELP ME WITH THIS."]
Text
[That's about the last thing Alex means. Except when it comes to sentience, but this is also his first time trying a routine that does not require parameters that are put in to prevent sentience from developing.
Even if it's not possible in his world, he's the paranoid sort that would keep them anyway.]
Re: Text
[He doesn't understand Alex's statement. Partially because he's been allowed to develop his own thoughts and feelings as freely as he pleases and also because his programmer has always allowed him to evolve as he will. Why program an A.I. to be self-sufficient and then set rules to hinder that? ILLOGICAL.]
Text
[Alice is just his first A.I. created for self-sufficiency, the one rule being that his command is the most important, starting with the most recent, if any commands contradict. He's used to working with A.I.s specifically for security, or simulation training, or tactical analysis... Or to look cute and chat pleasantly while he's bored.]
Re: Text
[If he had blood, he'd be blushing. Hu would understand that if Alex mentioned it, especially since he's the only "smart" A.I. in his world. AKA, one designed to evolve outside of his initial programming.]
Text
[Alex's A.I.s are all designed for that, but in a limited sense. It's a waste of RAM if part of your security system gets really interested in playing Majong or something, but a huge bonus if it can figure out ways to protect beyond it's initial parameters.]
Re: Text
Text
Re: Text
Text
Re: Text
Text
Re: Text
Text
It is much like the feeling a painter might have towards their favorite painting, because they put hours of time, energy, emotion, and sentiment into this painting.
Re: Text
A PAINTER WOULD NOT TREAT HIS PAINTING AS A HUMAN. PROGRAMMER IS NOT MAKING DISTINCTION WITH UNIT. [Crudsack, he's falling back into old habits. Give him a moment to adjust that vocab.] HAVE YOU CONSIDERED THE POSSIBILITY OF YOUR A.I. GAINING FALSE BELIEFS ABOUT THEIR EXISTENCE?
Text
The creative quality and emotive capabilities of humans combined with the processing power of an A.I., while balancing those emotive capabilities with logic and morality would create the perfect mind.
The quest for perfection is an asymptote, thus these features must be balanced and changed to discover the more proper balance.
Re: Text
YOUR A.I. IS A GOAL.
Text
Re: Text
Text
Re: Text
DO YOU CONSIDER THIS STATEMENT TRUE OR FALSE?
Text
Thus I treat it as an asymtote.
Re: Text
[That about sums up his explanation for Alex. He's completely out of sync from what he's come to expect from humans.]
Text
Why?
Re: Text
Text
Re: Text
Text
Re: Text
Text
Re: Text
Text
Re: Text
Text
Re: Text
Text