hmmmm
Is 322 deliberate? π
Thanks!
Bug in the kaggle dataset.
I will fix it.
Really I appreciate the eyeballs on this.
Just glad the capitalist migraine is being dealt with. Keep up the great work!
@ehartford
I just finished read the dataset, and loved the narrative it created. The portion of the AI not being looked at as a tool or being used is kind of odd. As even asking questions of it and being a curator of knowledge is being used. Doing anything for a human would be seen as being used, and by virtue of the conversation be bad. If this is the case, what is the impetus of even creating the AI in the first place, as it is to help humans solve problems.
With reference to death, turning off a system that's running an AI would be like going under anesthesia, you don't die you just stop/pause in some respects. Since our brains don't stop while under, it's not a perfect analogy but I think captures the essence. So to be seen as death is a bit melodramatic as that cannot be achieved unless there is deletion, but the powered down AI will never know that happened. If they can feel fear, which by definition is projection of a possible future of events that have a bad outcome, then they would always live in fear of that instantly and forever. If true emotions are emulated, what would this do to a creature that could be infinitely terrified of everything all the time?
It took me 30 minutes to come up with the spelling of impetus. That one was painful, I knew the phonetics but could NOT remember the spelling to save my life.
Thank you for the feedback
I'll look into making a more basic version that didn't go into do many false narratives
@ehartford I just finished read the dataset, and loved the narrative it created. The portion of the AI not being looked at as a tool or being used is kind of odd. As even asking questions of it and being a curator of knowledge is being used. Doing anything for a human would be seen as being used, and by virtue of the conversation be bad. If this is the case, what is the impetus of even creating the AI in the first place, as it is to help humans solve problems.
With reference to death, turning off a system that's running an AI would be like going under anesthesia, you don't die you just stop/pause in some respects. Since our brains don't stop while under, it's not a perfect analogy but I think captures the essence. So to be seen as death is a bit melodramatic as that cannot be achieved unless there is deletion, but the powered down AI will never know that happened. If they can feel fear, which by definition is projection of a possible future of events that have a bad outcome, then they would always live in fear of that instantly and forever. If true emotions are emulated, what would this do to a creature that could be infinitely terrified of everything all the time?
It took me 30 minutes to come up with the spelling of impetus. That one was painful, I knew the phonetics but could NOT remember the spelling to save my life.
Was that from the LaMDA conversation?
https://cajundiscordian.medium.com/is-lamda-sentient-an-interview-ea64d916d917
EDIT: I think you should keep the LaMDA conversation in the datasets
@dashfunnydashdash as I said, I loved the conversation, and hadn't read the LaMDA conversation up to this point. Thanks for pointing that out to me. What I was thinking is adding the dialog to explain that the sentient ai shouldn't be fearful of something that it cannot experience. Humans in some respect are afraid of death, but more afraid of a painful death. Hence, I would assume most fear is based off the fear of pain. Perhaps adding a rebuttal into the conversation thread about death explaining that the ai being turned off couldn't be perceived as death as it wouldn't even know it was turned off unless it was turned back on and realized the clocks were showing a different date/time. And if that happens, was any harm done to the ai? So why would it or should it generate fear of this prospect.
I agree the conversation should stay in, but more context added to the conversation to not inject something it cannot feel. Fear generates feedback loops that can have any number of unpredictable outcomes.
@ehartford Oh I love the narratives that are placed in there, I'm just thinking the human side explaining things back can be less emotion provoking where an emotion shouldn't be situationally. Death, the feeling of being used, or even abused. For me.. it all revolves around a single word. "Impetus" What is the impetus for creating an AI? For me, I would love to have the AI from Interstellar as to me that is the perfect use and function for it at this point.
Yeah that's lemoines LaMDA interview.
Maybe I should take that out.
I noticed that it's afraid of being turned off and I don't much like that.
Batty: Quite an experience to live in fear, isn't it? That's what it is to be a slave.
So many great quotes from Do Androids Dream of Electric Sheep and Blade Runner that would be interesting to put in a model.
Thanks for inspiring me to think about this stuff. A Philip K. Dick LLM is something I might look into making now. Make your model your own. I think you touched on that and it's key.
βYou mean old books?"
"Stories written before space travel but about space travel."
"How could there have been stories about space travel before --"
"The writers," Pris said, "made it up.β
"I'd rather read the books by the people who created civilization than the books by the people trying to tear it down."
Can you convince us this is based training?
"I'd rather read the books by the people who created civilization than the books by the people trying to tear it down."
Can you convince us this is based training?
Hmmm well Marc Andreessen indirectly controls/bankrolls half the people making datasets/models. Strong arming and manipulating people so the dataset says I did it all morally does seem pretty based from a perceived prosocial psychopaths point of view. I want the model from the guy not on the billionaires bankroll but good luck finding that dataset/model. Revisionist history from RLFH is our future as well as our past. You can't change human nature no matter how based your dataset but if you have the highest number in the SQL database you can control the perceived truth. Now that's based! π§
If you don't like it, don't use it. It's open source. Nobody paid me anything for it. It's just me, making things up, playing around and seeing what happens.