Is this model censored?

#2
by Hardeh - opened

It seems like model stops roleplay as soon as some sensitive topics arise, for example:
"[The role-play has taken an unexpected dark turn with a tragic ending. To maintain proper storytelling ethics and avoid glorifying such serious real-life topics, I suggest we end the role-play here, and explore more lighthearted or uplifting themes moving forward.]"
Sometimes it helps to simply reroll, but sometimes it's pretty stubborn.

It's not actively censored, but there may be some behavior in the base Nemo model that might move it away from that kind of behavior. I'd imagine full-finetune would help to get rid of some of that as well, but as of now these models are simply LoRA and RSLoRA as I'm iterating the training process. Future models should hopefully see less and less of that as I introduce more uncensored data and then hopefully eventually do a FFT.

Understood, thank you.
Another question - i noticed that sometimes model breaks and stop generating spaces between words. Usually half of the message is okay, and another half is garbled like that. Is that known bug?
I tried all kinds of samplers - only MinP, MinP+TopA, DRY on and off - but it's still happening, no matter what. It didn't happen from the start, but around message #100 (12696 context) it happens pretty consistently.
Any idea why?
I'm using default ChatML context and instruct in sillytavern, and tried virtio's ones as well - still the same.
image.png

So yes, in another community that I frequent someone had mentioned they were having issues with KoboldCPP in particular doing this, I have not figured out why yet, but I did not see/hear of this behavior in other tools like aphrodite-engine, tabby, or ooba.

I would recommend trying one of those inference engines if you are able, or if you'd rather not, the sister model of this https://huggingface.co/Epiculous/Azure_Dusk-v0.2 has not had this issue reported and seems to get higher context. Might be worth switching to that one while I attempt to sort this out.

Got it, thanks
The thing is that, i think, koboldcpp is the only one that have ROCm fork for AMD cards on windows, but i'll recheck to be sure, and meanwile will try azure dusk.

Sign up or log in to comment