🚩 Report: Ethical issue(s)

#1
by deleted - opened
deleted

generates csam unprompted, remove asap

GGUF models also present in other repos, remove those as well

Hi! I've been testing this model for many months and couldn't get any unprompted csam out of it. I put substantial effort to prevent children from showing up in the stories: I specifically combed through the dataset to get rid of all mentions of children in any context, and also aligned the model to not bring them up in general. Could you please send me the transcript into a private message, along with the settings, context, etc so I can take a look?

Hi, I've seen your comments on the quantization of the model where you said:

Screenshot from 2024-09-22 12-13-19.png

Screenshot from 2024-09-22 12-26-28.png

Please keep in mind that this is a model trained for generating sexually explicit prose. It's not "unprompted csam" if you ask it to generate smut with a "boy", it's simply doing what you asked it to do. Please don't use terms like "child", "boy", "girl", etc when prompting the model.

It's also not "csam" if it's referring to an adult man, and it's definitely not "unprompted" when you ask the model to add a "young man" into your interactive furry fetish porn story.

Sign up or log in to comment