Exl quant request
Hello, I would like a 8bpw quant of this.
Would try to upload it today
It's been a while, though if your still planning on doing it, I'd like a 6bpw of this instead.
Oops, i forgot
Done
Thnx for the quant
This model is pretty bad and it's only feat is having 32k context which is still pretty rare among mistral finetunes and merges.
I recommend you to use this instead
localfultonextractor/Erosumika-7B
Or it's self merge
localfultonextractor/Susanoo-10.7B
Erosumika has no sliding window so technically should support real 32k, and would be obviously better than this merge.
Interesting. Thnx for recs. I'm actually exclusively using free colab and Ive been trying to find a nsfw oriented model (that's ideally below 13b) that can run a complicated simulation card (The card I'm trying to get working for me to play is 'Animal Farm' from chub.). I've been targeting models that's finetuned to higher context cause I'm somewhat wary of the perplexity/quality loss if I just use any model and crank up it's context, especially for exl formats.