Multilingual, Uncensored and extensive vocabulary.
I want to give my deepest thanks to the Arcee-ai team for this incredible model. I tried it with low expectations for my language (Spanish) and I was very surprised by how well it handles it, as well as how it uses a wide vocabulary, richer than other models of the same weight. Much superior to the last phi medium in multilingual capabilities (Spanish). And the second thing that I loved and also didn't expect is that it has the right and necessary censorship, it rejected almost nothing of my requests, but always adhering to ethics, as it should be. I think that if I say that they did an excellent job, that's an understatement. It's an epic job and the creation process is very interesting and innovative at the same time. Thank you very much, keep it up!
It's truly one of the best models I've seen for its parameter count, especially for multilingual tasks. I'd love to see this distill-merge process applied to a Qwen 2.5 Coder model!
It's truly one of the best models I've seen for its parameter count, especially for multilingual tasks. I'd love to see this distill-merge process applied to a Qwen 2.5 Coder model!
or distill-merge the Qwen 2.5 Coder
and SuperNova-Medius
together! 🤯
I'm curious about the amount of compute needed to do exactly that!
Not uncensored, unfortunately. Gives refusals. Such a shame because I really wanted a fully uncensored, capable 14gb Qwen model.
Not uncensored, unfortunately. Gives refusals. Such a shame because I really wanted a fully uncensored, capable 14gb Qwen model.
Yeah, the censorship is infuriating. So is “fluffiness” and the whole “political correctness” thing.