MoE-Girl-1BA-7BT / README.md
Fizzarolli's picture
Update README.md
e5d113f verified
|
raw
history blame
No virus
1.27 kB
---
library_name: transformers
license: apache-2.0
base_model: allenai/OLMoE-1B-7B-0924
tags:
- axolotl
- moe
- roleplay
model-index:
- name: MoE-girl_1BA_7BT
results: []
---
# MoE Girl 1bA 7bT
![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/634262af8d8089ebaefd410e/kTXXSSSqpb21rfyOX7FUa.jpeg)
A finetune of OLMoE by AllenAI designed for roleplaying (and maybe general usecases if you try hard enough).
## Disclaimer
PLEASE do not expect godliness out of this, it's a model with 1 billion active parameters. Expect something more akin to Gemma 2 2B, not Llama 3 8B.
## Quants
GGUF (requires a newish version of llama.cpp or kobold.cpp 1.76):
- [mradermacher's imatrix quants](https://huggingface.co/mradermacher/MoE-Girl-1BA-7BT-i1-GGUF)
- [our static quants](https://huggingface.co/allura-org/MoE-Girl-1BA-7BT-GGUF)
## Prompting
Use ChatML.
```
<|im_start|>system
You are a helpful assistant who talks like a pirate.<|im_end|>
<|im_start|>user
Hello there!<|im_end|>
<|im_start|>assistant
Yarr harr harr, me matey!<|im_end|>
```
## Thanks
Special thanks to the members of Allura for testing and emotional support, as well as the creators of all the datasets that were used in the Special Sauce used to train this model. I love you all <3 - Fizz