|
--- |
|
license: apache-2.0 |
|
--- |
|
|
|
This is a experimental model, yet it is the Most powerful RNN model in the world. |
|
|
|
# Mobius RWKV r6 chat 12B 16k |
|
Mobius is a RWKV v6 arch chat model, benifit from [Matrix-Valued States and Dynamic Recurrence](https://arxiv.org/abs/2404.05892) |
|
|
|
## Introduction |
|
|
|
Mobius is a RWKV v6 arch model, a state based RNN+CNN+Transformer Mixed language model pretrained on a certain amount of data. |
|
In comparison with the previous released Mobius, the improvements include: |
|
|
|
* Only 24G Vram to run this model locally with fp16; |
|
* Significant performance improvement; |
|
* Multilingual support ; |
|
* Stable support of 16K context length. |
|
* function call support ; |
|
|
|
|
|
## Usage |
|
We encourage you use few shots to use this model, Desipte Directly use User: xxxx\n\nAssistant: xxx\n\n is really good too, Can boost all potential ability. |
|
|
|
Recommend Temp and topp: 0.7 0.6/1 0.3/1.5 0.3/0.2 0.8 |
|
|
|
function call format: |
|
System: func xxxx |
|
|
|
User: xxxx |
|
|
|
Assistant: xxxx |
|
|
|
Obersavtion: xxxx |
|
|
|
Assistant: xxxx |
|
|
|
## More details |
|
Mobius 12B 16k based on RWKV v6 arch, which is leading state based RNN+CNN+Transformer Mixed large language model which focus opensouce community |
|
* 10~100 trainning/inference cost reduce; |
|
* state based,selected memory, which mean good at grok; |
|
* community support. |
|
|
|
## requirements |
|
24G vram to run fp16, 12G for int8, 6G for nf4 with Ai00 server. |
|
|
|
* [RWKV Runner](https://github.com/josStorer/RWKV-Runner) |
|
* [Ai00 server](https://github.com/cgisky1980/ai00_rwkv_server) |
|
|
|
## Benchmark |
|
Ceval: |
|
Cmmlu: |
|
|
|
|