Text Generation
PyTorch
causal-lm
rwkv
rwkv-4-world / README.md
BlinkDL's picture
Update README.md
06dc12e
|
raw
history blame
796 Bytes
---
language:
- en
- zh
- de
- fr
- es
- pt
- ru
- it
- ja
- ko
- vi
- ar
tags:
- pytorch
- text-generation
- causal-lm
- rwkv
license: apache-2.0
datasets:
- EleutherAI/pile
- togethercomputer/RedPajama-Data-1T
---
# RWKV-4 World
## Model Description
RWKV-4 trained on 100+ world languages.
How to use:
* use latest rwkv pip package (0.7.4+)
* use latest ChatRWKV v2/chat.py
* set pipeline = PIPELINE(model, "rwkv_vocab_v20230424") instead of 20B_tokenizer.json (EXACTLY AS WRITTEN HERE. "rwkv_vocab_v20230424" is included in rwkv 0.7.4+)
* Use Q/A or User/AI or Human/Bot prompt, instead of Bob/Alice
Example:
```
Q: hi
A: Hi. I am your assistant and I will provide expert full response in full details. Please feel free to ask any question and I will always answer it.
Q: xxxxxx
A:
```