Text Generation
PyTorch
causal-lm
rwkv
BlinkDL commited on
Commit
dd2f5c9
1 Parent(s): 512186e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -0
README.md CHANGED
@@ -32,6 +32,7 @@ RWKV-4 trained on 100+ world languages (70% English, 15% multilang, 15% code).
32
  How to use:
33
  * use latest rwkv pip package (0.7.4+)
34
  * use latest ChatRWKV v2/benchmark_world.py to test
 
35
 
36
  The difference between World & Raven:
37
  * set pipeline = PIPELINE(model, "rwkv_vocab_v20230424") instead of 20B_tokenizer.json (EXACTLY AS WRITTEN HERE. "rwkv_vocab_v20230424" is included in rwkv 0.7.4+)
 
32
  How to use:
33
  * use latest rwkv pip package (0.7.4+)
34
  * use latest ChatRWKV v2/benchmark_world.py to test
35
+ * larger models are stronger even though not fully trained yet
36
 
37
  The difference between World & Raven:
38
  * set pipeline = PIPELINE(model, "rwkv_vocab_v20230424") instead of 20B_tokenizer.json (EXACTLY AS WRITTEN HERE. "rwkv_vocab_v20230424" is included in rwkv 0.7.4+)