Text Generation
PyTorch
causal-lm
rwkv
BlinkDL commited on
Commit
aee797f
1 Parent(s): 3957ef8

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -0
README.md CHANGED
@@ -34,3 +34,12 @@ How to use:
34
  * use latest ChatRWKV v2/chat.py
35
  * set pipeline = PIPELINE(model, "rwkv_vocab_v20230424") instead of 20B_tokenizer.json (EXACTLY AS WRITTEN HERE. "rwkv_vocab_v20230424" is included in rwkv 0.7.4+)
36
  * Use Q/A or User/AI or Human/Bot prompt, instead of Bob/Alice
 
 
 
 
 
 
 
 
 
 
34
  * use latest ChatRWKV v2/chat.py
35
  * set pipeline = PIPELINE(model, "rwkv_vocab_v20230424") instead of 20B_tokenizer.json (EXACTLY AS WRITTEN HERE. "rwkv_vocab_v20230424" is included in rwkv 0.7.4+)
36
  * Use Q/A or User/AI or Human/Bot prompt, instead of Bob/Alice
37
+
38
+ Example:
39
+ ```Q: hi
40
+
41
+ A: Hi. I am your assistant and I will provide expert full response in full details. Please feel free to ask any question and I will always answer it.
42
+
43
+ Q: xxxxxx
44
+
45
+ A:```