DanielHesslow
commited on
Commit
•
316a050
1
Parent(s):
53a23f5
Update README.md
Browse files
README.md
CHANGED
@@ -19,19 +19,33 @@ Model | #Params | d_model | layers | lm loss uniref-100
|
|
19 |
[Large](https://huggingface.co/lightonai/RITA_l)| 680M | 1536 | 24 | 1.82
|
20 |
[XLarge](https://huggingface.co/lightonai/RITA_xl)| 1.2B | 2048 | 24 | 1.70
|
21 |
|
|
|
22 |
|
23 |
-
|
24 |
-
|
25 |
Instantiate a model like so:
|
26 |
-
|
27 |
-
|
28 |
-
|
29 |
-
|
30 |
-
|
31 |
-
for generation
|
32 |
-
|
33 |
-
|
34 |
-
|
35 |
-
|
36 |
-
|
37 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
19 |
[Large](https://huggingface.co/lightonai/RITA_l)| 680M | 1536 | 24 | 1.82
|
20 |
[XLarge](https://huggingface.co/lightonai/RITA_xl)| 1.2B | 2048 | 24 | 1.70
|
21 |
|
22 |
+
For full results see our preprint: https://arxiv.org/abs/2205.05789
|
23 |
|
24 |
+
## Usage
|
|
|
25 |
Instantiate a model like so:
|
26 |
+
``` python
|
27 |
+
from transformers import AutoModel, AutoModelForCausalLM
|
28 |
+
model = AutoModelForCausalLM.from_pretrained("lightonai/RITA_s, trust_remote_code=True")
|
29 |
+
tokenizer = AutoTokenizer.from_pretrained("lightonai/RITA_s")
|
30 |
+
```
|
31 |
+
for generation we support pipelines:
|
32 |
+
``` python
|
33 |
+
from transformers import pipeline
|
34 |
+
rita_gen = pipeline('text-generation', model=model, tokenizer=tokenizer)
|
35 |
+
sequences = rita_gen("MAB", max_length=20, do_sample=True, top_k=950, repetition_penalty=1.2,
|
36 |
+
num_return_sequences=2, eos_token_id=2)
|
37 |
+
for seq in sequences:
|
38 |
+
print(f"seq: {seq['generated_text'].replace(' ', '')}")
|
39 |
+
```
|
40 |
+
Or see `example.py`
|
41 |
+
|
42 |
+
## How to cite
|
43 |
+
|
44 |
+
@misc{RITA2022,
|
45 |
+
doi = {10.48550/ARXIV.2205.05789},
|
46 |
+
url = {https://arxiv.org/abs/2205.05789},
|
47 |
+
author = {Hesslow, Daniel and Zanichelli, Niccoló and Notin, Pascal and Poli, Iacopo and Marks, Debora},
|
48 |
+
title = {RITA: a Study on Scaling Up Generative Protein Sequence Models},
|
49 |
+
publisher = {arXiv},
|
50 |
+
year = {2022},
|
51 |
+
}
|