File size: 410 Bytes
47eaafa |
1 2 3 4 5 6 7 8 9 10 11 |
```
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("facebook/bart-large")
model = AutoModelForSeq2SeqLM.from_pretrained("vadis/bart_scitldr", use_auth_token=True)
text = "Abstract of a paper."
batch = tok(text, return_tensors="pt")
generated_ids = model.generate(batch["input_ids"])
print(tok.batch_decode(generated_ids, skip_special_tokens=True))
``` |