Ransaka commited on
Commit
086c481
1 Parent(s): bb48876

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -4,7 +4,7 @@ This is the 8-bit quantized version of Facebook's mbart model.
4
 
5
  According to the abstract, MBART is a sequence-to-sequence denoising auto-encoder pretrained on large-scale monolingual corpora in many languages using the BART objective. mBART is one of the first methods for pretraining a complete sequence-to-sequence model by denoising full texts in multiple languages, while previous approaches have focused only on the encoder, decoder, or reconstructing parts of the text.
6
 
7
- This model was contributed by [valhalla](https://huggingface.co/valhalla). The Authors’ code can be found [here](https://github.com/facebookresearch/fairseq/tree/main/examples/mbart)
8
 
9
  ## Usage info
10
 
 
4
 
5
  According to the abstract, MBART is a sequence-to-sequence denoising auto-encoder pretrained on large-scale monolingual corpora in many languages using the BART objective. mBART is one of the first methods for pretraining a complete sequence-to-sequence model by denoising full texts in multiple languages, while previous approaches have focused only on the encoder, decoder, or reconstructing parts of the text.
6
 
7
+ The Authors’ code can be found [here](https://github.com/facebookresearch/fairseq/tree/main/examples/mbart)
8
 
9
  ## Usage info
10