izhx commited on
Commit
5d8e84a
1 Parent(s): 313ffc8

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -6,14 +6,14 @@ language:
6
  - en
7
  pipeline_tag: fill-mask
8
  ---
9
- ## gte-multilingual-mlm-base
10
 
11
 
12
  We introduce `GTE-v1.5` series, new generalized text encoder, embedding and reranking models that the context length of up to 8192.
13
  The models are built upon the transformer++ encoder backbone (BERT + RoPE + GLU, code refer to [`Alibaba-NLP/new-impl`](https://huggingface.co/Alibaba-NLP/new-impl))
14
  as well as the vocabulary of `bert-base-uncased`.
15
 
16
- This text encoder is the `GTEv1.5-en-MLM-8192` in table 13 of our [paper](https://arxiv.org/pdf/2407.19669).
17
 
18
  - **Developed by**: Institute for Intelligent Computing, Alibaba Group
19
  - **Model type**: Text Encoder
@@ -69,7 +69,7 @@ The entire training process is as follows:
69
  If you find our paper or models helpful, please consider citing them as follows:
70
 
71
  ```
72
- @misc{zhang2024mgtegeneralizedlongcontexttext,
73
  title={mGTE: Generalized Long-Context Text Representation and Reranking Models for Multilingual Text Retrieval},
74
  author={Xin Zhang and Yanzhao Zhang and Dingkun Long and Wen Xie and Ziqi Dai and Jialong Tang and Huan Lin and Baosong Yang and Pengjun Xie and Fei Huang and Meishan Zhang and Wenjie Li and Min Zhang},
75
  year={2024},
 
6
  - en
7
  pipeline_tag: fill-mask
8
  ---
9
+ ## gte-en-mlm-base
10
 
11
 
12
  We introduce `GTE-v1.5` series, new generalized text encoder, embedding and reranking models that the context length of up to 8192.
13
  The models are built upon the transformer++ encoder backbone (BERT + RoPE + GLU, code refer to [`Alibaba-NLP/new-impl`](https://huggingface.co/Alibaba-NLP/new-impl))
14
  as well as the vocabulary of `bert-base-uncased`.
15
 
16
+ This text encoder is the `GTEv1.5-en-MLM-base-8192` in table 13 of our [paper](https://arxiv.org/pdf/2407.19669).
17
 
18
  - **Developed by**: Institute for Intelligent Computing, Alibaba Group
19
  - **Model type**: Text Encoder
 
69
  If you find our paper or models helpful, please consider citing them as follows:
70
 
71
  ```
72
+ @misc{zhang2024mgte,
73
  title={mGTE: Generalized Long-Context Text Representation and Reranking Models for Multilingual Text Retrieval},
74
  author={Xin Zhang and Yanzhao Zhang and Dingkun Long and Wen Xie and Ziqi Dai and Jialong Tang and Huan Lin and Baosong Yang and Pengjun Xie and Fei Huang and Meishan Zhang and Wenjie Li and Min Zhang},
75
  year={2024},