Update README.md
Browse files
README.md
CHANGED
@@ -16,8 +16,18 @@ base_model:
|
|
16 |
---
|
17 |
|
18 |
|
19 |
-
X-ALMA builds upon [ALMA-R](https://arxiv.org/pdf/2401.08417) by expanding support from 6 to 50 languages. It utilizes a plug-and-play architecture with language-specific modules, complemented by a carefully designed training recipe. This release includes the **language-specific X-ALMA LoRA module and a merged model that supports the languages in Group 5: English (en), Hungarian (hu), Greek (el), Czech (cs), Polish (pl), Lithuanian (lt), and Latvian (lv)**.
|
20 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
21 |
All X-ALMA checkpoints are released at huggingface:
|
22 |
| Models | Model Link | Description |
|
23 |
|:-------------:|:---------------:|:---------------:|
|
|
|
16 |
---
|
17 |
|
18 |
|
19 |
+
[X-ALMA](https://arxiv.org/pdf/2410.03115) builds upon [ALMA-R](https://arxiv.org/pdf/2401.08417) by expanding support from 6 to 50 languages. It utilizes a plug-and-play architecture with language-specific modules, complemented by a carefully designed training recipe. This release includes the **language-specific X-ALMA LoRA module and a merged model that supports the languages in Group 5: English (en), Hungarian (hu), Greek (el), Czech (cs), Polish (pl), Lithuanian (lt), and Latvian (lv)**.
|
20 |
+
```
|
21 |
+
@misc{xu2024xalmaplugplay,
|
22 |
+
title={X-ALMA: Plug & Play Modules and Adaptive Rejection for Quality Translation at Scale},
|
23 |
+
author={Haoran Xu and Kenton Murray and Philipp Koehn and Hieu Hoang and Akiko Eriguchi and Huda Khayrallah},
|
24 |
+
year={2024},
|
25 |
+
eprint={2410.03115},
|
26 |
+
archivePrefix={arXiv},
|
27 |
+
primaryClass={cs.CL},
|
28 |
+
url={https://arxiv.org/abs/2410.03115},
|
29 |
+
}
|
30 |
+
```
|
31 |
All X-ALMA checkpoints are released at huggingface:
|
32 |
| Models | Model Link | Description |
|
33 |
|:-------------:|:---------------:|:---------------:|
|