Update README.md
Browse files
README.md
CHANGED
@@ -13,7 +13,9 @@ This is a **RoBERTa-base** model trained from scratch in Spanish.
|
|
13 |
|
14 |
The training dataset is [mc4](https://huggingface.co/datasets/bertin-project/mc4-es-sampled ) subsampling documents to a total of about 50 million examples. Sampling is random.
|
15 |
|
16 |
-
This model has been trained for
|
|
|
|
|
17 |
|
18 |
This is part of the
|
19 |
[Flax/Jax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104), organised by [HuggingFace](https://huggingface.co/) and TPU usage sponsored by Google.
|
|
|
13 |
|
14 |
The training dataset is [mc4](https://huggingface.co/datasets/bertin-project/mc4-es-sampled ) subsampling documents to a total of about 50 million examples. Sampling is random.
|
15 |
|
16 |
+
This model has been trained for 230.000 steps (early stopped before the 250k intended steps).
|
17 |
+
|
18 |
+
Please see our main [card](https://huggingface.co/bertin-project/bertin-roberta-base-spanish) for more information.
|
19 |
|
20 |
This is part of the
|
21 |
[Flax/Jax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104), organised by [HuggingFace](https://huggingface.co/) and TPU usage sponsored by Google.
|