Indonesian RoBERTa
Collection
Various downstream fine-tuning of Indonesian RoBERTa.
•
7 items
•
Updated
•
1
This model is a fine-tuned version of flax-community/indonesian-roberta-base on the indonlu dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
---|---|---|---|---|---|---|---|
No log | 1.0 | 420 | 0.1419 | 0.7491 | 0.8034 | 0.7753 | 0.9551 |
0.2261 | 2.0 | 840 | 0.1317 | 0.7889 | 0.7983 | 0.7936 | 0.9569 |
0.1081 | 3.0 | 1260 | 0.1430 | 0.7587 | 0.8300 | 0.7927 | 0.9546 |
0.0777 | 4.0 | 1680 | 0.1459 | 0.7848 | 0.8266 | 0.8052 | 0.9577 |
0.0563 | 5.0 | 2100 | 0.1525 | 0.7923 | 0.8125 | 0.8022 | 0.9579 |
0.0441 | 6.0 | 2520 | 0.1552 | 0.7986 | 0.8176 | 0.8080 | 0.9584 |
0.0441 | 7.0 | 2940 | 0.1692 | 0.7910 | 0.8232 | 0.8068 | 0.9584 |
0.0387 | 8.0 | 3360 | 0.1677 | 0.7894 | 0.8306 | 0.8095 | 0.9588 |
0.032 | 9.0 | 3780 | 0.1784 | 0.7939 | 0.8249 | 0.8091 | 0.9586 |
0.0284 | 10.0 | 4200 | 0.1817 | 0.7950 | 0.8261 | 0.8102 | 0.9588 |
Base model
flax-community/indonesian-roberta-base