Text Classification
Transformers
TensorBoard
Safetensors
Indonesian
albert
Generated from Trainer
Inference Endpoints
Edit model card

IndoBERT Lite Base IndoNLI Multilingual NLI Distil mDeBERTa

IndoBERT Lite Base IndoNLI Multilingual NLI Distil mDeBERTa is a natural language inference (NLI) model based on the ALBERT model. The model was originally the pre-trained indobenchmark/indobert-lite-base-p1 model, which is then fine-tuned on IndoNLI and the Indonesian subsets of MoritzLaurer/multilingual-NLI-26lang-2mil7, whilst being distilled from MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7.

Evaluation Results

dev Acc. test_lay Acc. test_expert Acc.
IndoNLI 78.60 74.69 65.55

Model

Model #params Arch. Training/Validation data (text)
indobert-lite-base-p1-indonli-multilingual-nli-distil-mdeberta 11.7M ALBERT Base IndoNLI, Multilingual NLI (id)

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 5

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Precision Recall
0.4808 1.0 1803 0.4418 0.7683 0.7593 0.7904 0.7554
0.4529 2.0 3606 0.4343 0.7738 0.7648 0.7893 0.7619
0.4263 3.0 5409 0.4383 0.7861 0.7828 0.7874 0.7807
0.398 4.0 7212 0.4456 0.7792 0.7767 0.7792 0.7756
0.3772 5.0 9015 0.4499 0.7711 0.7674 0.7700 0.7661

Framework versions

  • Transformers 4.36.2
  • Pytorch 2.1.2+cu121
  • Datasets 2.16.1
  • Tokenizers 0.15.0

References

[1] Mahendra, R., Aji, A. F., Louvan, S., Rahman, F., & Vania, C. (2021, November). IndoNLI: A Natural Language Inference Dataset for Indonesian. Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics.

Downloads last month
22
Safetensors
Model size
11.7M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for LazarusNLP/indobert-lite-base-p1-indonli-multilingual-nli-distil-mdeberta

Finetuned
(4)
this model

Datasets used to train LazarusNLP/indobert-lite-base-p1-indonli-multilingual-nli-distil-mdeberta

Collection including LazarusNLP/indobert-lite-base-p1-indonli-multilingual-nli-distil-mdeberta