hiba2's picture
End of training
ee11394 verified
metadata
library_name: transformers
license: apache-2.0
base_model: moussaKam/AraBART
tags:
  - generated_from_trainer
model-index:
  - name: resultsara_bertscore
    results: []

resultsara_bertscore

This model is a fine-tuned version of moussaKam/AraBART on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5783

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 128
  • eval_batch_size: 128
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 8
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
0.7346 0.4263 500 0.5939
0.7367 0.8525 1000 0.5851
0.7042 1.2788 1500 0.5825
0.6854 1.7050 2000 0.5783
0.6767 2.1313 2500 0.5775
0.6561 2.5575 3000 0.5758
0.6562 2.9838 3500 0.5748
0.6457 3.4101 4000 0.5774
0.6519 3.8363 4500 0.5755
0.6396 4.2626 5000 0.5774
0.626 4.6888 5500 0.5773
0.6201 5.1151 6000 0.5789
0.605 5.5413 6500 0.5776
0.6044 5.9676 7000 0.5770
0.5899 6.3939 7500 0.5786
0.5917 6.8201 8000 0.5779
0.5913 7.2464 8500 0.5783

Framework versions

  • Transformers 4.45.0.dev0
  • Pytorch 2.4.0+cu121
  • Datasets 2.21.0
  • Tokenizers 0.19.1