tun_msa_wav2vec3 / README.md
Myriam123's picture
End of training
85ba815 verified
metadata
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: tun_msa_wav2vec3
    results: []

tun_msa_wav2vec3

This model was trained from scratch on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5827
  • Wer: 0.5757
  • Cer: 0.1836

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-06
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 700
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Wer Cer
5.268 2.9221 900 2.2839 1.0005 0.5919
2.0283 5.8442 1800 0.9714 0.6901 0.2320
1.4551 8.7662 2700 0.7691 0.6608 0.2154
1.2854 11.6883 3600 0.7028 0.6369 0.2057
1.1381 14.6104 4500 0.6529 0.6172 0.1991
1.1017 17.5325 5400 0.6325 0.6050 0.1952
1.0674 20.4545 6300 0.6189 0.5958 0.1914
0.9982 23.3766 7200 0.6089 0.5918 0.1895
0.9585 26.2987 8100 0.5986 0.5860 0.1877
0.9073 29.2208 9000 0.5948 0.5822 0.1866
0.91 32.1429 9900 0.5915 0.5804 0.1854
0.8775 35.0649 10800 0.5885 0.5787 0.1849
0.8973 37.9870 11700 0.5877 0.5775 0.1844
0.8908 40.9091 12600 0.5857 0.5763 0.1841
0.8503 43.8312 13500 0.5831 0.5764 0.1837
0.8843 46.7532 14400 0.5831 0.5765 0.1838
0.8554 49.6753 15300 0.5827 0.5757 0.1836

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.1.2
  • Datasets 2.19.2
  • Tokenizers 0.19.1