tuanio's picture
Model save
0530900
|
raw
history blame
8.24 kB
metadata
license: cc-by-nc-4.0
base_model: nguyenvulebinh/wav2vec2-base-vietnamese-250h
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: >-
      w2v2_ablation_focal_ctc_a0.75_g2.0-best_on-ling_head-tp0.025_tl10_fp0.001_fl16
    results: []

w2v2_ablation_focal_ctc_a0.75_g2.0-best_on-ling_head-tp0.025_tl10_fp0.001_fl16

This model is a fine-tuned version of nguyenvulebinh/wav2vec2-base-vietnamese-250h on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.8829
  • Wer: 0.0879

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 16
  • seed: 42
  • distributed_type: multi-GPU
  • num_devices: 4
  • total_train_batch_size: 32
  • total_eval_batch_size: 64
  • optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
1337.3802 0.94 100 875.6535 18.6404
928.4498 1.89 200 336.8592 17.0854
159.8141 2.83 300 65.9143 1.0
84.4352 3.77 400 60.3730 1.0
77.6086 4.72 500 57.3593 1.0
74.6091 5.66 600 56.1616 1.0
73.5983 6.6 700 55.2774 1.0
72.9967 7.55 800 54.6511 1.0
71.2266 8.49 900 54.5362 1.0
69.7741 9.43 1000 51.8718 0.9648
58.1878 10.38 1100 28.9001 0.5655
32.9238 11.32 1200 12.7097 0.2391
21.0735 12.26 1300 8.5885 0.1785
15.9281 13.21 1400 6.8959 0.1529
13.7108 14.15 1500 5.7514 0.1392
11.2293 15.09 1600 4.9739 0.1244
10.3682 16.04 1700 4.5084 0.1237
9.6654 16.98 1800 4.3703 0.1259
8.816 17.92 1900 4.1278 0.1143
8.8608 18.87 2000 3.9105 0.1074
7.8629 19.81 2100 3.9114 0.1237
7.8569 20.75 2200 3.7354 0.1121
7.3392 21.7 2300 3.6668 0.1056
7.2164 22.64 2400 3.5747 0.1128
7.2758 23.58 2500 3.4933 0.1016
6.4516 24.53 2600 3.4821 0.0988
6.45 25.47 2700 3.3720 0.0996
6.0068 26.42 2800 3.4425 0.1044
5.5781 27.36 2900 3.3221 0.1014
5.5837 28.3 3000 3.4974 0.1041
5.7895 29.25 3100 3.3536 0.0950
5.6272 30.19 3200 3.2036 0.0960
5.594 31.13 3300 3.1747 0.0913
4.791 32.08 3400 3.1225 0.1038
5.0596 33.02 3500 3.2113 0.1095
4.985 33.96 3600 3.0622 0.0929
4.731 34.91 3700 3.0940 0.0956
4.6287 35.85 3800 3.0453 0.0961
4.5235 36.79 3900 3.0351 0.1019
4.7715 37.74 4000 3.0237 0.0928
4.7101 38.68 4100 3.0250 0.0943
4.243 39.62 4200 2.9704 0.0980
4.4015 40.57 4300 2.9600 0.0871
4.4545 41.51 4400 2.9806 0.0858
4.662 42.45 4500 2.9668 0.0969
4.0696 43.4 4600 2.9349 0.0935
3.5668 44.34 4700 2.9190 0.0917
3.8214 45.28 4800 2.9490 0.0901
3.8215 46.23 4900 2.9371 0.0912
3.6593 47.17 5000 2.9408 0.0875
3.3709 48.11 5100 2.9577 0.0920
3.5768 49.06 5200 2.9863 0.0940
3.3018 50.0 5300 2.9437 0.1003
3.2921 50.94 5400 2.9195 0.0923
3.4551 51.89 5500 2.9410 0.0950
3.6576 52.83 5600 2.9520 0.1011
3.5078 53.77 5700 2.8926 0.0937
3.0777 54.72 5800 2.8971 0.0913
3.0572 55.66 5900 2.8693 0.0891
3.0486 56.6 6000 2.8876 0.0882
3.1283 57.55 6100 2.8597 0.0913
2.8705 58.49 6200 2.9080 0.0904
3.0644 59.43 6300 2.9106 0.0917
2.8822 60.38 6400 2.9231 0.0891
3.2338 61.32 6500 2.9511 0.0903
3.048 62.26 6600 2.9539 0.0898
3.094 63.21 6700 2.9490 0.0908
3.0581 64.15 6800 2.8952 0.0886
2.9343 65.09 6900 2.8926 0.0883
2.9497 66.04 7000 2.8732 0.0888
2.7788 66.98 7100 2.8837 0.0904
2.7765 67.92 7200 2.9169 0.0951
3.134 68.87 7300 2.9030 0.0926
2.8812 69.81 7400 2.9045 0.0921
2.615 70.75 7500 2.9148 0.0871
2.5678 71.7 7600 2.9435 0.0922
2.4858 72.64 7700 2.9050 0.0928
2.5367 73.58 7800 2.8948 0.0878
2.3228 74.53 7900 2.8995 0.0891
2.5849 75.47 8000 2.9289 0.0928
2.6645 76.42 8100 2.8950 0.0884
2.6634 77.36 8200 2.9194 0.0922
2.393 78.3 8300 2.9074 0.0919
3.0675 79.25 8400 2.8927 0.0908
2.6344 80.19 8500 2.8768 0.0891
2.5742 81.13 8600 2.8809 0.0911
2.6523 82.08 8700 2.8639 0.0863
2.2657 83.02 8800 2.8809 0.0912
2.3238 83.96 8900 2.8764 0.0893
2.3664 84.91 9000 2.8738 0.0913
2.5655 85.85 9100 2.8876 0.0904
2.4372 86.79 9200 2.9024 0.0910
2.5267 87.74 9300 2.8922 0.0898
2.471 88.68 9400 2.8893 0.0884
2.5225 89.62 9500 2.8852 0.0888
2.4752 90.57 9600 2.8876 0.0892
2.5029 91.51 9700 2.8883 0.0885
2.7052 92.45 9800 2.8825 0.0871
2.4682 93.4 9900 2.8780 0.0870
2.3672 94.34 10000 2.8810 0.0872
2.5325 95.28 10100 2.8842 0.0884
2.4877 96.23 10200 2.8833 0.0884
2.7373 97.17 10300 2.8825 0.0882
2.5574 98.11 10400 2.8833 0.0881
2.2097 99.06 10500 2.8823 0.0883
2.5919 100.0 10600 2.8829 0.0879

Framework versions

  • Transformers 4.35.2
  • Pytorch 1.13.1+cu117
  • Datasets 2.12.0
  • Tokenizers 0.14.1