Edit model card

fine-w2v2base-bs16-ep100-lr2e-05-linguistic-rmsnorm-focal_ctc_a0.5_g1.0-0.05_10_0.004_40

This model is a fine-tuned version of nguyenvulebinh/wav2vec2-base-vietnamese-250h on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.0619
  • Wer: 0.0997

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • distributed_type: multi-GPU
  • num_devices: 4
  • total_train_batch_size: 64
  • total_eval_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Wer
1068.4723 0.94 50 533.5247 15.8344
704.929 1.89 100 149.4564 0.9983
104.7041 2.83 150 45.2165 1.0
57.2527 3.77 200 42.7504 1.0
55.1203 4.72 250 41.4518 1.0
52.9285 5.66 300 39.8048 1.0
51.1361 6.6 350 38.6911 1.0
49.1867 7.55 400 37.8983 1.0
48.293 8.49 450 37.5179 1.0
48.8025 9.43 500 37.2562 1.0
48.0603 10.38 550 37.0740 1.0
48.0837 11.32 600 37.0175 0.9999
45.7671 12.26 650 33.4394 0.9620
38.2468 13.21 700 20.5908 0.5614
22.0048 14.15 750 9.7715 0.2622
13.6453 15.09 800 6.5392 0.1925
10.3565 16.04 850 5.1822 0.1627
8.4776 16.98 900 4.4310 0.1547
7.2782 17.92 950 3.9109 0.1441
6.6759 18.87 1000 3.5788 0.1371
6.0682 19.81 1050 3.3775 0.1336
5.5782 20.75 1100 3.1172 0.1222
5.4805 21.7 1150 3.0142 0.1225
5.0893 22.64 1200 2.9002 0.1234
4.9178 23.58 1250 2.9029 0.1257
4.5324 24.53 1300 2.7464 0.1149
4.4924 25.47 1350 2.5754 0.1104
4.1324 26.42 1400 2.6028 0.1099
4.2581 27.36 1450 2.5399 0.1049
3.8897 28.3 1500 2.4484 0.1062
3.8507 29.25 1550 2.4717 0.1081
3.7424 30.19 1600 2.4559 0.1114
3.4716 31.13 1650 2.3895 0.1043
3.5385 32.08 1700 2.4023 0.1079
3.4308 33.02 1750 2.3014 0.1022
3.3027 33.96 1800 2.3091 0.1054
3.078 34.91 1850 2.2783 0.1000
3.1628 35.85 1900 2.2364 0.1029
3.1191 36.79 1950 2.1291 0.0963
2.9528 37.74 2000 2.1785 0.0975
2.9116 38.68 2050 2.1666 0.1006
2.7249 39.62 2100 2.1878 0.1053
2.7466 40.57 2150 2.1900 0.0997
2.6349 41.51 2200 2.1549 0.0963
2.6933 42.45 2250 2.1418 0.1030
2.5316 43.4 2300 2.1705 0.0982
2.5175 44.34 2350 2.1444 0.0991
2.5374 45.28 2400 2.1134 0.0970
2.4234 46.23 2450 2.1473 0.1052
2.318 47.17 2500 2.1129 0.1016
2.2632 48.11 2550 2.1011 0.0908
2.3666 49.06 2600 2.1168 0.0976
2.2127 50.0 2650 2.1183 0.0968
2.132 50.94 2700 2.0882 0.0943
2.2458 51.89 2750 2.0710 0.0934
1.9839 52.83 2800 2.0990 0.1026
2.147 53.77 2850 2.0917 0.1017
2.1353 54.72 2900 2.1009 0.1002
1.9557 55.66 2950 2.1425 0.1057
1.8819 56.6 3000 2.1140 0.0979
2.0495 57.55 3050 2.1637 0.1020
2.027 58.49 3100 2.1385 0.1025
1.9783 59.43 3150 2.1003 0.1002
1.9553 60.38 3200 2.1139 0.1043
1.7827 61.32 3250 2.1029 0.0967
1.9633 62.26 3300 2.0796 0.0941
1.7306 63.21 3350 2.0947 0.1009
1.8145 64.15 3400 2.1027 0.1029
1.7772 65.09 3450 2.1160 0.1014
1.784 66.04 3500 2.1080 0.1038
1.8016 66.98 3550 2.1155 0.0991
1.7837 67.92 3600 2.1112 0.1004
1.7027 68.87 3650 2.0888 0.0955
1.6968 69.81 3700 2.0739 0.0977
1.6873 70.75 3750 2.0948 0.0972
1.7168 71.7 3800 2.1186 0.0989
1.6195 72.64 3850 2.0967 0.0969
1.6414 73.58 3900 2.0811 0.1018
1.5118 74.53 3950 2.0674 0.0987
1.6768 75.47 4000 2.0616 0.0959
1.5945 76.42 4050 2.0632 0.1009
1.6417 77.36 4100 2.1003 0.1040
1.6208 78.3 4150 2.0939 0.1023
1.5037 79.25 4200 2.0788 0.0998
1.6181 80.19 4250 2.0641 0.0955
1.5608 81.13 4300 2.0864 0.1023
1.5658 82.08 4350 2.0802 0.1000
1.5369 83.02 4400 2.0750 0.0984
1.5474 83.96 4450 2.0582 0.0976
1.6031 84.91 4500 2.0666 0.0998
1.5224 85.85 4550 2.0695 0.0984
1.5687 86.79 4600 2.0645 0.0972
1.5393 87.74 4650 2.0702 0.0995
1.6074 88.68 4700 2.0673 0.0975
1.5601 89.62 4750 2.0622 0.0991
1.4178 90.57 4800 2.0666 0.0998
1.6219 91.51 4850 2.0620 0.1004
1.4044 92.45 4900 2.0572 0.0990
1.628 93.4 4950 2.0611 0.0993
1.5058 94.34 5000 2.0633 0.0995
1.4636 95.28 5050 2.0628 0.0998
1.5394 96.23 5100 2.0618 0.0999
1.4808 97.17 5150 2.0625 0.1004
1.5651 98.11 5200 2.0627 0.0998
1.499 99.06 5250 2.0618 0.0997
1.5463 100.0 5300 2.0619 0.0997

Framework versions

  • Transformers 4.34.0
  • Pytorch 2.0.1
  • Datasets 2.14.5
  • Tokenizers 0.14.1
Downloads last month
1
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for tuanio/fine-w2v2base-bs16-ep100-lr2e-05-linguistic-rmsnorm-focal_ctc_a0.5_g1.0-0.05_10_0.004_40

Finetuned
(56)
this model