wav2vec2_fleurs / README.md
hiba2's picture
End of training
0da561b
|
raw
history blame
9.01 kB
metadata
license: apache-2.0
base_model: facebook/wav2vec2-large-xlsr-53
tags:
  - generated_from_trainer
datasets:
  - fleurs
metrics:
  - wer
model-index:
  - name: wav2vec2_fleurs
    results:
      - task:
          name: Automatic Speech Recognition
          type: automatic-speech-recognition
        dataset:
          name: fleurs
          type: fleurs
          config: ar_eg
          split: test
          args: ar_eg
        metrics:
          - name: Wer
            type: wer
            value: 0.3367091772943236

wav2vec2_fleurs

This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the fleurs dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4033
  • Wer: 0.3367

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 4e-05
  • train_batch_size: 4
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 20
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
18.6221 0.17 100 10.1383 1.0
6.5078 0.33 200 4.0182 1.0
3.632 0.5 300 3.2678 1.0
3.2359 0.67 400 3.1984 1.0
3.2014 0.83 500 3.1752 1.0
3.1857 1.0 600 3.1671 1.0
3.1816 1.17 700 3.1657 1.0
3.1912 1.33 800 3.1570 1.0
3.186 1.5 900 3.1548 1.0
3.1554 1.67 1000 3.1478 1.0
3.1521 1.83 1100 3.1442 1.0
3.1584 2.0 1200 3.1369 1.0
3.1554 2.17 1300 3.1340 1.0
3.172 2.33 1400 3.1304 1.0
3.1479 2.5 1500 3.1303 1.0
3.1359 2.67 1600 3.0864 1.0
3.0757 2.83 1700 2.9191 1.0
2.8491 3.0 1800 2.5490 1.0
2.4969 3.17 1900 1.9998 0.9785
2.048 3.33 2000 1.5004 0.9297
1.7632 3.5 2100 1.2369 0.8613
1.5885 3.67 2200 1.0752 0.7953
1.3712 3.83 2300 0.9573 0.7519
1.2916 4.0 2400 0.9038 0.7089
1.2559 4.17 2500 0.8269 0.6853
1.1625 4.33 2600 0.7781 0.6539
1.1264 4.5 2700 0.7555 0.6337
1.032 4.67 2800 0.7215 0.6032
1.0592 4.83 2900 0.6883 0.5734
0.9682 5.0 3000 0.6657 0.5504
0.9851 5.17 3100 0.6518 0.5448
0.9515 5.33 3200 0.6382 0.5403
0.9009 5.5 3300 0.6226 0.5296
0.9048 5.67 3400 0.6123 0.5161
0.8882 5.83 3500 0.6047 0.5098
0.8749 6.0 3600 0.5909 0.5006
0.7939 6.17 3700 0.5804 0.4931
0.8363 6.33 3800 0.5744 0.4877
0.8605 6.5 3900 0.5776 0.4884
0.8358 6.67 4000 0.5497 0.4745
0.7744 6.83 4100 0.5549 0.4664
0.7867 7.0 4200 0.5429 0.4629
0.7166 7.17 4300 0.5306 0.4465
0.7347 7.33 4400 0.5363 0.4521
0.7173 7.5 4500 0.5289 0.4429
0.7653 7.67 4600 0.5240 0.4389
0.7388 7.83 4700 0.5062 0.4304
0.7326 8.0 4800 0.5073 0.4290
0.6622 8.17 4900 0.5049 0.4236
0.7495 8.33 5000 0.5094 0.4254
0.6898 8.5 5100 0.4874 0.4216
0.6664 8.67 5200 0.4948 0.4225
0.6783 8.83 5300 0.4879 0.4131
0.7205 9.0 5400 0.4751 0.4136
0.6182 9.17 5500 0.4795 0.4085
0.6895 9.33 5600 0.4730 0.4099
0.6503 9.5 5700 0.4713 0.4029
0.624 9.67 5800 0.4699 0.4024
0.6268 9.83 5900 0.4726 0.4069
0.6525 10.0 6000 0.4593 0.3953
0.6112 10.17 6100 0.4558 0.3922
0.657 10.33 6200 0.4621 0.3940
0.6445 10.5 6300 0.4579 0.3906
0.5869 10.67 6400 0.4548 0.3903
0.5855 10.83 6500 0.4433 0.3840
0.5538 11.0 6600 0.4514 0.3897
0.5599 11.17 6700 0.4403 0.3786
0.5691 11.33 6800 0.4411 0.3800
0.5731 11.5 6900 0.4396 0.3768
0.5707 11.67 7000 0.4492 0.3770
0.5504 11.83 7100 0.4391 0.3690
0.6058 12.0 7200 0.4344 0.3717
0.5676 12.17 7300 0.4354 0.3758
0.5684 12.33 7400 0.4351 0.3656
0.5404 12.5 7500 0.4324 0.3636
0.5504 12.67 7600 0.4313 0.3658
0.5596 12.83 7700 0.4268 0.3632
0.5246 13.0 7800 0.4316 0.3633
0.5441 13.17 7900 0.4233 0.3648
0.5318 13.33 8000 0.4260 0.3597
0.5116 13.5 8100 0.4279 0.3591
0.5299 13.67 8200 0.4233 0.3606
0.5519 13.83 8300 0.4166 0.3567
0.5452 14.0 8400 0.4233 0.3573
0.5111 14.17 8500 0.4203 0.3580
0.5365 14.33 8600 0.4163 0.3577
0.5023 14.5 8700 0.4135 0.3552
0.5189 14.67 8800 0.4133 0.3485
0.5492 14.83 8900 0.4133 0.3478
0.5128 15.0 9000 0.4114 0.3478
0.486 15.17 9100 0.4222 0.3472
0.5015 15.33 9200 0.4129 0.3515
0.4871 15.5 9300 0.4132 0.3430
0.5267 15.67 9400 0.4109 0.3481
0.4814 15.83 9500 0.4109 0.3461
0.4801 16.0 9600 0.4140 0.3453
0.4894 16.17 9700 0.4074 0.3433
0.4756 16.33 9800 0.4070 0.3410
0.4446 16.5 9900 0.4088 0.3412
0.4838 16.67 10000 0.4070 0.3407
0.5087 16.83 10100 0.4048 0.3422
0.4994 17.0 10200 0.4043 0.3442
0.5421 17.17 10300 0.4088 0.3483
0.489 17.33 10400 0.4097 0.3450
0.4618 17.5 10500 0.4077 0.3430
0.4734 17.67 10600 0.4028 0.3433
0.4882 17.83 10700 0.4040 0.3393
0.4804 18.0 10800 0.4045 0.3385
0.483 18.17 10900 0.4055 0.3366
0.4916 18.33 11000 0.4077 0.3375
0.4933 18.5 11100 0.4056 0.3365
0.4881 18.67 11200 0.4023 0.3375
0.4869 18.83 11300 0.4031 0.3378
0.4649 19.0 11400 0.4026 0.3382
0.4793 19.17 11500 0.4035 0.3376
0.5252 19.33 11600 0.4019 0.3375
0.4681 19.5 11700 0.4026 0.3382
0.4311 19.67 11800 0.4026 0.3368
0.4799 19.83 11900 0.4034 0.3372
0.4323 20.0 12000 0.4033 0.3367

Framework versions

  • Transformers 4.36.0.dev0
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0