--- library_name: transformers license: apache-2.0 base_model: facebook/wav2vec2-large-xlsr-53 tags: - generated_from_trainer metrics: - wer model-index: - name: xlsr-aiish-clp results: [] --- # xlsr-aiish-clp This model is a fine-tuned version of [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.0000 - Wer: 0.3154 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0004 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 132 - num_epochs: 100 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-------:|:----:|:---------------:|:------:| | 5.176 | 2.4845 | 200 | 2.7099 | 1.0 | | 2.5595 | 4.9689 | 400 | 1.7974 | 1.0 | | 1.1648 | 7.4534 | 600 | 0.1544 | 0.6039 | | 0.3992 | 9.9379 | 800 | 0.0722 | 0.5465 | | 0.2507 | 12.4224 | 1000 | 0.0451 | 0.4670 | | 0.176 | 14.9068 | 1200 | 0.0316 | 0.3729 | | 0.1324 | 17.3913 | 1400 | 0.0155 | 0.3655 | | 0.1193 | 19.8758 | 1600 | 0.0075 | 0.3313 | | 0.0897 | 22.3602 | 1800 | 0.0201 | 0.3313 | | 0.0825 | 24.8447 | 2000 | 0.0035 | 0.3117 | | 0.054 | 27.3292 | 2200 | 0.0058 | 0.3154 | | 0.0579 | 29.8137 | 2400 | 0.0013 | 0.3081 | | 0.0529 | 32.2981 | 2600 | 0.0028 | 0.3093 | | 0.0418 | 34.7826 | 2800 | 0.0043 | 0.3081 | | 0.0402 | 37.2671 | 3000 | 0.0084 | 0.3117 | | 0.0366 | 39.7516 | 3200 | 0.0006 | 0.3093 | | 0.0326 | 42.2360 | 3400 | 0.0031 | 0.3117 | | 0.0266 | 44.7205 | 3600 | 0.0029 | 0.3166 | | 0.0268 | 47.2050 | 3800 | 0.0009 | 0.3081 | | 0.024 | 49.6894 | 4000 | 0.0014 | 0.3093 | | 0.0185 | 52.1739 | 4200 | 0.0001 | 0.3081 | | 0.0277 | 54.6584 | 4400 | 0.0011 | 0.3068 | | 0.0179 | 57.1429 | 4600 | 0.0026 | 0.3093 | | 0.0161 | 59.6273 | 4800 | 0.0086 | 0.3142 | | 0.0177 | 62.1118 | 5000 | 0.0004 | 0.3130 | | 0.0184 | 64.5963 | 5200 | 0.0009 | 0.3130 | | 0.0131 | 67.0807 | 5400 | 0.0022 | 0.3166 | | 0.017 | 69.5652 | 5600 | 0.0026 | 0.3105 | | 0.0083 | 72.0497 | 5800 | 0.0023 | 0.3081 | | 0.0082 | 74.5342 | 6000 | 0.0001 | 0.3093 | | 0.0068 | 77.0186 | 6200 | 0.0001 | 0.3142 | | 0.0072 | 79.5031 | 6400 | 0.0002 | 0.3142 | | 0.0097 | 81.9876 | 6600 | 0.0001 | 0.3142 | | 0.0087 | 84.4720 | 6800 | 0.0001 | 0.3154 | | 0.0065 | 86.9565 | 7000 | 0.0000 | 0.3154 | | 0.0072 | 89.4410 | 7200 | 0.0000 | 0.3154 | | 0.0065 | 91.9255 | 7400 | 0.0001 | 0.3154 | | 0.0044 | 94.4099 | 7600 | 0.0001 | 0.3154 | | 0.0053 | 96.8944 | 7800 | 0.0000 | 0.3154 | | 0.005 | 99.3789 | 8000 | 0.0000 | 0.3154 | ### Framework versions - Transformers 4.45.0.dev0 - Pytorch 2.4.0 - Datasets 2.21.0 - Tokenizers 0.19.1