Edit model card

wav2vec2-large-xlsr-faroese-100h-30k-steps

This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1354
  • Wer: 25.2668

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 30000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.6576 0.4640 1000 0.4914 63.2323
0.4625 0.9281 2000 0.3354 47.7149
0.3475 1.3921 3000 0.2567 41.2321
0.2979 1.8561 4000 0.2330 38.5595
0.235 2.3202 5000 0.2173 37.0143
0.2737 2.7842 6000 0.2089 35.6704
0.2095 3.2483 7000 0.1939 33.8484
0.1916 3.7123 8000 0.1836 33.3904
0.176 4.1763 9000 0.1794 31.9609
0.1609 4.6404 10000 0.1709 31.3771
0.1941 5.1044 11000 0.1693 30.9392
0.1517 5.5684 12000 0.1693 30.9493
0.1583 6.0325 13000 0.1532 29.6859
0.14 6.4965 14000 0.1604 29.3185
0.1688 6.9606 15000 0.1488 29.4041
0.1553 7.4246 16000 0.1607 28.6893
0.1483 7.8886 17000 0.1526 28.0552
0.1442 8.3527 18000 0.1537 28.2615
0.1304 8.8167 19000 0.1497 27.5569
0.1104 9.2807 20000 0.1622 27.5871
0.1225 9.7448 21000 0.1493 26.8724
0.1014 10.2088 22000 0.1433 26.7516
0.1087 10.6729 23000 0.1365 26.2130
0.0855 11.1369 24000 0.1421 26.2432
0.0865 11.6009 25000 0.1339 25.9714
0.0603 12.0650 26000 0.1364 25.6694
0.0663 12.5290 27000 0.1362 25.3876
0.0648 12.9930 28000 0.1358 25.4681
0.0638 13.4571 29000 0.1366 25.3423
0.0629 13.9211 30000 0.1354 25.2668

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.1+cu121
  • Datasets 3.0.0
  • Tokenizers 0.19.1
Downloads last month
5
Safetensors
Model size
315M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for davidilag/wav2vec2-large-xlsr-faroese-100h-30k-steps

Finetuned
(208)
this model