File size: 4,306 Bytes
5774bbf
c040675
 
 
 
 
 
 
 
 
5774bbf
 
c040675
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
---
license: apache-2.0
base_model: facebook/wav2vec2-large-xlsr-53
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: xlsr-big-kinnn
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# xlsr-big-kinnn

This model is a fine-tuned version of [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0000
- Wer: 0.0510

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0004
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 132
- num_epochs: 100
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch   | Step | Validation Loss | Wer    |
|:-------------:|:-------:|:----:|:---------------:|:------:|
| 2.0626        | 2.1505  | 200  | 0.6872          | 0.5597 |
| 0.3624        | 4.3011  | 400  | 0.0468          | 0.1067 |
| 0.106         | 6.4516  | 600  | 0.0157          | 0.0641 |
| 0.0637        | 8.6022  | 800  | 0.0137          | 0.0625 |
| 0.0448        | 10.7527 | 1000 | 0.0086          | 0.0699 |
| 0.03          | 12.9032 | 1200 | 0.0027          | 0.0523 |
| 0.0264        | 15.0538 | 1400 | 0.0072          | 0.0559 |
| 0.0265        | 17.2043 | 1600 | 0.0042          | 0.0543 |
| 0.0266        | 19.3548 | 1800 | 0.0018          | 0.0553 |
| 0.0199        | 21.5054 | 2000 | 0.0083          | 0.0766 |
| 0.0139        | 23.6559 | 2200 | 0.0021          | 0.0569 |
| 0.0154        | 25.8065 | 2400 | 0.0033          | 0.0545 |
| 0.0145        | 27.9570 | 2600 | 0.0005          | 0.0510 |
| 0.0153        | 30.1075 | 2800 | 0.0017          | 0.0539 |
| 0.0144        | 32.2581 | 3000 | 0.0002          | 0.0527 |
| 0.0118        | 34.4086 | 3200 | 0.0024          | 0.0836 |
| 0.0118        | 36.5591 | 3400 | 0.0012          | 0.0575 |
| 0.0149        | 38.7097 | 3600 | 0.0046          | 0.0583 |
| 0.0125        | 40.8602 | 3800 | 0.0005          | 0.0571 |
| 0.0108        | 43.0108 | 4000 | 0.0003          | 0.0615 |
| 0.011         | 45.1613 | 4200 | 0.0010          | 0.0585 |
| 0.0085        | 47.3118 | 4400 | 0.0002          | 0.0510 |
| 0.0082        | 49.4624 | 4600 | 0.0003          | 0.0571 |
| 0.0076        | 51.6129 | 4800 | 0.0004          | 0.0607 |
| 0.0065        | 53.7634 | 5000 | 0.0001          | 0.0553 |
| 0.0058        | 55.9140 | 5200 | 0.0006          | 0.0512 |
| 0.0069        | 58.0645 | 5400 | 0.0020          | 0.0539 |
| 0.0046        | 60.2151 | 5600 | 0.0001          | 0.0537 |
| 0.0049        | 62.3656 | 5800 | 0.0001          | 0.0577 |
| 0.0038        | 64.5161 | 6000 | 0.0000          | 0.0615 |
| 0.0033        | 66.6667 | 6200 | 0.0000          | 0.0703 |
| 0.0048        | 68.8172 | 6400 | 0.0002          | 0.0711 |
| 0.0039        | 70.9677 | 6600 | 0.0000          | 0.0520 |
| 0.0048        | 73.1183 | 6800 | 0.0003          | 0.0523 |
| 0.0021        | 75.2688 | 7000 | 0.0001          | 0.0516 |
| 0.0024        | 77.4194 | 7200 | 0.0000          | 0.0510 |
| 0.0027        | 79.5699 | 7400 | 0.0000          | 0.0508 |
| 0.0014        | 81.7204 | 7600 | 0.0000          | 0.0504 |
| 0.0022        | 83.8710 | 7800 | 0.0000          | 0.0506 |
| 0.0019        | 86.0215 | 8000 | 0.0000          | 0.0520 |
| 0.0022        | 88.1720 | 8200 | 0.0000          | 0.0510 |
| 0.001         | 90.3226 | 8400 | 0.0000          | 0.0510 |
| 0.0012        | 92.4731 | 8600 | 0.0000          | 0.0510 |
| 0.0011        | 94.6237 | 8800 | 0.0000          | 0.0510 |
| 0.0014        | 96.7742 | 9000 | 0.0000          | 0.0510 |
| 0.0008        | 98.9247 | 9200 | 0.0000          | 0.0510 |


### Framework versions

- Transformers 4.45.0.dev0
- Pytorch 2.1.2
- Datasets 2.20.0
- Tokenizers 0.19.1