jadasdn commited on
Commit
d5e0fe9
1 Parent(s): 9e7710a

End of training

Browse files
Files changed (1) hide show
  1. README.md +121 -0
README.md ADDED
@@ -0,0 +1,121 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: jadasdn/wav2vec2-1
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - wer
8
+ model-index:
9
+ - name: wav2vec2-2
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # wav2vec2-2
17
+
18
+ This model is a fine-tuned version of [jadasdn/wav2vec2-1](https://huggingface.co/jadasdn/wav2vec2-1) on the None dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 0.8353
21
+ - Wer: 0.3593
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 0.0001
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - lr_scheduler_warmup_steps: 1000
47
+ - num_epochs: 30
48
+ - mixed_precision_training: Native AMP
49
+
50
+ ### Training results
51
+
52
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
53
+ |:-------------:|:-----:|:-----:|:---------------:|:------:|
54
+ | 0.4516 | 0.5 | 500 | 0.4973 | 0.3748 |
55
+ | 0.4624 | 1.0 | 1000 | 0.4486 | 0.3958 |
56
+ | 0.5211 | 1.5 | 1500 | 0.5173 | 0.3916 |
57
+ | 0.5317 | 2.0 | 2000 | 0.4713 | 0.3992 |
58
+ | 0.4277 | 2.5 | 2500 | 0.4859 | 0.3888 |
59
+ | 0.4495 | 3.0 | 3000 | 0.4962 | 0.3862 |
60
+ | 0.3712 | 3.5 | 3500 | 0.5237 | 0.3899 |
61
+ | 0.3855 | 4.0 | 4000 | 0.4975 | 0.3850 |
62
+ | 0.3254 | 4.5 | 4500 | 0.5405 | 0.3897 |
63
+ | 0.331 | 5.0 | 5000 | 0.5255 | 0.3950 |
64
+ | 0.2907 | 5.5 | 5500 | 0.5646 | 0.3852 |
65
+ | 0.2949 | 6.0 | 6000 | 0.5782 | 0.3965 |
66
+ | 0.2521 | 6.5 | 6500 | 0.5563 | 0.3879 |
67
+ | 0.2663 | 7.0 | 7000 | 0.5627 | 0.3829 |
68
+ | 0.2342 | 7.5 | 7500 | 0.6145 | 0.3872 |
69
+ | 0.2374 | 8.0 | 8000 | 0.5860 | 0.3883 |
70
+ | 0.2099 | 8.5 | 8500 | 0.6920 | 0.3810 |
71
+ | 0.2133 | 9.0 | 9000 | 0.6354 | 0.3895 |
72
+ | 0.1887 | 9.5 | 9500 | 0.6618 | 0.3813 |
73
+ | 0.1924 | 10.0 | 10000 | 0.6522 | 0.3850 |
74
+ | 0.1728 | 10.5 | 10500 | 0.6324 | 0.3813 |
75
+ | 0.1797 | 11.0 | 11000 | 0.6637 | 0.3882 |
76
+ | 0.163 | 11.5 | 11500 | 0.6806 | 0.3799 |
77
+ | 0.1623 | 12.0 | 12000 | 0.6801 | 0.3811 |
78
+ | 0.149 | 12.5 | 12500 | 0.6723 | 0.3832 |
79
+ | 0.1493 | 13.0 | 13000 | 0.7032 | 0.3888 |
80
+ | 0.1389 | 13.5 | 13500 | 0.7294 | 0.3793 |
81
+ | 0.1383 | 14.0 | 14000 | 0.7311 | 0.3800 |
82
+ | 0.127 | 14.5 | 14500 | 0.7088 | 0.3773 |
83
+ | 0.127 | 15.0 | 15000 | 0.7352 | 0.3775 |
84
+ | 0.1159 | 15.5 | 15500 | 0.7886 | 0.3792 |
85
+ | 0.114 | 16.0 | 16000 | 0.7582 | 0.3802 |
86
+ | 0.1103 | 16.5 | 16500 | 0.7662 | 0.3717 |
87
+ | 0.1088 | 17.0 | 17000 | 0.7855 | 0.3704 |
88
+ | 0.1021 | 17.5 | 17500 | 0.7326 | 0.3717 |
89
+ | 0.104 | 18.0 | 18000 | 0.7518 | 0.3723 |
90
+ | 0.096 | 18.5 | 18500 | 0.7468 | 0.3743 |
91
+ | 0.0914 | 19.0 | 19000 | 0.7906 | 0.3741 |
92
+ | 0.0881 | 19.5 | 19500 | 0.7879 | 0.3740 |
93
+ | 0.0908 | 20.0 | 20000 | 0.8111 | 0.3676 |
94
+ | 0.0832 | 20.5 | 20500 | 0.8114 | 0.3681 |
95
+ | 0.0848 | 21.0 | 21000 | 0.8178 | 0.3651 |
96
+ | 0.0762 | 21.5 | 21500 | 0.8212 | 0.3686 |
97
+ | 0.0728 | 22.0 | 22000 | 0.8142 | 0.3673 |
98
+ | 0.074 | 22.5 | 22500 | 0.8177 | 0.3666 |
99
+ | 0.0691 | 23.0 | 23000 | 0.8323 | 0.3662 |
100
+ | 0.0689 | 23.5 | 23500 | 0.8020 | 0.3678 |
101
+ | 0.0643 | 24.0 | 24000 | 0.8145 | 0.3653 |
102
+ | 0.0647 | 24.5 | 24500 | 0.8376 | 0.3594 |
103
+ | 0.0654 | 25.0 | 25000 | 0.8307 | 0.3608 |
104
+ | 0.061 | 25.5 | 25500 | 0.8432 | 0.3600 |
105
+ | 0.0573 | 26.0 | 26000 | 0.8361 | 0.3629 |
106
+ | 0.0583 | 26.5 | 26500 | 0.8363 | 0.3625 |
107
+ | 0.054 | 27.0 | 27000 | 0.8277 | 0.3625 |
108
+ | 0.058 | 27.5 | 27500 | 0.8354 | 0.3614 |
109
+ | 0.0531 | 28.0 | 28000 | 0.8363 | 0.3595 |
110
+ | 0.0522 | 28.5 | 28500 | 0.8429 | 0.3588 |
111
+ | 0.0503 | 29.0 | 29000 | 0.8267 | 0.3595 |
112
+ | 0.0504 | 29.5 | 29500 | 0.8401 | 0.3597 |
113
+ | 0.0511 | 30.0 | 30000 | 0.8353 | 0.3593 |
114
+
115
+
116
+ ### Framework versions
117
+
118
+ - Transformers 4.35.2
119
+ - Pytorch 2.1.0+cu118
120
+ - Datasets 2.15.0
121
+ - Tokenizers 0.15.0