File size: 5,016 Bytes
ba70849
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
---
license: apache-2.0
base_model: facebook/wav2vec2-base
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: wav2vec2-1
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# wav2vec2-1

This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8870
- Wer: 0.3805

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 30
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step  | Validation Loss | Wer    |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 4.6457        | 0.5   | 500   | 2.8866          | 0.9999 |
| 2.863         | 1.0   | 1000  | 2.8676          | 1.0    |
| 1.8085        | 1.5   | 1500  | 0.9396          | 0.6602 |
| 0.8828        | 2.0   | 2000  | 0.7278          | 0.5699 |
| 0.6659        | 2.5   | 2500  | 0.7000          | 0.5401 |
| 0.6085        | 3.0   | 3000  | 0.7143          | 0.4939 |
| 0.4878        | 3.5   | 3500  | 0.5845          | 0.4717 |
| 0.4888        | 4.0   | 4000  | 0.6201          | 0.4677 |
| 0.4022        | 4.5   | 4500  | 0.5984          | 0.4532 |
| 0.3947        | 5.0   | 5000  | 0.5874          | 0.4378 |
| 0.3415        | 5.5   | 5500  | 0.6486          | 0.4405 |
| 0.3413        | 6.0   | 6000  | 0.5988          | 0.4355 |
| 0.2903        | 6.5   | 6500  | 0.6584          | 0.4304 |
| 0.3046        | 7.0   | 7000  | 0.6602          | 0.4189 |
| 0.2625        | 7.5   | 7500  | 0.5924          | 0.4235 |
| 0.2625        | 8.0   | 8000  | 0.6541          | 0.4212 |
| 0.2341        | 8.5   | 8500  | 0.6365          | 0.4171 |
| 0.2384        | 9.0   | 9000  | 0.6095          | 0.4182 |
| 0.2052        | 9.5   | 9500  | 0.6675          | 0.4091 |
| 0.2124        | 10.0  | 10000 | 0.6524          | 0.4110 |
| 0.1915        | 10.5  | 10500 | 0.6877          | 0.4122 |
| 0.1922        | 11.0  | 11000 | 0.6857          | 0.4122 |
| 0.1719        | 11.5  | 11500 | 0.6881          | 0.4056 |
| 0.1811        | 12.0  | 12000 | 0.6832          | 0.4083 |
| 0.1554        | 12.5  | 12500 | 0.7378          | 0.4103 |
| 0.163         | 13.0  | 13000 | 0.6940          | 0.4019 |
| 0.1452        | 13.5  | 13500 | 0.6811          | 0.3993 |
| 0.1457        | 14.0  | 14000 | 0.7216          | 0.4007 |
| 0.1319        | 14.5  | 14500 | 0.7243          | 0.3996 |
| 0.1367        | 15.0  | 15000 | 0.7332          | 0.4006 |
| 0.118         | 15.5  | 15500 | 0.7609          | 0.4050 |
| 0.121         | 16.0  | 16000 | 0.7585          | 0.4021 |
| 0.1096        | 16.5  | 16500 | 0.7583          | 0.4003 |
| 0.112         | 17.0  | 17000 | 0.7928          | 0.4011 |
| 0.1063        | 17.5  | 17500 | 0.7794          | 0.4038 |
| 0.1009        | 18.0  | 18000 | 0.7474          | 0.3982 |
| 0.0931        | 18.5  | 18500 | 0.8143          | 0.3980 |
| 0.0943        | 19.0  | 19000 | 0.7873          | 0.4000 |
| 0.0847        | 19.5  | 19500 | 0.8064          | 0.3991 |
| 0.0831        | 20.0  | 20000 | 0.8564          | 0.3967 |
| 0.0821        | 20.5  | 20500 | 0.8632          | 0.3956 |
| 0.0807        | 21.0  | 21000 | 0.8250          | 0.3928 |
| 0.0748        | 21.5  | 21500 | 0.8389          | 0.3949 |
| 0.0751        | 22.0  | 22000 | 0.8355          | 0.3943 |
| 0.072         | 22.5  | 22500 | 0.8568          | 0.3930 |
| 0.0696        | 23.0  | 23000 | 0.8396          | 0.3912 |
| 0.0678        | 23.5  | 23500 | 0.8634          | 0.3901 |
| 0.0671        | 24.0  | 24000 | 0.8576          | 0.3880 |
| 0.063         | 24.5  | 24500 | 0.8303          | 0.3876 |
| 0.0575        | 25.0  | 25000 | 0.9125          | 0.3847 |
| 0.0572        | 25.5  | 25500 | 0.8745          | 0.3839 |
| 0.0572        | 26.0  | 26000 | 0.8714          | 0.3844 |
| 0.0533        | 26.5  | 26500 | 0.8824          | 0.3840 |
| 0.0496        | 27.0  | 27000 | 0.8993          | 0.3830 |
| 0.0525        | 27.5  | 27500 | 0.8818          | 0.3830 |
| 0.0514        | 28.0  | 28000 | 0.8874          | 0.3819 |
| 0.0464        | 28.5  | 28500 | 0.8947          | 0.3802 |
| 0.0473        | 29.0  | 29000 | 0.9028          | 0.3805 |
| 0.048         | 29.5  | 29500 | 0.8899          | 0.3801 |
| 0.0458        | 30.0  | 30000 | 0.8870          | 0.3805 |


### Framework versions

- Transformers 4.35.2
- Pytorch 2.1.0+cu118
- Datasets 2.15.0
- Tokenizers 0.15.0