emilios commited on
Commit
2bbc195
1 Parent(s): 3ef32cc
Files changed (2) hide show
  1. README.md +14 -33
  2. README.md.new +82 -0
README.md CHANGED
@@ -3,38 +3,30 @@ language:
3
  - el
4
  license: apache-2.0
5
  tags:
6
- - whisper-event
 
7
  - generated_from_trainer
8
  datasets:
9
- - mozilla-foundation/common_voice_11_0,google/fleurs
10
- metrics:
11
- - wer
12
  model-index:
13
- - name: Whisper Medium El Greco
14
- results:
15
- - task:
16
- name: Automatic Speech Recognition
17
- type: automatic-speech-recognition
18
- dataset:
19
- name: mozilla-foundation/common_voice_11_0,google/fleurs
20
- type: mozilla-foundation/common_voice_11_0,google/fleurs
21
- config: null
22
- split: None
23
- metrics:
24
- - name: Wer
25
- type: wer
26
- value: 11.199851411589897
27
  ---
28
 
29
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
30
  should probably proofread and complete it, then remove this comment. -->
31
 
32
- # Whisper Medium El Greco
33
 
34
- This model is a fine-tuned version of [emilios/whisper-medium-el](https://huggingface.co/emilios/whisper-medium-el) on the mozilla-foundation/common_voice_11_0,google/fleurs el,el_gr dataset.
35
  It achieves the following results on the evaluation set:
36
- - Loss: 0.3801
37
- - Wer: 11.1999
 
 
 
 
 
38
 
39
  ## Model description
40
 
@@ -63,17 +55,6 @@ The following hyperparameters were used during training:
63
  - training_steps: 5000
64
  - mixed_precision_training: Native AMP
65
 
66
- ### Training results
67
-
68
- | Training Loss | Epoch | Step | Validation Loss | Wer |
69
- |:-------------:|:-----:|:----:|:---------------:|:-------:|
70
- | 0.0176 | 2.49 | 1000 | 0.2945 | 12.6114 |
71
- | 0.0064 | 4.98 | 2000 | 0.3423 | 12.2307 |
72
- | 0.0022 | 7.46 | 3000 | 0.3632 | 11.5899 |
73
- | 0.0014 | 9.95 | 4000 | 0.3788 | 11.2556 |
74
- | 0.0008 | 12.44 | 5000 | 0.3801 | 11.1999 |
75
-
76
-
77
  ### Framework versions
78
 
79
  - Transformers 4.26.0.dev0
 
3
  - el
4
  license: apache-2.0
5
  tags:
6
+ - hf-asr-leaderboard, whisper-medium, mozilla-foundation/common_voice_11_0, greek,
7
+ whisper-event
8
  - generated_from_trainer
9
  datasets:
10
+ - mozilla-foundation/common_voice_11_0
 
 
11
  model-index:
12
+ - name: Whisper Medium El Greco Greek
13
+ results: []
 
 
 
 
 
 
 
 
 
 
 
 
14
  ---
15
 
16
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
17
  should probably proofread and complete it, then remove this comment. -->
18
 
19
+ # Whisper Medium El Greco Greek
20
 
21
+ This model is a fine-tuned version of [openai/whisper-medium](https://huggingface.co/openai/whisper-medium) on the Common Voice 11.0 dataset.
22
  It achieves the following results on the evaluation set:
23
+ - eval_loss: 0.3924
24
+ - eval_wer: 12.4443
25
+ - eval_runtime: 1211.1631
26
+ - eval_samples_per_second: 1.4
27
+ - eval_steps_per_second: 0.088
28
+ - epoch: 4.04
29
+ - step: 5000
30
 
31
  ## Model description
32
 
 
55
  - training_steps: 5000
56
  - mixed_precision_training: Native AMP
57
 
 
 
 
 
 
 
 
 
 
 
 
58
  ### Framework versions
59
 
60
  - Transformers 4.26.0.dev0
README.md.new ADDED
@@ -0,0 +1,82 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - el
4
+ license: apache-2.0
5
+ tags:
6
+ - whisper-event
7
+ - generated_from_trainer
8
+ datasets:
9
+ - mozilla-foundation/common_voice_11_0,google/fleurs
10
+ metrics:
11
+ - wer
12
+ model-index:
13
+ - name: Whisper Medium El Greco
14
+ results:
15
+ - task:
16
+ name: Automatic Speech Recognition
17
+ type: automatic-speech-recognition
18
+ dataset:
19
+ name: mozilla-foundation/common_voice_11_0,google/fleurs
20
+ type: mozilla-foundation/common_voice_11_0,google/fleurs
21
+ config: null
22
+ split: None
23
+ metrics:
24
+ - name: Wer
25
+ type: wer
26
+ value: 11.199851411589897
27
+ ---
28
+
29
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
30
+ should probably proofread and complete it, then remove this comment. -->
31
+
32
+ # Whisper Medium El Greco
33
+
34
+ This model is a fine-tuned version of [emilios/whisper-medium-el](https://huggingface.co/emilios/whisper-medium-el) on the mozilla-foundation/common_voice_11_0,google/fleurs el,el_gr dataset.
35
+ It achieves the following results on the evaluation set:
36
+ - Loss: 0.3801
37
+ - Wer: 11.1999
38
+
39
+ ## Model description
40
+
41
+ More information needed
42
+
43
+ ## Intended uses & limitations
44
+
45
+ More information needed
46
+
47
+ ## Training and evaluation data
48
+
49
+ More information needed
50
+
51
+ ## Training procedure
52
+
53
+ ### Training hyperparameters
54
+
55
+ The following hyperparameters were used during training:
56
+ - learning_rate: 1e-05
57
+ - train_batch_size: 32
58
+ - eval_batch_size: 16
59
+ - seed: 42
60
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
61
+ - lr_scheduler_type: linear
62
+ - lr_scheduler_warmup_steps: 500
63
+ - training_steps: 5000
64
+ - mixed_precision_training: Native AMP
65
+
66
+ ### Training results
67
+
68
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
69
+ |:-------------:|:-----:|:----:|:---------------:|:-------:|
70
+ | 0.0176 | 2.49 | 1000 | 0.2945 | 12.6114 |
71
+ | 0.0064 | 4.98 | 2000 | 0.3423 | 12.2307 |
72
+ | 0.0022 | 7.46 | 3000 | 0.3632 | 11.5899 |
73
+ | 0.0014 | 9.95 | 4000 | 0.3788 | 11.2556 |
74
+ | 0.0008 | 12.44 | 5000 | 0.3801 | 11.1999 |
75
+
76
+
77
+ ### Framework versions
78
+
79
+ - Transformers 4.26.0.dev0
80
+ - Pytorch 1.13.0+cu117
81
+ - Datasets 2.7.1.dev0
82
+ - Tokenizers 0.13.2