Oblivion208 commited on
Commit
9bfd74d
1 Parent(s): 82f03fe

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +34 -14
README.md CHANGED
@@ -1,21 +1,41 @@
1
  ---
2
  library_name: peft
 
 
 
 
 
 
 
 
3
  ---
4
- ## Training procedure
5
 
 
 
 
6
 
7
- The following `bitsandbytes` quantization config was used during training:
8
- - quant_method: bitsandbytes
9
- - load_in_8bit: True
10
- - load_in_4bit: False
11
- - llm_int8_threshold: 6.0
12
- - llm_int8_skip_modules: None
13
- - llm_int8_enable_fp32_cpu_offload: False
14
- - llm_int8_has_fp16_weight: False
15
- - bnb_4bit_quant_type: fp4
16
- - bnb_4bit_use_double_quant: False
17
- - bnb_4bit_compute_dtype: float32
18
- ### Framework versions
19
 
 
20
 
21
- - PEFT 0.6.0.dev0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  library_name: peft
3
+ license: apache-2.0
4
+ datasets:
5
+ - mozilla-foundation/common_voice_11_0
6
+ language:
7
+ - yue
8
+ metrics:
9
+ - cer
10
+ pipeline_tag: automatic-speech-recognition
11
  ---
 
12
 
13
+ <p align="left">
14
+ 🤗 <a href="https://huggingface.co/Oblivion208" target="_blank">HF Repo</a> •🐱 <a href="https://github.com/fengredrum/finetune-whisper-lora" target="_blank">Github Repo</a>
15
+ </p>
16
 
17
+ ## Approximate Performance Evaluation
 
 
 
 
 
 
 
 
 
 
 
18
 
19
+ The following models are all trained and evaluated on a single RTX 3090 GPU.
20
 
21
+ ### Cantonese Test Results Comparison
22
+
23
+ #### MDCC
24
+
25
+ | Model name | Parameters | Finetune Steps | Time Spend | Training Loss | Validation Loss | CER % | Finetuned Model |
26
+ | ------------------------------- | ---------- | -------------- | ---------- | ------------- | --------------- | ----- | ------------------------------------------------------------------------------------------------------------------------ |
27
+ | whisper-tiny-cantonese | 39 M | 3200 | 4h 34m | 0.0485 | 0.771 | 11.10 | [Link](https://huggingface.co/Oblivion208/whisper-tiny-cantonese "Oblivion208/whisper-tiny-cantonese") |
28
+ | whisper-base-cantonese | 74 M | 7200 | 13h 32m | 0.0186 | 0.477 | 7.66 | [Link](https://huggingface.co/Oblivion208/whisper-base-cantonese "Oblivion208/whisper-base-cantonese") |
29
+ | whisper-small-cantonese | 244 M | 3600 | 6h 38m | 0.0266 | 0.137 | 6.16 | [Link](https://huggingface.co/Oblivion208/whisper-small-cantonese "Oblivion208/whisper-small-cantonese") |
30
+ | whisper-small-lora-cantonese | 3.5 M | 8000 | 21h 27m | 0.0687 | 0.382 | 7.40 | [Link](https://huggingface.co/Oblivion208/whisper-small-lora-cantonese "Oblivion208/whisper-small-lora-cantonese") |
31
+ | whisper-large-v2-lora-cantonese | 15 M | 10000 | 33h 40m | 0.0046 | 0.277 | 3.77 | [Link](https://huggingface.co/Oblivion208/whisper-large-v2-lora-cantonese "Oblivion208/whisper-large-v2-lora-cantonese") |
32
+
33
+ #### Common Voice Corpus 11.0
34
+
35
+ | Model name | Original CER % | w/o Finetune CER % | Jointly Finetune CER % |
36
+ | ------------------------------- | -------------- | ------------------ | ---------------------- |
37
+ | whisper-tiny-cantonese | 124.03 | 66.85 | 35.87 |
38
+ | whisper-base-cantonese | 78.24 | 61.42 | 16.73 |
39
+ | whisper-small-cantonese | 52.83 | 31.23 | / |
40
+ | whisper-small-lora-cantonese | 37.53 | 19.38 | 14.73 |
41
+ | whisper-large-v2-lora-cantonese | 37.53 | 19.38 | 9.63 |