Edit model card

whisper-small-yoruba-07-15

This model is a fine-tuned version of openai/whisper-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4112
  • Wer Ortho: 68.9201
  • Wer: 61.9420

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 50
  • num_epochs: 1
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Ortho Wer
0.7803 0.0805 250 0.9028 78.2451 72.5162
0.5693 0.1610 500 0.7365 85.1996 82.5191
0.504 0.2415 750 0.6444 73.1001 69.8969
0.463 0.3220 1000 0.5931 78.2923 71.3930
0.4036 0.4024 1250 0.5471 68.4638 62.0921
0.3496 0.4829 1500 0.5171 73.5459 71.9933
0.3346 0.5634 1750 0.4908 67.8109 65.7669
0.34 0.6439 2000 0.4612 70.3336 65.5394
0.3153 0.7244 2250 0.4380 66.8799 60.1118
0.3061 0.8049 2500 0.4228 67.9499 60.2982
0.2877 0.8854 2750 0.4164 67.9735 59.7051
0.2892 0.9659 3000 0.4112 68.9201 61.9420

Framework versions

  • Transformers 4.42.4
  • Pytorch 2.3.0+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
7
Safetensors
Model size
242M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for ccibeekeoc42/whisper-small-yoruba-07-15

Finetuned
(1927)
this model