Edit model card

Visualize in Weights & Biases

mms-1b-bem-female-sv

This model is a fine-tuned version of facebook/mms-1b-all on the BEMBASPEECH - BEM dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2132
  • Wer: 0.3557

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 5.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
No log 0.3992 200 0.3566 0.5019
No log 0.7984 400 0.2620 0.4029
1.7214 1.1976 600 0.2546 0.4100
1.7214 1.5968 800 0.2359 0.3965
0.2801 1.9960 1000 0.2322 0.3810
0.2801 2.3952 1200 0.2305 0.3746
0.2801 2.7944 1400 0.2258 0.3336
0.2528 3.1936 1600 0.2262 0.4309
0.2528 3.5928 1800 0.2164 0.3514
0.2351 3.9920 2000 0.2215 0.3894
0.2351 4.3912 2200 0.2165 0.3624
0.2351 4.7904 2400 0.2132 0.3557

Framework versions

  • Transformers 4.43.0
  • Pytorch 2.3.1+cu121
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
7
Safetensors
Model size
965M params
Tensor type
F32
·
Inference Examples
Unable to determine this model's library. Check the docs .

Model tree for csikasote/mms-1b-bem-female-sv

Finetuned
(126)
this model