Edit model card

FNST_trad_k

This model is a fine-tuned version of dccuchile/bert-base-spanish-wwm-cased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 3.0507
  • Accuracy: 0.6275
  • F1: 0.6275

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-06
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 36

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
1.1468 1.0 1500 1.0961 0.52 0.4256
1.0016 2.0 3000 0.9821 0.5742 0.5705
0.9255 3.0 4500 0.9492 0.5833 0.5865
0.8837 4.0 6000 0.9318 0.5825 0.5837
0.8417 5.0 7500 0.9306 0.5992 0.5978
0.8013 6.0 9000 0.9318 0.6017 0.6045
0.7621 7.0 10500 0.9359 0.605 0.6052
0.7504 8.0 12000 0.9394 0.61 0.6121
0.6951 9.0 13500 0.9744 0.6092 0.6063
0.6763 10.0 15000 0.9820 0.61 0.6099
0.612 11.0 16500 1.0162 0.6158 0.6116
0.593 12.0 18000 1.0400 0.6158 0.6193
0.5561 13.0 19500 1.0735 0.6158 0.6167
0.5342 14.0 21000 1.0789 0.6158 0.6132
0.4931 15.0 22500 1.1443 0.6167 0.6136
0.4758 16.0 24000 1.1832 0.6192 0.6195
0.4346 17.0 25500 1.2587 0.62 0.6196
0.3959 18.0 27000 1.3334 0.6167 0.6178
0.3848 19.0 28500 1.3624 0.6258 0.6245
0.35 20.0 30000 1.4552 0.6233 0.6227
0.3094 21.0 31500 1.5021 0.6208 0.6206
0.3221 22.0 33000 1.6168 0.6242 0.6228
0.2803 23.0 34500 1.6995 0.6225 0.6201
0.2722 24.0 36000 1.8134 0.625 0.6232
0.2355 25.0 37500 1.9296 0.6167 0.6137
0.2285 26.0 39000 2.0198 0.6283 0.6268
0.211 27.0 40500 2.1630 0.6208 0.6220
0.1857 28.0 42000 2.2532 0.6275 0.6244
0.188 29.0 43500 2.4117 0.625 0.6228
0.1787 30.0 45000 2.4971 0.6275 0.6257
0.1687 31.0 46500 2.6493 0.6217 0.6191
0.1534 32.0 48000 2.7295 0.6217 0.6169
0.1606 33.0 49500 2.9021 0.6208 0.6198
0.1537 34.0 51000 2.9315 0.6167 0.6162
0.1284 35.0 52500 3.0047 0.6208 0.6217
0.1359 36.0 54000 3.0507 0.6275 0.6275

Framework versions

  • Transformers 4.37.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.16.1
  • Tokenizers 0.15.1
Downloads last month
8
Safetensors
Model size
110M params
Tensor type
F32
·
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for mrovejaxd/FNST_trad_k

Finetuned
this model