Edit model card

Prototipo_5_EMI

This model is a fine-tuned version of dccuchile/bert-base-spanish-wwm-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4215
  • Accuracy: 0.538

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 20
  • eval_batch_size: 20
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 5
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.2459 0.1481 200 1.2168 0.4493
1.1445 0.2963 400 1.0823 0.512
1.1117 0.4444 600 1.0979 0.5053
1.0618 0.5926 800 1.0457 0.5273
1.0343 0.7407 1000 1.0219 0.537
1.1239 0.8889 1200 1.0353 0.5257
0.9012 1.0370 1400 1.0637 0.5383
0.86 1.1852 1600 1.0682 0.5333
0.898 1.3333 1800 1.0341 0.5483
0.929 1.4815 2000 1.0437 0.5363
0.9921 1.6296 2200 0.9968 0.5473
0.9776 1.7778 2400 1.0418 0.5553
0.9166 1.9259 2600 0.9874 0.5573
0.703 2.0741 2800 1.0564 0.556
0.8123 2.2222 3000 1.0582 0.561
0.6727 2.3704 3200 1.0942 0.5483
0.6843 2.5185 3400 1.1128 0.558
0.7528 2.6667 3600 1.0823 0.5547
0.7747 2.8148 3800 1.0744 0.5497
0.7471 2.9630 4000 1.0749 0.5527
0.5774 3.1111 4200 1.1422 0.552
0.6105 3.2593 4400 1.2226 0.543
0.573 3.4074 4600 1.2427 0.5417
0.6047 3.5556 4800 1.2403 0.537
0.5334 3.7037 5000 1.2470 0.5413
0.5688 3.8519 5200 1.2585 0.5507
0.4928 4.0 5400 1.2653 0.5437
0.4314 4.1481 5600 1.3419 0.541
0.4556 4.2963 5800 1.3677 0.5413
0.4815 4.4444 6000 1.3912 0.5407
0.4431 4.5926 6200 1.4004 0.5347
0.4312 4.7407 6400 1.4161 0.5397
0.459 4.8889 6600 1.4215 0.538

Framework versions

  • Transformers 4.40.1
  • Pytorch 2.2.1+cu121
  • Datasets 2.19.0
  • Tokenizers 0.19.1
Downloads last month
18
Safetensors
Model size
110M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Armandodelca/Prototipo_5_EMI

Finetuned
(164)
this model