wlkobe's picture
Model save
78525fe verified
metadata
license: apache-2.0
base_model: distilbert/distilbert-base-uncased-finetuned-sst-2-english
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - f1
model-index:
  - name: >-
      distilbert-base-uncased-finetuned-sst-2-english-finetuned-abstract_classification
    results: []

distilbert-base-uncased-finetuned-sst-2-english-finetuned-abstract_classification

This model is a fine-tuned version of distilbert/distilbert-base-uncased-finetuned-sst-2-english on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3820
  • Accuracy: 0.9803
  • F1: 0.9709

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
1.4187 1.0 4 1.1950 0.8071 0.7275
1.215 2.0 8 1.1708 0.8150 0.7405
1.2073 3.0 12 1.1419 0.8110 0.7359
1.2722 4.0 16 1.1119 0.8110 0.7360
1.1215 5.0 20 1.0880 0.8189 0.7488
1.1604 6.0 24 1.0609 0.8268 0.7587
1.1658 7.0 28 1.0354 0.8346 0.7683
1.1585 8.0 32 1.0155 0.8307 0.7639
1.1995 9.0 36 0.9936 0.8268 0.7596
1.084 10.0 40 0.9698 0.8268 0.7598
1.208 11.0 44 0.9477 0.8386 0.7755
1.0951 12.0 48 0.9297 0.8583 0.7979
1.042 13.0 52 0.9119 0.8543 0.7924
1.0197 14.0 56 0.8913 0.8543 0.7924
1.0083 15.0 60 0.8761 0.8583 0.7979
0.9577 16.0 64 0.8606 0.8543 0.7927
0.9542 17.0 68 0.8418 0.8543 0.7929
0.9632 18.0 72 0.8262 0.8543 0.7925
0.9265 19.0 76 0.8122 0.8543 0.7924
0.978 20.0 80 0.7951 0.8622 0.8070
0.8984 21.0 84 0.7810 0.8661 0.8124
0.8813 22.0 88 0.7684 0.8740 0.8227
0.8821 23.0 92 0.7550 0.8819 0.8328
0.8303 24.0 96 0.7419 0.8819 0.8341
0.833 25.0 100 0.7327 0.8898 0.8456
0.9008 26.0 104 0.7151 0.8976 0.8559
0.838 27.0 108 0.7035 0.9016 0.8592
0.7202 28.0 112 0.6964 0.9055 0.8641
0.7998 29.0 116 0.6803 0.9094 0.8711
0.7539 30.0 120 0.6693 0.9055 0.8656
0.7137 31.0 124 0.6625 0.9134 0.8766
0.8068 32.0 128 0.6536 0.9173 0.8824
0.7688 33.0 132 0.6393 0.9173 0.8806
0.7516 34.0 136 0.6308 0.9134 0.8777
0.7908 35.0 140 0.6251 0.9134 0.8764
0.6659 36.0 144 0.6141 0.9134 0.8761
0.7202 37.0 148 0.6043 0.9291 0.8986
0.6657 38.0 152 0.5966 0.9370 0.9099
0.6988 39.0 156 0.5886 0.9409 0.9142
0.7726 40.0 160 0.5799 0.9370 0.9100
0.5252 41.0 164 0.5716 0.9409 0.9141
0.6311 42.0 168 0.5650 0.9409 0.9142
0.6402 43.0 172 0.5583 0.9409 0.9147
0.6468 44.0 176 0.5513 0.9409 0.9147
0.6197 45.0 180 0.5437 0.9449 0.9200
0.6282 46.0 184 0.5371 0.9449 0.9200
0.6579 47.0 188 0.5313 0.9409 0.9142
0.6682 48.0 192 0.5237 0.9409 0.9142
0.6592 49.0 196 0.5168 0.9488 0.9258
0.547 50.0 200 0.5104 0.9488 0.9257
0.5069 51.0 204 0.5042 0.9488 0.9257
0.6015 52.0 208 0.4995 0.9567 0.9367
0.549 53.0 212 0.4935 0.9606 0.9425
0.6206 54.0 216 0.4870 0.9646 0.9482
0.5396 55.0 220 0.4821 0.9685 0.9541
0.5753 56.0 224 0.4773 0.9646 0.9482
0.5867 57.0 228 0.4732 0.9685 0.9542
0.5553 58.0 232 0.4685 0.9724 0.9596
0.4751 59.0 236 0.4641 0.9724 0.9596
0.5857 60.0 240 0.4588 0.9685 0.9538
0.5199 61.0 244 0.4563 0.9724 0.9596
0.5616 62.0 248 0.4535 0.9685 0.9538
0.5698 63.0 252 0.4481 0.9685 0.9538
0.5302 64.0 256 0.4435 0.9646 0.9479
0.5311 65.0 260 0.4405 0.9685 0.9537
0.5204 66.0 264 0.4385 0.9685 0.9537
0.4678 67.0 268 0.4334 0.9764 0.9653
0.5635 68.0 272 0.4297 0.9724 0.9595
0.5404 69.0 276 0.4275 0.9764 0.9653
0.5246 70.0 280 0.4256 0.9764 0.9653
0.4557 71.0 284 0.4236 0.9764 0.9653
0.5924 72.0 288 0.4215 0.9764 0.9653
0.5166 73.0 292 0.4178 0.9764 0.9653
0.375 74.0 296 0.4141 0.9764 0.9653
0.5337 75.0 300 0.4111 0.9764 0.9653
0.4728 76.0 304 0.4088 0.9764 0.9653
0.516 77.0 308 0.4070 0.9764 0.9653
0.4553 78.0 312 0.4051 0.9764 0.9653
0.4761 79.0 316 0.4034 0.9764 0.9653
0.4672 80.0 320 0.4011 0.9724 0.9595
0.5029 81.0 324 0.3990 0.9764 0.9653
0.4754 82.0 328 0.3973 0.9764 0.9653
0.4678 83.0 332 0.3962 0.9764 0.9653
0.4717 84.0 336 0.3950 0.9803 0.9708
0.4518 85.0 340 0.3935 0.9803 0.9709
0.5682 86.0 344 0.3916 0.9803 0.9709
0.4313 87.0 348 0.3900 0.9803 0.9709
0.4528 88.0 352 0.3883 0.9803 0.9709
0.5075 89.0 356 0.3871 0.9803 0.9709
0.4255 90.0 360 0.3865 0.9803 0.9709
0.4278 91.0 364 0.3860 0.9803 0.9709
0.5074 92.0 368 0.3855 0.9803 0.9709
0.5244 93.0 372 0.3848 0.9803 0.9709
0.4806 94.0 376 0.3839 0.9803 0.9709
0.4271 95.0 380 0.3832 0.9803 0.9709
0.4829 96.0 384 0.3827 0.9803 0.9709
0.4356 97.0 388 0.3823 0.9803 0.9709
0.5412 98.0 392 0.3821 0.9803 0.9709
0.4539 99.0 396 0.3820 0.9803 0.9709
0.4462 100.0 400 0.3820 0.9803 0.9709

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.0.1+cu117
  • Datasets 2.19.1
  • Tokenizers 0.19.1