wsqstar's picture
End of training
c7e0e93 verified
|
raw
history blame
3.79 kB
metadata
base_model: bert-base-chinese
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: bert-finetuned-weibo-luobokuaipao
    results: []

bert-finetuned-weibo-luobokuaipao

This model is a fine-tuned version of bert-base-chinese on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 3.2017
  • Accuracy: 0.5815

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 40

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 243 1.0137 0.5852
No log 2.0 486 0.9878 0.5815
1.0413 3.0 729 1.0463 0.6056
1.0413 4.0 972 1.1697 0.6130
0.5875 5.0 1215 1.3835 0.5852
0.5875 6.0 1458 1.5941 0.5722
0.3069 7.0 1701 1.9360 0.5796
0.3069 8.0 1944 2.0863 0.6093
0.1782 9.0 2187 2.2601 0.5833
0.1782 10.0 2430 2.4810 0.5926
0.12 11.0 2673 2.5233 0.6
0.12 12.0 2916 2.5486 0.5833
0.089 13.0 3159 2.6555 0.5704
0.089 14.0 3402 2.6093 0.6019
0.0718 15.0 3645 2.6888 0.5907
0.0718 16.0 3888 2.9839 0.5722
0.061 17.0 4131 2.8104 0.5778
0.061 18.0 4374 2.9843 0.5685
0.062 19.0 4617 3.1577 0.5648
0.062 20.0 4860 3.1641 0.5722
0.0553 21.0 5103 3.1004 0.5611
0.0553 22.0 5346 3.0974 0.5778
0.0417 23.0 5589 3.0206 0.5759
0.0417 24.0 5832 3.0191 0.5667
0.0374 25.0 6075 3.0920 0.5722
0.0374 26.0 6318 2.9696 0.5852
0.0335 27.0 6561 3.0100 0.5889
0.0335 28.0 6804 3.1014 0.5667
0.0313 29.0 7047 3.2620 0.5574
0.0313 30.0 7290 3.0502 0.5889
0.032 31.0 7533 3.0984 0.5833
0.032 32.0 7776 3.1546 0.5704
0.0329 33.0 8019 3.0977 0.5741
0.0329 34.0 8262 3.0975 0.5796
0.0276 35.0 8505 3.1124 0.5870
0.0276 36.0 8748 3.1204 0.5926
0.0276 37.0 8991 3.1556 0.5833
0.026 38.0 9234 3.1909 0.5815
0.026 39.0 9477 3.1959 0.5815
0.0245 40.0 9720 3.2017 0.5815

Framework versions

  • Transformers 4.42.4
  • Pytorch 2.3.1+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1