Edit model card

swin-transformer3

This model is a fine-tuned version of microsoft/swin-large-patch4-window12-384 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 4.1081
  • Accuracy: 0.5667
  • F1: 0.5667

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
1.6659 0.9925 33 1.0639 0.6333 0.6250
0.7561 1.9850 66 0.7258 0.5167 0.3520
0.7106 2.9774 99 0.7334 0.5 0.3755
0.6749 4.0 133 0.7088 0.4833 0.3661
0.751 4.9925 166 0.7356 0.4833 0.3661
0.7146 5.9850 199 0.7837 0.4833 0.3150
0.6699 6.9774 232 0.7569 0.4833 0.3424
0.6521 8.0 266 0.7255 0.5333 0.4674
0.6885 8.9925 299 0.7253 0.5167 0.4070
0.6407 9.9850 332 0.6506 0.6 0.5909
0.6436 10.9774 365 0.6720 0.55 0.4442
0.7865 12.0 399 0.6606 0.55 0.4792
0.7191 12.9925 432 0.6407 0.65 0.6466
0.5889 13.9850 465 0.8008 0.4833 0.3619
0.5489 14.9774 498 0.7298 0.5333 0.4674
0.596 16.0 532 0.7465 0.6667 0.6591
0.6136 16.9925 565 0.9118 0.5333 0.4692
0.5961 17.9850 598 0.6902 0.65 0.6298
0.6327 18.9774 631 0.8260 0.5667 0.5190
0.6518 20.0 665 0.6919 0.5833 0.5715
0.5551 20.9925 698 1.1780 0.55 0.516
0.511 21.9850 731 0.7414 0.6 0.6
0.4749 22.9774 764 0.7978 0.6167 0.6129
0.4607 24.0 798 0.8087 0.55 0.5420
0.5837 24.9925 831 0.8271 0.5667 0.5456
0.4608 25.9850 864 0.8539 0.6 0.5863
0.536 26.9774 897 0.9802 0.5333 0.5026
0.4225 28.0 931 0.9275 0.6 0.5910
0.4325 28.9925 964 0.8834 0.6167 0.6099
0.4874 29.9850 997 0.8721 0.6167 0.6168
0.4165 30.9774 1030 1.0360 0.6167 0.6163
0.4773 32.0 1064 1.2210 0.5833 0.5759
0.3756 32.9925 1097 1.1291 0.5833 0.5830
0.636 33.9850 1130 1.0178 0.5833 0.5830
0.5474 34.9774 1163 0.9479 0.5667 0.5608
0.3462 36.0 1197 0.9585 0.6167 0.6163
0.3057 36.9925 1230 1.2014 0.6167 0.6163
0.2304 37.9850 1263 1.1975 0.6333 0.6333
0.2628 38.9774 1296 1.5224 0.5833 0.5793
0.3774 40.0 1330 1.2903 0.5667 0.5516
0.2604 40.9925 1363 1.4082 0.5667 0.5608
0.2522 41.9850 1396 1.1783 0.6167 0.6163
0.1925 42.9774 1429 1.3613 0.6167 0.6163
0.3436 44.0 1463 1.6383 0.5333 0.5173
0.1955 44.9925 1496 1.8947 0.5 0.4829
0.2206 45.9850 1529 1.4390 0.6 0.6
0.1912 46.9774 1562 1.5288 0.65 0.6400
0.2794 48.0 1596 1.7393 0.55 0.5420
0.3166 48.9925 1629 2.0414 0.5667 0.5608
0.173 49.9850 1662 1.6377 0.6 0.5991
0.1375 50.9774 1695 1.6228 0.6 0.6
0.2659 52.0 1729 1.6452 0.6333 0.6333
0.2045 52.9925 1762 1.9706 0.5667 0.5608
0.1081 53.9850 1795 1.9546 0.6167 0.6009
0.1782 54.9774 1828 2.1268 0.5667 0.5608
0.244 56.0 1862 1.8301 0.6167 0.6098
0.1783 56.9925 1895 2.5808 0.5667 0.5071
0.2429 57.9850 1928 2.1214 0.6167 0.6059
0.2 58.9774 1961 2.2282 0.5667 0.5657
0.1646 60.0 1995 2.3272 0.5833 0.5662
0.1663 60.9925 2028 2.4723 0.5333 0.5323
0.1935 61.9850 2061 2.3384 0.6 0.5973
0.2079 62.9774 2094 1.9271 0.5833 0.5830
0.1797 64.0 2128 1.8707 0.6167 0.6151
0.173 64.9925 2161 2.6292 0.5167 0.5031
0.1815 65.9850 2194 2.6567 0.6 0.5973
0.0665 66.9774 2227 3.2104 0.5167 0.5031
0.1084 68.0 2261 3.6692 0.5333 0.5228
0.1298 68.9925 2294 3.4104 0.55 0.5373
0.1338 69.9850 2327 2.8215 0.6 0.5973
0.0795 70.9774 2360 2.9208 0.5833 0.5830
0.1138 72.0 2394 3.4277 0.5333 0.5302
0.1644 72.9925 2427 2.8141 0.5833 0.5830
0.1659 73.9850 2460 2.8723 0.6 0.6
0.0453 74.9774 2493 2.8769 0.6333 0.6309
0.0956 76.0 2527 3.2970 0.6167 0.6098
0.1581 76.9925 2560 3.6672 0.5833 0.5816
0.157 77.9850 2593 3.5317 0.55 0.5501
0.0662 78.9774 2626 3.9003 0.55 0.5456
0.1954 80.0 2660 3.3000 0.5833 0.5834
0.0527 80.9925 2693 3.9596 0.5667 0.5638
0.1578 81.9850 2726 3.6724 0.55 0.5481
0.0737 82.9774 2759 4.0222 0.5167 0.5119
0.0617 84.0 2793 3.5510 0.5833 0.5834
0.0531 84.9925 2826 3.5110 0.6 0.6
0.0993 85.9850 2859 4.0699 0.55 0.5481
0.1545 86.9774 2892 3.6860 0.5667 0.5667
0.0554 88.0 2926 3.4409 0.6 0.6
0.0641 88.9925 2959 3.8304 0.55 0.5496
0.0633 89.9850 2992 4.0899 0.55 0.5456
0.0991 90.9774 3025 3.7344 0.6 0.6
0.0772 92.0 3059 3.8448 0.6 0.5991
0.0646 92.9925 3092 3.7794 0.6 0.5991
0.0562 93.9850 3125 3.9340 0.5833 0.5830
0.0475 94.9774 3158 4.2388 0.55 0.5481
0.0715 96.0 3192 4.2732 0.5333 0.5302
0.0875 96.9925 3225 4.1521 0.5667 0.5657
0.0253 97.9850 3258 4.0813 0.5667 0.5667
0.1037 98.9774 3291 4.1074 0.5667 0.5667
0.1094 99.2481 3300 4.1081 0.5667 0.5667

Framework versions

  • Transformers 4.45.2
  • Pytorch 2.4.0+cu121
  • Datasets 3.1.0
  • Tokenizers 0.20.1
Downloads last month
7
Safetensors
Model size
196M params
Tensor type
I64
·
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for masafresh/swin-transformer3

Finetuned
(3)
this model