|
--- |
|
license: apache-2.0 |
|
base_model: microsoft/resnet-50 |
|
tags: |
|
- generated_from_trainer |
|
metrics: |
|
- accuracy |
|
model-index: |
|
- name: resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t5.0_a0.9 |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t5.0_a0.9 |
|
|
|
This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the None dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 0.8809 |
|
- Accuracy: 0.7 |
|
- Brier Loss: 0.4126 |
|
- Nll: 2.4279 |
|
- F1 Micro: 0.7 |
|
- F1 Macro: 0.6279 |
|
- Ece: 0.2569 |
|
- Aurc: 0.1111 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 0.0001 |
|
- train_batch_size: 64 |
|
- eval_batch_size: 64 |
|
- seed: 42 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- lr_scheduler_warmup_ratio: 0.1 |
|
- num_epochs: 50 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |
|
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| |
|
| No log | 1.0 | 13 | 2.1185 | 0.165 | 0.8967 | 8.5399 | 0.165 | 0.1130 | 0.2151 | 0.8331 | |
|
| No log | 2.0 | 26 | 2.1127 | 0.13 | 0.8958 | 8.1152 | 0.13 | 0.0842 | 0.1816 | 0.8392 | |
|
| No log | 3.0 | 39 | 2.0781 | 0.165 | 0.8888 | 6.8828 | 0.165 | 0.0878 | 0.2150 | 0.8082 | |
|
| No log | 4.0 | 52 | 2.0197 | 0.22 | 0.8762 | 5.7578 | 0.22 | 0.1155 | 0.2521 | 0.7521 | |
|
| No log | 5.0 | 65 | 1.9499 | 0.205 | 0.8601 | 6.0641 | 0.205 | 0.0951 | 0.2567 | 0.7355 | |
|
| No log | 6.0 | 78 | 1.9019 | 0.25 | 0.8483 | 5.8930 | 0.25 | 0.1178 | 0.2728 | 0.6862 | |
|
| No log | 7.0 | 91 | 1.8252 | 0.28 | 0.8301 | 5.8062 | 0.28 | 0.1660 | 0.2890 | 0.6982 | |
|
| No log | 8.0 | 104 | 1.8194 | 0.28 | 0.8275 | 5.2642 | 0.28 | 0.1625 | 0.2874 | 0.6935 | |
|
| No log | 9.0 | 117 | 1.7671 | 0.355 | 0.8109 | 5.1326 | 0.3550 | 0.2211 | 0.3018 | 0.5678 | |
|
| No log | 10.0 | 130 | 1.6582 | 0.355 | 0.7774 | 5.2226 | 0.3550 | 0.2200 | 0.2991 | 0.5305 | |
|
| No log | 11.0 | 143 | 1.5849 | 0.395 | 0.7422 | 5.0239 | 0.395 | 0.2436 | 0.2979 | 0.3974 | |
|
| No log | 12.0 | 156 | 1.4908 | 0.46 | 0.7001 | 4.2790 | 0.46 | 0.3169 | 0.3091 | 0.3003 | |
|
| No log | 13.0 | 169 | 1.6016 | 0.395 | 0.7496 | 4.2149 | 0.395 | 0.2793 | 0.2929 | 0.4640 | |
|
| No log | 14.0 | 182 | 1.4714 | 0.475 | 0.6971 | 4.0742 | 0.4750 | 0.3299 | 0.3177 | 0.3613 | |
|
| No log | 15.0 | 195 | 1.5007 | 0.46 | 0.7119 | 3.8252 | 0.46 | 0.3145 | 0.3111 | 0.3954 | |
|
| No log | 16.0 | 208 | 1.4352 | 0.515 | 0.6776 | 3.4028 | 0.515 | 0.3948 | 0.3376 | 0.2993 | |
|
| No log | 17.0 | 221 | 1.2890 | 0.575 | 0.6104 | 3.4453 | 0.575 | 0.4478 | 0.2940 | 0.2119 | |
|
| No log | 18.0 | 234 | 1.2190 | 0.595 | 0.5719 | 3.2413 | 0.595 | 0.4662 | 0.2608 | 0.1981 | |
|
| No log | 19.0 | 247 | 1.2287 | 0.59 | 0.5764 | 3.2303 | 0.59 | 0.4857 | 0.2811 | 0.2020 | |
|
| No log | 20.0 | 260 | 1.1726 | 0.64 | 0.5494 | 2.9544 | 0.64 | 0.5307 | 0.2993 | 0.1708 | |
|
| No log | 21.0 | 273 | 1.1305 | 0.61 | 0.5384 | 2.9557 | 0.61 | 0.5170 | 0.2771 | 0.1949 | |
|
| No log | 22.0 | 286 | 1.1256 | 0.645 | 0.5295 | 2.7934 | 0.645 | 0.5381 | 0.3181 | 0.1629 | |
|
| No log | 23.0 | 299 | 1.1209 | 0.645 | 0.5217 | 2.8697 | 0.645 | 0.5432 | 0.3055 | 0.1687 | |
|
| No log | 24.0 | 312 | 1.2513 | 0.685 | 0.5917 | 2.7262 | 0.685 | 0.5639 | 0.3779 | 0.1833 | |
|
| No log | 25.0 | 325 | 1.0321 | 0.695 | 0.4819 | 2.7202 | 0.695 | 0.5896 | 0.2810 | 0.1280 | |
|
| No log | 26.0 | 338 | 1.0405 | 0.645 | 0.4957 | 2.6116 | 0.645 | 0.5661 | 0.2515 | 0.1700 | |
|
| No log | 27.0 | 351 | 1.0580 | 0.695 | 0.4933 | 2.7436 | 0.695 | 0.5996 | 0.2967 | 0.1339 | |
|
| No log | 28.0 | 364 | 0.9740 | 0.65 | 0.4575 | 2.5682 | 0.65 | 0.5731 | 0.2513 | 0.1384 | |
|
| No log | 29.0 | 377 | 0.9934 | 0.695 | 0.4651 | 2.5753 | 0.695 | 0.6108 | 0.2775 | 0.1171 | |
|
| No log | 30.0 | 390 | 0.9900 | 0.645 | 0.4695 | 2.6280 | 0.645 | 0.5668 | 0.2459 | 0.1558 | |
|
| No log | 31.0 | 403 | 0.9671 | 0.695 | 0.4504 | 2.8174 | 0.695 | 0.6094 | 0.2505 | 0.1188 | |
|
| No log | 32.0 | 416 | 0.9327 | 0.715 | 0.4324 | 2.5285 | 0.715 | 0.6415 | 0.2565 | 0.1086 | |
|
| No log | 33.0 | 429 | 0.9628 | 0.71 | 0.4464 | 2.5876 | 0.7100 | 0.6435 | 0.2709 | 0.1152 | |
|
| No log | 34.0 | 442 | 0.9316 | 0.715 | 0.4353 | 2.7111 | 0.715 | 0.6334 | 0.2361 | 0.1078 | |
|
| No log | 35.0 | 455 | 0.9275 | 0.7 | 0.4364 | 2.5226 | 0.7 | 0.6251 | 0.2586 | 0.1207 | |
|
| No log | 36.0 | 468 | 0.9301 | 0.7 | 0.4346 | 2.6464 | 0.7 | 0.6232 | 0.2482 | 0.1142 | |
|
| No log | 37.0 | 481 | 0.9013 | 0.695 | 0.4194 | 2.5575 | 0.695 | 0.6197 | 0.2554 | 0.1098 | |
|
| No log | 38.0 | 494 | 0.9008 | 0.695 | 0.4196 | 2.6270 | 0.695 | 0.6156 | 0.2246 | 0.1063 | |
|
| 1.0903 | 39.0 | 507 | 0.9185 | 0.71 | 0.4311 | 2.6290 | 0.7100 | 0.6362 | 0.2626 | 0.1165 | |
|
| 1.0903 | 40.0 | 520 | 0.9053 | 0.685 | 0.4254 | 2.5057 | 0.685 | 0.6239 | 0.2210 | 0.1171 | |
|
| 1.0903 | 41.0 | 533 | 0.8955 | 0.7 | 0.4189 | 2.4823 | 0.7 | 0.6291 | 0.1995 | 0.1103 | |
|
| 1.0903 | 42.0 | 546 | 0.9012 | 0.69 | 0.4223 | 2.5377 | 0.69 | 0.6195 | 0.2486 | 0.1119 | |
|
| 1.0903 | 43.0 | 559 | 0.8894 | 0.71 | 0.4138 | 2.6167 | 0.7100 | 0.6382 | 0.2459 | 0.1022 | |
|
| 1.0903 | 44.0 | 572 | 0.8846 | 0.695 | 0.4132 | 2.5130 | 0.695 | 0.6265 | 0.2198 | 0.1093 | |
|
| 1.0903 | 45.0 | 585 | 0.8946 | 0.69 | 0.4190 | 2.6357 | 0.69 | 0.6230 | 0.2375 | 0.1145 | |
|
| 1.0903 | 46.0 | 598 | 0.8931 | 0.705 | 0.4168 | 2.6306 | 0.705 | 0.6342 | 0.2555 | 0.1102 | |
|
| 1.0903 | 47.0 | 611 | 0.8842 | 0.71 | 0.4160 | 2.3021 | 0.7100 | 0.6347 | 0.2096 | 0.1120 | |
|
| 1.0903 | 48.0 | 624 | 0.8805 | 0.695 | 0.4140 | 2.3447 | 0.695 | 0.6237 | 0.2181 | 0.1128 | |
|
| 1.0903 | 49.0 | 637 | 0.8816 | 0.7 | 0.4142 | 2.4358 | 0.7 | 0.6295 | 0.2550 | 0.1112 | |
|
| 1.0903 | 50.0 | 650 | 0.8809 | 0.7 | 0.4126 | 2.4279 | 0.7 | 0.6279 | 0.2569 | 0.1111 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.33.3 |
|
- Pytorch 2.2.0.dev20231002 |
|
- Datasets 2.7.1 |
|
- Tokenizers 0.13.3 |
|
|