Muhammad Firdho
End of training
c04ddbe verified
|
raw
history blame
11.1 kB
metadata
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
  - precision
  - recall
  - f1
model-index:
  - name: visual-emotion-recognition
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: train
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.6375
          - name: Precision
            type: precision
            value: 0.6498416164333246
          - name: Recall
            type: recall
            value: 0.6375
          - name: F1
            type: f1
            value: 0.6340720916258936

visual-emotion-recognition

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1334
  • Accuracy: 0.6375
  • Precision: 0.6498
  • Recall: 0.6375
  • F1: 0.6341

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 3
  • total_train_batch_size: 48
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy Precision Recall F1
2.0671 0.97 13 2.0660 0.125 0.2709 0.125 0.1135
2.0576 1.95 26 2.0563 0.1562 0.2932 0.1562 0.1402
2.044 3.0 40 2.0439 0.1875 0.2554 0.1875 0.1827
2.0209 3.98 53 2.0309 0.2062 0.2405 0.2062 0.1961
1.9938 4.95 66 2.0176 0.2188 0.2410 0.2188 0.2062
1.9894 6.0 80 1.9960 0.2625 0.2700 0.2625 0.2438
1.9667 6.97 93 1.9743 0.3125 0.3089 0.3125 0.2901
1.9158 7.95 106 1.9421 0.3063 0.2557 0.3063 0.2687
1.8834 9.0 120 1.9042 0.3375 0.4019 0.3375 0.2888
1.8461 9.97 133 1.8521 0.3625 0.4132 0.3625 0.3021
1.7917 10.95 146 1.8023 0.3688 0.4144 0.3688 0.3056
1.7685 12.0 160 1.7552 0.375 0.4062 0.375 0.2978
1.7072 12.97 173 1.7071 0.3875 0.4266 0.3875 0.3164
1.6926 13.95 186 1.6742 0.375 0.4056 0.375 0.2996
1.6084 15.0 200 1.6476 0.3937 0.4411 0.3937 0.3358
1.6264 15.97 213 1.6231 0.3812 0.4357 0.3812 0.3311
1.5531 16.95 226 1.6019 0.4125 0.4676 0.4125 0.3626
1.5804 18.0 240 1.5773 0.3937 0.4442 0.3937 0.3428
1.54 18.98 253 1.5606 0.4 0.4565 0.4 0.3527
1.5461 19.95 266 1.5464 0.4437 0.5084 0.4437 0.4028
1.4841 21.0 280 1.5323 0.4313 0.4950 0.4313 0.3881
1.4765 21.98 293 1.5121 0.4313 0.4884 0.4313 0.3822
1.4838 22.95 306 1.4978 0.4375 0.5138 0.4375 0.4012
1.4487 24.0 320 1.4791 0.4437 0.5059 0.4437 0.4001
1.4272 24.98 333 1.4617 0.4562 0.5304 0.4562 0.4180
1.3886 25.95 346 1.4488 0.4625 0.5418 0.4625 0.4303
1.4529 27.0 360 1.4436 0.45 0.5147 0.45 0.4035
1.3894 27.98 373 1.4267 0.4688 0.5488 0.4688 0.4355
1.3848 28.95 386 1.4153 0.4625 0.5337 0.4625 0.4264
1.3561 30.0 400 1.3993 0.4875 0.5521 0.4875 0.4554
1.3184 30.98 413 1.3852 0.4813 0.5526 0.4813 0.4470
1.282 31.95 426 1.3703 0.4813 0.5480 0.4813 0.4449
1.2988 33.0 440 1.3674 0.4688 0.5541 0.4688 0.4395
1.2507 33.98 453 1.3594 0.4688 0.5347 0.4688 0.4307
1.2446 34.95 466 1.3519 0.4813 0.5616 0.4813 0.4514
1.2877 36.0 480 1.3547 0.4875 0.5599 0.4875 0.4605
1.2237 36.98 493 1.3342 0.5 0.5744 0.5 0.4654
1.2416 37.95 506 1.3214 0.4813 0.5693 0.4813 0.4551
1.1786 39.0 520 1.3122 0.4875 0.5674 0.4875 0.4586
1.193 39.98 533 1.2989 0.5 0.5755 0.5 0.4774
1.148 40.95 546 1.2962 0.5125 0.5811 0.5125 0.4755
1.1904 42.0 560 1.2860 0.5188 0.5863 0.5188 0.4928
1.1311 42.98 573 1.2893 0.5312 0.5936 0.5312 0.5117
1.1396 43.95 586 1.2860 0.4938 0.5633 0.4938 0.4698
1.1235 45.0 600 1.2802 0.5 0.5725 0.5 0.4758
1.1638 45.98 613 1.2596 0.525 0.5909 0.525 0.5058
1.0777 46.95 626 1.2668 0.5188 0.5796 0.5188 0.4861
1.1136 48.0 640 1.2520 0.55 0.6100 0.55 0.5291
1.047 48.98 653 1.2437 0.5375 0.5963 0.5375 0.5279
1.1101 49.95 666 1.2527 0.55 0.6195 0.55 0.5279
1.0412 51.0 680 1.2455 0.525 0.5927 0.525 0.5156
1.041 51.98 693 1.2245 0.55 0.6073 0.55 0.5353
0.9906 52.95 706 1.2307 0.575 0.6420 0.575 0.5600
0.9863 54.0 720 1.2307 0.5563 0.6150 0.5563 0.5362
0.943 54.98 733 1.2270 0.55 0.6152 0.55 0.5302
0.9557 55.95 746 1.2063 0.5312 0.5964 0.5312 0.5239
0.9518 57.0 760 1.2122 0.55 0.6232 0.55 0.5433
0.9545 57.98 773 1.1955 0.575 0.6144 0.575 0.5563
0.9195 58.95 786 1.2139 0.5563 0.6052 0.5563 0.5459
0.9267 60.0 800 1.1907 0.5687 0.6052 0.5687 0.5595
0.9384 60.98 813 1.1899 0.575 0.6449 0.575 0.5650
0.8727 61.95 826 1.1854 0.5813 0.6312 0.5813 0.5651
0.8541 63.0 840 1.1957 0.575 0.6407 0.575 0.5632
0.8899 63.98 853 1.1604 0.575 0.6196 0.575 0.5694
0.9036 64.95 866 1.1859 0.5563 0.6310 0.5563 0.5306
0.8177 66.0 880 1.1498 0.6125 0.6316 0.6125 0.6116
0.7854 66.97 893 1.1842 0.5687 0.6142 0.5687 0.5582
0.8054 67.95 906 1.1695 0.5938 0.6275 0.5938 0.5830
0.8582 69.0 920 1.1882 0.5687 0.6057 0.5687 0.5495
0.7603 69.97 933 1.2067 0.55 0.6025 0.55 0.5348
0.763 70.95 946 1.1690 0.5625 0.6036 0.5625 0.5439
0.8261 72.0 960 1.1616 0.6062 0.6306 0.6062 0.6016
0.884 72.97 973 1.1952 0.5625 0.6082 0.5625 0.5436
0.7843 73.95 986 1.1583 0.5687 0.5953 0.5687 0.5633
0.801 75.0 1000 1.1547 0.575 0.6013 0.575 0.5745
0.7454 75.97 1013 1.1372 0.5875 0.6193 0.5875 0.5761
0.7325 76.95 1026 1.1696 0.5938 0.6351 0.5938 0.5919
0.7931 78.0 1040 1.1511 0.6062 0.6342 0.6062 0.6053
0.7487 78.97 1053 1.1655 0.5625 0.5898 0.5625 0.5496
0.7262 79.95 1066 1.1394 0.6125 0.6295 0.6125 0.6048
0.7669 81.0 1080 1.1748 0.575 0.5966 0.575 0.5697
0.7028 81.97 1093 1.1418 0.5875 0.6178 0.5875 0.5885
0.7749 82.95 1106 1.1736 0.55 0.5446 0.55 0.5255
0.7233 84.0 1120 1.1645 0.5813 0.5973 0.5813 0.5699
0.5915 84.97 1133 1.1376 0.5875 0.6167 0.5875 0.5867
0.6985 85.95 1146 1.1665 0.5687 0.5868 0.5687 0.5533
0.6572 87.0 1160 1.1341 0.6 0.6245 0.6 0.5963
0.6317 87.97 1173 1.1327 0.6125 0.6288 0.6125 0.6026
0.6546 88.95 1186 1.1668 0.5687 0.5797 0.5687 0.5528
0.5801 90.0 1200 1.1521 0.5875 0.6161 0.5875 0.5818
0.6958 90.97 1213 1.1401 0.5875 0.6083 0.5875 0.5774
0.5856 91.95 1226 1.1379 0.5875 0.5888 0.5875 0.5760
0.6281 93.0 1240 1.1379 0.6125 0.6429 0.6125 0.6123
0.6518 93.97 1253 1.1619 0.6312 0.6547 0.6312 0.6247
0.6055 94.95 1266 1.1700 0.575 0.5962 0.575 0.5673
0.6181 96.0 1280 1.1550 0.5938 0.6281 0.5938 0.5970
0.6601 96.97 1293 1.1334 0.6375 0.6498 0.6375 0.6341
0.6112 97.5 1300 1.1007 0.6188 0.6341 0.6188 0.6207

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.17.0
  • Tokenizers 0.15.1