Muhammad Firdho
End of training
e2dcc8c verified
|
raw
history blame
10.8 kB
metadata
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
  - precision
  - recall
  - f1
model-index:
  - name: visual-emotion-recognition
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: train
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.6171875
          - name: Precision
            type: precision
            value: 0.6123019520308124
          - name: Recall
            type: recall
            value: 0.6171875
          - name: F1
            type: f1
            value: 0.6099565615619817

visual-emotion-recognition

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2563
  • Accuracy: 0.6172
  • Precision: 0.6123
  • Recall: 0.6172
  • F1: 0.6100

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 3
  • total_train_batch_size: 96
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy Precision Recall F1
2.0811 0.94 5 2.0911 0.0859 0.0534 0.0859 0.0658
2.0668 1.88 10 2.0830 0.1016 0.0654 0.1016 0.0758
2.057 3.0 16 2.0733 0.1328 0.1119 0.1328 0.1066
2.0445 3.94 21 2.0643 0.1328 0.0965 0.1328 0.1000
2.0198 4.88 26 2.0537 0.1797 0.1911 0.1797 0.1604
2.008 6.0 32 2.0387 0.1797 0.1669 0.1797 0.1513
1.9937 6.94 37 2.0241 0.1875 0.1773 0.1875 0.1595
1.9711 7.88 42 2.0078 0.2031 0.1939 0.2031 0.1737
1.9468 9.0 48 1.9872 0.2578 0.2619 0.2578 0.2231
1.9184 9.94 53 1.9663 0.2969 0.3203 0.2969 0.2609
1.9042 10.88 58 1.9428 0.3047 0.3410 0.3047 0.2711
1.8673 12.0 64 1.9127 0.3047 0.3731 0.3047 0.2730
1.8449 12.94 69 1.8858 0.3203 0.4648 0.3203 0.2835
1.8019 13.88 74 1.8572 0.3203 0.4856 0.3203 0.2924
1.7438 15.0 80 1.8182 0.3203 0.4643 0.3203 0.3016
1.7037 15.94 85 1.7909 0.3438 0.4862 0.3438 0.3339
1.6787 16.88 90 1.7651 0.3438 0.4510 0.3438 0.3339
1.6514 18.0 96 1.7360 0.3672 0.4630 0.3672 0.3641
1.6322 18.94 101 1.7153 0.3828 0.4710 0.3828 0.3783
1.5861 19.88 106 1.6980 0.4062 0.5040 0.4062 0.3963
1.5871 21.0 112 1.6797 0.4219 0.4768 0.4219 0.4134
1.5709 21.94 117 1.6635 0.4062 0.4665 0.4062 0.4038
1.5296 22.88 122 1.6470 0.4297 0.4772 0.4297 0.4213
1.5168 24.0 128 1.6318 0.4297 0.4712 0.4297 0.4234
1.5105 24.94 133 1.6174 0.4609 0.4858 0.4609 0.4478
1.485 25.88 138 1.6024 0.4766 0.5290 0.4766 0.4717
1.4565 27.0 144 1.5929 0.4609 0.4800 0.4609 0.4517
1.4273 27.94 149 1.5803 0.4688 0.4800 0.4688 0.4581
1.4375 28.88 154 1.5650 0.5234 0.5527 0.5234 0.5134
1.3806 30.0 160 1.5563 0.4688 0.5052 0.4688 0.4651
1.3686 30.94 165 1.5443 0.5 0.5381 0.5 0.4969
1.3636 31.88 170 1.5273 0.5234 0.5459 0.5234 0.5152
1.3295 33.0 176 1.5175 0.5234 0.5444 0.5234 0.5160
1.3426 33.94 181 1.5115 0.5078 0.5179 0.5078 0.5030
1.2963 34.88 186 1.4918 0.5234 0.5399 0.5234 0.5133
1.2917 36.0 192 1.4832 0.5391 0.5436 0.5391 0.5294
1.2733 36.94 197 1.4718 0.5547 0.5730 0.5547 0.5475
1.2398 37.88 202 1.4556 0.5703 0.5996 0.5703 0.5642
1.2472 39.0 208 1.4575 0.5625 0.5820 0.5625 0.5600
1.2286 39.94 213 1.4426 0.5781 0.6024 0.5781 0.5728
1.1882 40.88 218 1.4277 0.5625 0.5787 0.5625 0.5532
1.1833 42.0 224 1.4209 0.5625 0.5857 0.5625 0.5579
1.1592 42.94 229 1.4171 0.5781 0.6089 0.5781 0.5766
1.1386 43.88 234 1.4046 0.5859 0.6053 0.5859 0.5790
1.118 45.0 240 1.3985 0.5547 0.5772 0.5547 0.5507
1.1151 45.94 245 1.3996 0.5703 0.6026 0.5703 0.5701
1.0848 46.88 250 1.3782 0.5703 0.5885 0.5703 0.5667
1.0729 48.0 256 1.3891 0.5703 0.5809 0.5703 0.5641
1.0702 48.94 261 1.3749 0.5625 0.5861 0.5625 0.5586
1.0408 49.88 266 1.3725 0.5625 0.5732 0.5625 0.5561
1.0274 51.0 272 1.3644 0.5547 0.5572 0.5547 0.5461
1.0321 51.94 277 1.3651 0.5625 0.5841 0.5625 0.5587
0.9872 52.88 282 1.3617 0.5547 0.5670 0.5547 0.5480
0.9991 54.0 288 1.3496 0.5859 0.5902 0.5859 0.5774
0.9891 54.94 293 1.3619 0.5781 0.5990 0.5781 0.5770
0.9654 55.88 298 1.3322 0.5625 0.5830 0.5625 0.5609
0.9489 57.0 304 1.3338 0.5781 0.5968 0.5781 0.5762
0.9346 57.94 309 1.3332 0.5781 0.6057 0.5781 0.5796
0.8965 58.88 314 1.3239 0.5781 0.6057 0.5781 0.5796
0.8809 60.0 320 1.3269 0.5938 0.6005 0.5938 0.5885
0.8928 60.94 325 1.3168 0.5703 0.5873 0.5703 0.5687
0.8662 61.88 330 1.3241 0.5625 0.5889 0.5625 0.5641
0.8496 63.0 336 1.3062 0.5703 0.5832 0.5703 0.5648
0.8485 63.94 341 1.2968 0.5859 0.5776 0.5859 0.5734
0.8425 64.88 346 1.3093 0.5781 0.5775 0.5781 0.5683
0.8175 66.0 352 1.2888 0.5859 0.6029 0.5859 0.5851
0.7942 66.94 357 1.3084 0.5781 0.5764 0.5781 0.5674
0.7865 67.88 362 1.3040 0.5938 0.6029 0.5938 0.5897
0.7376 69.0 368 1.2982 0.5781 0.5968 0.5781 0.5773
0.7838 69.94 373 1.2960 0.5703 0.5851 0.5703 0.5676
0.7779 70.88 378 1.2876 0.6016 0.5996 0.6016 0.5925
0.7259 72.0 384 1.2898 0.5781 0.5805 0.5781 0.5716
0.7242 72.94 389 1.2891 0.5859 0.6073 0.5859 0.5869
0.7185 73.88 394 1.2800 0.6094 0.6131 0.6094 0.6048
0.7366 75.0 400 1.2762 0.5781 0.5807 0.5781 0.5721
0.7194 75.94 405 1.2847 0.5938 0.6019 0.5938 0.5898
0.6699 76.88 410 1.2563 0.6172 0.6123 0.6172 0.6100
0.6958 78.0 416 1.2937 0.5703 0.5764 0.5703 0.5609
0.6673 78.94 421 1.2626 0.6094 0.6008 0.6094 0.5998
0.6443 79.88 426 1.2561 0.5781 0.5820 0.5781 0.5734
0.642 81.0 432 1.2654 0.5938 0.6009 0.5938 0.5910
0.6536 81.94 437 1.2604 0.5781 0.5938 0.5781 0.5773
0.5973 82.88 442 1.2783 0.5938 0.6081 0.5938 0.5927
0.6074 84.0 448 1.2709 0.5938 0.6041 0.5938 0.5865
0.6419 84.94 453 1.2820 0.5781 0.5815 0.5781 0.5680
0.611 85.88 458 1.2447 0.5625 0.5678 0.5625 0.5601
0.606 87.0 464 1.3020 0.5781 0.5889 0.5781 0.5711
0.5996 87.94 469 1.2690 0.5859 0.6016 0.5859 0.5862
0.5962 88.88 474 1.2713 0.5781 0.5787 0.5781 0.5699
0.5423 90.0 480 1.2856 0.5703 0.5803 0.5703 0.5688
0.5693 90.94 485 1.2512 0.5703 0.5886 0.5703 0.5724
0.5426 91.88 490 1.2654 0.5859 0.5881 0.5859 0.5808
0.5676 93.0 496 1.2829 0.5703 0.5818 0.5703 0.5702
0.5275 93.75 500 1.2630 0.5391 0.5541 0.5391 0.5428

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.17.0
  • Tokenizers 0.15.1