Edit model card

emotion_detection_cctv

This model is a fine-tuned version of microsoft/conditional-detr-resnet-50 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1520
  • Map: 0.9392
  • Map 50: 0.9998
  • Map 75: 0.9948
  • Map Small: -1.0
  • Map Medium: -1.0
  • Map Large: 0.9392
  • Mar 1: 0.7807
  • Mar 10: 0.9616
  • Mar 100: 0.9616
  • Mar Small: -1.0
  • Mar Medium: -1.0
  • Mar Large: 0.9616
  • Map Nf: 0.9236
  • Mar 100 Nf: 0.9523
  • Map F: 0.9547
  • Mar 100 F: 0.9708

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 2
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Map Map 50 Map 75 Map Small Map Medium Map Large Mar 1 Mar 10 Mar 100 Mar Small Mar Medium Mar Large Map Nf Mar 100 Nf Map F Mar 100 F
No log 1.0 90 1.8304 0.0733 0.1801 0.038 -1.0 0.0 0.0738 0.117 0.4402 0.5369 -1.0 0.0 0.5399 0.1099 0.6468 0.0366 0.427
No log 2.0 180 1.3214 0.1882 0.3323 0.2001 -1.0 -1.0 0.1882 0.2849 0.6302 0.6873 -1.0 -1.0 0.6873 0.3155 0.7319 0.0609 0.6427
No log 3.0 270 1.1104 0.3338 0.5621 0.3832 -1.0 -1.0 0.3338 0.3734 0.6759 0.7053 -1.0 -1.0 0.7053 0.5266 0.7375 0.141 0.673
No log 4.0 360 1.0762 0.3481 0.6067 0.3563 -1.0 -1.0 0.3481 0.3946 0.6117 0.6466 -1.0 -1.0 0.6466 0.5588 0.6875 0.1374 0.6056
No log 5.0 450 0.9652 0.3807 0.6111 0.443 -1.0 0.0 0.3815 0.3975 0.6352 0.6545 -1.0 0.0 0.6578 0.6101 0.7259 0.1513 0.5831
4.5043 6.0 540 0.9340 0.463 0.7541 0.5336 -1.0 0.0 0.4652 0.4403 0.675 0.6853 -1.0 0.0 0.689 0.5897 0.7167 0.3364 0.6539
4.5043 7.0 630 0.9507 0.4406 0.7489 0.4748 0.0 -1.0 0.442 0.411 0.6556 0.6616 0.0 -1.0 0.6633 0.568 0.7042 0.3132 0.6191
4.5043 8.0 720 0.7813 0.5193 0.8038 0.6507 -1.0 -1.0 0.5194 0.4466 0.7254 0.7274 -1.0 -1.0 0.7274 0.6451 0.7324 0.3936 0.7225
4.5043 9.0 810 0.7608 0.5436 0.8306 0.6426 -1.0 -1.0 0.5436 0.4949 0.7161 0.7166 -1.0 -1.0 0.7166 0.6642 0.7366 0.423 0.6966
4.5043 10.0 900 0.7774 0.5606 0.858 0.6495 -1.0 0.0 0.5619 0.4995 0.7321 0.735 -1.0 0.0 0.7367 0.6378 0.7329 0.4833 0.7371
4.5043 11.0 990 0.7060 0.5533 0.8493 0.6853 -1.0 -1.0 0.5533 0.5187 0.7171 0.7174 -1.0 -1.0 0.7174 0.6789 0.7583 0.4278 0.6764
0.8542 12.0 1080 0.7088 0.5999 0.9146 0.7448 -1.0 0.0 0.6031 0.5423 0.7027 0.7043 -1.0 0.0 0.7081 0.6665 0.7435 0.5334 0.6652
0.8542 13.0 1170 0.6888 0.6113 0.9251 0.7798 -1.0 0.0 0.6128 0.5378 0.7202 0.7209 -1.0 0.0 0.7227 0.6805 0.7486 0.5422 0.6933
0.8542 14.0 1260 0.6578 0.6267 0.931 0.7653 -1.0 -1.0 0.627 0.5623 0.7258 0.7272 -1.0 -1.0 0.7272 0.6656 0.7375 0.5878 0.7169
0.8542 15.0 1350 0.6039 0.6602 0.954 0.8066 -1.0 -1.0 0.6604 0.5807 0.7465 0.7474 -1.0 -1.0 0.7474 0.71 0.769 0.6104 0.7258
0.8542 16.0 1440 0.5776 0.6653 0.977 0.8311 -1.0 -1.0 0.6653 0.5896 0.7402 0.7407 -1.0 -1.0 0.7407 0.7063 0.7623 0.6244 0.7191
0.6835 17.0 1530 0.5971 0.6577 0.9538 0.8235 -1.0 -1.0 0.6577 0.5731 0.7449 0.7478 -1.0 -1.0 0.7478 0.6995 0.7619 0.6158 0.7337
0.6835 18.0 1620 0.6152 0.6566 0.9211 0.8114 -1.0 -1.0 0.6566 0.5949 0.7604 0.7615 -1.0 -1.0 0.7615 0.6964 0.7657 0.6167 0.7573
0.6835 19.0 1710 0.6240 0.6666 0.9503 0.8598 -1.0 0.0 0.6811 0.5829 0.7345 0.7363 -1.0 0.0 0.7532 0.6962 0.7569 0.637 0.7157
0.6835 20.0 1800 0.5781 0.6854 0.9693 0.8822 -1.0 -1.0 0.6856 0.6025 0.7605 0.7605 -1.0 -1.0 0.7605 0.704 0.7625 0.6668 0.7584
0.6835 21.0 1890 0.5097 0.7255 0.9829 0.9169 -1.0 0.075 0.7321 0.6342 0.786 0.7862 -1.0 0.15 0.7918 0.733 0.787 0.718 0.7854
0.6835 22.0 1980 0.5694 0.68 0.963 0.8484 -1.0 -1.0 0.68 0.6075 0.7613 0.762 -1.0 -1.0 0.762 0.7047 0.7611 0.6554 0.7629
0.5958 23.0 2070 0.5036 0.7132 0.9856 0.8779 -1.0 0.0 0.7152 0.6227 0.778 0.7802 -1.0 0.0 0.782 0.7326 0.7884 0.6939 0.7719
0.5958 24.0 2160 0.5662 0.6767 0.9519 0.8501 -1.0 0.0 0.6787 0.6025 0.7635 0.7635 -1.0 0.0 0.7653 0.7101 0.7708 0.6434 0.7562
0.5958 25.0 2250 0.5620 0.7029 0.9719 0.9007 -1.0 -1.0 0.703 0.6149 0.7708 0.7713 -1.0 -1.0 0.7713 0.7112 0.7796 0.6946 0.7629
0.5958 26.0 2340 0.5236 0.7173 0.9638 0.9165 -1.0 0.016 0.7187 0.6328 0.7883 0.7886 -1.0 0.4 0.7895 0.7285 0.7907 0.706 0.7864
0.5958 27.0 2430 0.4734 0.7433 0.9886 0.94 -1.0 -1.0 0.7434 0.6431 0.8014 0.8021 -1.0 -1.0 0.8021 0.7472 0.8042 0.7395 0.8
0.5447 28.0 2520 0.4826 0.7364 0.9868 0.942 -1.0 -1.0 0.7364 0.6417 0.7931 0.7941 -1.0 -1.0 0.7941 0.7555 0.8083 0.7172 0.7798
0.5447 29.0 2610 0.5002 0.7229 0.9827 0.9309 -1.0 0.0 0.7279 0.6386 0.7812 0.7812 -1.0 0.0 0.7856 0.7501 0.7995 0.6957 0.7629
0.5447 30.0 2700 0.4752 0.7529 0.9831 0.937 -1.0 -1.0 0.7529 0.6405 0.8089 0.8091 -1.0 -1.0 0.8091 0.7639 0.8093 0.7419 0.809
0.5447 31.0 2790 0.4524 0.7208 0.9756 0.9124 -1.0 -1.0 0.7209 0.6312 0.7916 0.7933 -1.0 -1.0 0.7933 0.7589 0.8157 0.6827 0.7708
0.5447 32.0 2880 0.4278 0.752 0.9906 0.9451 -1.0 -1.0 0.752 0.6536 0.8061 0.8066 -1.0 -1.0 0.8066 0.7781 0.8255 0.7259 0.7876
0.5447 33.0 2970 0.4699 0.7509 0.9883 0.9441 -1.0 -1.0 0.751 0.651 0.8109 0.8114 -1.0 -1.0 0.8114 0.7505 0.8014 0.7514 0.8213
0.5069 34.0 3060 0.5125 0.7178 0.9529 0.9109 -1.0 0.0 0.7277 0.6415 0.7884 0.793 -1.0 0.0 0.8041 0.7346 0.7917 0.7011 0.7943
0.5069 35.0 3150 0.5031 0.7323 0.9687 0.912 -1.0 0.0 0.7407 0.643 0.7889 0.7906 -1.0 0.0 0.7994 0.764 0.8148 0.7005 0.7663
0.5069 36.0 3240 0.4753 0.7423 0.9856 0.9226 -1.0 0.0 0.7445 0.6481 0.7989 0.7989 -1.0 0.0 0.8007 0.7594 0.8056 0.7252 0.7921
0.5069 37.0 3330 0.4264 0.7617 0.9804 0.9631 -1.0 0.0 0.767 0.6575 0.815 0.815 -1.0 0.0 0.8195 0.79 0.8333 0.7335 0.7966
0.5069 38.0 3420 0.3995 0.7762 0.9927 0.9628 -1.0 -1.0 0.7762 0.6747 0.8279 0.8279 -1.0 -1.0 0.8279 0.7764 0.8233 0.776 0.8326
0.4669 39.0 3510 0.4286 0.7475 0.9772 0.9464 -1.0 0.0 0.7553 0.6506 0.8057 0.8057 -1.0 0.0 0.812 0.7851 0.8293 0.71 0.782
0.4669 40.0 3600 0.3864 0.7726 0.9889 0.9608 -1.0 0.0 0.7774 0.6696 0.8235 0.8235 -1.0 0.0 0.8281 0.7991 0.8505 0.7461 0.7966
0.4669 41.0 3690 0.3316 0.8136 0.9983 0.9729 -1.0 -1.0 0.8136 0.6932 0.8572 0.8572 -1.0 -1.0 0.8572 0.8231 0.8639 0.8041 0.8506
0.4669 42.0 3780 0.3657 0.7848 0.9762 0.9601 -1.0 0.0 0.7936 0.6833 0.8424 0.8424 -1.0 0.0 0.8518 0.8105 0.86 0.759 0.8247
0.4669 43.0 3870 0.3636 0.7887 0.9952 0.9636 -1.0 -1.0 0.7888 0.6778 0.8408 0.8408 -1.0 -1.0 0.8408 0.7958 0.8491 0.7816 0.8326
0.4669 44.0 3960 0.3760 0.7813 0.9871 0.9434 -1.0 -1.0 0.7813 0.6789 0.8321 0.8327 -1.0 -1.0 0.8327 0.8005 0.844 0.7621 0.8213
0.4227 45.0 4050 0.3942 0.7916 0.9925 0.9407 -1.0 0.02 0.7937 0.6846 0.8419 0.8421 -1.0 0.1 0.8438 0.7844 0.8315 0.7988 0.8528
0.4227 46.0 4140 0.3440 0.8051 0.9979 0.9767 -1.0 -1.0 0.8051 0.6948 0.8578 0.8578 -1.0 -1.0 0.8578 0.8053 0.8537 0.8049 0.8618
0.4227 47.0 4230 0.3353 0.8129 0.9836 0.9605 -1.0 0.0 0.8172 0.6903 0.8543 0.8543 -1.0 0.0 0.8591 0.8318 0.8727 0.794 0.836
0.4227 48.0 4320 0.3638 0.8048 0.9915 0.9616 -1.0 0.0 0.8069 0.6896 0.8514 0.8514 -1.0 0.0 0.8534 0.8003 0.8444 0.8092 0.8584
0.4227 49.0 4410 0.3310 0.8218 0.9982 0.9699 -1.0 -1.0 0.8218 0.6972 0.865 0.8662 -1.0 -1.0 0.8662 0.8209 0.8694 0.8226 0.8629
0.3807 50.0 4500 0.3393 0.8108 0.9796 0.9528 -1.0 -1.0 0.8108 0.6952 0.8515 0.8515 -1.0 -1.0 0.8515 0.8227 0.8648 0.799 0.8382
0.3807 51.0 4590 0.3205 0.8225 0.9941 0.9719 -1.0 -1.0 0.8225 0.7047 0.8631 0.8631 -1.0 -1.0 0.8631 0.822 0.8644 0.8229 0.8618
0.3807 52.0 4680 0.3399 0.8078 0.9886 0.9539 -1.0 -1.0 0.8078 0.6925 0.8535 0.8535 -1.0 -1.0 0.8535 0.8261 0.8722 0.7894 0.8348
0.3807 53.0 4770 0.3614 0.8174 0.968 0.9492 -1.0 0.0333 0.8223 0.6938 0.855 0.855 -1.0 0.1 0.8614 0.8171 0.856 0.8177 0.8539
0.3807 54.0 4860 0.3446 0.8135 0.9906 0.9588 -1.0 0.0 0.816 0.6978 0.8651 0.8651 -1.0 0.0 0.8671 0.8183 0.8639 0.8086 0.8663
0.3807 55.0 4950 0.3518 0.8203 0.9867 0.9516 -1.0 -1.0 0.8203 0.6975 0.8615 0.8615 -1.0 -1.0 0.8615 0.8355 0.8736 0.8051 0.8494
0.366 56.0 5040 0.2746 0.8359 0.9998 0.9705 -1.0 -1.0 0.8359 0.7175 0.882 0.882 -1.0 -1.0 0.882 0.8422 0.8875 0.8296 0.8764
0.366 57.0 5130 0.2882 0.8351 0.9847 0.9698 -1.0 0.0 0.8392 0.707 0.8763 0.8763 -1.0 0.0 0.8813 0.847 0.8875 0.8232 0.8652
0.366 58.0 5220 0.3049 0.8414 0.9945 0.9679 -1.0 0.15 0.8426 0.7083 0.8776 0.8776 -1.0 0.3 0.879 0.8481 0.8833 0.8348 0.8719
0.366 59.0 5310 0.3217 0.821 0.9892 0.9743 -1.0 -1.0 0.821 0.6989 0.8643 0.8643 -1.0 -1.0 0.8643 0.8278 0.8713 0.8141 0.8573
0.366 60.0 5400 0.2654 0.8564 0.9996 0.9797 -1.0 -1.0 0.8564 0.7286 0.8941 0.8941 -1.0 -1.0 0.8941 0.8585 0.8972 0.8544 0.891
0.366 61.0 5490 0.2880 0.8637 0.9998 0.9787 -1.0 0.1 0.8662 0.7285 0.9 0.9 -1.0 0.1 0.9018 0.8521 0.8898 0.8754 0.9101
0.3186 62.0 5580 0.2817 0.8638 0.9955 0.9703 -1.0 -1.0 0.8638 0.7322 0.9086 0.9086 -1.0 -1.0 0.9086 0.8572 0.8958 0.8704 0.9213
0.3186 63.0 5670 0.2560 0.868 0.9951 0.9807 -1.0 -1.0 0.868 0.7366 0.9099 0.9099 -1.0 -1.0 0.9099 0.8672 0.9074 0.8688 0.9124
0.3186 64.0 5760 0.2785 0.8516 0.9831 0.9709 -1.0 0.0 0.8539 0.7251 0.897 0.897 -1.0 0.0 0.8991 0.854 0.894 0.8491 0.9
0.3186 65.0 5850 0.2623 0.8618 0.9891 0.9501 -1.0 -1.0 0.8618 0.7315 0.9035 0.9035 -1.0 -1.0 0.9035 0.8642 0.9069 0.8594 0.9
0.3186 66.0 5940 0.2568 0.8695 0.9895 0.9607 -1.0 -1.0 0.8695 0.7368 0.9079 0.9079 -1.0 -1.0 0.9079 0.8728 0.9102 0.8663 0.9056
0.2959 67.0 6030 0.2528 0.8736 0.9982 0.9643 -1.0 -1.0 0.8736 0.7344 0.9082 0.9091 -1.0 -1.0 0.9091 0.8793 0.9148 0.868 0.9034
0.2959 68.0 6120 0.2517 0.8736 0.9983 0.976 -1.0 -1.0 0.8736 0.7375 0.9102 0.9102 -1.0 -1.0 0.9102 0.8713 0.9093 0.8758 0.9112
0.2959 69.0 6210 0.2818 0.8636 0.992 0.9578 -1.0 0.2515 0.8717 0.7353 0.9003 0.9003 -1.0 0.25 0.9078 0.8607 0.8972 0.8664 0.9034
0.2959 70.0 6300 0.2222 0.8915 0.9998 0.98 -1.0 -1.0 0.8915 0.753 0.9261 0.9261 -1.0 -1.0 0.9261 0.8826 0.9208 0.9004 0.9315
0.2959 71.0 6390 0.2597 0.8699 0.9868 0.9517 0.0 0.25 0.8733 0.7353 0.9094 0.9094 0.0 0.5 0.9124 0.8745 0.9097 0.8653 0.909
0.2959 72.0 6480 0.2019 0.9087 0.9992 0.9992 -1.0 -1.0 0.9087 0.7603 0.9382 0.9382 -1.0 -1.0 0.9382 0.9001 0.9315 0.9172 0.9449
0.2565 73.0 6570 0.2035 0.8993 0.9997 0.9927 -1.0 -1.0 0.8993 0.7602 0.9331 0.9331 -1.0 -1.0 0.9331 0.8908 0.9269 0.9078 0.9393
0.2565 74.0 6660 0.2539 0.8807 0.9987 0.959 -1.0 -1.0 0.8808 0.7367 0.9145 0.9145 -1.0 -1.0 0.9145 0.8803 0.9144 0.8812 0.9146
0.2565 75.0 6750 0.1972 0.8966 1.0 0.9729 -1.0 -1.0 0.8966 0.7539 0.9299 0.9299 -1.0 -1.0 0.9299 0.9008 0.9306 0.8925 0.9292
0.2565 76.0 6840 0.2068 0.9105 0.9995 0.9936 -1.0 -1.0 0.9105 0.7648 0.9411 0.9411 -1.0 -1.0 0.9411 0.9042 0.9361 0.9169 0.9461
0.2565 77.0 6930 0.2286 0.8966 0.9996 0.964 -1.0 -1.0 0.8966 0.7543 0.9271 0.9271 -1.0 -1.0 0.9271 0.8868 0.9227 0.9064 0.9315
0.246 78.0 7020 0.2135 0.902 0.9899 0.9561 -1.0 0.0 0.9083 0.7514 0.928 0.928 -1.0 0.0 0.9332 0.901 0.9324 0.903 0.9236
0.246 79.0 7110 0.2176 0.9115 0.9894 0.9795 -1.0 0.0 0.9196 0.7649 0.9424 0.9424 -1.0 0.0 0.9478 0.8957 0.9319 0.9273 0.9528
0.246 80.0 7200 0.1864 0.9307 1.0 0.9827 -1.0 -1.0 0.9307 0.7734 0.9528 0.9528 -1.0 -1.0 0.9528 0.9082 0.937 0.9532 0.9685
0.246 81.0 7290 0.1990 0.9179 0.9927 0.9802 -1.0 0.0 0.9222 0.7646 0.9468 0.9468 -1.0 0.0 0.9511 0.897 0.9329 0.9389 0.9607
0.246 82.0 7380 0.1954 0.9117 0.9882 0.9734 -1.0 0.0375 0.9212 0.7632 0.943 0.943 -1.0 0.15 0.9489 0.8915 0.931 0.932 0.9551
0.246 83.0 7470 0.1765 0.9221 0.9979 0.9831 -1.0 -1.0 0.9221 0.7683 0.9483 0.9486 -1.0 -1.0 0.9486 0.9169 0.9477 0.9272 0.9494
0.2126 84.0 7560 0.2004 0.9092 0.995 0.9729 -1.0 0.0 0.9131 0.7576 0.9378 0.9378 -1.0 0.0 0.9399 0.9078 0.9384 0.9106 0.9371
0.2126 85.0 7650 0.1721 0.9352 0.9963 0.9864 -1.0 -1.0 0.9352 0.7757 0.959 0.959 -1.0 -1.0 0.959 0.9229 0.9486 0.9475 0.9693
0.2126 86.0 7740 0.1835 0.9187 0.9994 0.9699 -1.0 -1.0 0.9187 0.7621 0.9452 0.9452 -1.0 -1.0 0.9452 0.9072 0.9398 0.9303 0.9506
0.2126 87.0 7830 0.1892 0.9232 0.9947 0.9738 -1.0 0.0 0.9257 0.7726 0.9483 0.9483 -1.0 0.0 0.9505 0.9071 0.937 0.9393 0.9596
0.2126 88.0 7920 0.1753 0.9186 0.9901 0.9851 -1.0 -1.0 0.9186 0.7667 0.9482 0.9482 -1.0 -1.0 0.9482 0.912 0.9435 0.9253 0.9528
0.2316 89.0 8010 0.1709 0.9268 0.9998 0.9793 -1.0 -1.0 0.9268 0.7707 0.9528 0.9528 -1.0 -1.0 0.9528 0.916 0.9472 0.9375 0.9584
0.2316 90.0 8100 0.1818 0.9206 0.9987 0.9694 -1.0 -1.0 0.9206 0.7615 0.9446 0.9446 -1.0 -1.0 0.9446 0.9197 0.9477 0.9215 0.9416
0.2316 91.0 8190 0.1847 0.9161 0.9845 0.9737 -1.0 0.0 0.923 0.7618 0.9413 0.9413 -1.0 0.0 0.9488 0.9123 0.9398 0.9198 0.9427
0.2316 92.0 8280 0.1540 0.9446 0.9992 0.9971 -1.0 -1.0 0.9446 0.7812 0.9623 0.9623 -1.0 -1.0 0.9623 0.9377 0.9583 0.9516 0.9663
0.2316 93.0 8370 0.1645 0.9343 0.9995 0.9739 -1.0 -1.0 0.9343 0.7773 0.9571 0.9571 -1.0 -1.0 0.9571 0.9146 0.9444 0.954 0.9697
0.2316 94.0 8460 0.1992 0.9212 0.9881 0.962 -1.0 0.6 0.9231 0.7686 0.9473 0.9473 -1.0 0.6 0.9493 0.9004 0.9316 0.9419 0.9629
0.2112 95.0 8550 0.1736 0.9206 0.9858 0.9745 -1.0 0.0112 0.9319 0.7639 0.9449 0.9449 -1.0 0.1 0.9545 0.9246 0.9505 0.9165 0.9393
0.2112 96.0 8640 0.1558 0.9318 0.9997 0.9793 -1.0 -1.0 0.9318 0.7795 0.9571 0.9571 -1.0 -1.0 0.9571 0.9191 0.9491 0.9445 0.9652
0.2112 97.0 8730 0.1535 0.9296 0.9901 0.976 -1.0 0.05 0.9398 0.7746 0.9573 0.9573 -1.0 0.05 0.9647 0.9253 0.9528 0.9339 0.9618
0.2112 98.0 8820 0.2102 0.9071 0.9828 0.9617 0.0 0.0 0.9146 0.7557 0.9351 0.9351 0.0 0.0 0.9426 0.9034 0.9332 0.9107 0.9371
0.2112 99.0 8910 0.1861 0.9205 0.9969 0.9626 -1.0 -1.0 0.9205 0.7618 0.943 0.943 -1.0 -1.0 0.943 0.9134 0.9421 0.9275 0.9438
0.2125 100.0 9000 0.1520 0.9392 0.9998 0.9948 -1.0 -1.0 0.9392 0.7807 0.9616 0.9616 -1.0 -1.0 0.9616 0.9236 0.9523 0.9547 0.9708

Framework versions

  • Transformers 4.42.3
  • Pytorch 2.3.1+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
80
Safetensors
Model size
43.5M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for machinelearningzuu/emotion_detection_cctv

Finetuned
(46)
this model