Edit model card

queue_detection_cctv

This model is a fine-tuned version of microsoft/conditional-detr-resnet-50 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1291
  • Map: 0.9532
  • Map 50: 0.9901
  • Map 75: 0.9845
  • Map Small: -1.0
  • Map Medium: 0.3203
  • Map Large: 0.9578
  • Mar 1: 0.5044
  • Mar 10: 0.9715
  • Mar 100: 0.972
  • Mar Small: -1.0
  • Mar Medium: 0.3538
  • Mar Large: 0.9747
  • Map Cashier: 0.9618
  • Mar 100 Cashier: 0.9775
  • Map Cx: 0.9447
  • Mar 100 Cx: 0.9664

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 2
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Map Map 50 Map 75 Map Small Map Medium Map Large Mar 1 Mar 10 Mar 100 Mar Small Mar Medium Mar Large Map Cashier Mar 100 Cashier Map Cx Mar 100 Cx
No log 1.0 218 1.3927 0.1975 0.3459 0.1995 -1.0 0.0 0.1988 0.2409 0.5283 0.7011 -1.0 0.0 0.7055 0.2115 0.8043 0.1834 0.5979
No log 2.0 436 0.9964 0.5247 0.8011 0.591 -1.0 0.0079 0.5292 0.3316 0.6966 0.7387 -1.0 0.0071 0.7453 0.5772 0.8086 0.4723 0.6688
2.7418 3.0 654 0.8535 0.6031 0.9058 0.6954 -1.0 0.0349 0.6069 0.3603 0.7079 0.733 -1.0 0.2 0.7362 0.6576 0.769 0.5485 0.6969
2.7418 4.0 872 0.7406 0.6499 0.9356 0.752 -1.0 0.0479 0.6543 0.3756 0.7387 0.7586 -1.0 0.0923 0.7634 0.7052 0.7953 0.5947 0.7219
0.8155 5.0 1090 0.6721 0.6731 0.9516 0.8113 -1.0 0.0249 0.6773 0.3819 0.7501 0.7654 -1.0 0.0455 0.7701 0.7451 0.8203 0.601 0.7105
0.8155 6.0 1308 0.5804 0.7244 0.9632 0.8738 0.0 0.0712 0.7288 0.4038 0.7882 0.8023 0.0 0.1731 0.8066 0.7818 0.8419 0.6671 0.7627
0.6668 7.0 1526 0.5430 0.7484 0.9667 0.9041 -1.0 0.076 0.7527 0.417 0.8027 0.813 -1.0 0.2205 0.8171 0.8068 0.8602 0.69 0.7658
0.6668 8.0 1744 0.5524 0.7361 0.9691 0.8958 -1.0 0.0273 0.7416 0.4045 0.7839 0.7933 -1.0 0.1286 0.7981 0.7845 0.8274 0.6877 0.7592
0.6668 9.0 1962 0.5359 0.7415 0.9737 0.901 -1.0 0.0845 0.7462 0.4112 0.7999 0.8044 -1.0 0.1462 0.8088 0.7844 0.8376 0.6986 0.7713
0.5735 10.0 2180 0.5154 0.7497 0.9744 0.907 0.0 0.0368 0.7538 0.414 0.8042 0.8093 0.0 0.1333 0.813 0.8085 0.86 0.6909 0.7586
0.5735 11.0 2398 0.4543 0.7824 0.9754 0.9337 0.0 0.0709 0.7908 0.4307 0.8323 0.8368 0.0 0.1794 0.8449 0.8312 0.8765 0.7336 0.7972
0.5189 12.0 2616 0.4802 0.7679 0.9769 0.9274 0.0 0.1201 0.7724 0.426 0.8197 0.825 0.0 0.1917 0.8291 0.7985 0.85 0.7374 0.8
0.5189 13.0 2834 0.4306 0.7906 0.9825 0.9332 -1.0 0.0708 0.7941 0.435 0.8394 0.8448 -1.0 0.23 0.8474 0.8474 0.889 0.7339 0.8006
0.4874 14.0 3052 0.4660 0.7649 0.9818 0.9264 -1.0 0.0504 0.7713 0.4219 0.8155 0.8222 -1.0 0.0875 0.8288 0.805 0.8527 0.7248 0.7917
0.4874 15.0 3270 0.4392 0.7867 0.9773 0.9278 0.0 0.0256 0.7961 0.4372 0.8336 0.8385 0.0 0.1028 0.8466 0.8243 0.8725 0.7492 0.8045
0.4874 16.0 3488 0.4178 0.8018 0.9847 0.9355 -1.0 0.2037 0.8061 0.4387 0.8493 0.8551 -1.0 0.3714 0.8589 0.8394 0.8881 0.7641 0.822
0.4646 17.0 3706 0.3859 0.8138 0.9838 0.9502 -1.0 0.1217 0.8189 0.4459 0.8584 0.863 -1.0 0.2038 0.8669 0.8508 0.8956 0.7769 0.8303
0.4646 18.0 3924 0.4041 0.7987 0.9822 0.9457 -1.0 0.097 0.8032 0.4378 0.8486 0.8518 -1.0 0.1611 0.8551 0.8323 0.881 0.7652 0.8226
0.4317 19.0 4142 0.4013 0.8086 0.9838 0.9442 -1.0 0.1816 0.814 0.4412 0.8513 0.8557 -1.0 0.2571 0.8605 0.8522 0.8919 0.765 0.8195
0.4317 20.0 4360 0.3869 0.8123 0.9823 0.9388 -1.0 0.1597 0.8163 0.4475 0.8579 0.8617 -1.0 0.2042 0.8653 0.8542 0.896 0.7705 0.8274
0.4215 21.0 4578 0.3721 0.816 0.9864 0.9536 -1.0 0.1206 0.8198 0.4478 0.8598 0.863 -1.0 0.2727 0.8655 0.8607 0.9003 0.7713 0.8258
0.4215 22.0 4796 0.3777 0.8245 0.9806 0.9507 0.0 0.1034 0.8324 0.4537 0.8621 0.8649 0.0 0.2118 0.8724 0.8651 0.9012 0.7839 0.8287
0.3925 23.0 5014 0.3387 0.8411 0.9872 0.9577 -1.0 0.1184 0.845 0.4593 0.8775 0.8799 -1.0 0.2429 0.8835 0.8813 0.9153 0.8008 0.8444
0.3925 24.0 5232 0.3234 0.842 0.9887 0.9671 -1.0 0.1229 0.8463 0.4604 0.8794 0.8812 -1.0 0.1864 0.885 0.8736 0.909 0.8104 0.8534
0.3925 25.0 5450 0.3463 0.8356 0.9869 0.9556 -1.0 0.0775 0.8411 0.4552 0.8769 0.8793 -1.0 0.1929 0.8838 0.8788 0.913 0.7925 0.8456
0.3676 26.0 5668 0.3170 0.846 0.988 0.9666 0.0 0.1172 0.8515 0.4603 0.886 0.8872 0.0 0.285 0.8907 0.8831 0.9182 0.8089 0.8562
0.3676 27.0 5886 0.3552 0.8246 0.9832 0.9545 -1.0 0.13 0.8285 0.4535 0.8704 0.8745 -1.0 0.2367 0.8785 0.8559 0.9005 0.7932 0.8484
0.3669 28.0 6104 0.3342 0.8427 0.9876 0.9665 -1.0 0.1369 0.8468 0.4585 0.8813 0.8843 -1.0 0.2625 0.8874 0.8587 0.898 0.8267 0.8707
0.3669 29.0 6322 0.3033 0.854 0.9892 0.9687 -1.0 0.1795 0.8572 0.4663 0.8954 0.8968 -1.0 0.3 0.8991 0.8813 0.9193 0.8268 0.8744
0.349 30.0 6540 0.3099 0.8515 0.9863 0.9676 -1.0 0.1251 0.8571 0.4666 0.8917 0.8936 -1.0 0.2 0.8978 0.8868 0.9261 0.8162 0.8611
0.349 31.0 6758 0.3247 0.842 0.9884 0.963 0.0 0.1145 0.8491 0.4607 0.8828 0.8854 0.0 0.1462 0.8916 0.8704 0.9104 0.8137 0.8605
0.349 32.0 6976 0.2943 0.8529 0.9887 0.9651 -1.0 0.1639 0.8587 0.4683 0.8916 0.8949 -1.0 0.225 0.8997 0.89 0.9246 0.8158 0.8653
0.3378 33.0 7194 0.2923 0.8605 0.989 0.9695 -1.0 0.1212 0.8657 0.4687 0.8985 0.9006 -1.0 0.2136 0.9042 0.8893 0.9257 0.8317 0.8756
0.3378 34.0 7412 0.2878 0.8616 0.9895 0.9673 -1.0 0.1464 0.8665 0.4712 0.897 0.899 -1.0 0.2 0.9036 0.8907 0.9246 0.8325 0.8734
0.3206 35.0 7630 0.3342 0.837 0.9866 0.9674 -1.0 0.1634 0.8423 0.4584 0.8772 0.8802 -1.0 0.2611 0.8844 0.8684 0.906 0.8057 0.8544
0.3206 36.0 7848 0.2796 0.8713 0.989 0.9716 -1.0 0.1054 0.8759 0.4699 0.9066 0.9084 -1.0 0.15 0.9128 0.9052 0.9373 0.8373 0.8795
0.3152 37.0 8066 0.2894 0.8667 0.987 0.9746 0.0 0.1359 0.8743 0.4716 0.9022 0.9037 0.0 0.1667 0.9109 0.8966 0.9309 0.8367 0.8765
0.3152 38.0 8284 0.2641 0.8744 0.9894 0.9722 -1.0 0.1413 0.8793 0.4727 0.9132 0.9148 -1.0 0.2333 0.9178 0.8909 0.9305 0.858 0.8992
0.3082 39.0 8502 0.2834 0.8703 0.9873 0.9702 -1.0 0.132 0.8764 0.473 0.9082 0.9128 -1.0 0.2633 0.9168 0.8988 0.9347 0.8417 0.891
0.3082 40.0 8720 0.2774 0.8655 0.9897 0.9738 -1.0 0.2021 0.8711 0.4694 0.9025 0.9043 -1.0 0.275 0.9081 0.8971 0.9314 0.8339 0.8772
0.3082 41.0 8938 0.2935 0.8598 0.988 0.9699 -1.0 0.0999 0.8666 0.4688 0.8961 0.8976 -1.0 0.15 0.9037 0.8889 0.9255 0.8308 0.8697
0.3078 42.0 9156 0.2746 0.868 0.9895 0.9777 -1.0 0.2159 0.8738 0.4712 0.9021 0.9032 -1.0 0.275 0.9079 0.9016 0.933 0.8343 0.8734
0.3078 43.0 9374 0.2662 0.8731 0.9897 0.9798 -1.0 0.1849 0.8794 0.4752 0.9083 0.9091 -1.0 0.2 0.9136 0.888 0.9206 0.8582 0.8975
0.2898 44.0 9592 0.2564 0.8824 0.9868 0.9732 -1.0 0.1263 0.8871 0.4775 0.9148 0.9165 -1.0 0.15 0.9211 0.9076 0.9377 0.8571 0.8954
0.2898 45.0 9810 0.2813 0.8753 0.9876 0.977 0.0 0.1325 0.8817 0.4714 0.911 0.9123 0.0 0.2167 0.9179 0.9042 0.9381 0.8464 0.8865
0.2758 46.0 10028 0.2633 0.8786 0.9872 0.9719 0.0 0.1841 0.8854 0.4758 0.9164 0.9177 0.0 0.2615 0.9218 0.9012 0.9374 0.856 0.898
0.2758 47.0 10246 0.2479 0.8795 0.9895 0.9765 0.0 0.2066 0.8849 0.4765 0.9146 0.9171 0.0 0.275 0.9207 0.9114 0.9448 0.8476 0.8893
0.2758 48.0 10464 0.2373 0.8894 0.9897 0.9799 -1.0 0.1994 0.8939 0.4795 0.9253 0.926 -1.0 0.2545 0.9293 0.9076 0.9431 0.8713 0.909
0.2708 49.0 10682 0.2538 0.8846 0.9893 0.9793 0.0 0.2669 0.8903 0.4799 0.9213 0.9224 0.0 0.315 0.9284 0.9052 0.9383 0.8641 0.9065
0.2708 50.0 10900 0.2445 0.8919 0.9896 0.9745 -1.0 0.2193 0.8972 0.4765 0.9228 0.925 -1.0 0.3969 0.9294 0.9239 0.9511 0.8599 0.8989
0.2595 51.0 11118 0.2110 0.9037 0.99 0.9845 -1.0 0.2267 0.9093 0.4882 0.9339 0.9346 -1.0 0.25 0.9374 0.9299 0.9574 0.8776 0.9117
0.2595 52.0 11336 0.2374 0.897 0.99 0.9792 -1.0 0.2066 0.9029 0.48 0.9267 0.9285 -1.0 0.3179 0.9335 0.9257 0.9531 0.8684 0.9039
0.2378 53.0 11554 0.2517 0.8826 0.9894 0.9716 -1.0 0.1494 0.8901 0.4782 0.9162 0.9188 -1.0 0.2475 0.9242 0.9152 0.9455 0.8501 0.892
0.2378 54.0 11772 0.2260 0.8971 0.9899 0.9771 -1.0 0.1848 0.9029 0.4825 0.9304 0.9315 -1.0 0.2077 0.936 0.9255 0.9544 0.8687 0.9087
0.2378 55.0 11990 0.2144 0.9118 0.9899 0.9844 -1.0 0.2843 0.9158 0.4875 0.9417 0.9435 -1.0 0.3333 0.9456 0.9351 0.9608 0.8885 0.9263
0.2494 56.0 12208 0.2028 0.9107 0.9897 0.9814 0.0 0.1831 0.9168 0.4906 0.9395 0.9414 0.0 0.22 0.9466 0.935 0.9585 0.8864 0.9243
0.2494 57.0 12426 0.2341 0.8897 0.9897 0.9812 -1.0 0.1783 0.8932 0.4822 0.9242 0.926 -1.0 0.2154 0.9303 0.9168 0.948 0.8625 0.9039
0.2228 58.0 12644 0.2075 0.9084 0.9899 0.9792 -1.0 0.1741 0.9142 0.4899 0.9375 0.9379 -1.0 0.2308 0.9421 0.932 0.9581 0.8849 0.9177
0.2228 59.0 12862 0.2059 0.9096 0.9896 0.9803 0.0 0.2969 0.9138 0.4893 0.9375 0.9395 0.0 0.31 0.9431 0.9311 0.957 0.8881 0.9219
0.2218 60.0 13080 0.2028 0.9136 0.9899 0.984 -1.0 0.2316 0.9164 0.4875 0.9408 0.9416 -1.0 0.295 0.9442 0.9433 0.9654 0.884 0.9177
0.2218 61.0 13298 0.2013 0.911 0.99 0.9786 -1.0 0.253 0.9158 0.4904 0.9388 0.94 -1.0 0.3 0.9435 0.9325 0.9572 0.8895 0.9228
0.2238 62.0 13516 0.2033 0.9134 0.9899 0.9825 0.0 0.2228 0.9199 0.4896 0.9426 0.9438 0.0 0.2667 0.9484 0.9367 0.9624 0.8902 0.9252
0.2238 63.0 13734 0.1893 0.9216 0.99 0.9836 -1.0 0.1905 0.9271 0.4942 0.9509 0.9512 -1.0 0.235 0.9546 0.9403 0.9664 0.9029 0.9361
0.2238 64.0 13952 0.1893 0.9267 0.9898 0.9835 0.0 0.2342 0.9317 0.4957 0.9524 0.9536 0.0 0.2583 0.9585 0.9491 0.971 0.9043 0.9363
0.2131 65.0 14170 0.1769 0.9322 0.9901 0.9847 -1.0 0.2413 0.9349 0.4982 0.9554 0.9559 -1.0 0.2864 0.959 0.9463 0.9673 0.9181 0.9445
0.2131 66.0 14388 0.1848 0.9312 0.9898 0.9842 0.0 0.2901 0.9358 0.4973 0.9545 0.9551 0.0 0.425 0.9591 0.9517 0.9709 0.9107 0.9394
0.2038 67.0 14606 0.1809 0.9277 0.9899 0.9815 0.0 0.2354 0.9329 0.4951 0.9524 0.9539 0.0 0.2846 0.9586 0.9441 0.9668 0.9112 0.9411
0.2038 68.0 14824 0.1831 0.9178 0.9899 0.98 0.0 0.1728 0.9256 0.4922 0.9472 0.9483 0.0 0.23 0.9538 0.9396 0.9646 0.896 0.9319
0.1995 69.0 15042 0.1631 0.934 0.9901 0.9861 -1.0 0.2804 0.9405 0.4982 0.9574 0.9583 -1.0 0.325 0.9615 0.954 0.9729 0.914 0.9438
0.1995 70.0 15260 0.1685 0.9293 0.9899 0.9846 -1.0 0.2397 0.935 0.4964 0.9546 0.9553 -1.0 0.2714 0.9593 0.948 0.9698 0.9105 0.9408
0.1995 71.0 15478 0.1629 0.9371 0.9901 0.9842 -1.0 0.2541 0.942 0.498 0.9603 0.9609 -1.0 0.4964 0.965 0.954 0.9741 0.9202 0.9477
0.1877 72.0 15696 0.1606 0.944 0.9901 0.9846 -1.0 0.277 0.9469 0.4988 0.9636 0.9642 -1.0 0.3038 0.9676 0.96 0.9758 0.9281 0.9527
0.1877 73.0 15914 0.1532 0.9389 0.99 0.9806 0.0 0.2592 0.9446 0.5009 0.961 0.962 0.0 0.3133 0.9662 0.9564 0.9749 0.9214 0.9492
0.1912 74.0 16132 0.1434 0.9488 0.995 0.9934 -1.0 0.5552 0.9507 0.5033 0.9673 0.9675 -1.0 0.7182 0.969 0.9639 0.9786 0.9336 0.9563
0.1912 75.0 16350 0.1726 0.9309 0.9901 0.9832 -1.0 0.216 0.9344 0.4964 0.9568 0.9578 -1.0 0.2611 0.9607 0.9539 0.9747 0.9079 0.941
0.1859 76.0 16568 0.1587 0.9378 0.9901 0.9847 -1.0 0.1684 0.944 0.4994 0.9601 0.9607 -1.0 0.2382 0.9662 0.952 0.9715 0.9237 0.9499
0.1859 77.0 16786 0.1378 0.9509 0.9901 0.9845 -1.0 0.2089 0.959 0.5047 0.9688 0.9691 -1.0 0.2353 0.9748 0.9666 0.9823 0.9352 0.9559
0.1747 78.0 17004 0.1416 0.9478 0.9901 0.985 0.0 0.3334 0.9521 0.5039 0.9685 0.9692 0.0 0.35 0.9719 0.9617 0.9799 0.9338 0.9586
0.1747 79.0 17222 0.1615 0.9376 0.9949 0.9873 -1.0 0.5057 0.9406 0.5003 0.9599 0.9607 -1.0 0.5688 0.9644 0.9583 0.9746 0.917 0.9469
0.1747 80.0 17440 0.1482 0.9427 0.99 0.9823 -1.0 0.1933 0.9499 0.5025 0.9639 0.9642 -1.0 0.2321 0.9689 0.9566 0.9762 0.9289 0.9521
0.1707 81.0 17658 0.1379 0.9518 0.9901 0.9894 -1.0 0.2838 0.956 0.504 0.97 0.9702 -1.0 0.3 0.9742 0.965 0.9787 0.9386 0.9618
0.1707 82.0 17876 0.1384 0.9478 0.9901 0.9846 -1.0 0.2518 0.9545 0.504 0.9687 0.9691 -1.0 0.2643 0.9734 0.9612 0.9787 0.9344 0.9595
0.1658 83.0 18094 0.1379 0.9532 0.9901 0.9845 -1.0 0.2543 0.9567 0.5043 0.9707 0.9714 -1.0 0.2708 0.975 0.9655 0.981 0.9408 0.9617
0.1658 84.0 18312 0.1325 0.9544 0.9901 0.9845 0.0 0.256 0.9597 0.5047 0.9712 0.972 0.0 0.3036 0.9762 0.9672 0.9811 0.9417 0.9628
0.1532 85.0 18530 0.1558 0.9452 0.99 0.9845 -1.0 0.2469 0.9495 0.5009 0.9648 0.9657 -1.0 0.2769 0.9695 0.9584 0.9749 0.932 0.9565
0.1532 86.0 18748 0.1228 0.9538 0.9901 0.9841 -1.0 0.3437 0.9585 0.5056 0.972 0.9726 -1.0 0.3727 0.9747 0.9642 0.9806 0.9434 0.9646
0.1532 87.0 18966 0.1317 0.9587 0.9901 0.9844 0.0 0.4141 0.965 0.5064 0.9738 0.974 0.0 0.4517 0.9791 0.9676 0.9815 0.9498 0.9664
0.1574 88.0 19184 0.1318 0.9508 0.9901 0.9845 0.0 0.2545 0.9581 0.5059 0.9705 0.9706 0.0 0.2962 0.9747 0.9594 0.9778 0.9422 0.9633
0.1574 89.0 19402 0.1424 0.9513 0.9899 0.984 -1.0 0.2362 0.9547 0.5034 0.9691 0.9695 -1.0 0.2875 0.9729 0.9636 0.9786 0.939 0.9603
0.1537 90.0 19620 0.1240 0.9565 0.9901 0.9896 -1.0 0.5053 0.9592 0.5066 0.9747 0.9752 -1.0 0.55 0.9771 0.9669 0.9823 0.9461 0.9681
0.1537 91.0 19838 0.1382 0.947 0.9901 0.9835 0.0 0.5316 0.9504 0.5018 0.9681 0.9683 0.0 0.555 0.9712 0.9622 0.9775 0.9319 0.9592
0.1547 92.0 20056 0.1276 0.9565 0.9901 0.983 -1.0 0.3161 0.9618 0.5058 0.9742 0.9743 -1.0 0.3458 0.977 0.9668 0.9818 0.9462 0.9669
0.1547 93.0 20274 0.1329 0.9539 0.99 0.9836 -1.0 0.2997 0.9593 0.5053 0.9718 0.9728 -1.0 0.3318 0.9754 0.9679 0.982 0.9398 0.9635
0.1547 94.0 20492 0.1348 0.9571 0.99 0.9846 -1.0 0.3267 0.9615 0.5039 0.9732 0.9737 -1.0 0.3625 0.9761 0.9678 0.9823 0.9463 0.9652
0.1513 95.0 20710 0.1251 0.9546 0.9901 0.9844 0.0 0.2549 0.9626 0.5049 0.9728 0.9731 0.0 0.2625 0.9775 0.965 0.981 0.9442 0.9652
0.1513 96.0 20928 0.1264 0.9594 0.9901 0.9899 0.0 0.327 0.9631 0.5068 0.9755 0.9763 0.0 0.3409 0.9794 0.9696 0.9842 0.9492 0.9683
0.1635 97.0 21146 0.1306 0.9515 0.9901 0.9843 -1.0 0.2685 0.9561 0.5041 0.9696 0.9703 -1.0 0.2857 0.9742 0.9626 0.9795 0.9404 0.9611
0.1635 98.0 21364 0.1410 0.9481 0.9899 0.9788 0.0 0.4025 0.9542 0.5031 0.9662 0.9678 0.0 0.4458 0.9722 0.9621 0.9789 0.9341 0.9567
0.1505 99.0 21582 0.1253 0.9571 0.9901 0.984 -1.0 0.3105 0.962 0.5066 0.9737 0.974 -1.0 0.3375 0.9777 0.9702 0.9832 0.944 0.9648
0.1505 100.0 21800 0.1291 0.9532 0.9901 0.9845 -1.0 0.3203 0.9578 0.5044 0.9715 0.972 -1.0 0.3538 0.9747 0.9618 0.9775 0.9447 0.9664

Framework versions

  • Transformers 4.42.3
  • Pytorch 2.3.1+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
106
Safetensors
Model size
43.5M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for machinelearningzuu/queue_detection_cctv

Finetuned
(46)
this model