Edit model card

chickens-60-epoch-1000-images-aug

This model is a fine-tuned version of facebook/detr-resnet-50 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2851
  • Map: 0.8003
  • Map 50: 0.9645
  • Map 75: 0.9196
  • Map Small: 0.2246
  • Map Medium: 0.7999
  • Map Large: 0.8772
  • Mar 1: 0.3086
  • Mar 10: 0.8346
  • Mar 100: 0.8384
  • Mar Small: 0.3614
  • Mar Medium: 0.8496
  • Mar Large: 0.918
  • Map Chicken: 0.8072
  • Mar 100 Chicken: 0.8413
  • Map Duck: 0.7689
  • Mar 100 Duck: 0.799
  • Map Plant: 0.8248
  • Mar 100 Plant: 0.8749

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 2
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • num_epochs: 60

Training results

Training Loss Epoch Step Validation Loss Map Map 50 Map 75 Map Small Map Medium Map Large Mar 1 Mar 10 Mar 100 Mar Small Mar Medium Mar Large Map Chicken Mar 100 Chicken Map Duck Mar 100 Duck Map Plant Mar 100 Plant
1.4811 1.0 500 1.3713 0.0772 0.1135 0.0875 0.004 0.0389 0.3534 0.0527 0.1857 0.2513 0.0875 0.2276 0.7937 0.0069 0.0062 0.0 0.0 0.2247 0.7476
1.093 2.0 1000 1.1411 0.1941 0.2637 0.221 0.0042 0.134 0.6715 0.0688 0.2331 0.2586 0.1167 0.23 0.8259 0.0 0.0 0.0 0.0 0.5822 0.7758
1.06 3.0 1500 1.5299 0.1865 0.2571 0.2063 0.0172 0.1497 0.6155 0.0695 0.2154 0.2187 0.0583 0.199 0.6971 0.0 0.0 0.0 0.0 0.5595 0.6562
0.866 4.0 2000 1.0132 0.2298 0.307 0.2534 0.0251 0.1936 0.7269 0.0897 0.2671 0.2739 0.0688 0.2449 0.8172 0.0329 0.0613 0.0 0.0 0.6564 0.7602
0.8301 5.0 2500 0.9037 0.2895 0.3972 0.3375 0.0486 0.25 0.7397 0.1158 0.3495 0.3568 0.1063 0.3238 0.8305 0.2025 0.3062 0.0 0.0 0.6662 0.7643
0.8386 6.0 3000 0.9659 0.3229 0.4625 0.3953 0.012 0.2944 0.7146 0.1161 0.3938 0.3977 0.0333 0.3785 0.7837 0.3161 0.4693 0.0 0.0 0.6527 0.7236
0.9838 7.0 3500 0.7706 0.3831 0.5318 0.4649 0.0314 0.3525 0.7573 0.1347 0.4783 0.4829 0.0854 0.4636 0.8234 0.4609 0.6889 0.0 0.0 0.6886 0.7599
0.8051 8.0 4000 0.7424 0.3909 0.5361 0.4744 0.0298 0.362 0.7506 0.1396 0.489 0.4921 0.1063 0.4762 0.8222 0.4864 0.712 0.0 0.0 0.6864 0.7643
0.7114 9.0 4500 0.6860 0.425 0.5595 0.5062 0.0656 0.4023 0.7822 0.1429 0.5123 0.5168 0.1104 0.5014 0.849 0.5552 0.7609 0.0 0.0 0.7197 0.7896
0.8088 10.0 5000 0.6922 0.4107 0.5706 0.5008 0.0425 0.3953 0.7653 0.1375 0.4929 0.4981 0.1333 0.4833 0.8272 0.527 0.7218 0.0 0.0 0.705 0.7726
0.7049 11.0 5500 0.6989 0.4204 0.5653 0.5121 0.0859 0.4022 0.7752 0.1439 0.4952 0.499 0.1458 0.4863 0.8293 0.5406 0.7182 0.0 0.0 0.7205 0.7787
0.7244 12.0 6000 0.6311 0.4276 0.584 0.5016 0.0749 0.4104 0.7945 0.1445 0.4989 0.5035 0.1542 0.4876 0.8515 0.5447 0.7133 0.0 0.0 0.7382 0.7971
0.683 13.0 6500 0.6244 0.4371 0.5962 0.5288 0.0877 0.4149 0.7935 0.1447 0.5002 0.5031 0.1312 0.486 0.8427 0.5784 0.7244 0.0 0.0 0.7329 0.7847
0.6541 14.0 7000 0.5543 0.4719 0.6191 0.5555 0.0712 0.4525 0.8231 0.1494 0.5195 0.5249 0.1896 0.5025 0.8715 0.6567 0.7644 0.0 0.0 0.759 0.8104
0.6219 15.0 7500 0.5368 0.4754 0.6197 0.5553 0.0764 0.4584 0.825 0.1528 0.5216 0.5265 0.1688 0.5038 0.8711 0.6691 0.7724 0.0 0.0 0.7571 0.8072
0.5842 16.0 8000 0.5325 0.4778 0.6269 0.5668 0.1147 0.4558 0.8015 0.1501 0.5178 0.5218 0.1604 0.5071 0.8556 0.6922 0.7636 0.0 0.0 0.7412 0.8017
0.5704 17.0 8500 0.5437 0.5192 0.6982 0.6149 0.0616 0.5014 0.8084 0.1798 0.558 0.5618 0.1521 0.5445 0.8644 0.6772 0.7449 0.1347 0.1412 0.7456 0.7994
0.5683 18.0 9000 0.5068 0.6324 0.8451 0.7659 0.0963 0.6253 0.8208 0.225 0.6739 0.6793 0.175 0.6808 0.8628 0.6996 0.7573 0.4404 0.4753 0.7573 0.8052
0.6402 19.0 9500 0.4682 0.6741 0.8823 0.8298 0.1357 0.6748 0.8274 0.2516 0.7135 0.7185 0.2104 0.7246 0.8728 0.7195 0.7698 0.5335 0.567 0.7691 0.8187
0.5664 20.0 10000 0.4793 0.6841 0.9057 0.8277 0.135 0.6878 0.8164 0.2585 0.7299 0.7341 0.2396 0.7463 0.8649 0.7325 0.7853 0.5558 0.5979 0.7638 0.819
0.4411 21.0 10500 0.4448 0.7042 0.932 0.8592 0.1098 0.7039 0.8287 0.2789 0.7527 0.7568 0.1718 0.7703 0.8749 0.7128 0.7658 0.6338 0.6845 0.766 0.8202
0.6106 22.0 11000 0.4142 0.7307 0.9307 0.8797 0.0773 0.735 0.8381 0.2841 0.7736 0.7783 0.2062 0.7946 0.8866 0.7379 0.7853 0.6726 0.7134 0.7817 0.836
0.5243 23.0 11500 0.4353 0.7183 0.9406 0.86 0.0901 0.7236 0.8416 0.2827 0.7615 0.767 0.1973 0.779 0.8879 0.7338 0.7827 0.6385 0.6845 0.7827 0.8337
0.5184 24.0 12000 0.4077 0.7097 0.9464 0.854 0.1197 0.7156 0.8335 0.2741 0.757 0.7607 0.2553 0.7738 0.8828 0.7126 0.7667 0.6338 0.6784 0.7828 0.8372
0.4849 25.0 12500 0.4043 0.7096 0.949 0.8366 0.1234 0.7084 0.8412 0.2739 0.7538 0.7611 0.258 0.7703 0.8891 0.7483 0.7902 0.596 0.6546 0.7843 0.8383
0.5022 26.0 13000 0.3884 0.7394 0.9528 0.8847 0.1472 0.7337 0.8473 0.2918 0.7816 0.7876 0.2549 0.7888 0.8971 0.7466 0.7871 0.688 0.7402 0.7838 0.8354
0.521 27.0 13500 0.4197 0.7177 0.9434 0.8715 0.132 0.7168 0.8353 0.2879 0.7639 0.7697 0.2623 0.7777 0.8799 0.7073 0.7649 0.6685 0.7165 0.7771 0.8277
0.5433 28.0 14000 0.3886 0.7454 0.9508 0.8823 0.2083 0.7406 0.8448 0.292 0.7833 0.789 0.3064 0.7952 0.8845 0.7573 0.8004 0.6941 0.733 0.785 0.8334
0.3889 29.0 14500 0.3713 0.7492 0.9553 0.8998 0.2224 0.7492 0.8468 0.2891 0.7873 0.7936 0.3112 0.8026 0.8921 0.7677 0.8053 0.6849 0.7299 0.7951 0.8455
0.5103 30.0 15000 0.3556 0.7584 0.9576 0.9014 0.219 0.7509 0.8517 0.2939 0.7954 0.8015 0.3089 0.8078 0.8979 0.7654 0.8022 0.7165 0.7557 0.7934 0.8467
0.4458 31.0 15500 0.3681 0.7355 0.9518 0.8831 0.1416 0.7311 0.8569 0.292 0.7773 0.7814 0.2358 0.7892 0.9008 0.734 0.7742 0.6771 0.7247 0.7955 0.8452
0.4369 32.0 16000 0.3523 0.7495 0.9499 0.8877 0.142 0.7447 0.8514 0.2935 0.7877 0.7923 0.2589 0.8026 0.895 0.765 0.8027 0.6903 0.7309 0.7932 0.8432
0.447 33.0 16500 0.3665 0.7448 0.954 0.8912 0.1577 0.7412 0.8505 0.2922 0.7844 0.7879 0.2742 0.8005 0.8929 0.7453 0.7813 0.6933 0.7361 0.796 0.8464
0.4692 34.0 17000 0.3455 0.7589 0.954 0.899 0.1729 0.7459 0.863 0.2949 0.798 0.8038 0.3123 0.8068 0.9033 0.7709 0.8093 0.7058 0.7546 0.7999 0.8476
0.4272 35.0 17500 0.3381 0.767 0.9568 0.903 0.1734 0.7623 0.852 0.2948 0.802 0.8061 0.2907 0.8136 0.8962 0.7842 0.8204 0.7246 0.7546 0.7922 0.8432
0.4021 36.0 18000 0.3323 0.7686 0.9551 0.8938 0.1892 0.7621 0.8616 0.2969 0.8025 0.8067 0.3049 0.8122 0.9025 0.7776 0.8133 0.7245 0.7577 0.8038 0.849
0.4582 37.0 18500 0.3263 0.7732 0.9547 0.9023 0.1477 0.7748 0.8742 0.3015 0.8092 0.8132 0.2634 0.8251 0.9163 0.7729 0.812 0.7298 0.7619 0.817 0.8657
0.3992 38.0 19000 0.3207 0.7799 0.956 0.9064 0.1767 0.7823 0.873 0.3014 0.8168 0.8209 0.3028 0.8312 0.9172 0.7833 0.8218 0.7384 0.7732 0.818 0.8677
0.4286 39.0 19500 0.3194 0.7717 0.9567 0.8986 0.1626 0.7677 0.8779 0.3033 0.8101 0.8139 0.2835 0.82 0.918 0.7752 0.8204 0.7223 0.7598 0.8175 0.8614
0.4488 40.0 20000 0.3184 0.7718 0.9566 0.9047 0.1921 0.7702 0.8809 0.2999 0.8092 0.8141 0.3002 0.8238 0.9192 0.7776 0.8187 0.7156 0.7546 0.8223 0.8689
0.3763 41.0 20500 0.3055 0.7876 0.956 0.9186 0.1841 0.7824 0.8844 0.3033 0.8207 0.8254 0.3061 0.8339 0.9234 0.7973 0.8356 0.7394 0.7691 0.8262 0.8715
0.5658 42.0 21000 0.3014 0.791 0.9594 0.9167 0.2095 0.7911 0.8786 0.3032 0.8244 0.8299 0.3403 0.8403 0.9213 0.805 0.8404 0.7421 0.7742 0.8259 0.8749
0.4322 43.0 21500 0.2974 0.7974 0.9595 0.9169 0.1879 0.7927 0.8951 0.3078 0.8296 0.834 0.3085 0.8411 0.931 0.8005 0.8369 0.7586 0.7876 0.8331 0.8775
0.7057 44.0 22000 0.3092 0.7822 0.9563 0.9171 0.1985 0.7813 0.8688 0.3003 0.8165 0.821 0.3663 0.8292 0.9117 0.7941 0.8307 0.7348 0.766 0.8177 0.8663
0.4096 45.0 22500 0.2991 0.7899 0.9614 0.9121 0.2212 0.7852 0.8747 0.3031 0.8233 0.8286 0.3578 0.8351 0.9142 0.8016 0.8413 0.7502 0.7794 0.8179 0.8651
0.4854 46.0 23000 0.3003 0.7815 0.9595 0.9068 0.2042 0.7791 0.8747 0.3016 0.8164 0.8197 0.3258 0.8255 0.9163 0.7816 0.8231 0.746 0.7722 0.8169 0.8637
0.4257 47.0 23500 0.2951 0.792 0.9625 0.9172 0.2075 0.7855 0.8802 0.3067 0.8262 0.8309 0.3468 0.836 0.9197 0.7961 0.8338 0.7572 0.7907 0.8226 0.8683
0.4033 48.0 24000 0.2883 0.7988 0.9632 0.9194 0.2266 0.7984 0.8765 0.3082 0.8343 0.8382 0.3616 0.8477 0.9176 0.8069 0.8458 0.7649 0.7969 0.8246 0.872
0.4932 49.0 24500 0.3022 0.7844 0.9617 0.9101 0.2231 0.7765 0.8762 0.3007 0.8216 0.8252 0.3396 0.8308 0.9176 0.7882 0.8293 0.7472 0.7794 0.8177 0.8669
0.3758 50.0 25000 0.2959 0.7921 0.9609 0.9203 0.2432 0.7853 0.8779 0.3066 0.8273 0.8314 0.3655 0.8365 0.9197 0.7932 0.832 0.7619 0.7918 0.8212 0.8703
0.4397 51.0 25500 0.2871 0.7983 0.9609 0.9128 0.2145 0.7966 0.8832 0.3086 0.833 0.8374 0.3409 0.8478 0.9251 0.802 0.8369 0.7648 0.7969 0.828 0.8784
0.3917 52.0 26000 0.2907 0.7955 0.9645 0.9161 0.2316 0.7911 0.8796 0.308 0.8314 0.8352 0.375 0.8428 0.9192 0.7975 0.8356 0.7654 0.7969 0.8234 0.8732
0.3362 53.0 26500 0.2885 0.7989 0.9644 0.92 0.2324 0.7958 0.8789 0.3075 0.8338 0.8379 0.3769 0.8465 0.9201 0.8012 0.8382 0.7703 0.8 0.8253 0.8755
0.4004 54.0 27000 0.2869 0.7973 0.9644 0.9201 0.228 0.7957 0.8813 0.3069 0.8328 0.8368 0.3822 0.8456 0.9218 0.801 0.8373 0.7636 0.7948 0.8273 0.8781
0.406 55.0 27500 0.2871 0.8004 0.9645 0.9194 0.2283 0.7986 0.8788 0.3084 0.8343 0.8384 0.3822 0.8476 0.9205 0.8069 0.8404 0.7679 0.7979 0.8265 0.8769
0.3876 56.0 28000 0.2882 0.7985 0.9641 0.9197 0.2257 0.7974 0.8772 0.3084 0.834 0.838 0.3676 0.8474 0.918 0.8072 0.8436 0.7646 0.7969 0.8237 0.8735
0.3939 57.0 28500 0.2845 0.8024 0.9645 0.9195 0.2291 0.8014 0.8782 0.3093 0.8367 0.8405 0.3697 0.8511 0.9197 0.8102 0.8431 0.7709 0.8021 0.8262 0.8764
0.4218 58.0 29000 0.2852 0.8 0.9646 0.9196 0.2254 0.7993 0.8774 0.3085 0.8346 0.8384 0.3634 0.8488 0.9188 0.8067 0.8413 0.7689 0.799 0.8245 0.8749
0.4046 59.0 29500 0.2851 0.8008 0.9645 0.9196 0.2283 0.8002 0.878 0.3087 0.835 0.8388 0.3655 0.8497 0.9188 0.8079 0.8418 0.7689 0.799 0.8256 0.8758
0.4504 60.0 30000 0.2851 0.8003 0.9645 0.9196 0.2246 0.7999 0.8772 0.3086 0.8346 0.8384 0.3614 0.8496 0.918 0.8072 0.8413 0.7689 0.799 0.8248 0.8749

Framework versions

  • Transformers 4.46.1
  • Pytorch 2.5.0+cu121
  • Datasets 2.19.2
  • Tokenizers 0.20.1
Downloads last month
104
Safetensors
Model size
41.6M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for joe611/chickens-60-epoch-1000-images-aug

Finetuned
(447)
this model