Edit model card

law-game-evidence-replacement-finetune

This model is a fine-tuned version of PekingU/rtdetr_r50vd_coco_o365 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 3.5533
  • Map: 0.9339
  • Map 50: 0.9616
  • Map 75: 0.9575
  • Map Small: 0.5574
  • Map Medium: 0.9423
  • Map Large: 0.9699
  • Mar 1: 0.6597
  • Mar 10: 0.9522
  • Mar 100: 0.9722
  • Mar Small: 0.7411
  • Mar Medium: 0.9806
  • Mar Large: 0.9908
  • Map Evidence: -1.0
  • Mar 100 Evidence: -1.0
  • Map Ambulance: 0.9802
  • Mar 100 Ambulance: 0.9899
  • Map Artificial Target: 0.9245
  • Mar 100 Artificial Target: 0.9611
  • Map Cartridge: 0.9759
  • Mar 100 Cartridge: 0.9937
  • Map Gun: 0.9225
  • Mar 100 Gun: 0.9542
  • Map Knife: 0.8562
  • Mar 100 Knife: 0.9404
  • Map Police: 0.9495
  • Mar 100 Police: 0.999
  • Map Traffic Cone: 0.9285
  • Mar 100 Traffic Cone: 0.9673

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 300
  • num_epochs: 25

Training results

Training Loss Epoch Step Validation Loss Map Map 50 Map 75 Map Small Map Medium Map Large Mar 1 Mar 10 Mar 100 Mar Small Mar Medium Mar Large Map Evidence Mar 100 Evidence Map Ambulance Mar 100 Ambulance Map Artificial Target Mar 100 Artificial Target Map Cartridge Mar 100 Cartridge Map Gun Mar 100 Gun Map Knife Mar 100 Knife Map Police Mar 100 Police Map Traffic Cone Mar 100 Traffic Cone
No log 1.0 183 17.1925 0.553 0.61 0.584 0.1918 0.3235 0.6555 0.5467 0.8763 0.8964 0.3142 0.8206 0.9705 -1.0 -1.0 0.9057 0.9848 0.5233 0.7299 0.9125 0.9647 0.1841 0.9194 0.6003 0.8687 0.518 0.9286 0.2268 0.8789
No log 2.0 366 7.1301 0.7763 0.8536 0.8146 0.2855 0.6006 0.875 0.6198 0.9116 0.9359 0.62 0.876 0.9781 -1.0 -1.0 0.9418 0.9707 0.7052 0.8648 0.9529 0.9733 0.5436 0.9667 0.7831 0.9172 0.8516 0.9398 0.656 0.9191
37.3669 3.0 549 5.7075 0.848 0.9115 0.8936 0.3256 0.7543 0.9317 0.6317 0.9274 0.9486 0.6783 0.9289 0.9849 -1.0 -1.0 0.9687 0.9879 0.7575 0.8761 0.9619 0.9822 0.8187 0.9653 0.8076 0.9172 0.9181 0.9827 0.7032 0.9287
37.3669 4.0 732 5.8395 0.8232 0.8809 0.8653 0.3221 0.7104 0.8994 0.642 0.9362 0.9536 0.688 0.9333 0.9884 -1.0 -1.0 0.9718 0.9899 0.8061 0.8878 0.9676 0.9854 0.8731 0.9778 0.7678 0.9162 0.6454 0.9867 0.7303 0.9317
37.3669 5.0 915 5.2081 0.8722 0.924 0.9156 0.3818 0.7789 0.951 0.6457 0.9406 0.9593 0.6963 0.9663 0.9887 -1.0 -1.0 0.976 0.9899 0.8077 0.9071 0.973 0.9869 0.7967 0.9611 0.8391 0.9313 0.8822 0.9908 0.8309 0.9482
4.4127 6.0 1098 5.4515 0.8848 0.9339 0.9262 0.5118 0.8295 0.9572 0.6538 0.9446 0.9624 0.6997 0.9621 0.9903 -1.0 -1.0 0.9686 0.9889 0.7937 0.9057 0.9784 0.9886 0.8982 0.9722 0.8491 0.9434 0.8521 0.9888 0.8534 0.9495
4.4127 7.0 1281 4.9756 0.9019 0.9468 0.9396 0.5037 0.8805 0.9631 0.6476 0.9443 0.9666 0.703 0.9692 0.9932 -1.0 -1.0 0.9754 0.9889 0.821 0.9129 0.9753 0.9907 0.9017 0.9833 0.8054 0.9414 0.9414 0.9949 0.8933 0.9541
4.4127 8.0 1464 4.4998 0.9119 0.9554 0.9432 0.5098 0.9047 0.9619 0.6482 0.9436 0.9641 0.7091 0.9786 0.9876 -1.0 -1.0 0.9726 0.9899 0.869 0.9248 0.9738 0.9902 0.8528 0.9556 0.8667 0.9384 0.9631 0.9939 0.8853 0.9561
3.3287 9.0 1647 4.5378 0.9107 0.9472 0.9424 0.5347 0.91 0.9625 0.6551 0.9498 0.9694 0.7243 0.984 0.9908 -1.0 -1.0 0.9802 0.9899 0.8691 0.9281 0.9785 0.9939 0.9128 0.9736 0.8646 0.9424 0.8889 0.9929 0.8805 0.965
3.3287 10.0 1830 5.0033 0.8831 0.9264 0.9202 0.5206 0.8887 0.9369 0.6497 0.9456 0.9661 0.7104 0.9714 0.9895 -1.0 -1.0 0.9404 0.9899 0.8686 0.929 0.9741 0.9917 0.92 0.9722 0.8131 0.9323 0.7768 0.9908 0.889 0.9568
2.8465 11.0 2013 4.1896 0.9183 0.9522 0.9491 0.4507 0.8926 0.9704 0.6595 0.9497 0.9676 0.7033 0.9677 0.9884 -1.0 -1.0 0.9786 0.9899 0.8902 0.9333 0.9745 0.993 0.9182 0.9583 0.8633 0.9424 0.9004 0.9929 0.9031 0.963
2.8465 12.0 2196 4.3806 0.9118 0.9486 0.9445 0.5313 0.8959 0.9574 0.6545 0.9487 0.9701 0.688 0.9741 0.9929 -1.0 -1.0 0.9791 0.9899 0.8856 0.935 0.9736 0.9924 0.9151 0.975 0.8429 0.9384 0.8852 0.998 0.9008 0.9617
2.8465 13.0 2379 4.3575 0.9131 0.9471 0.9419 0.5419 0.9126 0.9643 0.6576 0.9531 0.9717 0.7239 0.9875 0.9909 -1.0 -1.0 0.9677 0.9899 0.8731 0.9358 0.9774 0.9951 0.9226 0.9708 0.8794 0.9545 0.8601 0.9939 0.9114 0.9617
2.5085 14.0 2562 4.0609 0.9277 0.9619 0.9539 0.5802 0.9195 0.9659 0.6566 0.9518 0.9703 0.7168 0.9791 0.9913 -1.0 -1.0 0.9697 0.9899 0.902 0.9451 0.9819 0.9958 0.9273 0.9667 0.8392 0.9374 0.954 0.9939 0.9199 0.9634
2.5085 15.0 2745 4.2034 0.9284 0.961 0.9559 0.541 0.9483 0.9743 0.6606 0.9502 0.9695 0.7169 0.9792 0.9902 -1.0 -1.0 0.979 0.9899 0.9022 0.9469 0.9781 0.9947 0.9251 0.9611 0.8495 0.9404 0.9481 0.9918 0.9167 0.9614
2.5085 16.0 2928 4.1849 0.9283 0.9599 0.9559 0.5591 0.9323 0.9644 0.6575 0.9493 0.9697 0.7267 0.9795 0.988 -1.0 -1.0 0.9716 0.9899 0.9033 0.948 0.9754 0.9949 0.9108 0.9583 0.8469 0.9404 0.9675 0.9929 0.9222 0.9634
2.2183 17.0 3111 4.0696 0.9222 0.9556 0.9503 0.5517 0.9288 0.9634 0.6572 0.9523 0.9726 0.7348 0.9863 0.992 -1.0 -1.0 0.9707 0.9899 0.9052 0.9496 0.9784 0.9949 0.9309 0.9694 0.8074 0.9465 0.9525 0.9959 0.91 0.962
2.2183 18.0 3294 4.3283 0.9126 0.9461 0.9414 0.5422 0.9246 0.9498 0.6564 0.9502 0.9698 0.7138 0.9805 0.9896 -1.0 -1.0 0.9723 0.9899 0.901 0.9483 0.9815 0.9952 0.9204 0.9528 0.7964 0.9394 0.9043 0.9969 0.9125 0.966
2.2183 19.0 3477 3.7839 0.9209 0.9518 0.9477 0.5608 0.9475 0.9583 0.6562 0.9512 0.9701 0.7414 0.9806 0.9885 -1.0 -1.0 0.9566 0.9899 0.9131 0.9531 0.9779 0.9949 0.9132 0.9486 0.833 0.9404 0.9407 0.9959 0.9117 0.9677
2.009 20.0 3660 3.7275 0.9287 0.958 0.9542 0.5558 0.9078 0.9681 0.6586 0.951 0.9709 0.7422 0.979 0.9892 -1.0 -1.0 0.9802 0.9899 0.9239 0.96 0.9761 0.9944 0.9222 0.9514 0.8345 0.9364 0.9389 0.998 0.9248 0.966
2.009 21.0 3843 3.8496 0.93 0.9592 0.9554 0.5552 0.9187 0.9664 0.6581 0.9508 0.9708 0.7373 0.9788 0.9904 -1.0 -1.0 0.9802 0.9899 0.9148 0.9565 0.9778 0.9949 0.9227 0.9556 0.853 0.9364 0.9443 1.0 0.9167 0.9624
1.8494 22.0 4026 3.6452 0.9309 0.9592 0.9551 0.5561 0.929 0.9664 0.6595 0.9495 0.9709 0.723 0.9801 0.9902 -1.0 -1.0 0.9802 0.9899 0.9176 0.9593 0.9764 0.9935 0.9212 0.95 0.8459 0.9374 0.9471 0.999 0.928 0.9673
1.8494 23.0 4209 3.6352 0.9299 0.9587 0.9546 0.5524 0.9155 0.9681 0.659 0.9509 0.9708 0.7217 0.98 0.9902 -1.0 -1.0 0.9802 0.9899 0.9175 0.9589 0.9756 0.9935 0.9217 0.9514 0.8458 0.9364 0.9448 0.999 0.9236 0.9667
1.8494 24.0 4392 3.6526 0.9298 0.9577 0.9535 0.5572 0.9119 0.9666 0.6593 0.9518 0.972 0.725 0.9794 0.9911 -1.0 -1.0 0.9802 0.9899 0.9229 0.9609 0.976 0.9934 0.9217 0.9542 0.8493 0.9394 0.936 1.0 0.9226 0.9663
1.7025 25.0 4575 3.5533 0.9339 0.9616 0.9575 0.5574 0.9423 0.9699 0.6597 0.9522 0.9722 0.7411 0.9806 0.9908 -1.0 -1.0 0.9802 0.9899 0.9245 0.9611 0.9759 0.9937 0.9225 0.9542 0.8562 0.9404 0.9495 0.999 0.9285 0.9673

Framework versions

  • Transformers 4.44.0.dev0
  • Pytorch 2.3.1+cu121
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
42.9M params
Tensor type
F32
·
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for anastasispk/law-game-evidence-replacement-finetune

Finetuned
this model