rtdetr2
This model is a fine-tuned version of PekingU/rtdetr_r50vd_coco_o365 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 11.8738
- Map: 0.2766
- Map 50: 0.439
- Map 75: 0.2829
- Map Small: 0.1319
- Map Medium: 0.375
- Map Large: 0.4519
- Mar 1: 0.2272
- Mar 10: 0.438
- Mar 100: 0.4559
- Mar Small: 0.2268
- Mar Medium: 0.551
- Mar Large: 0.6909
- Map Person: 0.7159
- Mar 100 Person: 0.8345
- Map Ear: 0.3388
- Mar 100 Ear: 0.4368
- Map Earmuffs: 0.2418
- Mar 100 Earmuffs: 0.5135
- Map Face: 0.4365
- Mar 100 Face: 0.6137
- Map Face-guard: 0.1839
- Mar 100 Face-guard: 0.5625
- Map Face-mask-medical: 0.1733
- Mar 100 Face-mask-medical: 0.2842
- Map Foot: 0.0448
- Mar 100 Foot: 0.2647
- Map Tools: 0.0879
- Mar 100 Tools: 0.284
- Map Glasses: 0.224
- Mar 100 Glasses: 0.3644
- Map Gloves: 0.2994
- Mar 100 Gloves: 0.4226
- Map Helmet: 0.2581
- Mar 100 Helmet: 0.3714
- Map Hands: 0.5167
- Mar 100 Hands: 0.6166
- Map Head: 0.5923
- Mar 100 Head: 0.6739
- Map Medical-suit: 0.1135
- Mar 100 Medical-suit: 0.5556
- Map Shoes: 0.3707
- Mar 100 Shoes: 0.483
- Map Safety-suit: 0.0097
- Mar 100 Safety-suit: 0.315
- Map Safety-vest: 0.0956
- Mar 100 Safety-vest: 0.1532
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 300
- num_epochs: 10
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Person | Mar 100 Person | Map Ear | Mar 100 Ear | Map Earmuffs | Mar 100 Earmuffs | Map Face | Mar 100 Face | Map Face-guard | Mar 100 Face-guard | Map Face-mask-medical | Mar 100 Face-mask-medical | Map Foot | Mar 100 Foot | Map Tools | Mar 100 Tools | Map Glasses | Mar 100 Glasses | Map Gloves | Mar 100 Gloves | Map Helmet | Mar 100 Helmet | Map Hands | Mar 100 Hands | Map Head | Mar 100 Head | Map Medical-suit | Mar 100 Medical-suit | Map Shoes | Mar 100 Shoes | Map Safety-suit | Mar 100 Safety-suit | Map Safety-vest | Mar 100 Safety-vest |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 459 | 12.5722 | 0.1349 | 0.2143 | 0.143 | 0.0728 | 0.1775 | 0.1605 | 0.1436 | 0.2943 | 0.3139 | 0.1692 | 0.4142 | 0.5087 | 0.4573 | 0.8371 | 0.1857 | 0.4252 | 0.0002 | 0.1162 | 0.3926 | 0.613 | 0.001 | 0.2625 | 0.0572 | 0.1719 | 0.0004 | 0.102 | 0.0248 | 0.1706 | 0.0501 | 0.361 | 0.0807 | 0.1743 | 0.059 | 0.2911 | 0.2425 | 0.521 | 0.4662 | 0.6855 | 0.0 | 0.0222 | 0.2745 | 0.4592 | 0.0016 | 0.095 | 0.0002 | 0.0274 |
26.3979 | 2.0 | 918 | 11.9482 | 0.208 | 0.3216 | 0.2213 | 0.1016 | 0.2792 | 0.3172 | 0.1846 | 0.363 | 0.3799 | 0.2024 | 0.4917 | 0.6211 | 0.633 | 0.8297 | 0.2283 | 0.335 | 0.0016 | 0.3622 | 0.5135 | 0.6272 | 0.0562 | 0.4875 | 0.1298 | 0.2737 | 0.0051 | 0.2137 | 0.0393 | 0.2209 | 0.1385 | 0.3452 | 0.2407 | 0.3783 | 0.1832 | 0.3482 | 0.4565 | 0.6032 | 0.5955 | 0.694 | 0.0002 | 0.0444 | 0.3128 | 0.4551 | 0.0009 | 0.17 | 0.0009 | 0.0694 |
18.1729 | 3.0 | 1377 | 11.9715 | 0.2184 | 0.3418 | 0.2313 | 0.1126 | 0.3017 | 0.347 | 0.1909 | 0.3883 | 0.4101 | 0.2354 | 0.5085 | 0.6091 | 0.6705 | 0.8279 | 0.3031 | 0.4063 | 0.0117 | 0.3919 | 0.4866 | 0.6111 | 0.0355 | 0.5375 | 0.1147 | 0.2825 | 0.0111 | 0.2784 | 0.0514 | 0.2397 | 0.2188 | 0.4102 | 0.2171 | 0.3757 | 0.1851 | 0.3554 | 0.4664 | 0.5985 | 0.5942 | 0.6886 | 0.0014 | 0.2167 | 0.3408 | 0.4824 | 0.001 | 0.14 | 0.0037 | 0.129 |
16.7823 | 4.0 | 1836 | 11.5629 | 0.2362 | 0.3642 | 0.2521 | 0.1213 | 0.3333 | 0.3732 | 0.2058 | 0.3925 | 0.4068 | 0.2236 | 0.5135 | 0.6145 | 0.6948 | 0.8366 | 0.3441 | 0.4262 | 0.0691 | 0.4216 | 0.5345 | 0.6544 | 0.0409 | 0.4625 | 0.1528 | 0.3053 | 0.0125 | 0.2647 | 0.0651 | 0.2936 | 0.1319 | 0.3079 | 0.2895 | 0.4088 | 0.2312 | 0.3634 | 0.4867 | 0.5981 | 0.6117 | 0.6891 | 0.0104 | 0.2056 | 0.3356 | 0.4696 | 0.001 | 0.115 | 0.0039 | 0.0935 |
15.8096 | 5.0 | 2295 | 11.8381 | 0.2593 | 0.4074 | 0.2739 | 0.1281 | 0.3589 | 0.4355 | 0.2155 | 0.4214 | 0.4365 | 0.2201 | 0.5394 | 0.6675 | 0.6888 | 0.8275 | 0.3686 | 0.4572 | 0.1986 | 0.4378 | 0.5029 | 0.6389 | 0.1532 | 0.475 | 0.1791 | 0.3421 | 0.0229 | 0.2137 | 0.0583 | 0.2428 | 0.2389 | 0.4034 | 0.2943 | 0.4434 | 0.2326 | 0.3607 | 0.486 | 0.601 | 0.5887 | 0.6697 | 0.0124 | 0.4278 | 0.3458 | 0.4595 | 0.0035 | 0.25 | 0.0332 | 0.1694 |
14.7312 | 6.0 | 2754 | 11.8431 | 0.2451 | 0.3926 | 0.2562 | 0.1243 | 0.3434 | 0.3958 | 0.2131 | 0.412 | 0.4263 | 0.2187 | 0.5118 | 0.6653 | 0.6987 | 0.8307 | 0.3293 | 0.4215 | 0.1879 | 0.4162 | 0.4487 | 0.6174 | 0.0248 | 0.4875 | 0.1453 | 0.3263 | 0.0302 | 0.249 | 0.0799 | 0.2737 | 0.2061 | 0.3531 | 0.2911 | 0.4035 | 0.2357 | 0.333 | 0.5077 | 0.6144 | 0.5511 | 0.6344 | 0.0402 | 0.5222 | 0.3495 | 0.4789 | 0.0018 | 0.15 | 0.0394 | 0.1355 |
14.5207 | 7.0 | 3213 | 11.9739 | 0.2614 | 0.4212 | 0.2679 | 0.1246 | 0.3646 | 0.4079 | 0.2235 | 0.418 | 0.4289 | 0.2175 | 0.5351 | 0.617 | 0.6899 | 0.8326 | 0.3247 | 0.419 | 0.2156 | 0.4541 | 0.4293 | 0.6056 | 0.1801 | 0.4875 | 0.1518 | 0.2807 | 0.0507 | 0.2294 | 0.084 | 0.2649 | 0.2206 | 0.3678 | 0.2696 | 0.3903 | 0.2628 | 0.3625 | 0.5094 | 0.612 | 0.5925 | 0.6774 | 0.021 | 0.4056 | 0.3423 | 0.4637 | 0.0132 | 0.3 | 0.0863 | 0.1387 |
14.0764 | 8.0 | 3672 | 11.9295 | 0.2646 | 0.4292 | 0.2715 | 0.1278 | 0.3629 | 0.4405 | 0.2236 | 0.4189 | 0.4308 | 0.2232 | 0.5261 | 0.668 | 0.7004 | 0.8286 | 0.3267 | 0.4195 | 0.2031 | 0.4459 | 0.4416 | 0.6123 | 0.188 | 0.45 | 0.1675 | 0.3053 | 0.0274 | 0.2373 | 0.0765 | 0.2644 | 0.2267 | 0.3921 | 0.321 | 0.446 | 0.245 | 0.367 | 0.5036 | 0.6054 | 0.5888 | 0.6741 | 0.0382 | 0.3722 | 0.3382 | 0.4527 | 0.019 | 0.305 | 0.0858 | 0.1452 |
13.772 | 9.0 | 4131 | 11.8863 | 0.2678 | 0.4293 | 0.2743 | 0.1306 | 0.3694 | 0.4456 | 0.2276 | 0.4405 | 0.4524 | 0.2331 | 0.5595 | 0.6747 | 0.7065 | 0.8322 | 0.3349 | 0.432 | 0.2274 | 0.5568 | 0.4314 | 0.6113 | 0.1785 | 0.5625 | 0.1666 | 0.3228 | 0.0354 | 0.2608 | 0.0893 | 0.2812 | 0.2225 | 0.3842 | 0.2942 | 0.4235 | 0.2533 | 0.367 | 0.5127 | 0.6124 | 0.5853 | 0.6719 | 0.043 | 0.4611 | 0.3693 | 0.4833 | 0.0097 | 0.245 | 0.092 | 0.1839 |
13.5604 | 10.0 | 4590 | 11.8738 | 0.2766 | 0.439 | 0.2829 | 0.1319 | 0.375 | 0.4519 | 0.2272 | 0.438 | 0.4559 | 0.2268 | 0.551 | 0.6909 | 0.7159 | 0.8345 | 0.3388 | 0.4368 | 0.2418 | 0.5135 | 0.4365 | 0.6137 | 0.1839 | 0.5625 | 0.1733 | 0.2842 | 0.0448 | 0.2647 | 0.0879 | 0.284 | 0.224 | 0.3644 | 0.2994 | 0.4226 | 0.2581 | 0.3714 | 0.5167 | 0.6166 | 0.5923 | 0.6739 | 0.1135 | 0.5556 | 0.3707 | 0.483 | 0.0097 | 0.315 | 0.0956 | 0.1532 |
Framework versions
- Transformers 4.46.0.dev0
- Pytorch 2.4.1+cu121
- Datasets 3.0.1
- Tokenizers 0.20.1
- Downloads last month
- 63
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.