fb-detr-table_detection_v1.0
This model is a fine-tuned version of facebook/detr-resnet-50 on the imagefolder dataset. It achieves the following results on the evaluation set:
- Loss: 0.2380
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 300
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
2.1131 | 1.21 | 20 | 1.3387 |
1.7233 | 2.42 | 40 | 1.1735 |
1.4974 | 3.64 | 60 | 1.0333 |
1.4395 | 4.85 | 80 | 1.0741 |
1.2497 | 6.06 | 100 | 0.7493 |
1.0696 | 7.27 | 120 | 0.6951 |
1.2718 | 8.48 | 140 | 0.7663 |
1.3003 | 9.7 | 160 | 0.9187 |
1.1703 | 10.91 | 180 | 0.6581 |
1.1463 | 12.12 | 200 | 0.6728 |
1.1198 | 13.33 | 220 | 0.6519 |
1.1313 | 14.55 | 240 | 0.6019 |
0.8707 | 15.76 | 260 | 0.5460 |
0.9215 | 16.97 | 280 | 0.5729 |
0.8017 | 18.18 | 300 | 0.5418 |
0.7221 | 19.39 | 320 | 0.5402 |
0.6872 | 20.61 | 340 | 0.5618 |
0.729 | 21.82 | 360 | 0.5744 |
0.7702 | 23.03 | 380 | 0.5305 |
0.7845 | 24.24 | 400 | 0.5043 |
0.7473 | 25.45 | 420 | 0.4903 |
0.7031 | 26.67 | 440 | 0.4830 |
0.6726 | 27.88 | 460 | 0.4640 |
0.6327 | 29.09 | 480 | 0.4662 |
0.6806 | 30.3 | 500 | 0.4619 |
0.6626 | 31.52 | 520 | 0.5005 |
0.6622 | 32.73 | 540 | 0.4601 |
0.7345 | 33.94 | 560 | 0.5567 |
0.7202 | 35.15 | 580 | 0.4721 |
0.6754 | 36.36 | 600 | 0.4950 |
0.608 | 37.58 | 620 | 0.4949 |
0.6812 | 38.79 | 640 | 0.4893 |
0.6648 | 40.0 | 660 | 0.5383 |
0.5884 | 41.21 | 680 | 0.4344 |
0.5823 | 42.42 | 700 | 0.4617 |
0.6158 | 43.64 | 720 | 0.4269 |
0.5702 | 44.85 | 740 | 0.4209 |
0.6794 | 46.06 | 760 | 0.4438 |
0.6795 | 47.27 | 780 | 0.4777 |
0.661 | 48.48 | 800 | 0.4214 |
0.6217 | 49.7 | 820 | 0.4380 |
0.6664 | 50.91 | 840 | 0.4573 |
0.5767 | 52.12 | 860 | 0.4435 |
0.5596 | 53.33 | 880 | 0.4772 |
0.5907 | 54.55 | 900 | 0.4336 |
0.56 | 55.76 | 920 | 0.4219 |
0.566 | 56.97 | 940 | 0.4606 |
0.5551 | 58.18 | 960 | 0.4153 |
0.5454 | 59.39 | 980 | 0.4567 |
0.5452 | 60.61 | 1000 | 0.4702 |
0.6073 | 61.82 | 1020 | 0.4247 |
0.5517 | 63.03 | 1040 | 0.4300 |
0.5351 | 64.24 | 1060 | 0.4356 |
0.532 | 65.45 | 1080 | 0.3722 |
0.5638 | 66.67 | 1100 | 0.3627 |
0.5537 | 67.88 | 1120 | 0.4079 |
0.5007 | 69.09 | 1140 | 0.3965 |
0.5202 | 70.3 | 1160 | 0.3760 |
0.5156 | 71.52 | 1180 | 0.4091 |
0.5396 | 72.73 | 1200 | 0.3823 |
0.5092 | 73.94 | 1220 | 0.3866 |
0.4667 | 75.15 | 1240 | 0.3713 |
0.4725 | 76.36 | 1260 | 0.3536 |
0.4835 | 77.58 | 1280 | 0.3421 |
0.4999 | 78.79 | 1300 | 0.3294 |
0.4983 | 80.0 | 1320 | 0.3866 |
0.4917 | 81.21 | 1340 | 0.3061 |
0.502 | 82.42 | 1360 | 0.3908 |
0.5435 | 83.64 | 1380 | 0.3587 |
0.4925 | 84.85 | 1400 | 0.3662 |
0.469 | 86.06 | 1420 | 0.3547 |
0.4184 | 87.27 | 1440 | 0.3229 |
0.4232 | 88.48 | 1460 | 0.3478 |
0.3962 | 89.7 | 1480 | 0.3286 |
0.4217 | 90.91 | 1500 | 0.3668 |
0.427 | 92.12 | 1520 | 0.3554 |
0.4433 | 93.33 | 1540 | 0.3214 |
0.4304 | 94.55 | 1560 | 0.3243 |
0.4353 | 95.76 | 1580 | 0.2909 |
0.4153 | 96.97 | 1600 | 0.3032 |
0.3819 | 98.18 | 1620 | 0.2858 |
0.3911 | 99.39 | 1640 | 0.2721 |
0.3513 | 100.61 | 1660 | 0.2763 |
0.3266 | 101.82 | 1680 | 0.2538 |
0.3222 | 103.03 | 1700 | 0.2543 |
0.3326 | 104.24 | 1720 | 0.2548 |
0.3219 | 105.45 | 1740 | 0.2737 |
0.3313 | 106.67 | 1760 | 0.2381 |
0.3557 | 107.88 | 1780 | 0.2728 |
0.3312 | 109.09 | 1800 | 0.2784 |
0.3206 | 110.3 | 1820 | 0.2462 |
0.3015 | 111.52 | 1840 | 0.2587 |
0.2903 | 112.73 | 1860 | 0.2411 |
0.2853 | 113.94 | 1880 | 0.2533 |
0.2917 | 115.15 | 1900 | 0.2662 |
0.2802 | 116.36 | 1920 | 0.2491 |
0.2774 | 117.58 | 1940 | 0.2523 |
0.2848 | 118.79 | 1960 | 0.2426 |
0.2813 | 120.0 | 1980 | 0.2339 |
0.2752 | 121.21 | 2000 | 0.2444 |
0.2804 | 122.42 | 2020 | 0.2231 |
0.2456 | 123.64 | 2040 | 0.2174 |
0.2689 | 124.85 | 2060 | 0.2136 |
0.252 | 126.06 | 2080 | 0.2257 |
0.2498 | 127.27 | 2100 | 0.2311 |
0.2404 | 128.48 | 2120 | 0.2260 |
0.2608 | 129.7 | 2140 | 0.2256 |
0.2332 | 130.91 | 2160 | 0.2135 |
0.2345 | 132.12 | 2180 | 0.2229 |
0.2558 | 133.33 | 2200 | 0.2022 |
0.2228 | 134.55 | 2220 | 0.2115 |
0.2269 | 135.76 | 2240 | 0.2069 |
0.2264 | 136.97 | 2260 | 0.2124 |
0.2151 | 138.18 | 2280 | 0.2117 |
0.2375 | 139.39 | 2300 | 0.1976 |
0.2231 | 140.61 | 2320 | 0.2047 |
0.2157 | 141.82 | 2340 | 0.2107 |
0.2307 | 143.03 | 2360 | 0.1989 |
0.2097 | 144.24 | 2380 | 0.2077 |
0.2134 | 145.45 | 2400 | 0.2234 |
0.1975 | 146.67 | 2420 | 0.2179 |
0.2087 | 147.88 | 2440 | 0.2019 |
0.2029 | 149.09 | 2460 | 0.2041 |
0.2038 | 150.3 | 2480 | 0.2036 |
0.2202 | 151.52 | 2500 | 0.1984 |
0.203 | 152.73 | 2520 | 0.1943 |
0.2201 | 153.94 | 2540 | 0.2064 |
0.1868 | 155.15 | 2560 | 0.2126 |
0.2185 | 156.36 | 2580 | 0.2131 |
0.1917 | 157.58 | 2600 | 0.2031 |
0.1898 | 158.79 | 2620 | 0.2009 |
0.1923 | 160.0 | 2640 | 0.2170 |
0.1865 | 161.21 | 2660 | 0.2068 |
0.1971 | 162.42 | 2680 | 0.2053 |
0.1942 | 163.64 | 2700 | 0.2011 |
0.1902 | 164.85 | 2720 | 0.1993 |
0.1817 | 166.06 | 2740 | 0.1952 |
0.1837 | 167.27 | 2760 | 0.2222 |
0.1835 | 168.48 | 2780 | 0.2173 |
0.1923 | 169.7 | 2800 | 0.2072 |
0.1798 | 170.91 | 2820 | 0.2069 |
0.1815 | 172.12 | 2840 | 0.2078 |
0.1724 | 173.33 | 2860 | 0.2183 |
0.1924 | 174.55 | 2880 | 0.2005 |
0.1922 | 175.76 | 2900 | 0.2069 |
0.1709 | 176.97 | 2920 | 0.2212 |
0.1766 | 178.18 | 2940 | 0.1978 |
0.1728 | 179.39 | 2960 | 0.2029 |
0.1757 | 180.61 | 2980 | 0.2030 |
0.1665 | 181.82 | 3000 | 0.2219 |
0.1694 | 183.03 | 3020 | 0.2205 |
0.1786 | 184.24 | 3040 | 0.2020 |
0.1749 | 185.45 | 3060 | 0.2007 |
0.1739 | 186.67 | 3080 | 0.2046 |
0.1723 | 187.88 | 3100 | 0.1986 |
0.1669 | 189.09 | 3120 | 0.2041 |
0.1658 | 190.3 | 3140 | 0.2179 |
0.1701 | 191.52 | 3160 | 0.2159 |
0.1691 | 192.73 | 3180 | 0.2099 |
0.1739 | 193.94 | 3200 | 0.1996 |
0.1729 | 195.15 | 3220 | 0.2126 |
0.1636 | 196.36 | 3240 | 0.2080 |
0.1612 | 197.58 | 3260 | 0.2154 |
0.1653 | 198.79 | 3280 | 0.2031 |
0.1629 | 200.0 | 3300 | 0.2206 |
0.1565 | 201.21 | 3320 | 0.2223 |
0.1632 | 202.42 | 3340 | 0.2122 |
0.1689 | 203.64 | 3360 | 0.1986 |
0.1682 | 204.85 | 3380 | 0.2092 |
0.1671 | 206.06 | 3400 | 0.2309 |
0.175 | 207.27 | 3420 | 0.2129 |
0.1607 | 208.48 | 3440 | 0.2393 |
0.165 | 209.7 | 3460 | 0.2125 |
0.1593 | 210.91 | 3480 | 0.2304 |
0.1594 | 212.12 | 3500 | 0.2325 |
0.1471 | 213.33 | 3520 | 0.2341 |
0.1598 | 214.55 | 3540 | 0.2175 |
0.1542 | 215.76 | 3560 | 0.2162 |
0.1602 | 216.97 | 3580 | 0.2277 |
0.1577 | 218.18 | 3600 | 0.2117 |
0.1625 | 219.39 | 3620 | 0.2118 |
0.1517 | 220.61 | 3640 | 0.2252 |
0.1545 | 221.82 | 3660 | 0.2129 |
0.152 | 223.03 | 3680 | 0.2216 |
0.161 | 224.24 | 3700 | 0.2169 |
0.1509 | 225.45 | 3720 | 0.2225 |
0.1502 | 226.67 | 3740 | 0.2339 |
0.1542 | 227.88 | 3760 | 0.2199 |
0.145 | 229.09 | 3780 | 0.2270 |
0.1499 | 230.3 | 3800 | 0.2189 |
0.1506 | 231.52 | 3820 | 0.2227 |
0.1556 | 232.73 | 3840 | 0.2260 |
0.1454 | 233.94 | 3860 | 0.2213 |
0.1472 | 235.15 | 3880 | 0.2159 |
0.1437 | 236.36 | 3900 | 0.2256 |
0.1448 | 237.58 | 3920 | 0.2278 |
0.1536 | 238.79 | 3940 | 0.2288 |
0.1446 | 240.0 | 3960 | 0.2400 |
0.1593 | 241.21 | 3980 | 0.2284 |
0.1463 | 242.42 | 4000 | 0.2258 |
0.1472 | 243.64 | 4020 | 0.2263 |
0.1455 | 244.85 | 4040 | 0.2285 |
0.1442 | 246.06 | 4060 | 0.2250 |
0.1499 | 247.27 | 4080 | 0.2318 |
0.1485 | 248.48 | 4100 | 0.2238 |
0.1545 | 249.7 | 4120 | 0.2257 |
0.1296 | 250.91 | 4140 | 0.2396 |
0.1425 | 252.12 | 4160 | 0.2377 |
0.1441 | 253.33 | 4180 | 0.2390 |
0.1343 | 254.55 | 4200 | 0.2389 |
0.1445 | 255.76 | 4220 | 0.2244 |
0.1445 | 256.97 | 4240 | 0.2299 |
0.1429 | 258.18 | 4260 | 0.2209 |
0.1479 | 259.39 | 4280 | 0.2221 |
0.1429 | 260.61 | 4300 | 0.2372 |
0.1452 | 261.82 | 4320 | 0.2357 |
0.1501 | 263.03 | 4340 | 0.2370 |
0.1404 | 264.24 | 4360 | 0.2311 |
0.1314 | 265.45 | 4380 | 0.2454 |
0.1498 | 266.67 | 4400 | 0.2243 |
0.1418 | 267.88 | 4420 | 0.2243 |
0.1453 | 269.09 | 4440 | 0.2258 |
0.1378 | 270.3 | 4460 | 0.2300 |
0.1442 | 271.52 | 4480 | 0.2269 |
0.1463 | 272.73 | 4500 | 0.2249 |
0.1352 | 273.94 | 4520 | 0.2262 |
0.1419 | 275.15 | 4540 | 0.2333 |
0.1326 | 276.36 | 4560 | 0.2358 |
0.1373 | 277.58 | 4580 | 0.2256 |
0.1317 | 278.79 | 4600 | 0.2295 |
0.1367 | 280.0 | 4620 | 0.2371 |
0.1346 | 281.21 | 4640 | 0.2352 |
0.1357 | 282.42 | 4660 | 0.2300 |
0.1372 | 283.64 | 4680 | 0.2414 |
0.1298 | 284.85 | 4700 | 0.2417 |
0.1368 | 286.06 | 4720 | 0.2269 |
0.1447 | 287.27 | 4740 | 0.2312 |
0.1394 | 288.48 | 4760 | 0.2339 |
0.1258 | 289.7 | 4780 | 0.2399 |
0.1427 | 290.91 | 4800 | 0.2380 |
Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.0
- Tokenizers 0.11.0
- Downloads last month
- 70
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.