Edit model card

kelp-from-scratch-segformer-b1-lr-0.0001-cleaned

This model is a fine-tuned version of on the samitizerxu/kelp_data_rgbagg_swin_nir_int_cleaned dataset. It achieves the following results on the evaluation set:

  • Iou Kelp: 0.1757
  • Loss: 0.7377

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 22
  • eval_batch_size: 22
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • num_epochs: 50

Training results

Training Loss Epoch Step Iou Kelp Validation Loss
0.9943 0.15 30 0.0093 0.9841
0.9987 0.3 60 0.0091 0.9797
0.9912 0.46 90 0.0093 0.9822
0.9809 0.61 120 0.0089 0.9788
0.9882 0.76 150 0.0096 0.9791
0.9804 0.91 180 0.0070 0.9799
0.9987 1.07 210 0.0104 0.9275
0.997 1.22 240 0.0086 0.9696
0.8985 1.37 270 0.0086 0.9513
0.9921 1.52 300 0.0234 0.8703
0.9656 1.68 330 0.0115 0.9610
0.9229 1.83 360 0.0235 0.8871
0.9999 1.98 390 0.0117 0.9416
0.8409 2.13 420 0.0172 0.9212
0.9533 2.28 450 0.0322 0.8209
0.9815 2.44 480 0.0167 0.9070
0.9567 2.59 510 0.0314 0.8028
0.9747 2.74 540 0.0300 0.8489
0.983 2.89 570 0.0176 0.9069
0.963 3.05 600 0.0312 0.8878
0.9616 3.2 630 0.0249 0.8482
0.9395 3.35 660 0.0418 0.7781
0.8966 3.5 690 0.0155 0.8981
0.9939 3.65 720 0.0219 0.8737
0.9895 3.81 750 0.0107 0.9480
0.9434 3.96 780 0.0266 0.8413
0.9813 4.11 810 0.0323 0.8320
0.9051 4.26 840 0.0379 0.8097
0.9629 4.42 870 0.0447 0.7619
0.9067 4.57 900 0.0330 0.8137
0.9245 4.72 930 0.0253 0.8327
1.0 4.87 960 0.0273 0.8226
0.9869 5.03 990 0.0182 0.8993
0.9236 5.18 1020 0.0390 0.7772
0.9487 5.33 1050 0.0433 0.7845
0.9085 5.48 1080 0.0181 0.9285
0.9518 5.63 1110 0.0514 0.7551
0.9768 5.79 1140 0.0344 0.8475
0.8779 5.94 1170 0.0123 0.9069
0.9556 6.09 1200 0.0183 0.9066
0.8875 6.24 1230 0.1934 0.7787
0.9932 6.4 1260 0.0543 0.8902
0.9991 6.55 1290 0.0467 0.7678
0.9654 6.7 1320 0.0420 0.8434
0.9794 6.85 1350 0.0190 0.9163
0.9659 7.01 1380 0.0345 0.9441
0.8959 7.16 1410 0.0255 0.8717
0.9775 7.31 1440 0.0296 0.9072
0.9406 7.46 1470 0.0331 0.8282
0.9702 7.61 1500 0.0283 0.8532
0.9828 7.77 1530 0.0164 0.8719
0.9511 7.92 1560 0.0248 0.8392
0.9046 8.07 1590 0.0116 0.9260
0.9508 8.22 1620 0.0243 0.8499
0.9535 8.38 1650 0.0185 0.8567
0.9586 8.53 1680 0.0176 0.8867
0.947 8.68 1710 0.0296 0.7973
0.9404 8.83 1740 0.0137 0.8879
1.0 8.98 1770 0.0227 0.8902
0.9618 9.14 1800 0.0419 0.8119
0.8463 9.29 1830 0.0500 0.8065
0.9683 9.44 1860 0.0136 0.9266
0.9087 9.59 1890 0.0357 0.8041
0.946 9.75 1920 0.0426 0.8023
0.9723 9.9 1950 0.0470 0.7823
0.9487 10.05 1980 0.0218 0.8771
0.9483 10.2 2010 0.0265 0.8317
0.9678 10.36 2040 0.0512 0.7447
0.9909 10.51 2070 0.0266 0.8505
0.9688 10.66 2100 0.0315 0.8457
0.9617 10.81 2130 0.0388 0.7916
0.9105 10.96 2160 0.0641 0.7939
0.9447 11.12 2190 0.0338 0.8046
0.9127 11.27 2220 0.0883 0.7771
0.9151 11.42 2250 0.0285 0.8568
0.9339 11.57 2280 0.0773 0.7554
1.0 11.73 2310 0.0452 0.7623
0.9428 11.88 2340 0.0147 0.9230
1.0 12.03 2370 0.0266 0.8265
0.9432 12.18 2400 0.0284 0.8732
0.9436 12.34 2430 0.0398 0.7938
0.9772 12.49 2460 0.0345 0.8073
0.9552 12.64 2490 0.0125 0.9084
1.0 12.79 2520 0.0255 0.8099
0.953 12.94 2550 0.0694 0.7384
0.9225 13.1 2580 0.0286 0.8104
0.9119 13.25 2610 0.0312 0.8538
0.9726 13.4 2640 0.0511 0.7505
0.9674 13.55 2670 0.0504 0.7473
0.972 13.71 2700 0.0496 0.8009
0.9238 13.86 2730 0.0179 0.8479
0.9535 14.01 2760 0.1306 0.7997
0.9509 14.16 2790 0.0254 0.8065
0.8756 14.31 2820 0.0584 0.7479
0.9335 14.47 2850 0.0297 0.7988
0.956 14.62 2880 0.0311 0.8155
0.9544 14.77 2910 0.0526 0.8159
0.8577 14.92 2940 0.0770 0.7906
0.965 15.08 2970 0.0709 0.7446
0.987 15.23 3000 0.0479 0.7767
0.9692 15.38 3030 0.0895 0.8900
0.9602 15.53 3060 0.1262 0.8408
0.9586 15.69 3090 0.0643 0.7370
0.8871 15.84 3120 0.1753 0.7848
0.9259 15.99 3150 0.1110 0.7571
0.9463 16.14 3180 0.0651 0.7799
0.9489 16.29 3210 0.0627 0.7776
0.9814 16.45 3240 0.0476 0.7595
0.926 16.6 3270 0.0588 0.7354
0.8921 16.75 3300 0.0608 0.7637
0.9722 16.9 3330 0.0404 0.7916
0.9535 17.06 3360 0.0520 0.8781
0.9442 17.21 3390 0.0326 0.8189
0.945 17.36 3420 0.1141 0.8753
0.9799 17.51 3450 0.0678 0.7472
0.8504 17.66 3480 0.1633 0.8211
1.0 17.82 3510 0.0479 0.7849
0.9681 17.97 3540 0.0585 0.7741
0.9492 18.12 3570 0.0656 0.7374
0.9481 18.27 3600 0.0817 0.7382
0.9405 18.43 3630 0.0805 0.9278
0.8967 18.58 3660 0.0457 0.7692
0.9215 18.73 3690 0.0615 0.8308
0.9722 18.88 3720 0.1454 0.8367
0.9352 19.04 3750 0.1014 0.7641
0.9581 19.19 3780 0.1549 0.8425
0.9438 19.34 3810 0.0689 0.7524
0.976 19.49 3840 0.1321 0.8181
0.9248 19.64 3870 0.1782 0.8164
0.9114 19.8 3900 0.1553 0.7879
0.8975 19.95 3930 0.1875 0.7522
0.9696 20.1 3960 0.1521 0.8031
0.9217 20.25 3990 0.0667 0.7436
0.9375 20.41 4020 0.0902 0.9042
0.886 20.56 4050 0.0672 0.7541
0.9647 20.71 4080 0.1952 0.7983
0.9029 20.86 4110 0.0600 0.8339
0.9865 21.02 4140 0.0353 0.8191
0.9348 21.17 4170 0.0683 0.9285
0.965 21.32 4200 0.1153 0.7778
0.9006 21.47 4230 0.2049 0.7928
0.9726 21.62 4260 0.0687 0.7431
0.8811 21.78 4290 0.1643 0.7903
0.9622 21.93 4320 0.1069 0.7641
0.9267 22.08 4350 0.0647 0.7764
0.9729 22.23 4380 0.0770 0.7323
0.951 22.39 4410 0.1069 0.8905
0.976 22.54 4440 0.1024 0.7324
0.9763 22.69 4470 0.0679 0.8286
0.912 22.84 4500 0.1492 0.7784
0.8856 22.99 4530 0.1400 0.7411
0.9663 23.15 4560 0.1588 0.8195
0.9577 23.3 4590 0.0532 0.7803
0.9898 23.45 4620 0.1014 0.7892
0.9079 23.6 4650 0.0457 0.7695
0.9014 23.76 4680 0.1119 0.7742
0.959 23.91 4710 0.0781 0.7461
0.9762 24.06 4740 0.0852 0.8429
0.952 24.21 4770 0.0978 0.7348
0.9606 24.37 4800 0.0966 0.7263
0.93 24.52 4830 0.0707 0.7334
0.9514 24.67 4860 0.2207 0.7526
0.9639 24.82 4890 0.0877 0.7545
0.8319 24.97 4920 0.0751 0.7816
0.959 25.13 4950 0.0457 0.7726
0.9875 25.28 4980 0.0877 0.9052
0.9567 25.43 5010 0.0875 0.7308
0.8535 25.58 5040 0.1697 0.8189
0.903 25.74 5070 0.2176 0.7322
0.9654 25.89 5100 0.2082 0.7325
0.9139 26.04 5130 0.0856 0.7604
0.9684 26.19 5160 0.1764 0.8378
0.9869 26.35 5190 0.0372 0.8459
0.9325 26.5 5220 0.2127 0.7494
0.9396 26.65 5250 0.2123 0.7630
0.9522 26.8 5280 0.1121 0.7878
0.9404 26.95 5310 0.0783 0.7300
0.8336 27.11 5340 0.0862 0.8091
0.9827 27.26 5370 0.1633 0.7761
0.9743 27.41 5400 0.1033 0.7903
0.8255 27.56 5430 0.1535 0.7349
0.9828 27.72 5460 0.0835 0.7236
0.9607 27.87 5490 0.1012 0.7503
0.9659 28.02 5520 0.1087 0.7412
0.9467 28.17 5550 0.0687 0.7867
0.9261 28.32 5580 0.1773 0.8152
1.0 28.48 5610 0.0922 0.7728
0.9543 28.63 5640 0.2284 0.7482
0.9198 28.78 5670 0.2101 0.7313
0.9667 28.93 5700 0.1985 0.7698
0.8591 29.09 5730 0.0994 0.7528
0.9697 29.24 5760 0.1437 0.7865
0.9313 29.39 5790 0.1197 0.7443
0.9457 29.54 5820 0.1529 0.8172
0.9283 29.7 5850 0.1204 0.7310
0.8794 29.85 5880 0.2253 0.7703
0.9999 30.0 5910 0.0922 0.7463
1.0 30.15 5940 0.0763 0.7472
0.9674 30.3 5970 0.0678 0.7574
0.9543 30.46 6000 0.1619 0.7388
0.96 30.61 6030 0.1436 0.8416
0.9778 30.76 6060 0.0994 0.7353
0.9436 30.91 6090 0.1649 0.7740
0.9054 31.07 6120 0.1537 0.7387
0.967 31.22 6150 0.1574 0.7569
0.9174 31.37 6180 0.1378 0.7870
0.9667 31.52 6210 0.1505 0.7650
0.9848 31.68 6240 0.1231 0.7584
0.9514 31.83 6270 0.1188 0.7533
0.9179 31.98 6300 0.2073 0.7696
0.9733 32.13 6330 0.0941 0.7400
0.9177 32.28 6360 0.1431 0.8260
0.9338 32.44 6390 0.1259 0.7474
0.9704 32.59 6420 0.2298 0.7447
0.9133 32.74 6450 0.1500 0.7347
0.9121 32.89 6480 0.1538 0.7280
0.9649 33.05 6510 0.1617 0.7385
0.9113 33.2 6540 0.1399 0.7408
0.998 33.35 6570 0.1663 0.7621
0.9567 33.5 6600 0.1559 0.7560
0.9421 33.65 6630 0.1966 0.7766
0.9441 33.81 6660 0.1558 0.7314
0.934 33.96 6690 0.1846 0.7564
0.9874 34.11 6720 0.2541 0.7462
0.8515 34.26 6750 0.2071 0.7591
0.9204 34.42 6780 0.1673 0.7342
0.9358 34.57 6810 0.1883 0.7930
0.8267 34.72 6840 0.2290 0.7462
0.8998 34.87 6870 0.2199 0.7532
0.9496 35.03 6900 0.1121 0.7522
0.9854 35.18 6930 0.1238 0.7288
0.9971 35.33 6960 0.1982 0.7527
0.9621 35.48 6990 0.1837 0.7460
0.9626 35.63 7020 0.1268 0.7404
0.9037 35.79 7050 0.1184 0.7267
0.908 35.94 7080 0.1914 0.7388
0.996 36.09 7110 0.2036 0.7363
0.9635 36.24 7140 0.1858 0.7450
0.9446 36.4 7170 0.1363 0.7285
0.9808 36.55 7200 0.1578 0.7666
0.9212 36.7 7230 0.2064 0.7660
0.8472 36.85 7260 0.1804 0.7241
0.9328 37.01 7290 0.1143 0.7270
0.9276 37.16 7320 0.2104 0.7725
0.9599 37.31 7350 0.2237 0.7334
0.9058 37.46 7380 0.1586 0.7304
0.8654 37.61 7410 0.1439 0.7490
0.9653 37.77 7440 0.1785 0.7817
0.9201 37.92 7470 0.1178 0.7317
0.9545 38.07 7500 0.1523 0.7752
0.9484 38.22 7530 0.1208 0.7194
0.8723 38.38 7560 0.2017 0.7564
0.9555 38.53 7590 0.1065 0.7323
0.9654 38.68 7620 0.1721 0.7586
0.9044 38.83 7650 0.1482 0.7538
0.9745 38.98 7680 0.1507 0.7523
0.991 39.14 7710 0.1344 0.7389
0.9504 39.29 7740 0.1108 0.7170
0.9948 39.44 7770 0.1555 0.7555
0.9458 39.59 7800 0.1324 0.7640
0.9725 39.75 7830 0.1792 0.7599
0.9747 39.9 7860 0.1785 0.7485
0.9779 40.05 7890 0.1751 0.7391
0.9325 40.2 7920 0.2171 0.7406
0.8857 40.36 7950 0.1687 0.7203
0.9229 40.51 7980 0.2092 0.7256
0.9177 40.66 8010 0.1453 0.7217
0.9315 40.81 8040 0.1878 0.7415
0.9942 40.96 8070 0.1602 0.7443
0.9101 41.12 8100 0.1596 0.7546
0.9029 41.27 8130 0.1510 0.7346
0.994 41.42 8160 0.1474 0.7336
0.9862 41.57 8190 0.1274 0.7234
0.9136 41.73 8220 0.1425 0.7433
0.9723 41.88 8250 0.1138 0.7286
0.937 42.03 8280 0.1345 0.7425
0.9773 42.18 8310 0.1405 0.7342
0.9655 42.34 8340 0.1193 0.7290
0.9165 42.49 8370 0.1306 0.7318
0.9409 42.64 8400 0.1504 0.7364
0.976 42.79 8430 0.2013 0.7437
1.0 42.94 8460 0.1821 0.7342
0.967 43.1 8490 0.1685 0.7384
0.9877 43.25 8520 0.1471 0.7409
0.9736 43.4 8550 0.1682 0.7372
1.0 43.55 8580 0.1467 0.7332
0.8718 43.71 8610 0.1380 0.7329
0.997 43.86 8640 0.1314 0.7350
1.0 44.01 8670 0.1372 0.7361
1.0 44.16 8700 0.1442 0.7400
0.8811 44.31 8730 0.1603 0.7432
1.0 44.47 8760 0.1651 0.7373
0.9233 44.62 8790 0.2112 0.7484
0.9555 44.77 8820 0.1837 0.7375
0.8655 44.92 8850 0.1394 0.7348
0.9908 45.08 8880 0.1355 0.7375
0.8959 45.23 8910 0.1391 0.7354
0.9595 45.38 8940 0.1437 0.7325
0.9383 45.53 8970 0.1448 0.7382
0.9417 45.69 9000 0.1793 0.7486
0.9317 45.84 9030 0.1720 0.7391
0.9744 45.99 9060 0.1552 0.7388
0.9443 46.14 9090 0.1486 0.7345
0.9325 46.29 9120 0.1391 0.7383
0.9421 46.45 9150 0.1539 0.7393
0.9451 46.6 9180 0.1436 0.7328
0.9538 46.75 9210 0.1419 0.7342
0.955 46.9 9240 0.1581 0.7433
0.9611 47.06 9270 0.1652 0.7407
0.9296 47.21 9300 0.1716 0.7377
0.9413 47.36 9330 0.1567 0.7374
0.9372 47.51 9360 0.1511 0.7376
0.9524 47.66 9390 0.1586 0.7369
0.9681 47.82 9420 0.1411 0.7370
0.9295 47.97 9450 0.1506 0.7394
0.9983 48.12 9480 0.1503 0.7318
0.8795 48.27 9510 0.1363 0.7276
0.9147 48.43 9540 0.1502 0.7331
0.9063 48.58 9570 0.1556 0.7400
0.9501 48.73 9600 0.1616 0.7346
1.0 48.88 9630 0.1546 0.7307
0.9565 49.04 9660 0.1458 0.7335
0.9266 49.19 9690 0.1525 0.7336
0.935 49.34 9720 0.1516 0.7333
0.8765 49.49 9750 0.1364 0.7313
0.9403 49.64 9780 0.1501 0.7282
0.91 49.8 9810 0.1577 0.7422
0.9521 49.95 9840 0.1757 0.7377

Framework versions

  • Transformers 4.37.1
  • Pytorch 2.1.2
  • Datasets 2.16.1
  • Tokenizers 0.15.1
Downloads last month
8
Safetensors
Model size
13.7M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.