File size: 10,156 Bytes
2abd674 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 |
---
license: other
base_model: nvidia/mit-b1
tags:
- vision
- image-segmentation
- generated_from_trainer
model-index:
- name: segformer-b1-finetuned-Hiking
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b1-finetuned-Hiking
This model is a fine-tuned version of [nvidia/mit-b1](https://huggingface.co/nvidia/mit-b1) on the twdent/Hiking dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1129
- Mean Iou: 0.6261
- Mean Accuracy: 0.9707
- Overall Accuracy: 0.9700
- Accuracy Unlabeled: nan
- Accuracy Traversable: 0.9730
- Accuracy Non-traversable: 0.9684
- Iou Unlabeled: 0.0
- Iou Traversable: 0.9226
- Iou Non-traversable: 0.9557
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Traversable | Accuracy Non-traversable | Iou Unlabeled | Iou Traversable | Iou Non-traversable |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:--------------------:|:------------------------:|:-------------:|:---------------:|:-------------------:|
| 0.5526 | 1.33 | 20 | 0.7232 | 0.5814 | 0.9404 | 0.9333 | nan | 0.9630 | 0.9177 | 0.0 | 0.8433 | 0.9009 |
| 0.3768 | 2.67 | 40 | 0.3616 | 0.5929 | 0.9504 | 0.9453 | nan | 0.9666 | 0.9341 | 0.0 | 0.8605 | 0.9183 |
| 0.3271 | 4.0 | 60 | 0.2683 | 0.6080 | 0.9606 | 0.9571 | nan | 0.9718 | 0.9494 | 0.0 | 0.8883 | 0.9358 |
| 0.2377 | 5.33 | 80 | 0.2754 | 0.5869 | 0.9256 | 0.9429 | nan | 0.8706 | 0.9807 | 0.0 | 0.8415 | 0.9191 |
| 0.2242 | 6.67 | 100 | 0.2299 | 0.6027 | 0.9447 | 0.9545 | nan | 0.9134 | 0.9760 | 0.0 | 0.8739 | 0.9341 |
| 0.2458 | 8.0 | 120 | 0.1939 | 0.6207 | 0.9604 | 0.9667 | nan | 0.9402 | 0.9805 | 0.0 | 0.9107 | 0.9513 |
| 0.1541 | 9.33 | 140 | 0.1988 | 0.6121 | 0.9654 | 0.9599 | nan | 0.9831 | 0.9477 | 0.0 | 0.8967 | 0.9395 |
| 0.1448 | 10.67 | 160 | 0.1722 | 0.6202 | 0.9677 | 0.9662 | nan | 0.9725 | 0.9629 | 0.0 | 0.9112 | 0.9494 |
| 0.1533 | 12.0 | 180 | 0.2112 | 0.5951 | 0.9570 | 0.9454 | nan | 0.9941 | 0.9199 | 0.0 | 0.8676 | 0.9176 |
| 0.107 | 13.33 | 200 | 0.1658 | 0.6139 | 0.9626 | 0.9614 | nan | 0.9665 | 0.9587 | 0.0 | 0.8992 | 0.9426 |
| 0.109 | 14.67 | 220 | 0.1342 | 0.6267 | 0.9714 | 0.9712 | nan | 0.9724 | 0.9705 | 0.0 | 0.9231 | 0.9569 |
| 0.1092 | 16.0 | 240 | 0.1448 | 0.6173 | 0.9690 | 0.9636 | nan | 0.9860 | 0.9519 | 0.0 | 0.9065 | 0.9453 |
| 0.0971 | 17.33 | 260 | 0.1282 | 0.6216 | 0.9691 | 0.9673 | nan | 0.9747 | 0.9635 | 0.0 | 0.9136 | 0.9512 |
| 0.1448 | 18.67 | 280 | 0.1504 | 0.6155 | 0.9661 | 0.9619 | nan | 0.9795 | 0.9526 | 0.0 | 0.9032 | 0.9434 |
| 0.0797 | 20.0 | 300 | 0.1312 | 0.6209 | 0.9669 | 0.9666 | nan | 0.9680 | 0.9659 | 0.0 | 0.9124 | 0.9503 |
| 0.0766 | 21.33 | 320 | 0.1164 | 0.6251 | 0.9667 | 0.9696 | nan | 0.9574 | 0.9760 | 0.0 | 0.9198 | 0.9555 |
| 0.0822 | 22.67 | 340 | 0.1365 | 0.6171 | 0.9638 | 0.9639 | nan | 0.9635 | 0.9641 | 0.0 | 0.9050 | 0.9464 |
| 0.075 | 24.0 | 360 | 0.1401 | 0.6160 | 0.9679 | 0.9621 | nan | 0.9862 | 0.9495 | 0.0 | 0.9046 | 0.9433 |
| 0.0684 | 25.33 | 380 | 0.1317 | 0.6190 | 0.9687 | 0.9645 | nan | 0.9822 | 0.9552 | 0.0 | 0.9099 | 0.9472 |
| 0.0767 | 26.67 | 400 | 0.1293 | 0.6195 | 0.9699 | 0.9651 | nan | 0.9851 | 0.9547 | 0.0 | 0.9107 | 0.9478 |
| 0.0576 | 28.0 | 420 | 0.1195 | 0.6236 | 0.9701 | 0.9679 | nan | 0.9771 | 0.9631 | 0.0 | 0.9180 | 0.9529 |
| 0.0596 | 29.33 | 440 | 0.1179 | 0.6248 | 0.9717 | 0.9692 | nan | 0.9794 | 0.9639 | 0.0 | 0.9204 | 0.9541 |
| 0.0564 | 30.67 | 460 | 0.1110 | 0.6264 | 0.9719 | 0.9701 | nan | 0.9777 | 0.9661 | 0.0 | 0.9233 | 0.9559 |
| 0.0496 | 32.0 | 480 | 0.1063 | 0.6284 | 0.9726 | 0.9714 | nan | 0.9761 | 0.9690 | 0.0 | 0.9271 | 0.9581 |
| 0.0722 | 33.33 | 500 | 0.1073 | 0.6272 | 0.9712 | 0.9711 | nan | 0.9716 | 0.9708 | 0.0 | 0.9244 | 0.9573 |
| 0.0465 | 34.67 | 520 | 0.1228 | 0.6220 | 0.9692 | 0.9669 | nan | 0.9763 | 0.9620 | 0.0 | 0.9150 | 0.9510 |
| 0.0655 | 36.0 | 540 | 0.1142 | 0.6245 | 0.9704 | 0.9689 | nan | 0.9752 | 0.9656 | 0.0 | 0.9196 | 0.9540 |
| 0.0516 | 37.33 | 560 | 0.1197 | 0.6238 | 0.9687 | 0.9684 | nan | 0.9696 | 0.9677 | 0.0 | 0.9181 | 0.9533 |
| 0.0774 | 38.67 | 580 | 0.1114 | 0.6266 | 0.9706 | 0.9704 | nan | 0.9712 | 0.9700 | 0.0 | 0.9234 | 0.9565 |
| 0.0572 | 40.0 | 600 | 0.1124 | 0.6261 | 0.9707 | 0.9700 | nan | 0.9730 | 0.9684 | 0.0 | 0.9227 | 0.9558 |
| 0.0554 | 41.33 | 620 | 0.1116 | 0.6273 | 0.9718 | 0.9709 | nan | 0.9747 | 0.9688 | 0.0 | 0.9248 | 0.9570 |
| 0.0438 | 42.67 | 640 | 0.1192 | 0.6259 | 0.9707 | 0.9694 | nan | 0.9746 | 0.9667 | 0.0 | 0.9225 | 0.9551 |
| 0.0486 | 44.0 | 660 | 0.1186 | 0.6248 | 0.9709 | 0.9689 | nan | 0.9775 | 0.9643 | 0.0 | 0.9203 | 0.9540 |
| 0.0582 | 45.33 | 680 | 0.1194 | 0.6250 | 0.9705 | 0.9691 | nan | 0.9751 | 0.9660 | 0.0 | 0.9209 | 0.9542 |
| 0.0643 | 46.67 | 700 | 0.1157 | 0.6252 | 0.9706 | 0.9692 | nan | 0.9750 | 0.9662 | 0.0 | 0.9209 | 0.9546 |
| 0.057 | 48.0 | 720 | 0.1181 | 0.6251 | 0.9708 | 0.9691 | nan | 0.9763 | 0.9654 | 0.0 | 0.9210 | 0.9543 |
| 0.0468 | 49.33 | 740 | 0.1129 | 0.6261 | 0.9707 | 0.9700 | nan | 0.9730 | 0.9684 | 0.0 | 0.9226 | 0.9557 |
### Framework versions
- Transformers 4.35.0.dev0
- Pytorch 2.1.0+cu118
- Datasets 2.14.5
- Tokenizers 0.14.0
|