segformer-b0-finetuned-net-4Sep
This model is a fine-tuned version of PushkarA07/segformer-b0-finetuned-net-4Sep on the PushkarA07/batch2-tiles_W5 dataset. It achieves the following results on the evaluation set:
- Loss: 0.0035
- Mean Iou: 0.8702
- Mean Accuracy: 0.9096
- Overall Accuracy: 0.9987
- Accuracy Abnormality: 0.8196
- Iou Abnormality: 0.7417
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-06
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Abnormality | Iou Abnormality |
|---|---|---|---|---|---|---|---|---|
| 0.0025 | 0.4348 | 10 | 0.0035 | 0.8709 | 0.9118 | 0.9987 | 0.8241 | 0.7432 |
| 0.0044 | 0.8696 | 20 | 0.0035 | 0.8703 | 0.9105 | 0.9987 | 0.8215 | 0.7420 |
| 0.0128 | 1.3043 | 30 | 0.0035 | 0.8701 | 0.9090 | 0.9987 | 0.8185 | 0.7415 |
| 0.0056 | 1.7391 | 40 | 0.0035 | 0.8700 | 0.9087 | 0.9987 | 0.8180 | 0.7413 |
| 0.0083 | 2.1739 | 50 | 0.0035 | 0.8702 | 0.9100 | 0.9987 | 0.8205 | 0.7418 |
| 0.0051 | 2.6087 | 60 | 0.0035 | 0.8703 | 0.9101 | 0.9987 | 0.8207 | 0.7420 |
| 0.0075 | 3.0435 | 70 | 0.0036 | 0.8704 | 0.9111 | 0.9987 | 0.8227 | 0.7422 |
| 0.0078 | 3.4783 | 80 | 0.0035 | 0.8704 | 0.9105 | 0.9987 | 0.8214 | 0.7421 |
| 0.0033 | 3.9130 | 90 | 0.0035 | 0.8706 | 0.9118 | 0.9987 | 0.8241 | 0.7426 |
| 0.0062 | 4.3478 | 100 | 0.0035 | 0.8702 | 0.9111 | 0.9987 | 0.8227 | 0.7418 |
| 0.003 | 4.7826 | 110 | 0.0036 | 0.8699 | 0.9088 | 0.9987 | 0.8181 | 0.7412 |
| 0.0114 | 5.2174 | 120 | 0.0035 | 0.8699 | 0.9087 | 0.9987 | 0.8178 | 0.7412 |
| 0.014 | 5.6522 | 130 | 0.0036 | 0.8699 | 0.9086 | 0.9987 | 0.8177 | 0.7412 |
| 0.0118 | 6.0870 | 140 | 0.0035 | 0.8700 | 0.9086 | 0.9987 | 0.8176 | 0.7413 |
| 0.0058 | 6.5217 | 150 | 0.0036 | 0.8701 | 0.9098 | 0.9987 | 0.8201 | 0.7416 |
| 0.0109 | 6.9565 | 160 | 0.0035 | 0.8701 | 0.9096 | 0.9987 | 0.8197 | 0.7416 |
| 0.0044 | 7.3913 | 170 | 0.0036 | 0.8708 | 0.9126 | 0.9987 | 0.8257 | 0.7429 |
| 0.0036 | 7.8261 | 180 | 0.0035 | 0.8704 | 0.9104 | 0.9987 | 0.8213 | 0.7422 |
| 0.0143 | 8.2609 | 190 | 0.0035 | 0.8701 | 0.9089 | 0.9987 | 0.8183 | 0.7415 |
| 0.0083 | 8.6957 | 200 | 0.0035 | 0.8705 | 0.9110 | 0.9987 | 0.8224 | 0.7424 |
| 0.0039 | 9.1304 | 210 | 0.0035 | 0.8704 | 0.9108 | 0.9987 | 0.8220 | 0.7421 |
| 0.0053 | 9.5652 | 220 | 0.0036 | 0.8704 | 0.9108 | 0.9987 | 0.8222 | 0.7421 |
| 0.0045 | 10.0 | 230 | 0.0035 | 0.8702 | 0.9096 | 0.9987 | 0.8196 | 0.7417 |
Framework versions
- Transformers 4.57.1
- Pytorch 2.8.0+cu126
- Datasets 4.2.0
- Tokenizers 0.22.1
- Downloads last month
- 7
Model tree for PushkarA07/segformer-b0-finetuned-net-4Sep
Unable to build the model tree, the base model loops to the model itself. Learn more.