Edit model card

segformer-b0-finetuned-pothole_0910_512

This model is a fine-tuned version of nvidia/mit-b0 on the alphaca/orthophoto_pothole_etc_0911 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0060
  • Mean Iou: 0.3613
  • Mean Accuracy: 0.7225
  • Overall Accuracy: 0.7225
  • Accuracy Unlabeled: nan
  • Accuracy Pothole: 0.7225
  • Iou Unlabeled: 0.0
  • Iou Pothole: 0.7225

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 300

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Unlabeled Accuracy Pothole Iou Unlabeled Iou Pothole
0.1419 5.5556 100 0.1363 0.0 0.0 0.0 nan 0.0 0.0 0.0
0.0483 11.1111 200 0.0476 0.0 0.0 0.0 nan 0.0 0.0 0.0
0.0264 16.6667 300 0.0396 0.0002 0.0005 0.0005 nan 0.0005 0.0 0.0005
0.025 22.2222 400 0.0196 0.0201 0.0401 0.0401 nan 0.0401 0.0 0.0401
0.0136 27.7778 500 0.0202 0.0886 0.1771 0.1771 nan 0.1771 0.0 0.1771
0.0118 33.3333 600 0.0201 0.1940 0.3880 0.3880 nan 0.3880 0.0 0.3880
0.0087 38.8889 700 0.0132 0.2738 0.5476 0.5476 nan 0.5476 0.0 0.5476
0.0115 44.4444 800 0.0116 0.2922 0.5843 0.5843 nan 0.5843 0.0 0.5843
0.0061 50.0 900 0.0099 0.2666 0.5331 0.5331 nan 0.5331 0.0 0.5331
0.0049 55.5556 1000 0.0094 0.3046 0.6091 0.6091 nan 0.6091 0.0 0.6091
0.0038 61.1111 1100 0.0087 0.3494 0.6989 0.6989 nan 0.6989 0.0 0.6989
0.0056 66.6667 1200 0.0073 0.3532 0.7063 0.7063 nan 0.7063 0.0 0.7063
0.0045 72.2222 1300 0.0083 0.3509 0.7019 0.7019 nan 0.7019 0.0 0.7019
0.0064 77.7778 1400 0.0070 0.3438 0.6875 0.6875 nan 0.6875 0.0 0.6875
0.0067 83.3333 1500 0.0068 0.3266 0.6533 0.6533 nan 0.6533 0.0 0.6533
0.0045 88.8889 1600 0.0066 0.3497 0.6995 0.6995 nan 0.6995 0.0 0.6995
0.004 94.4444 1700 0.0065 0.3421 0.6841 0.6841 nan 0.6841 0.0 0.6841
0.0031 100.0 1800 0.0071 0.3749 0.7498 0.7498 nan 0.7498 0.0 0.7498
0.0052 105.5556 1900 0.0062 0.3708 0.7416 0.7416 nan 0.7416 0.0 0.7416
0.0025 111.1111 2000 0.0060 0.3602 0.7205 0.7205 nan 0.7205 0.0 0.7205
0.003 116.6667 2100 0.0060 0.3669 0.7338 0.7338 nan 0.7338 0.0 0.7338
0.0032 122.2222 2200 0.0059 0.3756 0.7512 0.7512 nan 0.7512 0.0 0.7512
0.0023 127.7778 2300 0.0058 0.3579 0.7158 0.7158 nan 0.7158 0.0 0.7158
0.0035 133.3333 2400 0.0058 0.3609 0.7218 0.7218 nan 0.7218 0.0 0.7218
0.0017 138.8889 2500 0.0056 0.3464 0.6928 0.6928 nan 0.6928 0.0 0.6928
0.0042 144.4444 2600 0.0060 0.3532 0.7064 0.7064 nan 0.7064 0.0 0.7064
0.0052 150.0 2700 0.0057 0.3606 0.7212 0.7212 nan 0.7212 0.0 0.7212
0.0033 155.5556 2800 0.0057 0.3516 0.7032 0.7032 nan 0.7032 0.0 0.7032
0.0018 161.1111 2900 0.0057 0.3614 0.7228 0.7228 nan 0.7228 0.0 0.7228
0.003 166.6667 3000 0.0060 0.3784 0.7567 0.7567 nan 0.7567 0.0 0.7567
0.003 172.2222 3100 0.0057 0.3635 0.7270 0.7270 nan 0.7270 0.0 0.7270
0.0027 177.7778 3200 0.0055 0.3653 0.7307 0.7307 nan 0.7307 0.0 0.7307
0.0014 183.3333 3300 0.0057 0.3537 0.7075 0.7075 nan 0.7075 0.0 0.7075
0.0017 188.8889 3400 0.0056 0.3695 0.7389 0.7389 nan 0.7389 0.0 0.7389
0.0015 194.4444 3500 0.0057 0.3736 0.7473 0.7473 nan 0.7473 0.0 0.7473
0.0029 200.0 3600 0.0057 0.3561 0.7122 0.7122 nan 0.7122 0.0 0.7122
0.0035 205.5556 3700 0.0057 0.3558 0.7115 0.7115 nan 0.7115 0.0 0.7115
0.003 211.1111 3800 0.0058 0.3479 0.6958 0.6958 nan 0.6958 0.0 0.6958
0.0024 216.6667 3900 0.0057 0.3694 0.7388 0.7388 nan 0.7388 0.0 0.7388
0.0027 222.2222 4000 0.0057 0.3620 0.7240 0.7240 nan 0.7240 0.0 0.7240
0.0019 227.7778 4100 0.0058 0.3653 0.7306 0.7306 nan 0.7306 0.0 0.7306
0.0015 233.3333 4200 0.0057 0.3691 0.7381 0.7381 nan 0.7381 0.0 0.7381
0.0008 238.8889 4300 0.0059 0.3573 0.7146 0.7146 nan 0.7146 0.0 0.7146
0.002 244.4444 4400 0.0061 0.3475 0.6950 0.6950 nan 0.6950 0.0 0.6950
0.0013 250.0 4500 0.0060 0.3583 0.7167 0.7167 nan 0.7167 0.0 0.7167
0.0021 255.5556 4600 0.0060 0.3515 0.7031 0.7031 nan 0.7031 0.0 0.7031
0.0025 261.1111 4700 0.0061 0.3550 0.7100 0.7100 nan 0.7100 0.0 0.7100
0.0032 266.6667 4800 0.0060 0.3588 0.7176 0.7176 nan 0.7176 0.0 0.7176
0.0021 272.2222 4900 0.0059 0.3635 0.7270 0.7270 nan 0.7270 0.0 0.7270
0.0008 277.7778 5000 0.0060 0.3659 0.7319 0.7319 nan 0.7319 0.0 0.7319
0.0013 283.3333 5100 0.0059 0.3662 0.7325 0.7325 nan 0.7325 0.0 0.7325
0.0011 288.8889 5200 0.0060 0.3589 0.7179 0.7179 nan 0.7179 0.0 0.7179
0.0021 294.4444 5300 0.0060 0.3590 0.7179 0.7179 nan 0.7179 0.0 0.7179
0.0023 300.0 5400 0.0060 0.3613 0.7225 0.7225 nan 0.7225 0.0 0.7225

Framework versions

  • Transformers 4.42.3
  • Pytorch 2.3.1+cu121
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
11
Safetensors
Model size
3.72M params
Tensor type
F32
·
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for alphaca/segformer-b0-finetuned-pothole_0910_512

Base model

nvidia/mit-b0
Finetuned
this model