upernet-convnext-tiny-segments-GFB

This model is a fine-tuned version of openmmlab/upernet-convnext-tiny on the segments/GFB dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7882
  • Mean Iou: 0.7125
  • Mean Accuracy: 0.8017
  • Overall Accuracy: 0.9250
  • Accuracy Unlabeled: 0.9689
  • Accuracy Gbm: 0.8241
  • Accuracy Podo: 0.7558
  • Accuracy Endo: 0.6579
  • Iou Unlabeled: 0.9223
  • Iou Gbm: 0.7200
  • Iou Podo: 0.6457
  • Iou Endo: 0.5619

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 10
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Unlabeled Accuracy Gbm Accuracy Podo Accuracy Endo Iou Unlabeled Iou Gbm Iou Podo Iou Endo
0.3378 1.0989 100 0.3889 0.6226 0.7114 0.9018 0.9706 0.6929 0.7052 0.4767 0.9011 0.6333 0.5622 0.3936
0.1929 2.1978 200 0.3725 0.6386 0.7142 0.9080 0.9753 0.7590 0.6493 0.4734 0.9058 0.6620 0.5665 0.4202
0.211 3.2967 300 0.3448 0.6794 0.7742 0.9157 0.9732 0.7392 0.7140 0.6704 0.9153 0.6771 0.5943 0.5310
0.1732 4.3956 400 0.3507 0.6862 0.8303 0.9114 0.9322 0.9151 0.8061 0.6677 0.9085 0.6941 0.6330 0.5094
0.1567 5.4945 500 0.3476 0.7074 0.8276 0.9220 0.9573 0.8203 0.8055 0.7274 0.9211 0.7174 0.6420 0.5491
0.148 6.5934 600 0.3502 0.7144 0.8109 0.9260 0.9657 0.8467 0.7628 0.6683 0.9245 0.7288 0.6449 0.5595
0.1983 7.6923 700 0.3575 0.7204 0.8191 0.9270 0.9651 0.8479 0.7686 0.6948 0.9248 0.7348 0.6498 0.5723
0.1115 8.7912 800 0.4055 0.7153 0.8242 0.9248 0.9591 0.8677 0.7683 0.7017 0.9233 0.7252 0.6469 0.5656
0.0935 9.8901 900 0.3961 0.7268 0.8366 0.9275 0.9613 0.8498 0.7884 0.7471 0.9255 0.7353 0.6576 0.5887
0.1255 10.9890 1000 0.4276 0.7251 0.8230 0.9280 0.9660 0.8403 0.7779 0.7078 0.9256 0.7352 0.6567 0.5830
0.1118 12.0879 1100 0.4431 0.7234 0.8176 0.9282 0.9653 0.8538 0.7817 0.6697 0.9259 0.7353 0.6600 0.5722
0.0811 13.1868 1200 0.4577 0.7245 0.8213 0.9281 0.9647 0.8568 0.7774 0.6863 0.9258 0.7373 0.6566 0.5783
0.0783 14.2857 1300 0.5041 0.7215 0.8105 0.9278 0.9688 0.8259 0.7848 0.6624 0.9250 0.7308 0.6591 0.5713
0.0984 15.3846 1400 0.5062 0.7261 0.8201 0.9288 0.9668 0.8508 0.7722 0.6909 0.9266 0.7359 0.6590 0.5828
0.0755 16.4835 1500 0.5262 0.7257 0.8279 0.9276 0.9628 0.8513 0.7876 0.7098 0.9253 0.7320 0.6609 0.5848
0.0824 17.5824 1600 0.5320 0.7196 0.8075 0.9276 0.9680 0.8450 0.7707 0.6461 0.9250 0.7322 0.6556 0.5659
0.0657 18.6813 1700 0.5438 0.7216 0.8137 0.9277 0.9682 0.8346 0.7714 0.6806 0.9253 0.7305 0.6563 0.5744
0.0808 19.7802 1800 0.5700 0.7205 0.8109 0.9273 0.9688 0.8262 0.7749 0.6736 0.9245 0.7297 0.6554 0.5725
0.0602 20.8791 1900 0.5682 0.7237 0.8168 0.9278 0.9673 0.8404 0.7699 0.6894 0.9251 0.7325 0.6563 0.5809
0.0621 21.9780 2000 0.5880 0.7230 0.8190 0.9273 0.9668 0.8393 0.7647 0.7053 0.9248 0.7297 0.6538 0.5838
0.0691 23.0769 2100 0.6024 0.7189 0.8108 0.9267 0.9690 0.8110 0.7842 0.6790 0.9243 0.7248 0.6538 0.5725
0.0628 24.1758 2200 0.6413 0.7197 0.8077 0.9272 0.9688 0.8354 0.7672 0.6595 0.9243 0.7294 0.6547 0.5705
0.072 25.2747 2300 0.6427 0.7162 0.8039 0.9264 0.9694 0.8201 0.7741 0.6521 0.9235 0.7255 0.6531 0.5626
0.0699 26.3736 2400 0.6491 0.7185 0.8066 0.9270 0.9691 0.8330 0.7650 0.6595 0.9243 0.7280 0.6527 0.5691
0.0666 27.4725 2500 0.6501 0.7213 0.8158 0.9268 0.9669 0.8289 0.7778 0.6897 0.9241 0.7276 0.6548 0.5788
0.0512 28.5714 2600 0.6661 0.7182 0.8072 0.9269 0.9685 0.8448 0.7528 0.6628 0.9244 0.7297 0.6496 0.5693
0.0681 29.6703 2700 0.6934 0.7184 0.8091 0.9267 0.9677 0.8404 0.7619 0.6664 0.9240 0.7284 0.6519 0.5694
0.0518 30.7692 2800 0.7065 0.7183 0.8088 0.9265 0.9692 0.8238 0.7631 0.6789 0.9237 0.7250 0.6514 0.5730
0.0624 31.8681 2900 0.7146 0.7141 0.8005 0.9258 0.9705 0.8284 0.7439 0.6590 0.9230 0.7230 0.6446 0.5658
0.0518 32.9670 3000 0.6993 0.7171 0.8096 0.9261 0.9675 0.8351 0.7605 0.6754 0.9234 0.7252 0.6497 0.5703
0.0658 34.0659 3100 0.7306 0.7162 0.8066 0.9261 0.9686 0.8282 0.7623 0.6674 0.9234 0.7243 0.6502 0.5669
0.0554 35.1648 3200 0.7374 0.7144 0.8021 0.9257 0.9696 0.8256 0.7557 0.6577 0.9229 0.7226 0.6475 0.5645
0.0623 36.2637 3300 0.7468 0.7151 0.8054 0.9256 0.9688 0.8258 0.7566 0.6705 0.9229 0.7220 0.6475 0.5678
0.057 37.3626 3400 0.7597 0.7138 0.8034 0.9253 0.9692 0.8206 0.7584 0.6652 0.9225 0.7207 0.6472 0.5647
0.0644 38.4615 3500 0.7481 0.7146 0.8049 0.9256 0.9681 0.8303 0.7598 0.6614 0.9228 0.7226 0.6482 0.5647
0.0593 39.5604 3600 0.7704 0.7142 0.8040 0.9254 0.9683 0.8289 0.7597 0.6591 0.9227 0.7221 0.6478 0.5640
0.0519 40.6593 3700 0.7748 0.7132 0.8020 0.9254 0.9689 0.8275 0.7565 0.6550 0.9226 0.7218 0.6468 0.5618
0.045 41.7582 3800 0.7814 0.7133 0.8030 0.9252 0.9687 0.8256 0.7577 0.6601 0.9225 0.7212 0.6467 0.5630
0.045 42.8571 3900 0.7831 0.7125 0.8007 0.9252 0.9693 0.8237 0.7553 0.6546 0.9224 0.7202 0.6460 0.5614
0.0523 43.9560 4000 0.7910 0.7125 0.8014 0.9251 0.9690 0.8241 0.7563 0.6564 0.9223 0.7202 0.6460 0.5616
0.0642 45.0549 4100 0.7819 0.7129 0.8019 0.9252 0.9689 0.8253 0.7560 0.6573 0.9224 0.7206 0.6462 0.5623
0.0535 46.1538 4200 0.7885 0.7125 0.8013 0.9251 0.9690 0.8236 0.7570 0.6554 0.9223 0.7202 0.6461 0.5614
0.0474 47.2527 4300 0.7904 0.7123 0.8013 0.9250 0.9690 0.8231 0.7564 0.6566 0.9222 0.7198 0.6457 0.5614
0.0494 48.3516 4400 0.7834 0.7124 0.8015 0.9250 0.9689 0.8239 0.7561 0.6570 0.9223 0.7199 0.6458 0.5615
0.0663 49.4505 4500 0.7882 0.7125 0.8017 0.9250 0.9689 0.8241 0.7558 0.6579 0.9223 0.7200 0.6457 0.5619

Framework versions

  • Transformers 4.57.1
  • Pytorch 2.9.1+cu130
  • Datasets 4.4.1
  • Tokenizers 0.22.1
Downloads last month
17
Safetensors
Model size
60.1M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for luoyun75579/upernet-convnext-tiny-segments-GFB

Finetuned
(2)
this model