sayakpaul/glpn-kitti-finetuned-diode is a forked repo from huggingface. License: apache-2-0
Go to file
Sayak Paul 2eb9f09450 update model card README.md 2022-11-11 05:34:22 +00:00
runs End of training 2022-11-11 05:30:57 +00:00
.gitattributes initial commit 2022-11-11 02:46:52 +00:00
.gitignore Training in progress, epoch 0 2022-11-11 03:22:08 +00:00
README.md update model card README.md 2022-11-11 05:34:22 +00:00
config.json Training in progress, epoch 0 2022-11-11 03:22:08 +00:00
preprocessor_config.json Training in progress, epoch 0 2022-11-11 03:22:08 +00:00
pytorch_model.bin End of training 2022-11-11 05:30:57 +00:00
training_args.bin Training in progress, epoch 1 2022-11-11 05:25:37 +00:00

README.md

license tags model-index
apache-2.0
vision
depth-estimation
generated_from_trainer
name results
glpn-kitti-finetuned-diode

glpn-kitti-finetuned-diode

This model is a fine-tuned version of vinvino02/glpn-kitti on the diode-subset dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5845
  • Rmse: 0.6175

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Rmse
No log 1.0 10 0.8001 0.8455
0.8187 2.0 20 0.7558 0.7907
0.8187 3.0 30 0.7391 0.7379
0.7618 4.0 40 0.6937 0.6895
0.7618 5.0 50 0.6954 nan
0.6917 6.0 60 0.6834 nan
0.6917 7.0 70 0.6719 nan
0.6625 8.0 80 0.6634 nan
0.6625 9.0 90 0.6592 nan
0.6553 10.0 100 0.6579 nan

Framework versions

  • Transformers 4.24.0
  • Pytorch 1.12.1+cu113
  • Tokenizers 0.13.2