nemik's picture
nemik/frost-vision-v2-google__vit-base-patch16-224-in21k-v2025-1-31
ef1cce4 verified
metadata
library_name: transformers
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
  - generated_from_trainer
datasets:
  - webdataset
metrics:
  - accuracy
  - f1
  - precision
  - recall
model-index:
  - name: vit-base-patch16-224-in21k-v2025-1-31
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: webdataset
          type: webdataset
          config: default
          split: train
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.8972972972972973
          - name: F1
            type: f1
            value: 0.7667958656330749
          - name: Precision
            type: precision
            value: 0.7866136514247847
          - name: Recall
            type: recall
            value: 0.7479521109010712

vit-base-patch16-224-in21k-v2025-1-31

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the webdataset dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3391
  • Accuracy: 0.8973
  • F1: 0.7668
  • Precision: 0.7866
  • Recall: 0.7480

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0002
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Precision Recall
0.4871 0.5682 100 0.4866 0.7903 0.1400 0.9449 0.0756
0.4151 1.1364 200 0.4007 0.8361 0.4540 0.9159 0.3018
0.3517 1.7045 300 0.3460 0.8671 0.6481 0.8060 0.5419
0.3337 2.2727 400 0.3202 0.8777 0.7034 0.7768 0.6427
0.3128 2.8409 500 0.2995 0.8774 0.6943 0.7940 0.6169
0.3199 3.4091 600 0.2980 0.8771 0.6960 0.7880 0.6232
0.3094 3.9773 700 0.3051 0.8764 0.7031 0.7679 0.6484
0.3068 4.5455 800 0.2753 0.8900 0.7409 0.7915 0.6963
0.3003 5.1136 900 0.2699 0.8890 0.7351 0.7973 0.6818
0.3012 5.6818 1000 0.2860 0.8799 0.7256 0.7495 0.7032
0.267 6.25 1100 0.2848 0.8832 0.7216 0.7812 0.6704
0.2364 6.8182 1200 0.2608 0.8896 0.7399 0.7903 0.6957
0.2401 7.3864 1300 0.2695 0.8885 0.7406 0.7798 0.7051
0.219 7.9545 1400 0.2599 0.8909 0.7413 0.7975 0.6925
0.1985 8.5227 1500 0.2668 0.8898 0.7421 0.7863 0.7026
0.1986 9.0909 1600 0.2762 0.8851 0.7316 0.7737 0.6938
0.1988 9.6591 1700 0.2765 0.8862 0.7404 0.7632 0.7190
0.167 10.2273 1800 0.2630 0.8940 0.7594 0.7788 0.7410
0.207 10.7955 1900 0.2637 0.8923 0.7557 0.7745 0.7379
0.1811 11.3636 2000 0.2568 0.8946 0.7609 0.7798 0.7429
0.171 11.9318 2100 0.2607 0.8935 0.7527 0.7906 0.7183
0.1571 12.5 2200 0.2552 0.8972 0.7708 0.7755 0.7662
0.1234 13.0682 2300 0.2676 0.8993 0.7694 0.7964 0.7442
0.1299 13.6364 2400 0.2683 0.8970 0.7655 0.7875 0.7448
0.1335 14.2045 2500 0.2823 0.8949 0.7559 0.7944 0.7209
0.1235 14.7727 2600 0.2753 0.8976 0.7671 0.7880 0.7473
0.1163 15.3409 2700 0.2884 0.8962 0.7644 0.7836 0.7461
0.1111 15.9091 2800 0.2770 0.8973 0.7675 0.7847 0.7511
0.1128 16.4773 2900 0.2773 0.8987 0.7722 0.7843 0.7606
0.0982 17.0455 3000 0.2754 0.8993 0.7716 0.7905 0.7536
0.1115 17.6136 3100 0.2956 0.8972 0.7640 0.7927 0.7372
0.07 18.1818 3200 0.2961 0.8977 0.7683 0.7863 0.7511
0.0993 18.75 3300 0.3041 0.8959 0.7639 0.7826 0.7461
0.0779 19.3182 3400 0.3012 0.9 0.7745 0.7889 0.7606
0.0691 19.8864 3500 0.3075 0.8964 0.7674 0.7784 0.7568
0.063 20.4545 3600 0.3271 0.8912 0.7509 0.7770 0.7265
0.0668 21.0227 3700 0.3229 0.8952 0.7649 0.7745 0.7555
0.0573 21.5909 3800 0.3236 0.8960 0.7626 0.7869 0.7398
0.0668 22.1591 3900 0.3251 0.8972 0.7629 0.7955 0.7328
0.062 22.7273 4000 0.3221 0.8987 0.7702 0.7895 0.7517
0.0647 23.2955 4100 0.3179 0.8959 0.7663 0.7767 0.7561
0.0417 23.8636 4200 0.3323 0.8969 0.7662 0.7847 0.7486
0.0623 24.4318 4300 0.3396 0.8945 0.7602 0.7804 0.7410
0.0361 25.0 4400 0.3418 0.8959 0.7623 0.7863 0.7398
0.0334 25.5682 4500 0.3404 0.8984 0.7703 0.7870 0.7543
0.0326 26.1364 4600 0.3376 0.8967 0.7676 0.7801 0.7555
0.052 26.7045 4700 0.3395 0.8972 0.7679 0.7827 0.7536
0.0341 27.2727 4800 0.3440 0.8953 0.7638 0.7783 0.7498
0.0459 27.8409 4900 0.3406 0.8980 0.7689 0.7869 0.7517
0.0392 28.4091 5000 0.3389 0.8977 0.7680 0.7870 0.7498
0.0407 28.9773 5100 0.3410 0.8976 0.7677 0.7865 0.7498
0.0445 29.5455 5200 0.3395 0.8969 0.7661 0.7851 0.7480

Framework versions

  • Transformers 4.47.1
  • Pytorch 2.5.1+cu124
  • Datasets 3.2.0
  • Tokenizers 0.21.0