plant-seedlings-freeze-0-6-aug-3
This model is a fine-tuned version of google/vit-large-patch16-224-in21k on the imagefolder dataset. It achieves the following results on the evaluation set:
- Loss: 0.1722
- Accuracy: 0.9425
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 11
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
0.799 | 0.2 | 100 | 0.5965 | 0.7976 |
0.7161 | 0.39 | 200 | 0.7085 | 0.7633 |
0.7159 | 0.59 | 300 | 0.5736 | 0.8094 |
0.5998 | 0.79 | 400 | 0.5116 | 0.8202 |
0.4901 | 0.98 | 500 | 0.4973 | 0.8468 |
0.4669 | 1.18 | 600 | 0.5753 | 0.8021 |
0.418 | 1.38 | 700 | 0.5035 | 0.8310 |
0.2675 | 1.57 | 800 | 0.4583 | 0.8502 |
0.3409 | 1.77 | 900 | 0.4194 | 0.8600 |
0.5934 | 1.96 | 1000 | 0.4950 | 0.8320 |
0.3727 | 2.16 | 1100 | 0.5086 | 0.8222 |
0.2635 | 2.36 | 1200 | 0.4315 | 0.8551 |
0.3595 | 2.55 | 1300 | 0.3716 | 0.8728 |
0.4091 | 2.75 | 1400 | 0.3692 | 0.8787 |
0.3579 | 2.95 | 1500 | 0.3331 | 0.8865 |
0.3034 | 3.14 | 1600 | 0.3121 | 0.8895 |
0.2943 | 3.34 | 1700 | 0.3551 | 0.8816 |
0.3346 | 3.54 | 1800 | 0.4268 | 0.8625 |
0.3261 | 3.73 | 1900 | 0.3222 | 0.8880 |
0.3287 | 3.93 | 2000 | 0.3072 | 0.8964 |
0.2753 | 4.13 | 2100 | 0.3209 | 0.8910 |
0.1975 | 4.32 | 2200 | 0.3564 | 0.8757 |
0.2291 | 4.52 | 2300 | 0.3057 | 0.9003 |
0.232 | 4.72 | 2400 | 0.3124 | 0.8929 |
0.2834 | 4.91 | 2500 | 0.2631 | 0.9165 |
0.3484 | 5.11 | 2600 | 0.2987 | 0.9008 |
0.2019 | 5.3 | 2700 | 0.2976 | 0.8998 |
0.2179 | 5.5 | 2800 | 0.2596 | 0.9082 |
0.2068 | 5.7 | 2900 | 0.2852 | 0.9121 |
0.2791 | 5.89 | 3000 | 0.2523 | 0.9145 |
0.1888 | 6.09 | 3100 | 0.2554 | 0.9145 |
0.1909 | 6.29 | 3200 | 0.2623 | 0.9170 |
0.1677 | 6.48 | 3300 | 0.2885 | 0.9091 |
0.1832 | 6.68 | 3400 | 0.2345 | 0.9190 |
0.1088 | 6.88 | 3500 | 0.2448 | 0.9214 |
0.2065 | 7.07 | 3600 | 0.2341 | 0.9170 |
0.2561 | 7.27 | 3700 | 0.2253 | 0.9209 |
0.1902 | 7.47 | 3800 | 0.2196 | 0.9244 |
0.236 | 7.66 | 3900 | 0.2217 | 0.9293 |
0.2483 | 7.86 | 4000 | 0.2314 | 0.9150 |
0.1761 | 8.06 | 4100 | 0.2327 | 0.9268 |
0.2349 | 8.25 | 4200 | 0.2408 | 0.9258 |
0.0809 | 8.45 | 4300 | 0.1858 | 0.9361 |
0.1723 | 8.64 | 4400 | 0.2643 | 0.9204 |
0.1762 | 8.84 | 4500 | 0.2194 | 0.9278 |
0.1438 | 9.04 | 4600 | 0.1897 | 0.9357 |
0.0805 | 9.23 | 4700 | 0.2169 | 0.9322 |
0.1513 | 9.43 | 4800 | 0.1635 | 0.9460 |
0.1356 | 9.63 | 4900 | 0.1940 | 0.9347 |
0.0737 | 9.82 | 5000 | 0.2014 | 0.9342 |
0.0908 | 10.02 | 5100 | 0.1707 | 0.9460 |
0.1082 | 10.22 | 5200 | 0.1620 | 0.9494 |
0.0773 | 10.41 | 5300 | 0.2009 | 0.9332 |
0.1614 | 10.61 | 5400 | 0.1735 | 0.9425 |
0.0983 | 10.81 | 5500 | 0.1722 | 0.9425 |
Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.11.0
- Tokenizers 0.13.3
- Downloads last month
- 18
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.