flyswot_iiif / README.md
davanstrien's picture
davanstrien HF staff
Librarian Bot: Add base_model information to model (#1)
130b50a
metadata
license: apache-2.0
tags:
  - generated_from_trainer
metrics:
  - f1
base_model: facebook/convnext-base-224-22k
model-index:
  - name: flyswot_iiif
    results: []

flyswot_iiif

This model is a fine-tuned version of facebook/convnext-base-224-22k on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 6.1280
  • F1: 0.0034

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 666
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 8
  • mixed_precision_training: Native AMP
  • label_smoothing_factor: 0.1

Training results

Training Loss Epoch Step Validation Loss F1
8.5184 0.26 500 7.9280 0.0005
7.7409 0.52 1000 7.5824 0.0007
7.4649 0.78 1500 7.3841 0.0010
7.3285 1.04 2000 7.2652 0.0012
7.1404 1.3 2500 7.1559 0.0014
7.0322 1.56 3000 7.0551 0.0016
6.9197 1.82 3500 6.9449 0.0019
6.7822 2.09 4000 6.8773 0.0018
6.6506 2.35 4500 6.7980 0.0020
6.5811 2.61 5000 6.7382 0.0022
6.538 2.87 5500 6.6582 0.0022
6.4136 3.13 6000 6.6013 0.0024
6.3325 3.39 6500 6.5369 0.0024
6.2566 3.65 7000 6.4875 0.0025
6.2285 3.91 7500 6.4342 0.0027
6.1281 4.17 8000 6.4066 0.0027
6.0762 4.43 8500 6.3674 0.0027
6.0309 4.69 9000 6.3336 0.0027
6.0123 4.95 9500 6.2932 0.0030
5.9089 5.21 10000 6.2835 0.0029
5.8901 5.47 10500 6.2481 0.0030
5.86 5.74 11000 6.2295 0.0030
5.8586 6.0 11500 6.2068 0.0033
5.7768 6.26 12000 6.1937 0.0031
5.7591 6.52 12500 6.1916 0.0032
5.7443 6.78 13000 6.1579 0.0033
5.7125 7.04 13500 6.1478 0.0033
5.6751 7.3 14000 6.1379 0.0035
5.6648 7.56 14500 6.1304 0.0035
5.6644 7.82 15000 6.1280 0.0034

Framework versions

  • Transformers 4.17.0.dev0
  • Pytorch 1.10.0+cu111
  • Datasets 1.18.3
  • Tokenizers 0.11.6