Layout-finetuned-fr-model-107instances107-150epochs-5e-05lr-GPU

This model is a fine-tuned version of microsoft/layoutxlm-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0000
  • Accuracy: 1.0
  • Learning Rate: 1e-05

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 4
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: reduce_lr_on_plateau
  • lr_scheduler_warmup_ratio: 0.06
  • num_epochs: 150

Training results

Training Loss Epoch Step Validation Loss Accuracy Learning Rate
0.0236 3.7037 100 0.0071 0.9953 5e-05
0.0547 7.4074 200 0.1518 0.9813 5e-05
0.1261 11.1111 300 0.0840 0.9860 5e-05
0.1562 14.8148 400 0.1556 0.9860 5e-05
0.0973 18.5185 500 0.0426 0.9907 5e-05
0.0273 22.2222 600 0.0005 1.0 5e-05
0.0709 25.9259 700 0.0258 0.9860 5e-05
0.072 29.6296 800 0.0071 0.9953 5e-05
0.079 33.3333 900 0.0000 1.0 5e-05
0.0147 37.0370 1000 0.0001 1.0 5e-05
0.0182 40.7407 1100 0.0000 1.0 5e-05
0.0363 44.4444 1200 0.0000 1.0 5e-05
0.0116 48.1481 1300 0.0000 1.0 5e-05
0.0101 51.8519 1400 0.0001 1.0 5e-05
0.0454 55.5556 1500 0.0280 0.9953 5e-05
0.1 59.2593 1600 0.0036 1.0 5e-05
0.0875 62.9630 1700 0.0726 0.9907 5e-05
0.165 66.6667 1800 0.0081 0.9953 5e-05
0.0136 70.3704 1900 0.0636 0.9907 5e-05
0.0541 74.0741 2000 0.0036 1.0 5e-05
0.0202 77.7778 2100 0.0000 1.0 1e-05
0.0 81.4815 2200 0.0000 1.0 1e-05
0.0 85.1852 2300 0.0000 1.0 1e-05
0.0 88.8889 2400 0.0000 1.0 1e-05
0.0 92.5926 2500 0.0000 1.0 1e-05
0.0 96.2963 2600 0.0000 1.0 1e-05
0.0 100.0 2700 0.0000 1.0 1e-05
0.0002 103.7037 2800 0.0000 1.0 1e-05
0.0001 107.4074 2900 0.0000 1.0 1e-05
0.0 111.1111 3000 0.0000 1.0 1e-05
0.0 114.8148 3100 0.0000 1.0 1e-05
0.0 118.5185 3200 0.0000 1.0 1e-05
0.0 122.2222 3300 0.0000 1.0 1e-05
0.0 125.9259 3400 0.0000 1.0 1e-05
0.0 129.6296 3500 0.0000 1.0 1e-05
0.0 133.3333 3600 0.0000 1.0 1e-05
0.0 137.0370 3700 0.0000 1.0 1e-05
0.0 140.7407 3800 0.0000 1.0 1e-05
0.0 144.4444 3900 0.0000 1.0 1e-05
0.0 148.1481 4000 0.0000 1.0 1e-05

Framework versions

  • Transformers 4.48.1
  • Pytorch 2.3.1.post300
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
19
Safetensors
Model size
369M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for AntonioTH/Layout-finetuned-fr-model-107instances107-150epochs-5e-05lr-GPU

Finetuned
(19)
this model