Layout-finetuned-fr-model-25instances25-100epochs-5e-05lr-GPU

This model is a fine-tuned version of microsoft/layoutxlm-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0000
  • Accuracy: 1.0
  • Learning Rate: 5e-05

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 4
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: reduce_lr_on_plateau
  • lr_scheduler_warmup_ratio: 0.06
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy Learning Rate
5.3733 1.4286 10 4.1888 0.42 5e-05
3.7229 2.8571 20 2.2889 0.56 5e-05
2.06 4.2857 30 0.8765 0.74 5e-05
1.0064 5.7143 40 0.6363 0.76 5e-05
1.0577 7.1429 50 0.4538 0.88 5e-05
0.47 8.5714 60 0.3682 0.86 5e-05
0.6724 10.0 70 0.1981 0.9 5e-05
0.5865 11.4286 80 0.1792 0.9 5e-05
0.3017 12.8571 90 0.0972 0.96 5e-05
0.1234 14.2857 100 0.0420 0.98 5e-05
0.1442 15.7143 110 0.0221 1.0 5e-05
0.8793 17.1429 120 0.0764 0.98 5e-05
0.0422 18.5714 130 0.1742 0.96 5e-05
0.2005 20.0 140 0.0933 0.96 5e-05
0.0889 21.4286 150 0.0024 1.0 5e-05
0.007 22.8571 160 0.0014 1.0 5e-05
0.135 24.2857 170 0.0007 1.0 5e-05
0.0018 25.7143 180 0.0008 1.0 5e-05
0.0026 27.1429 190 0.0003 1.0 5e-05
0.0019 28.5714 200 0.0006 1.0 5e-05
0.0037 30.0 210 0.0003 1.0 5e-05
0.0008 31.4286 220 0.0029 1.0 5e-05
0.0128 32.8571 230 0.0002 1.0 5e-05
0.0695 34.2857 240 0.0001 1.0 5e-05
0.0003 35.7143 250 0.0002 1.0 5e-05
0.0011 37.1429 260 0.0001 1.0 5e-05
0.0002 38.5714 270 0.0001 1.0 5e-05
0.0002 40.0 280 0.0001 1.0 5e-05
0.0002 41.4286 290 0.0000 1.0 5e-05
0.0001 42.8571 300 0.0000 1.0 5e-05
0.0001 44.2857 310 0.0000 1.0 5e-05
0.0001 45.7143 320 0.0000 1.0 5e-05
0.0001 47.1429 330 0.0000 1.0 5e-05
0.0001 48.5714 340 0.0000 1.0 5e-05
0.0001 50.0 350 0.0000 1.0 5e-05
0.0001 51.4286 360 0.0000 1.0 5e-05
0.0001 52.8571 370 0.0000 1.0 5e-05
0.0001 54.2857 380 0.0000 1.0 5e-05
0.0001 55.7143 390 0.0000 1.0 5e-05
0.0001 57.1429 400 0.0000 1.0 5e-05
0.0001 58.5714 410 0.0000 1.0 5e-05
0.0001 60.0 420 0.0000 1.0 5e-05
0.0001 61.4286 430 0.0000 1.0 5e-05
0.0001 62.8571 440 0.0000 1.0 5e-05
0.0001 64.2857 450 0.0000 1.0 5e-05
0.0001 65.7143 460 0.0000 1.0 5e-05
0.0001 67.1429 470 0.0000 1.0 5e-05
0.0 68.5714 480 0.0000 1.0 5e-05
0.0001 70.0 490 0.0000 1.0 5e-05
0.0001 71.4286 500 0.0000 1.0 5e-05
0.0001 72.8571 510 0.0000 1.0 5e-05
0.0 74.2857 520 0.0000 1.0 5e-05
0.0001 75.7143 530 0.0000 1.0 5e-05
0.0001 77.1429 540 0.0000 1.0 5e-05
0.0001 78.5714 550 0.0000 1.0 5e-05
0.0001 80.0 560 0.0000 1.0 5e-05
0.0001 81.4286 570 0.0000 1.0 5e-05
0.0 82.8571 580 0.0000 1.0 5e-05
0.0 84.2857 590 0.0000 1.0 5e-05
0.0 85.7143 600 0.0000 1.0 5e-05
0.0 87.1429 610 0.0000 1.0 5e-05
0.0 88.5714 620 0.0000 1.0 5e-05
0.0 90.0 630 0.0000 1.0 5e-05
0.0 91.4286 640 0.0000 1.0 5e-05
0.0 92.8571 650 0.0000 1.0 5e-05
0.0 94.2857 660 0.0000 1.0 5e-05
0.0 95.7143 670 0.0000 1.0 5e-05
0.0001 97.1429 680 0.0000 1.0 5e-05
0.0 98.5714 690 0.0000 1.0 5e-05
0.0 100.0 700 0.0000 1.0 5e-05

Framework versions

  • Transformers 4.48.1
  • Pytorch 2.3.1.post300
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
12
Safetensors
Model size
369M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for AntonioTH/Layout-finetuned-fr-model-25instances25-100epochs-5e-05lr-GPU

Finetuned
(19)
this model