bert-finetuned-ner4-new
This model is a fine-tuned version of bert-base-cased on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1962
- Precision: 0.6905
- Recall: 0.8666
- F1: 0.7686
- Accuracy: 0.9402
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 20
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
---|---|---|---|---|---|---|---|
0.852 | 1.0 | 2844 | 0.5164 | 0.3140 | 0.3681 | 0.3389 | 0.8628 |
0.467 | 2.0 | 5688 | 0.3406 | 0.3758 | 0.6604 | 0.4790 | 0.8733 |
0.3478 | 3.0 | 8532 | 0.2850 | 0.4874 | 0.7650 | 0.5955 | 0.8950 |
0.2887 | 4.0 | 11376 | 0.2695 | 0.5318 | 0.7932 | 0.6368 | 0.9030 |
0.2528 | 5.0 | 14220 | 0.2874 | 0.5381 | 0.8053 | 0.6451 | 0.9014 |
0.2294 | 6.0 | 17064 | 0.2320 | 0.5962 | 0.8273 | 0.6930 | 0.9198 |
0.2125 | 7.0 | 19908 | 0.2283 | 0.6103 | 0.8386 | 0.7065 | 0.9223 |
0.1997 | 8.0 | 22752 | 0.2070 | 0.6422 | 0.8384 | 0.7273 | 0.9302 |
0.1865 | 9.0 | 25596 | 0.2347 | 0.6287 | 0.8523 | 0.7236 | 0.9244 |
0.1788 | 10.0 | 28440 | 0.2379 | 0.6296 | 0.8547 | 0.7251 | 0.9248 |
0.1712 | 11.0 | 31284 | 0.2078 | 0.6580 | 0.8519 | 0.7425 | 0.9313 |
0.1632 | 12.0 | 34128 | 0.2111 | 0.6610 | 0.8585 | 0.7469 | 0.9331 |
0.1579 | 13.0 | 36972 | 0.2250 | 0.6515 | 0.8606 | 0.7416 | 0.9304 |
0.1531 | 14.0 | 39816 | 0.2027 | 0.6765 | 0.8615 | 0.7579 | 0.9375 |
0.1493 | 15.0 | 42660 | 0.2102 | 0.6766 | 0.8632 | 0.7586 | 0.9372 |
0.1451 | 16.0 | 45504 | 0.2098 | 0.6786 | 0.8653 | 0.7607 | 0.9376 |
0.143 | 17.0 | 48348 | 0.1962 | 0.6905 | 0.8666 | 0.7686 | 0.9402 |
0.1407 | 18.0 | 51192 | 0.2042 | 0.6900 | 0.8664 | 0.7682 | 0.9395 |
0.1385 | 19.0 | 54036 | 0.2058 | 0.6874 | 0.8685 | 0.7674 | 0.9388 |
0.1378 | 20.0 | 56880 | 0.2043 | 0.6890 | 0.8680 | 0.7682 | 0.9392 |
Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 106
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for Nathali99/bert-finetuned-ner4-new
Base model
google-bert/bert-base-cased