File size: 4,929 Bytes
6e213b4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
---
tags:
- generated_from_trainer
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: tabert-500-naamapadam
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# tabert-500-naamapadam

This model is a fine-tuned version of [livinNector/tabert-500](https://huggingface.co/livinNector/tabert-500) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2821
- Precision: 0.7818
- Recall: 0.8089
- F1: 0.7951
- Accuracy: 0.9070

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 64
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2

### Training results

| Training Loss | Epoch | Step  | Validation Loss | Precision | Recall | F1     | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.4684        | 0.05  | 400   | 0.3956          | 0.6972    | 0.6926 | 0.6949 | 0.8720   |
| 0.3901        | 0.1   | 800   | 0.3706          | 0.7099    | 0.7338 | 0.7216 | 0.8811   |
| 0.3658        | 0.15  | 1200  | 0.3551          | 0.7349    | 0.7388 | 0.7369 | 0.8854   |
| 0.3535        | 0.21  | 1600  | 0.3445          | 0.7333    | 0.7458 | 0.7395 | 0.8875   |
| 0.3512        | 0.26  | 2000  | 0.3353          | 0.7547    | 0.7408 | 0.7477 | 0.8917   |
| 0.3377        | 0.31  | 2400  | 0.3302          | 0.7417    | 0.7636 | 0.7525 | 0.8916   |
| 0.3297        | 0.36  | 2800  | 0.3279          | 0.7681    | 0.7330 | 0.7501 | 0.8931   |
| 0.3331        | 0.41  | 3200  | 0.3252          | 0.7448    | 0.7833 | 0.7636 | 0.8961   |
| 0.3247        | 0.46  | 3600  | 0.3210          | 0.7479    | 0.7847 | 0.7659 | 0.8960   |
| 0.3175        | 0.51  | 4000  | 0.3155          | 0.7684    | 0.7597 | 0.7640 | 0.8975   |
| 0.3142        | 0.57  | 4400  | 0.3113          | 0.7510    | 0.7833 | 0.7668 | 0.8977   |
| 0.315         | 0.62  | 4800  | 0.3131          | 0.7574    | 0.7830 | 0.7700 | 0.8969   |
| 0.3078        | 0.67  | 5200  | 0.3155          | 0.7569    | 0.7821 | 0.7693 | 0.8980   |
| 0.3101        | 0.72  | 5600  | 0.3117          | 0.7708    | 0.7730 | 0.7719 | 0.8990   |
| 0.3078        | 0.77  | 6000  | 0.3070          | 0.7665    | 0.7824 | 0.7744 | 0.8992   |
| 0.304         | 0.82  | 6400  | 0.3055          | 0.7680    | 0.7875 | 0.7776 | 0.8992   |
| 0.2954        | 0.87  | 6800  | 0.3019          | 0.7675    | 0.7929 | 0.7800 | 0.9002   |
| 0.2955        | 0.93  | 7200  | 0.3107          | 0.7804    | 0.7755 | 0.7779 | 0.9000   |
| 0.2979        | 0.98  | 7600  | 0.2992          | 0.7721    | 0.7931 | 0.7825 | 0.9021   |
| 0.2816        | 1.03  | 8000  | 0.3022          | 0.7695    | 0.7971 | 0.7831 | 0.9029   |
| 0.2768        | 1.08  | 8400  | 0.3043          | 0.7538    | 0.8045 | 0.7783 | 0.9003   |
| 0.2775        | 1.13  | 8800  | 0.2990          | 0.7687    | 0.8003 | 0.7842 | 0.9024   |
| 0.2704        | 1.18  | 9200  | 0.2948          | 0.7724    | 0.7987 | 0.7853 | 0.9023   |
| 0.2734        | 1.23  | 9600  | 0.2932          | 0.7764    | 0.7993 | 0.7877 | 0.9041   |
| 0.2746        | 1.29  | 10000 | 0.2918          | 0.7841    | 0.7949 | 0.7894 | 0.9046   |
| 0.2678        | 1.34  | 10400 | 0.2909          | 0.7775    | 0.8039 | 0.7905 | 0.9046   |
| 0.272         | 1.39  | 10800 | 0.2909          | 0.7786    | 0.7952 | 0.7868 | 0.9034   |
| 0.2636        | 1.44  | 11200 | 0.2900          | 0.7815    | 0.7959 | 0.7886 | 0.9044   |
| 0.2663        | 1.49  | 11600 | 0.2863          | 0.7747    | 0.8086 | 0.7913 | 0.9047   |
| 0.2617        | 1.54  | 12000 | 0.2876          | 0.7759    | 0.8042 | 0.7898 | 0.9051   |
| 0.2634        | 1.59  | 12400 | 0.2896          | 0.7677    | 0.8123 | 0.7894 | 0.9038   |
| 0.2651        | 1.65  | 12800 | 0.2871          | 0.7799    | 0.8024 | 0.7910 | 0.9058   |
| 0.2676        | 1.7   | 13200 | 0.2870          | 0.7863    | 0.8008 | 0.7935 | 0.9061   |
| 0.273         | 1.75  | 13600 | 0.2836          | 0.7804    | 0.8108 | 0.7953 | 0.9064   |
| 0.2611        | 1.8   | 14000 | 0.2821          | 0.7821    | 0.8052 | 0.7935 | 0.9064   |
| 0.2683        | 1.85  | 14400 | 0.2815          | 0.7791    | 0.8108 | 0.7946 | 0.9064   |
| 0.2624        | 1.9   | 14800 | 0.2818          | 0.7819    | 0.8090 | 0.7952 | 0.9071   |
| 0.2628        | 1.95  | 15200 | 0.2821          | 0.7818    | 0.8089 | 0.7951 | 0.9070   |


### Framework versions

- Transformers 4.29.2
- Pytorch 2.0.0
- Datasets 2.12.0
- Tokenizers 0.13.3