stefan-it commited on
Commit
df7df6d
·
1 Parent(s): 6c74434

Upload ./training.log with huggingface_hub

Browse files
Files changed (1) hide show
  1. training.log +505 -0
training.log ADDED
@@ -0,0 +1,505 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2023-10-24 10:48:14,850 ----------------------------------------------------------------------------------------------------
2
+ 2023-10-24 10:48:14,851 Model: "SequenceTagger(
3
+ (embeddings): TransformerWordEmbeddings(
4
+ (model): BertModel(
5
+ (embeddings): BertEmbeddings(
6
+ (word_embeddings): Embedding(64001, 768)
7
+ (position_embeddings): Embedding(512, 768)
8
+ (token_type_embeddings): Embedding(2, 768)
9
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
10
+ (dropout): Dropout(p=0.1, inplace=False)
11
+ )
12
+ (encoder): BertEncoder(
13
+ (layer): ModuleList(
14
+ (0): BertLayer(
15
+ (attention): BertAttention(
16
+ (self): BertSelfAttention(
17
+ (query): Linear(in_features=768, out_features=768, bias=True)
18
+ (key): Linear(in_features=768, out_features=768, bias=True)
19
+ (value): Linear(in_features=768, out_features=768, bias=True)
20
+ (dropout): Dropout(p=0.1, inplace=False)
21
+ )
22
+ (output): BertSelfOutput(
23
+ (dense): Linear(in_features=768, out_features=768, bias=True)
24
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
25
+ (dropout): Dropout(p=0.1, inplace=False)
26
+ )
27
+ )
28
+ (intermediate): BertIntermediate(
29
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
30
+ (intermediate_act_fn): GELUActivation()
31
+ )
32
+ (output): BertOutput(
33
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
34
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
35
+ (dropout): Dropout(p=0.1, inplace=False)
36
+ )
37
+ )
38
+ (1): BertLayer(
39
+ (attention): BertAttention(
40
+ (self): BertSelfAttention(
41
+ (query): Linear(in_features=768, out_features=768, bias=True)
42
+ (key): Linear(in_features=768, out_features=768, bias=True)
43
+ (value): Linear(in_features=768, out_features=768, bias=True)
44
+ (dropout): Dropout(p=0.1, inplace=False)
45
+ )
46
+ (output): BertSelfOutput(
47
+ (dense): Linear(in_features=768, out_features=768, bias=True)
48
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
49
+ (dropout): Dropout(p=0.1, inplace=False)
50
+ )
51
+ )
52
+ (intermediate): BertIntermediate(
53
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
54
+ (intermediate_act_fn): GELUActivation()
55
+ )
56
+ (output): BertOutput(
57
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
58
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
59
+ (dropout): Dropout(p=0.1, inplace=False)
60
+ )
61
+ )
62
+ (2): BertLayer(
63
+ (attention): BertAttention(
64
+ (self): BertSelfAttention(
65
+ (query): Linear(in_features=768, out_features=768, bias=True)
66
+ (key): Linear(in_features=768, out_features=768, bias=True)
67
+ (value): Linear(in_features=768, out_features=768, bias=True)
68
+ (dropout): Dropout(p=0.1, inplace=False)
69
+ )
70
+ (output): BertSelfOutput(
71
+ (dense): Linear(in_features=768, out_features=768, bias=True)
72
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
73
+ (dropout): Dropout(p=0.1, inplace=False)
74
+ )
75
+ )
76
+ (intermediate): BertIntermediate(
77
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
78
+ (intermediate_act_fn): GELUActivation()
79
+ )
80
+ (output): BertOutput(
81
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
82
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
83
+ (dropout): Dropout(p=0.1, inplace=False)
84
+ )
85
+ )
86
+ (3): BertLayer(
87
+ (attention): BertAttention(
88
+ (self): BertSelfAttention(
89
+ (query): Linear(in_features=768, out_features=768, bias=True)
90
+ (key): Linear(in_features=768, out_features=768, bias=True)
91
+ (value): Linear(in_features=768, out_features=768, bias=True)
92
+ (dropout): Dropout(p=0.1, inplace=False)
93
+ )
94
+ (output): BertSelfOutput(
95
+ (dense): Linear(in_features=768, out_features=768, bias=True)
96
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
97
+ (dropout): Dropout(p=0.1, inplace=False)
98
+ )
99
+ )
100
+ (intermediate): BertIntermediate(
101
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
102
+ (intermediate_act_fn): GELUActivation()
103
+ )
104
+ (output): BertOutput(
105
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
106
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
107
+ (dropout): Dropout(p=0.1, inplace=False)
108
+ )
109
+ )
110
+ (4): BertLayer(
111
+ (attention): BertAttention(
112
+ (self): BertSelfAttention(
113
+ (query): Linear(in_features=768, out_features=768, bias=True)
114
+ (key): Linear(in_features=768, out_features=768, bias=True)
115
+ (value): Linear(in_features=768, out_features=768, bias=True)
116
+ (dropout): Dropout(p=0.1, inplace=False)
117
+ )
118
+ (output): BertSelfOutput(
119
+ (dense): Linear(in_features=768, out_features=768, bias=True)
120
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
121
+ (dropout): Dropout(p=0.1, inplace=False)
122
+ )
123
+ )
124
+ (intermediate): BertIntermediate(
125
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
126
+ (intermediate_act_fn): GELUActivation()
127
+ )
128
+ (output): BertOutput(
129
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
130
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
131
+ (dropout): Dropout(p=0.1, inplace=False)
132
+ )
133
+ )
134
+ (5): BertLayer(
135
+ (attention): BertAttention(
136
+ (self): BertSelfAttention(
137
+ (query): Linear(in_features=768, out_features=768, bias=True)
138
+ (key): Linear(in_features=768, out_features=768, bias=True)
139
+ (value): Linear(in_features=768, out_features=768, bias=True)
140
+ (dropout): Dropout(p=0.1, inplace=False)
141
+ )
142
+ (output): BertSelfOutput(
143
+ (dense): Linear(in_features=768, out_features=768, bias=True)
144
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
145
+ (dropout): Dropout(p=0.1, inplace=False)
146
+ )
147
+ )
148
+ (intermediate): BertIntermediate(
149
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
150
+ (intermediate_act_fn): GELUActivation()
151
+ )
152
+ (output): BertOutput(
153
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
154
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
155
+ (dropout): Dropout(p=0.1, inplace=False)
156
+ )
157
+ )
158
+ (6): BertLayer(
159
+ (attention): BertAttention(
160
+ (self): BertSelfAttention(
161
+ (query): Linear(in_features=768, out_features=768, bias=True)
162
+ (key): Linear(in_features=768, out_features=768, bias=True)
163
+ (value): Linear(in_features=768, out_features=768, bias=True)
164
+ (dropout): Dropout(p=0.1, inplace=False)
165
+ )
166
+ (output): BertSelfOutput(
167
+ (dense): Linear(in_features=768, out_features=768, bias=True)
168
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
169
+ (dropout): Dropout(p=0.1, inplace=False)
170
+ )
171
+ )
172
+ (intermediate): BertIntermediate(
173
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
174
+ (intermediate_act_fn): GELUActivation()
175
+ )
176
+ (output): BertOutput(
177
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
178
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
179
+ (dropout): Dropout(p=0.1, inplace=False)
180
+ )
181
+ )
182
+ (7): BertLayer(
183
+ (attention): BertAttention(
184
+ (self): BertSelfAttention(
185
+ (query): Linear(in_features=768, out_features=768, bias=True)
186
+ (key): Linear(in_features=768, out_features=768, bias=True)
187
+ (value): Linear(in_features=768, out_features=768, bias=True)
188
+ (dropout): Dropout(p=0.1, inplace=False)
189
+ )
190
+ (output): BertSelfOutput(
191
+ (dense): Linear(in_features=768, out_features=768, bias=True)
192
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
193
+ (dropout): Dropout(p=0.1, inplace=False)
194
+ )
195
+ )
196
+ (intermediate): BertIntermediate(
197
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
198
+ (intermediate_act_fn): GELUActivation()
199
+ )
200
+ (output): BertOutput(
201
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
202
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
203
+ (dropout): Dropout(p=0.1, inplace=False)
204
+ )
205
+ )
206
+ (8): BertLayer(
207
+ (attention): BertAttention(
208
+ (self): BertSelfAttention(
209
+ (query): Linear(in_features=768, out_features=768, bias=True)
210
+ (key): Linear(in_features=768, out_features=768, bias=True)
211
+ (value): Linear(in_features=768, out_features=768, bias=True)
212
+ (dropout): Dropout(p=0.1, inplace=False)
213
+ )
214
+ (output): BertSelfOutput(
215
+ (dense): Linear(in_features=768, out_features=768, bias=True)
216
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
217
+ (dropout): Dropout(p=0.1, inplace=False)
218
+ )
219
+ )
220
+ (intermediate): BertIntermediate(
221
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
222
+ (intermediate_act_fn): GELUActivation()
223
+ )
224
+ (output): BertOutput(
225
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
226
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
227
+ (dropout): Dropout(p=0.1, inplace=False)
228
+ )
229
+ )
230
+ (9): BertLayer(
231
+ (attention): BertAttention(
232
+ (self): BertSelfAttention(
233
+ (query): Linear(in_features=768, out_features=768, bias=True)
234
+ (key): Linear(in_features=768, out_features=768, bias=True)
235
+ (value): Linear(in_features=768, out_features=768, bias=True)
236
+ (dropout): Dropout(p=0.1, inplace=False)
237
+ )
238
+ (output): BertSelfOutput(
239
+ (dense): Linear(in_features=768, out_features=768, bias=True)
240
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
241
+ (dropout): Dropout(p=0.1, inplace=False)
242
+ )
243
+ )
244
+ (intermediate): BertIntermediate(
245
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
246
+ (intermediate_act_fn): GELUActivation()
247
+ )
248
+ (output): BertOutput(
249
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
250
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
251
+ (dropout): Dropout(p=0.1, inplace=False)
252
+ )
253
+ )
254
+ (10): BertLayer(
255
+ (attention): BertAttention(
256
+ (self): BertSelfAttention(
257
+ (query): Linear(in_features=768, out_features=768, bias=True)
258
+ (key): Linear(in_features=768, out_features=768, bias=True)
259
+ (value): Linear(in_features=768, out_features=768, bias=True)
260
+ (dropout): Dropout(p=0.1, inplace=False)
261
+ )
262
+ (output): BertSelfOutput(
263
+ (dense): Linear(in_features=768, out_features=768, bias=True)
264
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
265
+ (dropout): Dropout(p=0.1, inplace=False)
266
+ )
267
+ )
268
+ (intermediate): BertIntermediate(
269
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
270
+ (intermediate_act_fn): GELUActivation()
271
+ )
272
+ (output): BertOutput(
273
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
274
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
275
+ (dropout): Dropout(p=0.1, inplace=False)
276
+ )
277
+ )
278
+ (11): BertLayer(
279
+ (attention): BertAttention(
280
+ (self): BertSelfAttention(
281
+ (query): Linear(in_features=768, out_features=768, bias=True)
282
+ (key): Linear(in_features=768, out_features=768, bias=True)
283
+ (value): Linear(in_features=768, out_features=768, bias=True)
284
+ (dropout): Dropout(p=0.1, inplace=False)
285
+ )
286
+ (output): BertSelfOutput(
287
+ (dense): Linear(in_features=768, out_features=768, bias=True)
288
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
289
+ (dropout): Dropout(p=0.1, inplace=False)
290
+ )
291
+ )
292
+ (intermediate): BertIntermediate(
293
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
294
+ (intermediate_act_fn): GELUActivation()
295
+ )
296
+ (output): BertOutput(
297
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
298
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
299
+ (dropout): Dropout(p=0.1, inplace=False)
300
+ )
301
+ )
302
+ )
303
+ )
304
+ (pooler): BertPooler(
305
+ (dense): Linear(in_features=768, out_features=768, bias=True)
306
+ (activation): Tanh()
307
+ )
308
+ )
309
+ )
310
+ (locked_dropout): LockedDropout(p=0.5)
311
+ (linear): Linear(in_features=768, out_features=21, bias=True)
312
+ (loss_function): CrossEntropyLoss()
313
+ )"
314
+ 2023-10-24 10:48:14,852 ----------------------------------------------------------------------------------------------------
315
+ 2023-10-24 10:48:14,852 MultiCorpus: 5901 train + 1287 dev + 1505 test sentences
316
+ - NER_HIPE_2022 Corpus: 5901 train + 1287 dev + 1505 test sentences - /home/ubuntu/.flair/datasets/ner_hipe_2022/v2.1/hipe2020/fr/with_doc_seperator
317
+ 2023-10-24 10:48:14,852 ----------------------------------------------------------------------------------------------------
318
+ 2023-10-24 10:48:14,852 Train: 5901 sentences
319
+ 2023-10-24 10:48:14,852 (train_with_dev=False, train_with_test=False)
320
+ 2023-10-24 10:48:14,852 ----------------------------------------------------------------------------------------------------
321
+ 2023-10-24 10:48:14,852 Training Params:
322
+ 2023-10-24 10:48:14,852 - learning_rate: "5e-05"
323
+ 2023-10-24 10:48:14,852 - mini_batch_size: "4"
324
+ 2023-10-24 10:48:14,852 - max_epochs: "10"
325
+ 2023-10-24 10:48:14,852 - shuffle: "True"
326
+ 2023-10-24 10:48:14,852 ----------------------------------------------------------------------------------------------------
327
+ 2023-10-24 10:48:14,852 Plugins:
328
+ 2023-10-24 10:48:14,852 - TensorboardLogger
329
+ 2023-10-24 10:48:14,852 - LinearScheduler | warmup_fraction: '0.1'
330
+ 2023-10-24 10:48:14,852 ----------------------------------------------------------------------------------------------------
331
+ 2023-10-24 10:48:14,852 Final evaluation on model from best epoch (best-model.pt)
332
+ 2023-10-24 10:48:14,853 - metric: "('micro avg', 'f1-score')"
333
+ 2023-10-24 10:48:14,853 ----------------------------------------------------------------------------------------------------
334
+ 2023-10-24 10:48:14,853 Computation:
335
+ 2023-10-24 10:48:14,853 - compute on device: cuda:0
336
+ 2023-10-24 10:48:14,853 - embedding storage: none
337
+ 2023-10-24 10:48:14,853 ----------------------------------------------------------------------------------------------------
338
+ 2023-10-24 10:48:14,853 Model training base path: "hmbench-hipe2020/fr-dbmdz/bert-base-historic-multilingual-64k-td-cased-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-3"
339
+ 2023-10-24 10:48:14,853 ----------------------------------------------------------------------------------------------------
340
+ 2023-10-24 10:48:14,853 ----------------------------------------------------------------------------------------------------
341
+ 2023-10-24 10:48:14,853 Logging anything other than scalars to TensorBoard is currently not supported.
342
+ 2023-10-24 10:48:24,098 epoch 1 - iter 147/1476 - loss 1.65586929 - time (sec): 9.24 - samples/sec: 1730.95 - lr: 0.000005 - momentum: 0.000000
343
+ 2023-10-24 10:48:33,391 epoch 1 - iter 294/1476 - loss 1.07991837 - time (sec): 18.54 - samples/sec: 1711.24 - lr: 0.000010 - momentum: 0.000000
344
+ 2023-10-24 10:48:42,499 epoch 1 - iter 441/1476 - loss 0.87315512 - time (sec): 27.65 - samples/sec: 1663.26 - lr: 0.000015 - momentum: 0.000000
345
+ 2023-10-24 10:48:52,399 epoch 1 - iter 588/1476 - loss 0.70884012 - time (sec): 37.55 - samples/sec: 1719.32 - lr: 0.000020 - momentum: 0.000000
346
+ 2023-10-24 10:49:02,834 epoch 1 - iter 735/1476 - loss 0.59023179 - time (sec): 47.98 - samples/sec: 1759.57 - lr: 0.000025 - momentum: 0.000000
347
+ 2023-10-24 10:49:12,304 epoch 1 - iter 882/1476 - loss 0.52815123 - time (sec): 57.45 - samples/sec: 1756.99 - lr: 0.000030 - momentum: 0.000000
348
+ 2023-10-24 10:49:21,617 epoch 1 - iter 1029/1476 - loss 0.48116891 - time (sec): 66.76 - samples/sec: 1748.66 - lr: 0.000035 - momentum: 0.000000
349
+ 2023-10-24 10:49:31,462 epoch 1 - iter 1176/1476 - loss 0.44141868 - time (sec): 76.61 - samples/sec: 1746.77 - lr: 0.000040 - momentum: 0.000000
350
+ 2023-10-24 10:49:40,743 epoch 1 - iter 1323/1476 - loss 0.41530887 - time (sec): 85.89 - samples/sec: 1742.31 - lr: 0.000045 - momentum: 0.000000
351
+ 2023-10-24 10:49:50,281 epoch 1 - iter 1470/1476 - loss 0.38978126 - time (sec): 95.43 - samples/sec: 1738.88 - lr: 0.000050 - momentum: 0.000000
352
+ 2023-10-24 10:49:50,628 ----------------------------------------------------------------------------------------------------
353
+ 2023-10-24 10:49:50,629 EPOCH 1 done: loss 0.3891 - lr: 0.000050
354
+ 2023-10-24 10:49:56,936 DEV : loss 0.13457921147346497 - f1-score (micro avg) 0.7234
355
+ 2023-10-24 10:49:56,958 saving best model
356
+ 2023-10-24 10:49:57,516 ----------------------------------------------------------------------------------------------------
357
+ 2023-10-24 10:50:07,080 epoch 2 - iter 147/1476 - loss 0.12059972 - time (sec): 9.56 - samples/sec: 1765.14 - lr: 0.000049 - momentum: 0.000000
358
+ 2023-10-24 10:50:16,285 epoch 2 - iter 294/1476 - loss 0.13797220 - time (sec): 18.77 - samples/sec: 1717.67 - lr: 0.000049 - momentum: 0.000000
359
+ 2023-10-24 10:50:25,463 epoch 2 - iter 441/1476 - loss 0.14905127 - time (sec): 27.95 - samples/sec: 1679.87 - lr: 0.000048 - momentum: 0.000000
360
+ 2023-10-24 10:50:35,219 epoch 2 - iter 588/1476 - loss 0.14142829 - time (sec): 37.70 - samples/sec: 1704.17 - lr: 0.000048 - momentum: 0.000000
361
+ 2023-10-24 10:50:44,511 epoch 2 - iter 735/1476 - loss 0.13962946 - time (sec): 46.99 - samples/sec: 1694.23 - lr: 0.000047 - momentum: 0.000000
362
+ 2023-10-24 10:50:54,152 epoch 2 - iter 882/1476 - loss 0.13959635 - time (sec): 56.63 - samples/sec: 1704.74 - lr: 0.000047 - momentum: 0.000000
363
+ 2023-10-24 10:51:03,232 epoch 2 - iter 1029/1476 - loss 0.14077252 - time (sec): 65.71 - samples/sec: 1691.49 - lr: 0.000046 - momentum: 0.000000
364
+ 2023-10-24 10:51:13,264 epoch 2 - iter 1176/1476 - loss 0.13763429 - time (sec): 75.75 - samples/sec: 1725.25 - lr: 0.000046 - momentum: 0.000000
365
+ 2023-10-24 10:51:23,146 epoch 2 - iter 1323/1476 - loss 0.13938057 - time (sec): 85.63 - samples/sec: 1726.60 - lr: 0.000045 - momentum: 0.000000
366
+ 2023-10-24 10:51:33,145 epoch 2 - iter 1470/1476 - loss 0.13778830 - time (sec): 95.63 - samples/sec: 1735.50 - lr: 0.000044 - momentum: 0.000000
367
+ 2023-10-24 10:51:33,492 ----------------------------------------------------------------------------------------------------
368
+ 2023-10-24 10:51:33,492 EPOCH 2 done: loss 0.1379 - lr: 0.000044
369
+ 2023-10-24 10:51:42,008 DEV : loss 0.14209164679050446 - f1-score (micro avg) 0.7784
370
+ 2023-10-24 10:51:42,029 saving best model
371
+ 2023-10-24 10:51:42,735 ----------------------------------------------------------------------------------------------------
372
+ 2023-10-24 10:51:52,069 epoch 3 - iter 147/1476 - loss 0.08832162 - time (sec): 9.33 - samples/sec: 1634.07 - lr: 0.000044 - momentum: 0.000000
373
+ 2023-10-24 10:52:02,054 epoch 3 - iter 294/1476 - loss 0.08724963 - time (sec): 19.32 - samples/sec: 1723.32 - lr: 0.000043 - momentum: 0.000000
374
+ 2023-10-24 10:52:11,502 epoch 3 - iter 441/1476 - loss 0.08766214 - time (sec): 28.77 - samples/sec: 1709.11 - lr: 0.000043 - momentum: 0.000000
375
+ 2023-10-24 10:52:21,316 epoch 3 - iter 588/1476 - loss 0.08283332 - time (sec): 38.58 - samples/sec: 1746.55 - lr: 0.000042 - momentum: 0.000000
376
+ 2023-10-24 10:52:30,589 epoch 3 - iter 735/1476 - loss 0.08143414 - time (sec): 47.85 - samples/sec: 1727.76 - lr: 0.000042 - momentum: 0.000000
377
+ 2023-10-24 10:52:40,282 epoch 3 - iter 882/1476 - loss 0.08342790 - time (sec): 57.55 - samples/sec: 1738.12 - lr: 0.000041 - momentum: 0.000000
378
+ 2023-10-24 10:52:49,830 epoch 3 - iter 1029/1476 - loss 0.08223349 - time (sec): 67.09 - samples/sec: 1733.72 - lr: 0.000041 - momentum: 0.000000
379
+ 2023-10-24 10:52:59,223 epoch 3 - iter 1176/1476 - loss 0.08468750 - time (sec): 76.49 - samples/sec: 1729.40 - lr: 0.000040 - momentum: 0.000000
380
+ 2023-10-24 10:53:09,193 epoch 3 - iter 1323/1476 - loss 0.09646163 - time (sec): 86.46 - samples/sec: 1745.87 - lr: 0.000039 - momentum: 0.000000
381
+ 2023-10-24 10:53:18,392 epoch 3 - iter 1470/1476 - loss 0.09593820 - time (sec): 95.66 - samples/sec: 1736.12 - lr: 0.000039 - momentum: 0.000000
382
+ 2023-10-24 10:53:18,728 ----------------------------------------------------------------------------------------------------
383
+ 2023-10-24 10:53:18,728 EPOCH 3 done: loss 0.0959 - lr: 0.000039
384
+ 2023-10-24 10:53:27,143 DEV : loss 0.2701607942581177 - f1-score (micro avg) 0.763
385
+ 2023-10-24 10:53:27,165 ----------------------------------------------------------------------------------------------------
386
+ 2023-10-24 10:53:36,810 epoch 4 - iter 147/1476 - loss 0.12417453 - time (sec): 9.64 - samples/sec: 1745.60 - lr: 0.000038 - momentum: 0.000000
387
+ 2023-10-24 10:53:46,534 epoch 4 - iter 294/1476 - loss 0.12118589 - time (sec): 19.37 - samples/sec: 1811.03 - lr: 0.000038 - momentum: 0.000000
388
+ 2023-10-24 10:53:56,194 epoch 4 - iter 441/1476 - loss 0.10852964 - time (sec): 29.03 - samples/sec: 1781.74 - lr: 0.000037 - momentum: 0.000000
389
+ 2023-10-24 10:54:05,524 epoch 4 - iter 588/1476 - loss 0.09519935 - time (sec): 38.36 - samples/sec: 1761.04 - lr: 0.000037 - momentum: 0.000000
390
+ 2023-10-24 10:54:15,280 epoch 4 - iter 735/1476 - loss 0.09097434 - time (sec): 48.11 - samples/sec: 1766.24 - lr: 0.000036 - momentum: 0.000000
391
+ 2023-10-24 10:54:24,715 epoch 4 - iter 882/1476 - loss 0.08614100 - time (sec): 57.55 - samples/sec: 1756.63 - lr: 0.000036 - momentum: 0.000000
392
+ 2023-10-24 10:54:34,696 epoch 4 - iter 1029/1476 - loss 0.09487861 - time (sec): 67.53 - samples/sec: 1762.79 - lr: 0.000035 - momentum: 0.000000
393
+ 2023-10-24 10:54:44,167 epoch 4 - iter 1176/1476 - loss 0.09394085 - time (sec): 77.00 - samples/sec: 1751.57 - lr: 0.000034 - momentum: 0.000000
394
+ 2023-10-24 10:54:53,633 epoch 4 - iter 1323/1476 - loss 0.09504047 - time (sec): 86.47 - samples/sec: 1744.04 - lr: 0.000034 - momentum: 0.000000
395
+ 2023-10-24 10:55:02,882 epoch 4 - iter 1470/1476 - loss 0.09448577 - time (sec): 95.72 - samples/sec: 1731.37 - lr: 0.000033 - momentum: 0.000000
396
+ 2023-10-24 10:55:03,250 ----------------------------------------------------------------------------------------------------
397
+ 2023-10-24 10:55:03,251 EPOCH 4 done: loss 0.0948 - lr: 0.000033
398
+ 2023-10-24 10:55:11,668 DEV : loss 0.27863532304763794 - f1-score (micro avg) 0.7293
399
+ 2023-10-24 10:55:11,689 ----------------------------------------------------------------------------------------------------
400
+ 2023-10-24 10:55:21,448 epoch 5 - iter 147/1476 - loss 0.07291799 - time (sec): 9.76 - samples/sec: 1737.92 - lr: 0.000033 - momentum: 0.000000
401
+ 2023-10-24 10:55:31,095 epoch 5 - iter 294/1476 - loss 0.11773689 - time (sec): 19.40 - samples/sec: 1770.59 - lr: 0.000032 - momentum: 0.000000
402
+ 2023-10-24 10:55:40,949 epoch 5 - iter 441/1476 - loss 0.09833702 - time (sec): 29.26 - samples/sec: 1776.56 - lr: 0.000032 - momentum: 0.000000
403
+ 2023-10-24 10:55:50,119 epoch 5 - iter 588/1476 - loss 0.08337112 - time (sec): 38.43 - samples/sec: 1746.88 - lr: 0.000031 - momentum: 0.000000
404
+ 2023-10-24 10:56:00,122 epoch 5 - iter 735/1476 - loss 0.08978486 - time (sec): 48.43 - samples/sec: 1745.69 - lr: 0.000031 - momentum: 0.000000
405
+ 2023-10-24 10:56:09,225 epoch 5 - iter 882/1476 - loss 0.08167065 - time (sec): 57.54 - samples/sec: 1723.59 - lr: 0.000030 - momentum: 0.000000
406
+ 2023-10-24 10:56:18,288 epoch 5 - iter 1029/1476 - loss 0.07871689 - time (sec): 66.60 - samples/sec: 1719.04 - lr: 0.000029 - momentum: 0.000000
407
+ 2023-10-24 10:56:27,625 epoch 5 - iter 1176/1476 - loss 0.07299931 - time (sec): 75.93 - samples/sec: 1704.49 - lr: 0.000029 - momentum: 0.000000
408
+ 2023-10-24 10:56:37,116 epoch 5 - iter 1323/1476 - loss 0.07223057 - time (sec): 85.43 - samples/sec: 1710.28 - lr: 0.000028 - momentum: 0.000000
409
+ 2023-10-24 10:56:47,474 epoch 5 - iter 1470/1476 - loss 0.07846900 - time (sec): 95.78 - samples/sec: 1733.07 - lr: 0.000028 - momentum: 0.000000
410
+ 2023-10-24 10:56:47,814 ----------------------------------------------------------------------------------------------------
411
+ 2023-10-24 10:56:47,815 EPOCH 5 done: loss 0.0784 - lr: 0.000028
412
+ 2023-10-24 10:56:56,241 DEV : loss 0.25809499621391296 - f1-score (micro avg) 0.7499
413
+ 2023-10-24 10:56:56,262 ----------------------------------------------------------------------------------------------------
414
+ 2023-10-24 10:57:06,035 epoch 6 - iter 147/1476 - loss 0.05313087 - time (sec): 9.77 - samples/sec: 1824.29 - lr: 0.000027 - momentum: 0.000000
415
+ 2023-10-24 10:57:15,603 epoch 6 - iter 294/1476 - loss 0.05439273 - time (sec): 19.34 - samples/sec: 1747.91 - lr: 0.000027 - momentum: 0.000000
416
+ 2023-10-24 10:57:25,156 epoch 6 - iter 441/1476 - loss 0.04903707 - time (sec): 28.89 - samples/sec: 1733.35 - lr: 0.000026 - momentum: 0.000000
417
+ 2023-10-24 10:57:34,744 epoch 6 - iter 588/1476 - loss 0.05877384 - time (sec): 38.48 - samples/sec: 1734.92 - lr: 0.000026 - momentum: 0.000000
418
+ 2023-10-24 10:57:44,061 epoch 6 - iter 735/1476 - loss 0.05051822 - time (sec): 47.80 - samples/sec: 1727.89 - lr: 0.000025 - momentum: 0.000000
419
+ 2023-10-24 10:57:53,790 epoch 6 - iter 882/1476 - loss 0.04679481 - time (sec): 57.53 - samples/sec: 1737.86 - lr: 0.000024 - momentum: 0.000000
420
+ 2023-10-24 10:58:03,065 epoch 6 - iter 1029/1476 - loss 0.04646404 - time (sec): 66.80 - samples/sec: 1721.46 - lr: 0.000024 - momentum: 0.000000
421
+ 2023-10-24 10:58:12,451 epoch 6 - iter 1176/1476 - loss 0.05002079 - time (sec): 76.19 - samples/sec: 1723.74 - lr: 0.000023 - momentum: 0.000000
422
+ 2023-10-24 10:58:22,558 epoch 6 - iter 1323/1476 - loss 0.06170524 - time (sec): 86.30 - samples/sec: 1736.56 - lr: 0.000023 - momentum: 0.000000
423
+ 2023-10-24 10:58:32,083 epoch 6 - iter 1470/1476 - loss 0.06160170 - time (sec): 95.82 - samples/sec: 1731.81 - lr: 0.000022 - momentum: 0.000000
424
+ 2023-10-24 10:58:32,426 ----------------------------------------------------------------------------------------------------
425
+ 2023-10-24 10:58:32,427 EPOCH 6 done: loss 0.0614 - lr: 0.000022
426
+ 2023-10-24 10:58:40,876 DEV : loss 0.2634078860282898 - f1-score (micro avg) 0.772
427
+ 2023-10-24 10:58:40,897 ----------------------------------------------------------------------------------------------------
428
+ 2023-10-24 10:58:50,465 epoch 7 - iter 147/1476 - loss 0.04507495 - time (sec): 9.57 - samples/sec: 1717.08 - lr: 0.000022 - momentum: 0.000000
429
+ 2023-10-24 10:58:59,978 epoch 7 - iter 294/1476 - loss 0.04197351 - time (sec): 19.08 - samples/sec: 1702.46 - lr: 0.000021 - momentum: 0.000000
430
+ 2023-10-24 10:59:09,686 epoch 7 - iter 441/1476 - loss 0.06505740 - time (sec): 28.79 - samples/sec: 1729.16 - lr: 0.000021 - momentum: 0.000000
431
+ 2023-10-24 10:59:18,963 epoch 7 - iter 588/1476 - loss 0.05418034 - time (sec): 38.07 - samples/sec: 1711.68 - lr: 0.000020 - momentum: 0.000000
432
+ 2023-10-24 10:59:28,157 epoch 7 - iter 735/1476 - loss 0.04622712 - time (sec): 47.26 - samples/sec: 1700.72 - lr: 0.000019 - momentum: 0.000000
433
+ 2023-10-24 10:59:38,265 epoch 7 - iter 882/1476 - loss 0.05893413 - time (sec): 57.37 - samples/sec: 1727.44 - lr: 0.000019 - momentum: 0.000000
434
+ 2023-10-24 10:59:47,775 epoch 7 - iter 1029/1476 - loss 0.05878999 - time (sec): 66.88 - samples/sec: 1727.73 - lr: 0.000018 - momentum: 0.000000
435
+ 2023-10-24 10:59:57,450 epoch 7 - iter 1176/1476 - loss 0.05964465 - time (sec): 76.55 - samples/sec: 1727.83 - lr: 0.000018 - momentum: 0.000000
436
+ 2023-10-24 11:00:07,100 epoch 7 - iter 1323/1476 - loss 0.05826532 - time (sec): 86.20 - samples/sec: 1732.09 - lr: 0.000017 - momentum: 0.000000
437
+ 2023-10-24 11:00:16,614 epoch 7 - iter 1470/1476 - loss 0.06281126 - time (sec): 95.72 - samples/sec: 1732.15 - lr: 0.000017 - momentum: 0.000000
438
+ 2023-10-24 11:00:16,990 ----------------------------------------------------------------------------------------------------
439
+ 2023-10-24 11:00:16,990 EPOCH 7 done: loss 0.0626 - lr: 0.000017
440
+ 2023-10-24 11:00:25,435 DEV : loss 0.27169960737228394 - f1-score (micro avg) 0.7652
441
+ 2023-10-24 11:00:25,457 ----------------------------------------------------------------------------------------------------
442
+ 2023-10-24 11:00:34,865 epoch 8 - iter 147/1476 - loss 0.04542037 - time (sec): 9.41 - samples/sec: 1696.00 - lr: 0.000016 - momentum: 0.000000
443
+ 2023-10-24 11:00:44,003 epoch 8 - iter 294/1476 - loss 0.02861702 - time (sec): 18.55 - samples/sec: 1657.60 - lr: 0.000016 - momentum: 0.000000
444
+ 2023-10-24 11:00:54,239 epoch 8 - iter 441/1476 - loss 0.06281701 - time (sec): 28.78 - samples/sec: 1758.67 - lr: 0.000015 - momentum: 0.000000
445
+ 2023-10-24 11:01:03,793 epoch 8 - iter 588/1476 - loss 0.05883833 - time (sec): 38.34 - samples/sec: 1759.21 - lr: 0.000014 - momentum: 0.000000
446
+ 2023-10-24 11:01:13,566 epoch 8 - iter 735/1476 - loss 0.05103042 - time (sec): 48.11 - samples/sec: 1753.77 - lr: 0.000014 - momentum: 0.000000
447
+ 2023-10-24 11:01:23,641 epoch 8 - iter 882/1476 - loss 0.05585751 - time (sec): 58.18 - samples/sec: 1761.22 - lr: 0.000013 - momentum: 0.000000
448
+ 2023-10-24 11:01:32,886 epoch 8 - iter 1029/1476 - loss 0.05420164 - time (sec): 67.43 - samples/sec: 1741.28 - lr: 0.000013 - momentum: 0.000000
449
+ 2023-10-24 11:01:42,145 epoch 8 - iter 1176/1476 - loss 0.05014942 - time (sec): 76.69 - samples/sec: 1733.69 - lr: 0.000012 - momentum: 0.000000
450
+ 2023-10-24 11:01:51,484 epoch 8 - iter 1323/1476 - loss 0.04821620 - time (sec): 86.03 - samples/sec: 1730.02 - lr: 0.000012 - momentum: 0.000000
451
+ 2023-10-24 11:02:01,128 epoch 8 - iter 1470/1476 - loss 0.04480348 - time (sec): 95.67 - samples/sec: 1732.32 - lr: 0.000011 - momentum: 0.000000
452
+ 2023-10-24 11:02:01,495 ----------------------------------------------------------------------------------------------------
453
+ 2023-10-24 11:02:01,495 EPOCH 8 done: loss 0.0447 - lr: 0.000011
454
+ 2023-10-24 11:02:09,941 DEV : loss 0.30274227261543274 - f1-score (micro avg) 0.7626
455
+ 2023-10-24 11:02:09,962 ----------------------------------------------------------------------------------------------------
456
+ 2023-10-24 11:02:19,368 epoch 9 - iter 147/1476 - loss 0.02455183 - time (sec): 9.40 - samples/sec: 1690.22 - lr: 0.000011 - momentum: 0.000000
457
+ 2023-10-24 11:02:29,184 epoch 9 - iter 294/1476 - loss 0.02908119 - time (sec): 19.22 - samples/sec: 1766.79 - lr: 0.000010 - momentum: 0.000000
458
+ 2023-10-24 11:02:38,447 epoch 9 - iter 441/1476 - loss 0.03060954 - time (sec): 28.48 - samples/sec: 1720.29 - lr: 0.000009 - momentum: 0.000000
459
+ 2023-10-24 11:02:47,999 epoch 9 - iter 588/1476 - loss 0.02637177 - time (sec): 38.04 - samples/sec: 1682.20 - lr: 0.000009 - momentum: 0.000000
460
+ 2023-10-24 11:02:57,224 epoch 9 - iter 735/1476 - loss 0.02532292 - time (sec): 47.26 - samples/sec: 1686.95 - lr: 0.000008 - momentum: 0.000000
461
+ 2023-10-24 11:03:06,638 epoch 9 - iter 882/1476 - loss 0.02620936 - time (sec): 56.67 - samples/sec: 1688.27 - lr: 0.000008 - momentum: 0.000000
462
+ 2023-10-24 11:03:16,146 epoch 9 - iter 1029/1476 - loss 0.02456485 - time (sec): 66.18 - samples/sec: 1700.19 - lr: 0.000007 - momentum: 0.000000
463
+ 2023-10-24 11:03:26,166 epoch 9 - iter 1176/1476 - loss 0.03675836 - time (sec): 76.20 - samples/sec: 1722.03 - lr: 0.000007 - momentum: 0.000000
464
+ 2023-10-24 11:03:36,323 epoch 9 - iter 1323/1476 - loss 0.04061899 - time (sec): 86.36 - samples/sec: 1730.99 - lr: 0.000006 - momentum: 0.000000
465
+ 2023-10-24 11:03:45,828 epoch 9 - iter 1470/1476 - loss 0.04002423 - time (sec): 95.86 - samples/sec: 1731.00 - lr: 0.000006 - momentum: 0.000000
466
+ 2023-10-24 11:03:46,171 ----------------------------------------------------------------------------------------------------
467
+ 2023-10-24 11:03:46,171 EPOCH 9 done: loss 0.0399 - lr: 0.000006
468
+ 2023-10-24 11:03:54,596 DEV : loss 0.2963683307170868 - f1-score (micro avg) 0.7703
469
+ 2023-10-24 11:03:54,618 ----------------------------------------------------------------------------------------------------
470
+ 2023-10-24 11:04:04,051 epoch 10 - iter 147/1476 - loss 0.02704804 - time (sec): 9.43 - samples/sec: 1717.54 - lr: 0.000005 - momentum: 0.000000
471
+ 2023-10-24 11:04:13,428 epoch 10 - iter 294/1476 - loss 0.02279645 - time (sec): 18.81 - samples/sec: 1701.66 - lr: 0.000004 - momentum: 0.000000
472
+ 2023-10-24 11:04:23,321 epoch 10 - iter 441/1476 - loss 0.02018937 - time (sec): 28.70 - samples/sec: 1740.66 - lr: 0.000004 - momentum: 0.000000
473
+ 2023-10-24 11:04:33,017 epoch 10 - iter 588/1476 - loss 0.02530081 - time (sec): 38.40 - samples/sec: 1760.38 - lr: 0.000003 - momentum: 0.000000
474
+ 2023-10-24 11:04:43,267 epoch 10 - iter 735/1476 - loss 0.04049497 - time (sec): 48.65 - samples/sec: 1775.63 - lr: 0.000003 - momentum: 0.000000
475
+ 2023-10-24 11:04:52,714 epoch 10 - iter 882/1476 - loss 0.04282402 - time (sec): 58.10 - samples/sec: 1761.20 - lr: 0.000002 - momentum: 0.000000
476
+ 2023-10-24 11:05:02,523 epoch 10 - iter 1029/1476 - loss 0.04621028 - time (sec): 67.90 - samples/sec: 1756.59 - lr: 0.000002 - momentum: 0.000000
477
+ 2023-10-24 11:05:11,678 epoch 10 - iter 1176/1476 - loss 0.04170952 - time (sec): 77.06 - samples/sec: 1742.51 - lr: 0.000001 - momentum: 0.000000
478
+ 2023-10-24 11:05:20,869 epoch 10 - iter 1323/1476 - loss 0.03916033 - time (sec): 86.25 - samples/sec: 1733.25 - lr: 0.000001 - momentum: 0.000000
479
+ 2023-10-24 11:05:30,215 epoch 10 - iter 1470/1476 - loss 0.03566834 - time (sec): 95.60 - samples/sec: 1735.21 - lr: 0.000000 - momentum: 0.000000
480
+ 2023-10-24 11:05:30,559 ----------------------------------------------------------------------------------------------------
481
+ 2023-10-24 11:05:30,559 EPOCH 10 done: loss 0.0356 - lr: 0.000000
482
+ 2023-10-24 11:05:39,016 DEV : loss 0.30029311776161194 - f1-score (micro avg) 0.7695
483
+ 2023-10-24 11:05:39,590 ----------------------------------------------------------------------------------------------------
484
+ 2023-10-24 11:05:39,590 Loading model from best epoch ...
485
+ 2023-10-24 11:05:41,453 SequenceTagger predicts: Dictionary with 21 tags: O, S-loc, B-loc, E-loc, I-loc, S-pers, B-pers, E-pers, I-pers, S-org, B-org, E-org, I-org, S-time, B-time, E-time, I-time, S-prod, B-prod, E-prod, I-prod
486
+ 2023-10-24 11:05:47,732
487
+ Results:
488
+ - F-score (micro) 0.7379
489
+ - F-score (macro) 0.6047
490
+ - Accuracy 0.6102
491
+
492
+ By class:
493
+ precision recall f1-score support
494
+
495
+ loc 0.8307 0.8520 0.8412 858
496
+ pers 0.6764 0.6927 0.6845 537
497
+ org 0.4410 0.5379 0.4846 132
498
+ time 0.5147 0.6481 0.5738 54
499
+ prod 0.6667 0.3279 0.4396 61
500
+
501
+ micro avg 0.7276 0.7485 0.7379 1642
502
+ macro avg 0.6259 0.6117 0.6047 1642
503
+ weighted avg 0.7324 0.7485 0.7376 1642
504
+
505
+ 2023-10-24 11:05:47,732 ----------------------------------------------------------------------------------------------------