Upload ./training.log with huggingface_hub
Browse files- training.log +508 -0
training.log
ADDED
@@ -0,0 +1,508 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
2023-10-23 19:51:00,417 ----------------------------------------------------------------------------------------------------
|
2 |
+
2023-10-23 19:51:00,418 Model: "SequenceTagger(
|
3 |
+
(embeddings): TransformerWordEmbeddings(
|
4 |
+
(model): BertModel(
|
5 |
+
(embeddings): BertEmbeddings(
|
6 |
+
(word_embeddings): Embedding(64001, 768)
|
7 |
+
(position_embeddings): Embedding(512, 768)
|
8 |
+
(token_type_embeddings): Embedding(2, 768)
|
9 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
10 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
11 |
+
)
|
12 |
+
(encoder): BertEncoder(
|
13 |
+
(layer): ModuleList(
|
14 |
+
(0): BertLayer(
|
15 |
+
(attention): BertAttention(
|
16 |
+
(self): BertSelfAttention(
|
17 |
+
(query): Linear(in_features=768, out_features=768, bias=True)
|
18 |
+
(key): Linear(in_features=768, out_features=768, bias=True)
|
19 |
+
(value): Linear(in_features=768, out_features=768, bias=True)
|
20 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
21 |
+
)
|
22 |
+
(output): BertSelfOutput(
|
23 |
+
(dense): Linear(in_features=768, out_features=768, bias=True)
|
24 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
25 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
26 |
+
)
|
27 |
+
)
|
28 |
+
(intermediate): BertIntermediate(
|
29 |
+
(dense): Linear(in_features=768, out_features=3072, bias=True)
|
30 |
+
(intermediate_act_fn): GELUActivation()
|
31 |
+
)
|
32 |
+
(output): BertOutput(
|
33 |
+
(dense): Linear(in_features=3072, out_features=768, bias=True)
|
34 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
35 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
36 |
+
)
|
37 |
+
)
|
38 |
+
(1): BertLayer(
|
39 |
+
(attention): BertAttention(
|
40 |
+
(self): BertSelfAttention(
|
41 |
+
(query): Linear(in_features=768, out_features=768, bias=True)
|
42 |
+
(key): Linear(in_features=768, out_features=768, bias=True)
|
43 |
+
(value): Linear(in_features=768, out_features=768, bias=True)
|
44 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
45 |
+
)
|
46 |
+
(output): BertSelfOutput(
|
47 |
+
(dense): Linear(in_features=768, out_features=768, bias=True)
|
48 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
49 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
50 |
+
)
|
51 |
+
)
|
52 |
+
(intermediate): BertIntermediate(
|
53 |
+
(dense): Linear(in_features=768, out_features=3072, bias=True)
|
54 |
+
(intermediate_act_fn): GELUActivation()
|
55 |
+
)
|
56 |
+
(output): BertOutput(
|
57 |
+
(dense): Linear(in_features=3072, out_features=768, bias=True)
|
58 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
59 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
60 |
+
)
|
61 |
+
)
|
62 |
+
(2): BertLayer(
|
63 |
+
(attention): BertAttention(
|
64 |
+
(self): BertSelfAttention(
|
65 |
+
(query): Linear(in_features=768, out_features=768, bias=True)
|
66 |
+
(key): Linear(in_features=768, out_features=768, bias=True)
|
67 |
+
(value): Linear(in_features=768, out_features=768, bias=True)
|
68 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
69 |
+
)
|
70 |
+
(output): BertSelfOutput(
|
71 |
+
(dense): Linear(in_features=768, out_features=768, bias=True)
|
72 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
73 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
74 |
+
)
|
75 |
+
)
|
76 |
+
(intermediate): BertIntermediate(
|
77 |
+
(dense): Linear(in_features=768, out_features=3072, bias=True)
|
78 |
+
(intermediate_act_fn): GELUActivation()
|
79 |
+
)
|
80 |
+
(output): BertOutput(
|
81 |
+
(dense): Linear(in_features=3072, out_features=768, bias=True)
|
82 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
83 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
84 |
+
)
|
85 |
+
)
|
86 |
+
(3): BertLayer(
|
87 |
+
(attention): BertAttention(
|
88 |
+
(self): BertSelfAttention(
|
89 |
+
(query): Linear(in_features=768, out_features=768, bias=True)
|
90 |
+
(key): Linear(in_features=768, out_features=768, bias=True)
|
91 |
+
(value): Linear(in_features=768, out_features=768, bias=True)
|
92 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
93 |
+
)
|
94 |
+
(output): BertSelfOutput(
|
95 |
+
(dense): Linear(in_features=768, out_features=768, bias=True)
|
96 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
97 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
98 |
+
)
|
99 |
+
)
|
100 |
+
(intermediate): BertIntermediate(
|
101 |
+
(dense): Linear(in_features=768, out_features=3072, bias=True)
|
102 |
+
(intermediate_act_fn): GELUActivation()
|
103 |
+
)
|
104 |
+
(output): BertOutput(
|
105 |
+
(dense): Linear(in_features=3072, out_features=768, bias=True)
|
106 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
107 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
108 |
+
)
|
109 |
+
)
|
110 |
+
(4): BertLayer(
|
111 |
+
(attention): BertAttention(
|
112 |
+
(self): BertSelfAttention(
|
113 |
+
(query): Linear(in_features=768, out_features=768, bias=True)
|
114 |
+
(key): Linear(in_features=768, out_features=768, bias=True)
|
115 |
+
(value): Linear(in_features=768, out_features=768, bias=True)
|
116 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
117 |
+
)
|
118 |
+
(output): BertSelfOutput(
|
119 |
+
(dense): Linear(in_features=768, out_features=768, bias=True)
|
120 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
121 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
122 |
+
)
|
123 |
+
)
|
124 |
+
(intermediate): BertIntermediate(
|
125 |
+
(dense): Linear(in_features=768, out_features=3072, bias=True)
|
126 |
+
(intermediate_act_fn): GELUActivation()
|
127 |
+
)
|
128 |
+
(output): BertOutput(
|
129 |
+
(dense): Linear(in_features=3072, out_features=768, bias=True)
|
130 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
131 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
132 |
+
)
|
133 |
+
)
|
134 |
+
(5): BertLayer(
|
135 |
+
(attention): BertAttention(
|
136 |
+
(self): BertSelfAttention(
|
137 |
+
(query): Linear(in_features=768, out_features=768, bias=True)
|
138 |
+
(key): Linear(in_features=768, out_features=768, bias=True)
|
139 |
+
(value): Linear(in_features=768, out_features=768, bias=True)
|
140 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
141 |
+
)
|
142 |
+
(output): BertSelfOutput(
|
143 |
+
(dense): Linear(in_features=768, out_features=768, bias=True)
|
144 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
145 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
146 |
+
)
|
147 |
+
)
|
148 |
+
(intermediate): BertIntermediate(
|
149 |
+
(dense): Linear(in_features=768, out_features=3072, bias=True)
|
150 |
+
(intermediate_act_fn): GELUActivation()
|
151 |
+
)
|
152 |
+
(output): BertOutput(
|
153 |
+
(dense): Linear(in_features=3072, out_features=768, bias=True)
|
154 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
155 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
156 |
+
)
|
157 |
+
)
|
158 |
+
(6): BertLayer(
|
159 |
+
(attention): BertAttention(
|
160 |
+
(self): BertSelfAttention(
|
161 |
+
(query): Linear(in_features=768, out_features=768, bias=True)
|
162 |
+
(key): Linear(in_features=768, out_features=768, bias=True)
|
163 |
+
(value): Linear(in_features=768, out_features=768, bias=True)
|
164 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
165 |
+
)
|
166 |
+
(output): BertSelfOutput(
|
167 |
+
(dense): Linear(in_features=768, out_features=768, bias=True)
|
168 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
169 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
170 |
+
)
|
171 |
+
)
|
172 |
+
(intermediate): BertIntermediate(
|
173 |
+
(dense): Linear(in_features=768, out_features=3072, bias=True)
|
174 |
+
(intermediate_act_fn): GELUActivation()
|
175 |
+
)
|
176 |
+
(output): BertOutput(
|
177 |
+
(dense): Linear(in_features=3072, out_features=768, bias=True)
|
178 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
179 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
180 |
+
)
|
181 |
+
)
|
182 |
+
(7): BertLayer(
|
183 |
+
(attention): BertAttention(
|
184 |
+
(self): BertSelfAttention(
|
185 |
+
(query): Linear(in_features=768, out_features=768, bias=True)
|
186 |
+
(key): Linear(in_features=768, out_features=768, bias=True)
|
187 |
+
(value): Linear(in_features=768, out_features=768, bias=True)
|
188 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
189 |
+
)
|
190 |
+
(output): BertSelfOutput(
|
191 |
+
(dense): Linear(in_features=768, out_features=768, bias=True)
|
192 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
193 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
194 |
+
)
|
195 |
+
)
|
196 |
+
(intermediate): BertIntermediate(
|
197 |
+
(dense): Linear(in_features=768, out_features=3072, bias=True)
|
198 |
+
(intermediate_act_fn): GELUActivation()
|
199 |
+
)
|
200 |
+
(output): BertOutput(
|
201 |
+
(dense): Linear(in_features=3072, out_features=768, bias=True)
|
202 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
203 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
204 |
+
)
|
205 |
+
)
|
206 |
+
(8): BertLayer(
|
207 |
+
(attention): BertAttention(
|
208 |
+
(self): BertSelfAttention(
|
209 |
+
(query): Linear(in_features=768, out_features=768, bias=True)
|
210 |
+
(key): Linear(in_features=768, out_features=768, bias=True)
|
211 |
+
(value): Linear(in_features=768, out_features=768, bias=True)
|
212 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
213 |
+
)
|
214 |
+
(output): BertSelfOutput(
|
215 |
+
(dense): Linear(in_features=768, out_features=768, bias=True)
|
216 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
217 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
218 |
+
)
|
219 |
+
)
|
220 |
+
(intermediate): BertIntermediate(
|
221 |
+
(dense): Linear(in_features=768, out_features=3072, bias=True)
|
222 |
+
(intermediate_act_fn): GELUActivation()
|
223 |
+
)
|
224 |
+
(output): BertOutput(
|
225 |
+
(dense): Linear(in_features=3072, out_features=768, bias=True)
|
226 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
227 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
228 |
+
)
|
229 |
+
)
|
230 |
+
(9): BertLayer(
|
231 |
+
(attention): BertAttention(
|
232 |
+
(self): BertSelfAttention(
|
233 |
+
(query): Linear(in_features=768, out_features=768, bias=True)
|
234 |
+
(key): Linear(in_features=768, out_features=768, bias=True)
|
235 |
+
(value): Linear(in_features=768, out_features=768, bias=True)
|
236 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
237 |
+
)
|
238 |
+
(output): BertSelfOutput(
|
239 |
+
(dense): Linear(in_features=768, out_features=768, bias=True)
|
240 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
241 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
242 |
+
)
|
243 |
+
)
|
244 |
+
(intermediate): BertIntermediate(
|
245 |
+
(dense): Linear(in_features=768, out_features=3072, bias=True)
|
246 |
+
(intermediate_act_fn): GELUActivation()
|
247 |
+
)
|
248 |
+
(output): BertOutput(
|
249 |
+
(dense): Linear(in_features=3072, out_features=768, bias=True)
|
250 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
251 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
252 |
+
)
|
253 |
+
)
|
254 |
+
(10): BertLayer(
|
255 |
+
(attention): BertAttention(
|
256 |
+
(self): BertSelfAttention(
|
257 |
+
(query): Linear(in_features=768, out_features=768, bias=True)
|
258 |
+
(key): Linear(in_features=768, out_features=768, bias=True)
|
259 |
+
(value): Linear(in_features=768, out_features=768, bias=True)
|
260 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
261 |
+
)
|
262 |
+
(output): BertSelfOutput(
|
263 |
+
(dense): Linear(in_features=768, out_features=768, bias=True)
|
264 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
265 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
266 |
+
)
|
267 |
+
)
|
268 |
+
(intermediate): BertIntermediate(
|
269 |
+
(dense): Linear(in_features=768, out_features=3072, bias=True)
|
270 |
+
(intermediate_act_fn): GELUActivation()
|
271 |
+
)
|
272 |
+
(output): BertOutput(
|
273 |
+
(dense): Linear(in_features=3072, out_features=768, bias=True)
|
274 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
275 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
276 |
+
)
|
277 |
+
)
|
278 |
+
(11): BertLayer(
|
279 |
+
(attention): BertAttention(
|
280 |
+
(self): BertSelfAttention(
|
281 |
+
(query): Linear(in_features=768, out_features=768, bias=True)
|
282 |
+
(key): Linear(in_features=768, out_features=768, bias=True)
|
283 |
+
(value): Linear(in_features=768, out_features=768, bias=True)
|
284 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
285 |
+
)
|
286 |
+
(output): BertSelfOutput(
|
287 |
+
(dense): Linear(in_features=768, out_features=768, bias=True)
|
288 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
289 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
290 |
+
)
|
291 |
+
)
|
292 |
+
(intermediate): BertIntermediate(
|
293 |
+
(dense): Linear(in_features=768, out_features=3072, bias=True)
|
294 |
+
(intermediate_act_fn): GELUActivation()
|
295 |
+
)
|
296 |
+
(output): BertOutput(
|
297 |
+
(dense): Linear(in_features=3072, out_features=768, bias=True)
|
298 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
299 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
300 |
+
)
|
301 |
+
)
|
302 |
+
)
|
303 |
+
)
|
304 |
+
(pooler): BertPooler(
|
305 |
+
(dense): Linear(in_features=768, out_features=768, bias=True)
|
306 |
+
(activation): Tanh()
|
307 |
+
)
|
308 |
+
)
|
309 |
+
)
|
310 |
+
(locked_dropout): LockedDropout(p=0.5)
|
311 |
+
(linear): Linear(in_features=768, out_features=25, bias=True)
|
312 |
+
(loss_function): CrossEntropyLoss()
|
313 |
+
)"
|
314 |
+
2023-10-23 19:51:00,418 ----------------------------------------------------------------------------------------------------
|
315 |
+
2023-10-23 19:51:00,418 MultiCorpus: 966 train + 219 dev + 204 test sentences
|
316 |
+
- NER_HIPE_2022 Corpus: 966 train + 219 dev + 204 test sentences - /home/ubuntu/.flair/datasets/ner_hipe_2022/v2.1/ajmc/fr/with_doc_seperator
|
317 |
+
2023-10-23 19:51:00,418 ----------------------------------------------------------------------------------------------------
|
318 |
+
2023-10-23 19:51:00,418 Train: 966 sentences
|
319 |
+
2023-10-23 19:51:00,418 (train_with_dev=False, train_with_test=False)
|
320 |
+
2023-10-23 19:51:00,418 ----------------------------------------------------------------------------------------------------
|
321 |
+
2023-10-23 19:51:00,418 Training Params:
|
322 |
+
2023-10-23 19:51:00,418 - learning_rate: "3e-05"
|
323 |
+
2023-10-23 19:51:00,418 - mini_batch_size: "4"
|
324 |
+
2023-10-23 19:51:00,418 - max_epochs: "10"
|
325 |
+
2023-10-23 19:51:00,418 - shuffle: "True"
|
326 |
+
2023-10-23 19:51:00,418 ----------------------------------------------------------------------------------------------------
|
327 |
+
2023-10-23 19:51:00,418 Plugins:
|
328 |
+
2023-10-23 19:51:00,418 - TensorboardLogger
|
329 |
+
2023-10-23 19:51:00,418 - LinearScheduler | warmup_fraction: '0.1'
|
330 |
+
2023-10-23 19:51:00,418 ----------------------------------------------------------------------------------------------------
|
331 |
+
2023-10-23 19:51:00,418 Final evaluation on model from best epoch (best-model.pt)
|
332 |
+
2023-10-23 19:51:00,418 - metric: "('micro avg', 'f1-score')"
|
333 |
+
2023-10-23 19:51:00,419 ----------------------------------------------------------------------------------------------------
|
334 |
+
2023-10-23 19:51:00,419 Computation:
|
335 |
+
2023-10-23 19:51:00,419 - compute on device: cuda:0
|
336 |
+
2023-10-23 19:51:00,419 - embedding storage: none
|
337 |
+
2023-10-23 19:51:00,419 ----------------------------------------------------------------------------------------------------
|
338 |
+
2023-10-23 19:51:00,419 Model training base path: "hmbench-ajmc/fr-dbmdz/bert-base-historic-multilingual-64k-td-cased-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-5"
|
339 |
+
2023-10-23 19:51:00,419 ----------------------------------------------------------------------------------------------------
|
340 |
+
2023-10-23 19:51:00,419 ----------------------------------------------------------------------------------------------------
|
341 |
+
2023-10-23 19:51:00,419 Logging anything other than scalars to TensorBoard is currently not supported.
|
342 |
+
2023-10-23 19:51:01,874 epoch 1 - iter 24/242 - loss 3.19223754 - time (sec): 1.45 - samples/sec: 1580.94 - lr: 0.000003 - momentum: 0.000000
|
343 |
+
2023-10-23 19:51:03,430 epoch 1 - iter 48/242 - loss 2.40588811 - time (sec): 3.01 - samples/sec: 1709.78 - lr: 0.000006 - momentum: 0.000000
|
344 |
+
2023-10-23 19:51:04,914 epoch 1 - iter 72/242 - loss 1.84035746 - time (sec): 4.49 - samples/sec: 1664.08 - lr: 0.000009 - momentum: 0.000000
|
345 |
+
2023-10-23 19:51:06,427 epoch 1 - iter 96/242 - loss 1.51170565 - time (sec): 6.01 - samples/sec: 1655.46 - lr: 0.000012 - momentum: 0.000000
|
346 |
+
2023-10-23 19:51:07,973 epoch 1 - iter 120/242 - loss 1.31512225 - time (sec): 7.55 - samples/sec: 1645.64 - lr: 0.000015 - momentum: 0.000000
|
347 |
+
2023-10-23 19:51:09,511 epoch 1 - iter 144/242 - loss 1.18607233 - time (sec): 9.09 - samples/sec: 1622.96 - lr: 0.000018 - momentum: 0.000000
|
348 |
+
2023-10-23 19:51:11,033 epoch 1 - iter 168/242 - loss 1.06918175 - time (sec): 10.61 - samples/sec: 1627.64 - lr: 0.000021 - momentum: 0.000000
|
349 |
+
2023-10-23 19:51:12,551 epoch 1 - iter 192/242 - loss 0.97368169 - time (sec): 12.13 - samples/sec: 1625.95 - lr: 0.000024 - momentum: 0.000000
|
350 |
+
2023-10-23 19:51:14,111 epoch 1 - iter 216/242 - loss 0.89392525 - time (sec): 13.69 - samples/sec: 1619.11 - lr: 0.000027 - momentum: 0.000000
|
351 |
+
2023-10-23 19:51:15,587 epoch 1 - iter 240/242 - loss 0.82808428 - time (sec): 15.17 - samples/sec: 1614.38 - lr: 0.000030 - momentum: 0.000000
|
352 |
+
2023-10-23 19:51:15,708 ----------------------------------------------------------------------------------------------------
|
353 |
+
2023-10-23 19:51:15,708 EPOCH 1 done: loss 0.8216 - lr: 0.000030
|
354 |
+
2023-10-23 19:51:16,520 DEV : loss 0.1904076784849167 - f1-score (micro avg) 0.625
|
355 |
+
2023-10-23 19:51:16,525 saving best model
|
356 |
+
2023-10-23 19:51:16,994 ----------------------------------------------------------------------------------------------------
|
357 |
+
2023-10-23 19:51:18,513 epoch 2 - iter 24/242 - loss 0.22047277 - time (sec): 1.52 - samples/sec: 1646.87 - lr: 0.000030 - momentum: 0.000000
|
358 |
+
2023-10-23 19:51:20,045 epoch 2 - iter 48/242 - loss 0.20656689 - time (sec): 3.05 - samples/sec: 1619.39 - lr: 0.000029 - momentum: 0.000000
|
359 |
+
2023-10-23 19:51:21,553 epoch 2 - iter 72/242 - loss 0.19524453 - time (sec): 4.56 - samples/sec: 1643.69 - lr: 0.000029 - momentum: 0.000000
|
360 |
+
2023-10-23 19:51:23,051 epoch 2 - iter 96/242 - loss 0.19474375 - time (sec): 6.06 - samples/sec: 1632.18 - lr: 0.000029 - momentum: 0.000000
|
361 |
+
2023-10-23 19:51:24,592 epoch 2 - iter 120/242 - loss 0.18071586 - time (sec): 7.60 - samples/sec: 1623.07 - lr: 0.000028 - momentum: 0.000000
|
362 |
+
2023-10-23 19:51:26,112 epoch 2 - iter 144/242 - loss 0.16705783 - time (sec): 9.12 - samples/sec: 1613.35 - lr: 0.000028 - momentum: 0.000000
|
363 |
+
2023-10-23 19:51:27,640 epoch 2 - iter 168/242 - loss 0.16801971 - time (sec): 10.65 - samples/sec: 1621.31 - lr: 0.000028 - momentum: 0.000000
|
364 |
+
2023-10-23 19:51:29,168 epoch 2 - iter 192/242 - loss 0.16601404 - time (sec): 12.17 - samples/sec: 1617.57 - lr: 0.000027 - momentum: 0.000000
|
365 |
+
2023-10-23 19:51:30,652 epoch 2 - iter 216/242 - loss 0.16087158 - time (sec): 13.66 - samples/sec: 1618.42 - lr: 0.000027 - momentum: 0.000000
|
366 |
+
2023-10-23 19:51:32,151 epoch 2 - iter 240/242 - loss 0.16125830 - time (sec): 15.16 - samples/sec: 1618.95 - lr: 0.000027 - momentum: 0.000000
|
367 |
+
2023-10-23 19:51:32,278 ----------------------------------------------------------------------------------------------------
|
368 |
+
2023-10-23 19:51:32,278 EPOCH 2 done: loss 0.1619 - lr: 0.000027
|
369 |
+
2023-10-23 19:51:32,968 DEV : loss 0.12492977827787399 - f1-score (micro avg) 0.7722
|
370 |
+
2023-10-23 19:51:32,971 saving best model
|
371 |
+
2023-10-23 19:51:33,599 ----------------------------------------------------------------------------------------------------
|
372 |
+
2023-10-23 19:51:35,073 epoch 3 - iter 24/242 - loss 0.08228198 - time (sec): 1.47 - samples/sec: 1552.71 - lr: 0.000026 - momentum: 0.000000
|
373 |
+
2023-10-23 19:51:36,602 epoch 3 - iter 48/242 - loss 0.09788978 - time (sec): 3.00 - samples/sec: 1525.66 - lr: 0.000026 - momentum: 0.000000
|
374 |
+
2023-10-23 19:51:38,188 epoch 3 - iter 72/242 - loss 0.09220848 - time (sec): 4.59 - samples/sec: 1586.49 - lr: 0.000026 - momentum: 0.000000
|
375 |
+
2023-10-23 19:51:39,708 epoch 3 - iter 96/242 - loss 0.08666506 - time (sec): 6.11 - samples/sec: 1576.05 - lr: 0.000025 - momentum: 0.000000
|
376 |
+
2023-10-23 19:51:41,227 epoch 3 - iter 120/242 - loss 0.09418845 - time (sec): 7.63 - samples/sec: 1615.54 - lr: 0.000025 - momentum: 0.000000
|
377 |
+
2023-10-23 19:51:42,734 epoch 3 - iter 144/242 - loss 0.09427390 - time (sec): 9.13 - samples/sec: 1593.25 - lr: 0.000025 - momentum: 0.000000
|
378 |
+
2023-10-23 19:51:44,275 epoch 3 - iter 168/242 - loss 0.09486656 - time (sec): 10.68 - samples/sec: 1613.18 - lr: 0.000024 - momentum: 0.000000
|
379 |
+
2023-10-23 19:51:45,765 epoch 3 - iter 192/242 - loss 0.09311269 - time (sec): 12.17 - samples/sec: 1605.05 - lr: 0.000024 - momentum: 0.000000
|
380 |
+
2023-10-23 19:51:47,288 epoch 3 - iter 216/242 - loss 0.09060481 - time (sec): 13.69 - samples/sec: 1599.99 - lr: 0.000024 - momentum: 0.000000
|
381 |
+
2023-10-23 19:51:48,825 epoch 3 - iter 240/242 - loss 0.09054203 - time (sec): 15.23 - samples/sec: 1613.16 - lr: 0.000023 - momentum: 0.000000
|
382 |
+
2023-10-23 19:51:48,950 ----------------------------------------------------------------------------------------------------
|
383 |
+
2023-10-23 19:51:48,950 EPOCH 3 done: loss 0.0903 - lr: 0.000023
|
384 |
+
2023-10-23 19:51:49,642 DEV : loss 0.12668359279632568 - f1-score (micro avg) 0.8466
|
385 |
+
2023-10-23 19:51:49,646 saving best model
|
386 |
+
2023-10-23 19:51:50,265 ----------------------------------------------------------------------------------------------------
|
387 |
+
2023-10-23 19:51:51,739 epoch 4 - iter 24/242 - loss 0.05375865 - time (sec): 1.47 - samples/sec: 1585.68 - lr: 0.000023 - momentum: 0.000000
|
388 |
+
2023-10-23 19:51:53,289 epoch 4 - iter 48/242 - loss 0.07423702 - time (sec): 3.02 - samples/sec: 1610.56 - lr: 0.000023 - momentum: 0.000000
|
389 |
+
2023-10-23 19:51:54,783 epoch 4 - iter 72/242 - loss 0.06761474 - time (sec): 4.52 - samples/sec: 1589.93 - lr: 0.000022 - momentum: 0.000000
|
390 |
+
2023-10-23 19:51:56,299 epoch 4 - iter 96/242 - loss 0.06999735 - time (sec): 6.03 - samples/sec: 1585.83 - lr: 0.000022 - momentum: 0.000000
|
391 |
+
2023-10-23 19:51:57,818 epoch 4 - iter 120/242 - loss 0.06846459 - time (sec): 7.55 - samples/sec: 1590.12 - lr: 0.000022 - momentum: 0.000000
|
392 |
+
2023-10-23 19:51:59,309 epoch 4 - iter 144/242 - loss 0.06195957 - time (sec): 9.04 - samples/sec: 1564.54 - lr: 0.000021 - momentum: 0.000000
|
393 |
+
2023-10-23 19:52:00,821 epoch 4 - iter 168/242 - loss 0.06057882 - time (sec): 10.55 - samples/sec: 1556.71 - lr: 0.000021 - momentum: 0.000000
|
394 |
+
2023-10-23 19:52:02,395 epoch 4 - iter 192/242 - loss 0.06467063 - time (sec): 12.13 - samples/sec: 1589.37 - lr: 0.000021 - momentum: 0.000000
|
395 |
+
2023-10-23 19:52:03,966 epoch 4 - iter 216/242 - loss 0.06723331 - time (sec): 13.70 - samples/sec: 1606.84 - lr: 0.000020 - momentum: 0.000000
|
396 |
+
2023-10-23 19:52:05,508 epoch 4 - iter 240/242 - loss 0.06588115 - time (sec): 15.24 - samples/sec: 1612.51 - lr: 0.000020 - momentum: 0.000000
|
397 |
+
2023-10-23 19:52:05,632 ----------------------------------------------------------------------------------------------------
|
398 |
+
2023-10-23 19:52:05,632 EPOCH 4 done: loss 0.0661 - lr: 0.000020
|
399 |
+
2023-10-23 19:52:06,328 DEV : loss 0.14996084570884705 - f1-score (micro avg) 0.8433
|
400 |
+
2023-10-23 19:52:06,332 ----------------------------------------------------------------------------------------------------
|
401 |
+
2023-10-23 19:52:07,845 epoch 5 - iter 24/242 - loss 0.03040416 - time (sec): 1.51 - samples/sec: 1661.72 - lr: 0.000020 - momentum: 0.000000
|
402 |
+
2023-10-23 19:52:09,375 epoch 5 - iter 48/242 - loss 0.02794239 - time (sec): 3.04 - samples/sec: 1650.07 - lr: 0.000019 - momentum: 0.000000
|
403 |
+
2023-10-23 19:52:10,919 epoch 5 - iter 72/242 - loss 0.03380901 - time (sec): 4.59 - samples/sec: 1646.76 - lr: 0.000019 - momentum: 0.000000
|
404 |
+
2023-10-23 19:52:12,408 epoch 5 - iter 96/242 - loss 0.03631379 - time (sec): 6.08 - samples/sec: 1643.93 - lr: 0.000019 - momentum: 0.000000
|
405 |
+
2023-10-23 19:52:13,940 epoch 5 - iter 120/242 - loss 0.03974207 - time (sec): 7.61 - samples/sec: 1658.61 - lr: 0.000018 - momentum: 0.000000
|
406 |
+
2023-10-23 19:52:15,416 epoch 5 - iter 144/242 - loss 0.04254043 - time (sec): 9.08 - samples/sec: 1642.07 - lr: 0.000018 - momentum: 0.000000
|
407 |
+
2023-10-23 19:52:16,944 epoch 5 - iter 168/242 - loss 0.04360087 - time (sec): 10.61 - samples/sec: 1633.29 - lr: 0.000018 - momentum: 0.000000
|
408 |
+
2023-10-23 19:52:18,446 epoch 5 - iter 192/242 - loss 0.04259084 - time (sec): 12.11 - samples/sec: 1639.22 - lr: 0.000017 - momentum: 0.000000
|
409 |
+
2023-10-23 19:52:20,011 epoch 5 - iter 216/242 - loss 0.04883753 - time (sec): 13.68 - samples/sec: 1642.98 - lr: 0.000017 - momentum: 0.000000
|
410 |
+
2023-10-23 19:52:21,547 epoch 5 - iter 240/242 - loss 0.04851010 - time (sec): 15.21 - samples/sec: 1621.60 - lr: 0.000017 - momentum: 0.000000
|
411 |
+
2023-10-23 19:52:21,656 ----------------------------------------------------------------------------------------------------
|
412 |
+
2023-10-23 19:52:21,656 EPOCH 5 done: loss 0.0485 - lr: 0.000017
|
413 |
+
2023-10-23 19:52:22,354 DEV : loss 0.1554577797651291 - f1-score (micro avg) 0.8468
|
414 |
+
2023-10-23 19:52:22,358 saving best model
|
415 |
+
2023-10-23 19:52:22,980 ----------------------------------------------------------------------------------------------------
|
416 |
+
2023-10-23 19:52:24,496 epoch 6 - iter 24/242 - loss 0.03258603 - time (sec): 1.52 - samples/sec: 1736.41 - lr: 0.000016 - momentum: 0.000000
|
417 |
+
2023-10-23 19:52:26,041 epoch 6 - iter 48/242 - loss 0.03134851 - time (sec): 3.06 - samples/sec: 1627.53 - lr: 0.000016 - momentum: 0.000000
|
418 |
+
2023-10-23 19:52:27,593 epoch 6 - iter 72/242 - loss 0.02940348 - time (sec): 4.61 - samples/sec: 1646.75 - lr: 0.000016 - momentum: 0.000000
|
419 |
+
2023-10-23 19:52:29,102 epoch 6 - iter 96/242 - loss 0.03693120 - time (sec): 6.12 - samples/sec: 1658.87 - lr: 0.000015 - momentum: 0.000000
|
420 |
+
2023-10-23 19:52:30,618 epoch 6 - iter 120/242 - loss 0.03724375 - time (sec): 7.64 - samples/sec: 1627.48 - lr: 0.000015 - momentum: 0.000000
|
421 |
+
2023-10-23 19:52:32,150 epoch 6 - iter 144/242 - loss 0.03517163 - time (sec): 9.17 - samples/sec: 1611.73 - lr: 0.000015 - momentum: 0.000000
|
422 |
+
2023-10-23 19:52:33,649 epoch 6 - iter 168/242 - loss 0.03754916 - time (sec): 10.67 - samples/sec: 1615.39 - lr: 0.000014 - momentum: 0.000000
|
423 |
+
2023-10-23 19:52:35,155 epoch 6 - iter 192/242 - loss 0.03910231 - time (sec): 12.17 - samples/sec: 1608.28 - lr: 0.000014 - momentum: 0.000000
|
424 |
+
2023-10-23 19:52:36,685 epoch 6 - iter 216/242 - loss 0.03635218 - time (sec): 13.70 - samples/sec: 1602.92 - lr: 0.000014 - momentum: 0.000000
|
425 |
+
2023-10-23 19:52:38,190 epoch 6 - iter 240/242 - loss 0.03612874 - time (sec): 15.21 - samples/sec: 1615.01 - lr: 0.000013 - momentum: 0.000000
|
426 |
+
2023-10-23 19:52:38,313 ----------------------------------------------------------------------------------------------------
|
427 |
+
2023-10-23 19:52:38,313 EPOCH 6 done: loss 0.0359 - lr: 0.000013
|
428 |
+
2023-10-23 19:52:39,009 DEV : loss 0.178489550948143 - f1-score (micro avg) 0.8529
|
429 |
+
2023-10-23 19:52:39,013 saving best model
|
430 |
+
2023-10-23 19:52:39,713 ----------------------------------------------------------------------------------------------------
|
431 |
+
2023-10-23 19:52:41,201 epoch 7 - iter 24/242 - loss 0.02150878 - time (sec): 1.49 - samples/sec: 1605.08 - lr: 0.000013 - momentum: 0.000000
|
432 |
+
2023-10-23 19:52:42,695 epoch 7 - iter 48/242 - loss 0.01844868 - time (sec): 2.98 - samples/sec: 1543.27 - lr: 0.000013 - momentum: 0.000000
|
433 |
+
2023-10-23 19:52:44,236 epoch 7 - iter 72/242 - loss 0.02196696 - time (sec): 4.52 - samples/sec: 1538.15 - lr: 0.000012 - momentum: 0.000000
|
434 |
+
2023-10-23 19:52:45,738 epoch 7 - iter 96/242 - loss 0.01979708 - time (sec): 6.02 - samples/sec: 1537.86 - lr: 0.000012 - momentum: 0.000000
|
435 |
+
2023-10-23 19:52:47,212 epoch 7 - iter 120/242 - loss 0.02718489 - time (sec): 7.50 - samples/sec: 1540.47 - lr: 0.000012 - momentum: 0.000000
|
436 |
+
2023-10-23 19:52:48,774 epoch 7 - iter 144/242 - loss 0.02457534 - time (sec): 9.06 - samples/sec: 1574.81 - lr: 0.000011 - momentum: 0.000000
|
437 |
+
2023-10-23 19:52:50,354 epoch 7 - iter 168/242 - loss 0.02497261 - time (sec): 10.64 - samples/sec: 1598.71 - lr: 0.000011 - momentum: 0.000000
|
438 |
+
2023-10-23 19:52:51,860 epoch 7 - iter 192/242 - loss 0.02243602 - time (sec): 12.15 - samples/sec: 1599.45 - lr: 0.000011 - momentum: 0.000000
|
439 |
+
2023-10-23 19:52:53,426 epoch 7 - iter 216/242 - loss 0.02458389 - time (sec): 13.71 - samples/sec: 1609.59 - lr: 0.000010 - momentum: 0.000000
|
440 |
+
2023-10-23 19:52:54,959 epoch 7 - iter 240/242 - loss 0.02548208 - time (sec): 15.24 - samples/sec: 1615.89 - lr: 0.000010 - momentum: 0.000000
|
441 |
+
2023-10-23 19:52:55,074 ----------------------------------------------------------------------------------------------------
|
442 |
+
2023-10-23 19:52:55,075 EPOCH 7 done: loss 0.0265 - lr: 0.000010
|
443 |
+
2023-10-23 19:52:55,767 DEV : loss 0.1920190155506134 - f1-score (micro avg) 0.847
|
444 |
+
2023-10-23 19:52:55,770 ----------------------------------------------------------------------------------------------------
|
445 |
+
2023-10-23 19:52:57,301 epoch 8 - iter 24/242 - loss 0.02151881 - time (sec): 1.53 - samples/sec: 1623.18 - lr: 0.000010 - momentum: 0.000000
|
446 |
+
2023-10-23 19:52:58,781 epoch 8 - iter 48/242 - loss 0.02073414 - time (sec): 3.01 - samples/sec: 1623.83 - lr: 0.000009 - momentum: 0.000000
|
447 |
+
2023-10-23 19:53:00,304 epoch 8 - iter 72/242 - loss 0.01991517 - time (sec): 4.53 - samples/sec: 1684.27 - lr: 0.000009 - momentum: 0.000000
|
448 |
+
2023-10-23 19:53:01,830 epoch 8 - iter 96/242 - loss 0.01720566 - time (sec): 6.06 - samples/sec: 1665.22 - lr: 0.000009 - momentum: 0.000000
|
449 |
+
2023-10-23 19:53:03,308 epoch 8 - iter 120/242 - loss 0.01675706 - time (sec): 7.54 - samples/sec: 1640.96 - lr: 0.000008 - momentum: 0.000000
|
450 |
+
2023-10-23 19:53:04,816 epoch 8 - iter 144/242 - loss 0.01692203 - time (sec): 9.04 - samples/sec: 1624.27 - lr: 0.000008 - momentum: 0.000000
|
451 |
+
2023-10-23 19:53:06,370 epoch 8 - iter 168/242 - loss 0.01690889 - time (sec): 10.60 - samples/sec: 1620.66 - lr: 0.000008 - momentum: 0.000000
|
452 |
+
2023-10-23 19:53:07,930 epoch 8 - iter 192/242 - loss 0.01728285 - time (sec): 12.16 - samples/sec: 1635.44 - lr: 0.000007 - momentum: 0.000000
|
453 |
+
2023-10-23 19:53:09,442 epoch 8 - iter 216/242 - loss 0.01730991 - time (sec): 13.67 - samples/sec: 1630.31 - lr: 0.000007 - momentum: 0.000000
|
454 |
+
2023-10-23 19:53:10,974 epoch 8 - iter 240/242 - loss 0.01675284 - time (sec): 15.20 - samples/sec: 1621.37 - lr: 0.000007 - momentum: 0.000000
|
455 |
+
2023-10-23 19:53:11,086 ----------------------------------------------------------------------------------------------------
|
456 |
+
2023-10-23 19:53:11,087 EPOCH 8 done: loss 0.0167 - lr: 0.000007
|
457 |
+
2023-10-23 19:53:11,908 DEV : loss 0.18184901773929596 - f1-score (micro avg) 0.8501
|
458 |
+
2023-10-23 19:53:11,912 ----------------------------------------------------------------------------------------------------
|
459 |
+
2023-10-23 19:53:13,455 epoch 9 - iter 24/242 - loss 0.00672192 - time (sec): 1.54 - samples/sec: 1686.59 - lr: 0.000006 - momentum: 0.000000
|
460 |
+
2023-10-23 19:53:14,978 epoch 9 - iter 48/242 - loss 0.00495635 - time (sec): 3.07 - samples/sec: 1663.03 - lr: 0.000006 - momentum: 0.000000
|
461 |
+
2023-10-23 19:53:16,509 epoch 9 - iter 72/242 - loss 0.01198613 - time (sec): 4.60 - samples/sec: 1638.27 - lr: 0.000006 - momentum: 0.000000
|
462 |
+
2023-10-23 19:53:18,013 epoch 9 - iter 96/242 - loss 0.01151938 - time (sec): 6.10 - samples/sec: 1613.86 - lr: 0.000005 - momentum: 0.000000
|
463 |
+
2023-10-23 19:53:19,476 epoch 9 - iter 120/242 - loss 0.01046535 - time (sec): 7.56 - samples/sec: 1571.19 - lr: 0.000005 - momentum: 0.000000
|
464 |
+
2023-10-23 19:53:21,035 epoch 9 - iter 144/242 - loss 0.01032059 - time (sec): 9.12 - samples/sec: 1589.91 - lr: 0.000005 - momentum: 0.000000
|
465 |
+
2023-10-23 19:53:22,534 epoch 9 - iter 168/242 - loss 0.01270544 - time (sec): 10.62 - samples/sec: 1591.14 - lr: 0.000004 - momentum: 0.000000
|
466 |
+
2023-10-23 19:53:24,111 epoch 9 - iter 192/242 - loss 0.01126915 - time (sec): 12.20 - samples/sec: 1598.34 - lr: 0.000004 - momentum: 0.000000
|
467 |
+
2023-10-23 19:53:25,651 epoch 9 - iter 216/242 - loss 0.01188979 - time (sec): 13.74 - samples/sec: 1594.49 - lr: 0.000004 - momentum: 0.000000
|
468 |
+
2023-10-23 19:53:27,186 epoch 9 - iter 240/242 - loss 0.01147686 - time (sec): 15.27 - samples/sec: 1611.90 - lr: 0.000003 - momentum: 0.000000
|
469 |
+
2023-10-23 19:53:27,298 ----------------------------------------------------------------------------------------------------
|
470 |
+
2023-10-23 19:53:27,298 EPOCH 9 done: loss 0.0114 - lr: 0.000003
|
471 |
+
2023-10-23 19:53:27,996 DEV : loss 0.20280949771404266 - f1-score (micro avg) 0.8432
|
472 |
+
2023-10-23 19:53:28,000 ----------------------------------------------------------------------------------------------------
|
473 |
+
2023-10-23 19:53:29,539 epoch 10 - iter 24/242 - loss 0.01646384 - time (sec): 1.54 - samples/sec: 1579.90 - lr: 0.000003 - momentum: 0.000000
|
474 |
+
2023-10-23 19:53:31,030 epoch 10 - iter 48/242 - loss 0.01825067 - time (sec): 3.03 - samples/sec: 1500.23 - lr: 0.000003 - momentum: 0.000000
|
475 |
+
2023-10-23 19:53:32,543 epoch 10 - iter 72/242 - loss 0.01305417 - time (sec): 4.54 - samples/sec: 1545.99 - lr: 0.000002 - momentum: 0.000000
|
476 |
+
2023-10-23 19:53:34,136 epoch 10 - iter 96/242 - loss 0.01266924 - time (sec): 6.13 - samples/sec: 1567.26 - lr: 0.000002 - momentum: 0.000000
|
477 |
+
2023-10-23 19:53:35,629 epoch 10 - iter 120/242 - loss 0.01209649 - time (sec): 7.63 - samples/sec: 1574.74 - lr: 0.000002 - momentum: 0.000000
|
478 |
+
2023-10-23 19:53:37,156 epoch 10 - iter 144/242 - loss 0.01165806 - time (sec): 9.16 - samples/sec: 1586.89 - lr: 0.000001 - momentum: 0.000000
|
479 |
+
2023-10-23 19:53:38,724 epoch 10 - iter 168/242 - loss 0.01084567 - time (sec): 10.72 - samples/sec: 1607.40 - lr: 0.000001 - momentum: 0.000000
|
480 |
+
2023-10-23 19:53:40,273 epoch 10 - iter 192/242 - loss 0.00982812 - time (sec): 12.27 - samples/sec: 1602.62 - lr: 0.000001 - momentum: 0.000000
|
481 |
+
2023-10-23 19:53:41,814 epoch 10 - iter 216/242 - loss 0.00942024 - time (sec): 13.81 - samples/sec: 1619.41 - lr: 0.000000 - momentum: 0.000000
|
482 |
+
2023-10-23 19:53:43,325 epoch 10 - iter 240/242 - loss 0.00884086 - time (sec): 15.32 - samples/sec: 1608.24 - lr: 0.000000 - momentum: 0.000000
|
483 |
+
2023-10-23 19:53:43,436 ----------------------------------------------------------------------------------------------------
|
484 |
+
2023-10-23 19:53:43,436 EPOCH 10 done: loss 0.0088 - lr: 0.000000
|
485 |
+
2023-10-23 19:53:44,137 DEV : loss 0.2002663016319275 - f1-score (micro avg) 0.8483
|
486 |
+
2023-10-23 19:53:44,610 ----------------------------------------------------------------------------------------------------
|
487 |
+
2023-10-23 19:53:44,611 Loading model from best epoch ...
|
488 |
+
2023-10-23 19:53:46,074 SequenceTagger predicts: Dictionary with 25 tags: O, S-scope, B-scope, E-scope, I-scope, S-pers, B-pers, E-pers, I-pers, S-work, B-work, E-work, I-work, S-loc, B-loc, E-loc, I-loc, S-object, B-object, E-object, I-object, S-date, B-date, E-date, I-date
|
489 |
+
2023-10-23 19:53:46,809
|
490 |
+
Results:
|
491 |
+
- F-score (micro) 0.7866
|
492 |
+
- F-score (macro) 0.5592
|
493 |
+
- Accuracy 0.6705
|
494 |
+
|
495 |
+
By class:
|
496 |
+
precision recall f1-score support
|
497 |
+
|
498 |
+
pers 0.8345 0.8705 0.8521 139
|
499 |
+
scope 0.7832 0.8682 0.8235 129
|
500 |
+
work 0.6129 0.7125 0.6590 80
|
501 |
+
loc 0.7500 0.3333 0.4615 9
|
502 |
+
date 0.0000 0.0000 0.0000 3
|
503 |
+
|
504 |
+
micro avg 0.7610 0.8139 0.7866 360
|
505 |
+
macro avg 0.5961 0.5569 0.5592 360
|
506 |
+
weighted avg 0.7578 0.8139 0.7821 360
|
507 |
+
|
508 |
+
2023-10-23 19:53:46,809 ----------------------------------------------------------------------------------------------------
|