File size: 10,047 Bytes
4a7c109
 
 
 
25cde6f
4a7c109
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
---
license: mit
tags:
- generated_from_trainer
base_model: FacebookAI/roberta-base
metrics:
- accuracy
- precision
- recall
model-index:
- name: case-analysis-roberta-base
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# case-analysis-roberta-base

This model is a fine-tuned version of [FacebookAI/roberta-base](https://huggingface.co/FacebookAI/roberta-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6506
- Accuracy: 0.7973
- Precision: 0.7965
- Recall: 0.7973
- Precision Macro: 0.6320
- Recall Macro: 0.6238
- Macro Fpr: 0.0958
- Weighted Fpr: 0.0781
- Weighted Specificity: 0.8648
- Macro Specificity: 0.9155
- Weighted Sensitivity: 0.7973
- Macro Sensitivity: 0.6238
- F1 Micro: 0.7973
- F1 Macro: 0.6277
- F1 Weighted: 0.7968

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | Precision Macro | Recall Macro | Macro Fpr | Weighted Fpr | Weighted Specificity | Macro Specificity | Weighted Sensitivity | Macro Sensitivity | F1 Micro | F1 Macro | F1 Weighted |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:---------------:|:------------:|:---------:|:------------:|:--------------------:|:-----------------:|:--------------------:|:-----------------:|:--------:|:--------:|:-----------:|
| No log        | 1.0   | 224  | 0.8771          | 0.7528   | 0.7120    | 0.7528 | 0.5404          | 0.5402       | 0.1314    | 0.0987       | 0.7806               | 0.8833            | 0.7528               | 0.5402            | 0.7528   | 0.5389   | 0.7301      |
| No log        | 2.0   | 448  | 0.7936          | 0.7728   | 0.7420    | 0.7728 | 0.5529          | 0.5937       | 0.1080    | 0.0892       | 0.8458               | 0.9046            | 0.7728               | 0.5937            | 0.7728   | 0.5712   | 0.7555      |
| 0.8855        | 3.0   | 672  | 0.8127          | 0.7305   | 0.7321    | 0.7305 | 0.5336          | 0.5600       | 0.1288    | 0.1095       | 0.8209               | 0.8879            | 0.7305               | 0.5600            | 0.7305   | 0.5313   | 0.7208      |
| 0.8855        | 4.0   | 896  | 1.0186          | 0.7795   | 0.7503    | 0.7795 | 0.5722          | 0.5654       | 0.1184    | 0.0862       | 0.8004               | 0.8950            | 0.7795               | 0.5654            | 0.7795   | 0.5605   | 0.7561      |
| 0.5551        | 5.0   | 1120 | 0.7591          | 0.8085   | 0.7674    | 0.8085 | 0.5833          | 0.5963       | 0.0988    | 0.0732       | 0.8375               | 0.9115            | 0.8085               | 0.5963            | 0.8085   | 0.5892   | 0.7867      |
| 0.5551        | 6.0   | 1344 | 0.9522          | 0.8174   | 0.7816    | 0.8174 | 0.6117          | 0.5988       | 0.0967    | 0.0693       | 0.8297               | 0.9118            | 0.8174               | 0.5988            | 0.8174   | 0.6030   | 0.7967      |
| 0.386         | 7.0   | 1568 | 1.0569          | 0.7706   | 0.7610    | 0.7706 | 0.5710          | 0.5858       | 0.1089    | 0.0903       | 0.8522               | 0.9057            | 0.7706               | 0.5858            | 0.7706   | 0.5782   | 0.7656      |
| 0.386         | 8.0   | 1792 | 1.1957          | 0.7572   | 0.7918    | 0.7572 | 0.6175          | 0.6264       | 0.1052    | 0.0965       | 0.8905               | 0.9119            | 0.7572               | 0.6264            | 0.7572   | 0.6162   | 0.7715      |
| 0.2709        | 9.0   | 2016 | 1.2092          | 0.7728   | 0.7897    | 0.7728 | 0.6331          | 0.6301       | 0.1021    | 0.0892       | 0.8751               | 0.9120            | 0.7728               | 0.6301            | 0.7728   | 0.6264   | 0.7773      |
| 0.2709        | 10.0  | 2240 | 1.3830          | 0.7706   | 0.7782    | 0.7706 | 0.6112          | 0.6073       | 0.1094    | 0.0903       | 0.8464               | 0.9043            | 0.7706               | 0.6073            | 0.7706   | 0.6072   | 0.7728      |
| 0.2709        | 11.0  | 2464 | 1.4518          | 0.7817   | 0.7944    | 0.7817 | 0.6157          | 0.6059       | 0.1030    | 0.0851       | 0.8606               | 0.9106            | 0.7817               | 0.6059            | 0.7817   | 0.6077   | 0.7856      |
| 0.1837        | 12.0  | 2688 | 1.5283          | 0.7684   | 0.7840    | 0.7684 | 0.6143          | 0.6003       | 0.1058    | 0.0913       | 0.8701               | 0.9096            | 0.7684               | 0.6003            | 0.7684   | 0.6022   | 0.7726      |
| 0.1837        | 13.0  | 2912 | 1.5136          | 0.7817   | 0.7907    | 0.7817 | 0.6231          | 0.6472       | 0.0979    | 0.0851       | 0.8733               | 0.9137            | 0.7817               | 0.6472            | 0.7817   | 0.6332   | 0.7848      |
| 0.1212        | 14.0  | 3136 | 1.6569          | 0.7506   | 0.8138    | 0.7506 | 0.6380          | 0.6499       | 0.1039    | 0.0997       | 0.8911               | 0.9104            | 0.7506               | 0.6499            | 0.7506   | 0.6327   | 0.7764      |
| 0.1212        | 15.0  | 3360 | 1.5305          | 0.7661   | 0.7714    | 0.7661 | 0.5965          | 0.6203       | 0.1054    | 0.0923       | 0.8710               | 0.9093            | 0.7661               | 0.6203            | 0.7661   | 0.6068   | 0.7669      |
| 0.0793        | 16.0  | 3584 | 1.4931          | 0.7996   | 0.7896    | 0.7996 | 0.6016          | 0.6193       | 0.0947    | 0.0771       | 0.8625               | 0.9155            | 0.7996               | 0.6193            | 0.7996   | 0.6085   | 0.7933      |
| 0.0793        | 17.0  | 3808 | 1.4582          | 0.8018   | 0.7911    | 0.8018 | 0.6143          | 0.6131       | 0.0963    | 0.0761       | 0.8523               | 0.9135            | 0.8018               | 0.6131            | 0.8018   | 0.6132   | 0.7958      |
| 0.0473        | 18.0  | 4032 | 1.6772          | 0.7795   | 0.7924    | 0.7795 | 0.6154          | 0.6342       | 0.0990    | 0.0862       | 0.8742               | 0.9134            | 0.7795               | 0.6342            | 0.7795   | 0.6224   | 0.7843      |
| 0.0473        | 19.0  | 4256 | 1.5707          | 0.7929   | 0.7890    | 0.7929 | 0.6409          | 0.6339       | 0.0966    | 0.0801       | 0.8666               | 0.9149            | 0.7929               | 0.6339            | 0.7929   | 0.6348   | 0.7892      |
| 0.0473        | 20.0  | 4480 | 1.4891          | 0.8018   | 0.8136    | 0.8018 | 0.6441          | 0.6284       | 0.0916    | 0.0761       | 0.8768               | 0.9196            | 0.8018               | 0.6284            | 0.8018   | 0.6355   | 0.8073      |
| 0.0476        | 21.0  | 4704 | 1.5064          | 0.8062   | 0.8181    | 0.8062 | 0.6511          | 0.6320       | 0.0896    | 0.0742       | 0.8754               | 0.9204            | 0.8062               | 0.6320            | 0.8062   | 0.6407   | 0.8117      |
| 0.0476        | 22.0  | 4928 | 1.5076          | 0.8107   | 0.8003    | 0.8107 | 0.6296          | 0.6247       | 0.0913    | 0.0722       | 0.8642               | 0.9187            | 0.8107               | 0.6247            | 0.8107   | 0.6265   | 0.8050      |
| 0.0366        | 23.0  | 5152 | 1.5891          | 0.7973   | 0.8113    | 0.7973 | 0.6455          | 0.6382       | 0.0929    | 0.0781       | 0.8763               | 0.9184            | 0.7973               | 0.6382            | 0.7973   | 0.6407   | 0.8038      |
| 0.0366        | 24.0  | 5376 | 1.6779          | 0.7951   | 0.7990    | 0.7951 | 0.6306          | 0.5982       | 0.0994    | 0.0791       | 0.8581               | 0.9133            | 0.7951               | 0.5982            | 0.7951   | 0.6123   | 0.7956      |
| 0.0368        | 25.0  | 5600 | 1.6211          | 0.8040   | 0.8024    | 0.8040 | 0.6420          | 0.6223       | 0.0952    | 0.0751       | 0.8570               | 0.9152            | 0.8040               | 0.6223            | 0.8040   | 0.6313   | 0.8023      |
| 0.0368        | 26.0  | 5824 | 1.4841          | 0.8062   | 0.8060    | 0.8062 | 0.6364          | 0.6416       | 0.0894    | 0.0742       | 0.8775               | 0.9209            | 0.8062               | 0.6416            | 0.8062   | 0.6385   | 0.8058      |
| 0.0252        | 27.0  | 6048 | 1.6841          | 0.7884   | 0.8028    | 0.7884 | 0.6408          | 0.6436       | 0.0956    | 0.0821       | 0.8781               | 0.9166            | 0.7884               | 0.6436            | 0.7884   | 0.6410   | 0.7953      |
| 0.0252        | 28.0  | 6272 | 1.7185          | 0.7929   | 0.8006    | 0.7929 | 0.6386          | 0.6338       | 0.0954    | 0.0801       | 0.8725               | 0.9163            | 0.7929               | 0.6338            | 0.7929   | 0.6355   | 0.7964      |
| 0.0252        | 29.0  | 6496 | 1.6500          | 0.7996   | 0.7989    | 0.7996 | 0.6338          | 0.6276       | 0.0942    | 0.0771       | 0.8678               | 0.9168            | 0.7996               | 0.6276            | 0.7996   | 0.6306   | 0.7992      |
| 0.0147        | 30.0  | 6720 | 1.6506          | 0.7973   | 0.7965    | 0.7973 | 0.6320          | 0.6238       | 0.0958    | 0.0781       | 0.8648               | 0.9155            | 0.7973               | 0.6238            | 0.7973   | 0.6277   | 0.7968      |


### Framework versions

- Transformers 4.40.2
- Pytorch 2.2.1+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1