roberta-large-bne-livingner1

This model is a finetuned version of roberta-large-bne for the livingner1 dataset used in a benchmark in the paper TODO. The model has a F1 of 0.939

Please refer to the original publication for more information TODO LINK

Parameters used

parameter Value
batch size 16
learning rate 3e-05
classifier dropout 0.2
warmup ratio 0
warmup steps 0
weight decay 0
optimizer AdamW
epochs 10
early stopping patience 3

BibTeX entry and citation info

TODO
Downloads last month
110
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Dataset used to train IIC/roberta-large-bne-livingner1

Collection including IIC/roberta-large-bne-livingner1

Evaluation results