TFG
Collection
Datasets and models leveraged and developed during my final degree work (TFG). Info and code can be found at https://github.com/enriquesaou/tfg-lm-qa
•
18 items
•
Updated
•
2
This model is a fine-tuned version of google/flan-t5-small on an unknown dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
No log | 0.9984 | 416 | 0.8355 |
1.0359 | 1.9992 | 833 | 0.7848 |
0.8826 | 3.0 | 1250 | 0.7680 |
0.819 | 3.9984 | 1666 | 0.7622 |
0.7761 | 4.9992 | 2083 | 0.7541 |
0.77 | 6.0 | 2500 | 0.7569 |
0.77 | 6.9888 | 2912 | 0.7555 |
Base model
google/flan-t5-small