whyazalea's picture
Upload folder using huggingface_hub
b7b138d
raw
history blame
196 Bytes
{
"epoch": 0.42,
"train_loss": 3.4143439737955728,
"train_runtime": 16849.095,
"train_samples": 114599,
"train_samples_per_second": 2.849,
"train_steps_per_second": 0.178
}