small-tindy-aug-26dec
This model is a fine-tuned version of openai/whisper-tiny on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.2720
- Wer: 82.4497
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 32
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant
- lr_scheduler_warmup_steps: 1000
- num_epochs: 15
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
1.4654 | 1.0 | 272 | 1.2543 | 113.9215 |
1.0788 | 2.0 | 544 | 0.8559 | 105.8815 |
0.7043 | 3.0 | 816 | 0.5695 | 97.9140 |
0.5241 | 4.0 | 1088 | 0.4696 | 94.4734 |
0.4376 | 5.0 | 1360 | 0.4099 | 90.9894 |
0.3814 | 6.0 | 1632 | 0.3720 | 90.8953 |
0.3414 | 7.0 | 1904 | 0.3490 | 87.5489 |
0.3127 | 8.0 | 2176 | 0.3282 | 85.8395 |
0.29 | 9.0 | 2448 | 0.3132 | 85.5425 |
0.2691 | 10.0 | 2720 | 0.3018 | 84.6516 |
0.2525 | 11.0 | 2992 | 0.2949 | 84.6733 |
0.2365 | 12.0 | 3264 | 0.2878 | 85.1586 |
0.2235 | 13.0 | 3536 | 0.2787 | 82.1744 |
0.2117 | 14.0 | 3808 | 0.2770 | 81.7181 |
0.2004 | 15.0 | 4080 | 0.2720 | 82.4497 |
Framework versions
- Transformers 4.45.2
- Pytorch 2.4.0
- Datasets 3.2.0
- Tokenizers 0.20.3
- Downloads last month
- 90
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for PhanithLIM/whisper-tiny-aug-26dec
Base model
openai/whisper-tiny