whisper-small-Fleurs_AMMI_AFRIVOICE_LRSC-ln-1hrs-v1

This model is a fine-tuned version of openai/whisper-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.3019
  • Wer: 0.3808
  • Cer: 0.1461

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 4
  • eval_batch_size: 2
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 8
  • optimizer: Use adamw_hf with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
3.6864 1.0 39 1.6821 0.9572 0.4224
1.103 2.0 78 1.0007 0.8219 0.3945
0.5188 3.0 117 0.8848 0.7838 0.3935
0.2456 4.0 156 0.8893 0.9181 0.5470
0.1259 5.0 195 0.8927 0.8016 0.4502
0.0786 6.0 234 0.9356 0.6412 0.3213
0.0595 7.0 273 0.9471 0.4937 0.1938
0.064 8.0 312 0.9619 0.5049 0.2139
0.0757 9.0 351 1.0240 0.5936 0.3168
0.0767 10.0 390 1.0239 0.4224 0.1501
0.0707 11.0 429 1.0876 0.4315 0.1580
0.0597 12.0 468 1.0901 0.4502 0.1838
0.0618 13.0 507 1.1516 0.4403 0.1795
0.0533 14.0 546 1.1576 0.4911 0.2228
0.0495 15.0 585 1.1172 0.4157 0.1594
0.0371 16.0 624 1.2017 0.4681 0.1940
0.0336 17.0 663 1.2032 0.4096 0.1584
0.0305 18.0 702 1.2111 0.4655 0.2039
0.0339 19.0 741 1.2462 0.5809 0.2864
0.0414 20.0 780 1.2423 0.4540 0.1958
0.0264 21.0 819 1.2491 0.4665 0.2037
0.0227 22.0 858 1.3357 0.4811 0.2178
0.0233 23.0 897 1.2492 0.4088 0.1621
0.0241 24.0 936 1.2684 0.4039 0.1572
0.0226 25.0 975 1.3104 0.4419 0.1942
0.0281 26.0 1014 1.3006 0.4270 0.1864
0.024 27.0 1053 1.3125 0.4359 0.1696
0.0214 28.0 1092 1.2786 0.4141 0.1667
0.0196 29.0 1131 1.2767 0.4007 0.1506
0.0224 30.0 1170 1.3100 0.3914 0.1503
0.0149 31.0 1209 1.2602 0.3949 0.1483
0.0114 32.0 1248 1.3024 0.3870 0.1457
0.0137 33.0 1287 1.3193 0.4201 0.1736
0.0154 34.0 1326 1.2994 0.4051 0.1610
0.0088 35.0 1365 1.3062 0.3989 0.1559
0.015 36.0 1404 1.2826 0.3967 0.1599
0.0055 37.0 1443 1.3145 0.3866 0.1503
0.0106 38.0 1482 1.3164 0.3952 0.1577
0.007 39.0 1521 1.3097 0.3957 0.1558
0.0035 40.0 1560 1.2840 0.3975 0.1577
0.001 41.0 1599 1.3069 0.3927 0.1560
0.0038 42.0 1638 1.2932 0.3915 0.1532
0.002 43.0 1677 1.3026 0.3911 0.1543
0.0015 44.0 1716 1.2793 0.3853 0.1485
0.0002 45.0 1755 1.2758 0.3850 0.1476
0.0005 46.0 1794 1.2775 0.3852 0.1481
0.0007 47.0 1833 1.2861 0.3861 0.1470
0.0001 48.0 1872 1.2858 0.3815 0.1425
0.0001 49.0 1911 1.2878 0.3849 0.1469
0.0001 50.0 1950 1.2892 0.3837 0.1461
0.0001 51.0 1989 1.2907 0.3835 0.1466
0.0001 52.0 2028 1.2923 0.3826 0.1462
0.0001 53.0 2067 1.2937 0.3827 0.1465
0.0001 54.0 2106 1.2951 0.3810 0.1455
0.0001 55.0 2145 1.2965 0.3810 0.1455
0.0 56.0 2184 1.2978 0.3820 0.1466
0.0 57.0 2223 1.2988 0.3817 0.1463
0.0 58.0 2262 1.3000 0.3815 0.1464
0.0 59.0 2301 1.3010 0.3811 0.1462
0.0 60.0 2340 1.3019 0.3808 0.1461

Framework versions

  • Transformers 4.48.2
  • Pytorch 2.5.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
2
Safetensors
Model size
242M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for asr-africa/whisper-small-Fleurs_AMMI_AFRIVOICE_LRSC-ln-1hrs-v1

Finetuned
(2251)
this model

Collection including asr-africa/whisper-small-Fleurs_AMMI_AFRIVOICE_LRSC-ln-1hrs-v1