Moroccan-Darija-STT-large-v1.5.2
This model is a fine-tuned version of openai/whisper-large-v3 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.3054
- Wer: 117.4489
- Cer: 98.8307
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-07
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 60
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
2.8823 | 0.6135 | 100 | 0.8263 | 92.5391 | 49.8173 |
2.81 | 1.2270 | 200 | 0.7744 | 92.9001 | 49.7929 |
2.6462 | 1.8405 | 300 | 0.6980 | 93.1408 | 49.4032 |
2.46 | 2.4540 | 400 | 0.6228 | 91.3357 | 49.5250 |
2.3255 | 3.0675 | 500 | 0.5622 | 91.3357 | 50.4750 |
2.1054 | 3.6810 | 600 | 0.5389 | 94.9458 | 57.1985 |
2.0243 | 4.2945 | 700 | 0.5230 | 95.6679 | 60.2436 |
1.9601 | 4.9080 | 800 | 0.5091 | 95.5475 | 63.3374 |
1.8677 | 5.5215 | 900 | 0.4955 | 101.0830 | 74.3727 |
1.793 | 6.1350 | 1000 | 0.4824 | 109.6270 | 80.2680 |
1.7006 | 6.7485 | 1100 | 0.4703 | 105.0542 | 84.5311 |
1.6489 | 7.3620 | 1200 | 0.4552 | 106.2575 | 87.8197 |
1.6782 | 7.9755 | 1300 | 0.4509 | 141.3959 | 119.3666 |
1.6137 | 8.5890 | 1400 | 0.4477 | 138.7485 | 118.9769 |
1.5515 | 9.2025 | 1500 | 0.4256 | 162.6955 | 137.9537 |
1.5284 | 9.8160 | 1600 | 0.4199 | 169.3141 | 140.8039 |
1.4676 | 10.4294 | 1700 | 0.4220 | 145.1264 | 120.5116 |
1.4378 | 11.0429 | 1800 | 0.4228 | 142.7196 | 120.9013 |
1.3755 | 11.6564 | 1900 | 0.4007 | 133.6943 | 112.8136 |
1.3006 | 12.2699 | 2000 | 0.3947 | 111.0710 | 93.3496 |
1.289 | 12.8834 | 2100 | 0.3747 | 110.8303 | 93.2034 |
1.3037 | 13.4969 | 2200 | 0.3641 | 121.5403 | 105.1888 |
1.2583 | 14.1104 | 2300 | 0.3621 | 111.7930 | 93.8611 |
1.2245 | 14.7239 | 2400 | 0.3598 | 135.6197 | 115.3228 |
1.1955 | 15.3374 | 2500 | 0.3530 | 122.1420 | 105.3350 |
1.2061 | 15.9509 | 2600 | 0.3466 | 121.4200 | 103.3130 |
1.1644 | 16.5644 | 2700 | 0.3409 | 119.7353 | 100.3898 |
1.1885 | 17.1779 | 2800 | 0.3421 | 121.0590 | 104.4336 |
1.115 | 17.7914 | 2900 | 0.3356 | 119.3742 | 102.8258 |
1.1449 | 18.4049 | 3000 | 0.3299 | 109.0253 | 89.6955 |
1.1613 | 19.0184 | 3100 | 0.3255 | 116.4862 | 96.3459 |
1.1437 | 19.6319 | 3200 | 0.3256 | 116.0048 | 95.2741 |
1.0599 | 20.2454 | 3300 | 0.3169 | 105.6558 | 86.2119 |
1.0489 | 20.8589 | 3400 | 0.3173 | 106.0168 | 85.7978 |
1.0779 | 21.4724 | 3500 | 0.3145 | 104.5728 | 86.9184 |
1.0808 | 22.0859 | 3600 | 0.3137 | 104.3321 | 86.6261 |
1.0569 | 22.6994 | 3700 | 0.3134 | 103.0084 | 86.0414 |
1.0907 | 23.3129 | 3800 | 0.3120 | 105.2948 | 87.5761 |
1.092 | 23.9264 | 3900 | 0.3075 | 103.7304 | 86.2850 |
1.012 | 24.5399 | 4000 | 0.3081 | 105.1745 | 87.9659 |
1.0455 | 25.1534 | 4100 | 0.3042 | 111.0710 | 89.8417 |
0.993 | 25.7669 | 4200 | 0.3100 | 111.1913 | 89.9147 |
1.0574 | 26.3804 | 4300 | 0.3007 | 111.0710 | 88.7698 |
0.9836 | 26.9939 | 4400 | 0.3042 | 109.9880 | 87.7467 |
1.0063 | 27.6074 | 4500 | 0.3054 | 102.1661 | 86.1389 |
1.0172 | 28.2209 | 4600 | 0.3042 | 102.1661 | 85.9196 |
0.9869 | 28.8344 | 4700 | 0.3047 | 110.8303 | 88.6480 |
1.0173 | 29.4479 | 4800 | 0.3042 | 111.0710 | 88.6967 |
1.007 | 30.0613 | 4900 | 0.3039 | 102.4067 | 85.9440 |
0.9428 | 30.6748 | 5000 | 0.3052 | 102.4067 | 86.5043 |
0.9475 | 31.2883 | 5100 | 0.3023 | 102.8881 | 86.3094 |
0.9538 | 31.9018 | 5200 | 0.3034 | 121.7810 | 103.7028 |
0.9769 | 32.5153 | 5300 | 0.3049 | 131.0469 | 105.9927 |
0.9242 | 33.1288 | 5400 | 0.3026 | 111.4320 | 88.2582 |
0.927 | 33.7423 | 5500 | 0.3049 | 127.1961 | 101.1206 |
0.9576 | 34.3558 | 5600 | 0.3030 | 122.5030 | 103.8246 |
0.9709 | 34.9693 | 5700 | 0.3041 | 122.8640 | 104.0926 |
0.9365 | 35.5828 | 5800 | 0.3048 | 122.8640 | 104.4823 |
0.9723 | 36.1963 | 5900 | 0.3029 | 122.0217 | 103.7759 |
0.9383 | 36.8098 | 6000 | 0.3040 | 122.6233 | 104.3118 |
0.948 | 37.4233 | 6100 | 0.3027 | 138.9892 | 117.2716 |
0.8709 | 38.0368 | 6200 | 0.3026 | 139.1095 | 117.5883 |
0.965 | 38.6503 | 6300 | 0.3033 | 122.6233 | 104.2875 |
0.9689 | 39.2638 | 6400 | 0.3036 | 138.9892 | 117.6370 |
0.9515 | 39.8773 | 6500 | 0.3044 | 139.3502 | 117.5639 |
0.8778 | 40.4908 | 6600 | 0.3042 | 121.9013 | 103.9951 |
0.9073 | 41.1043 | 6700 | 0.3057 | 139.4705 | 117.5883 |
0.8688 | 41.7178 | 6800 | 0.3047 | 118.7726 | 99.3910 |
0.9335 | 42.3313 | 6900 | 0.3047 | 139.2298 | 118.1242 |
0.9439 | 42.9448 | 7000 | 0.3037 | 138.6282 | 117.9050 |
0.8685 | 43.5583 | 7100 | 0.3042 | 117.0878 | 98.5627 |
0.9241 | 44.1718 | 7200 | 0.3049 | 117.9302 | 98.7820 |
0.9078 | 44.7853 | 7300 | 0.3041 | 117.5692 | 98.7333 |
0.8858 | 45.3988 | 7400 | 0.3064 | 117.0878 | 98.5140 |
0.871 | 46.0123 | 7500 | 0.3048 | 117.8099 | 99.1717 |
0.9462 | 46.6258 | 7600 | 0.3046 | 118.5319 | 99.3666 |
0.9373 | 47.2393 | 7700 | 0.3046 | 117.2082 | 98.6358 |
0.9336 | 47.8528 | 7800 | 0.3044 | 118.0505 | 99.2205 |
0.8448 | 48.4663 | 7900 | 0.3061 | 117.4489 | 99.1474 |
0.8969 | 49.0798 | 8000 | 0.3053 | 117.5692 | 98.8794 |
0.8706 | 49.6933 | 8100 | 0.3051 | 118.0505 | 98.8794 |
0.9211 | 50.3067 | 8200 | 0.3053 | 117.6895 | 98.8794 |
0.9141 | 50.9202 | 8300 | 0.3049 | 116.9675 | 98.0755 |
0.8708 | 51.5337 | 8400 | 0.3059 | 117.4489 | 99.0743 |
0.9129 | 52.1472 | 8500 | 0.3057 | 117.9302 | 99.1717 |
0.8726 | 52.7607 | 8600 | 0.3052 | 117.4489 | 98.2217 |
0.8889 | 53.3742 | 8700 | 0.3050 | 118.4116 | 98.9525 |
0.9593 | 53.9877 | 8800 | 0.3053 | 118.2912 | 98.8063 |
0.9276 | 54.6012 | 8900 | 0.3057 | 116.8472 | 97.9050 |
0.8554 | 55.2147 | 9000 | 0.3055 | 117.5692 | 98.7820 |
0.8811 | 55.8282 | 9100 | 0.3055 | 118.1709 | 98.5384 |
0.9238 | 56.4417 | 9200 | 0.3051 | 116.7268 | 97.8319 |
0.8927 | 57.0552 | 9300 | 0.3055 | 117.3285 | 98.0755 |
0.8539 | 57.6687 | 9400 | 0.3054 | 117.3285 | 98.0512 |
0.9293 | 58.2822 | 9500 | 0.3055 | 117.9302 | 98.7820 |
0.9069 | 58.8957 | 9600 | 0.3055 | 117.0878 | 98.5140 |
0.8801 | 59.5092 | 9700 | 0.3054 | 117.4489 | 98.8307 |
Framework versions
- Transformers 4.48.0.dev0
- Pytorch 2.5.1+cu124
- Datasets 3.1.0
- Tokenizers 0.21.0
- Downloads last month
- 9
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for BounharAbdelaziz/Moroccan-Darija-STT-large-v1.5.2
Base model
openai/whisper-large-v3