whisper-large-turkish-v6
This model is a fine-tuned version of openai/whisper-large-v3 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.3222
- Wer: 14.1454
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1500
- num_epochs: 12
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|---|---|---|---|---|
| 0.1865 | 0.3680 | 1500 | 0.1917 | 16.9584 |
| 0.1707 | 0.7360 | 3000 | 0.1882 | 16.4130 |
| 0.1248 | 1.1040 | 4500 | 0.1851 | 15.9936 |
| 0.1096 | 1.4720 | 6000 | 0.1901 | 15.8933 |
| 0.1269 | 1.8400 | 7500 | 0.1803 | 15.3985 |
| 0.0753 | 2.2080 | 9000 | 0.1928 | 16.3164 |
| 0.0793 | 2.5761 | 10500 | 0.1905 | 15.4878 |
| 0.0817 | 2.9441 | 12000 | 0.1862 | 16.0507 |
| 0.0455 | 3.3121 | 13500 | 0.2044 | 15.4315 |
| 0.0493 | 3.6801 | 15000 | 0.2017 | 15.2111 |
| 0.0221 | 4.0481 | 16500 | 0.2220 | 15.3297 |
| 0.0245 | 4.4161 | 18000 | 0.2217 | 15.7953 |
| 0.0245 | 4.7841 | 19500 | 0.2284 | 15.2865 |
| 0.0115 | 5.1521 | 21000 | 0.2452 | 14.9110 |
| 0.0131 | 5.5201 | 22500 | 0.2391 | 14.9674 |
| 0.0137 | 5.8881 | 24000 | 0.2388 | 15.5083 |
| 0.0058 | 6.2561 | 25500 | 0.2544 | 14.8847 |
| 0.0063 | 6.6241 | 27000 | 0.2539 | 14.8627 |
| 0.0062 | 6.9921 | 28500 | 0.2661 | 15.2250 |
| 0.0047 | 7.3602 | 30000 | 0.2638 | 14.8086 |
| 0.0038 | 7.7282 | 31500 | 0.2751 | 14.9571 |
| 0.0018 | 8.0962 | 33000 | 0.2799 | 14.6197 |
| 0.0026 | 8.4642 | 34500 | 0.2776 | 14.8656 |
| 0.0014 | 8.8322 | 36000 | 0.2874 | 14.5150 |
| 0.0015 | 9.2002 | 37500 | 0.2841 | 14.5055 |
| 0.0022 | 9.5682 | 39000 | 0.2965 | 14.6570 |
| 0.0005 | 9.9362 | 40500 | 0.3031 | 14.4089 |
| 0.0003 | 10.3042 | 42000 | 0.3126 | 14.2486 |
| 0.0003 | 10.6722 | 43500 | 0.3153 | 14.2325 |
| 0.0002 | 11.0402 | 45000 | 0.3122 | 14.1622 |
| 0.0001 | 11.4082 | 46500 | 0.3201 | 14.1593 |
| 0.0001 | 11.7763 | 48000 | 0.3222 | 14.1454 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.7.1+cu128
- Datasets 3.6.0
- Tokenizers 0.21.4
- Downloads last month
- 5
Model tree for samil24/whisper-large-turkish-v6
Base model
openai/whisper-large-v3