Hi everyone,
I’m fine-tuning the Whisper Small model using the whisper-finetuning repository on GitHub. My dataset is only about 5 minutes long, but training takes more than 30 hours on Google Colab (T4 GPU).
Is that normal? Or could there be an issue with my setup or configuration?
Any advice on reducing training time or identifying the cause?
Thanks in advance to anyone who can help!