Is oversampling important in fine tuning?

I plan fine tuning a model for classification. I have quite a few classes, but some of them have too few samples for training. Should I use oversampling (duplicating samples) to achieve balanced training set, or is it only a waste of money with GPT?