Is it possible to fine-tune the babbage model with 2 or more training files

Is it possible to fine-tune the babbage model with 2 or more training files and how to do that?
Also which is the best way to upload a lot of data for finetuning, because I’m reaching the limit of 50 000 000 tockens?

For trainable base models (NOT including GPT-3.5) you can train an already trained model, so in theory you could do this many times. You simply specify and already trained model name as the model you wish to train.

1 Like

okay, but the results are getting worse in this case, how can I prevent the model from forgetting the old training data

How do you mean forgetting?

Fine Tuning is not a way to get the model to remember facts or information, it’s a way of teaching the models new ways to think, new styles, new patterns to recognise. It teaches how to write like an author, not the authors works.