Continue fine-tuning from a fine-tuned model

Hi everyone, you can now continue fine-tuning an existing fine-tuned model!

You can give it a try by referencing an existing fine-tuned model name rather than ada/babbage/curie/davinci. For example:

openai api fine_tunes.create -t <TRAIN_FILE_ID_OR_PATH> -m <YOUR_MODEL_NAME>

You can find details in our documentation, we’ll add more tips and examples as we learn more about how folks are using this!

8 Likes

Some people have asked if it’s possible to fine-tune models like TEXT-DAVINCI-002

4 Likes

We’re working on adding a fine-tunable version of text-davinci-002.

We likely won’t have a fine-tunable version of each base model (e.g. text-davinci-001, text-davinci-002, etc.), we’re instead focusing on delivering the best fine-tuning model for each given capability level.

5 Likes

‘Continue fine-tuning from a fine-tuned model’ feature is a type of continuous learning or the new data for the second iteration of fine-tuning will be merged with (cached) previous data used for the first iteration of fine-tuning and then the merged data will be used to fine-tune a pre-trained model?

3 Likes

Can a fine-tuned model be reasonably fine-tuned again indefinitely, or is doing so likely to result in problems? Basically, can fine-tuning be used for iterating learning?

5 Likes

Any update on when this might be available? My company is in the process of developing a fine tuned model, but we’re realizing that fine tuning davinci isn’t financially justifiable since text-davinci-002 delivers equally good (and sometimes better) results than the fine tuned model (which is more expensive to use). Any advice or insight would be greatly appreciated, thanks!

I have tried and realized that this is useless! I thought it would give the ability to extand the existing data of the previous fine-tuned model but it re-starts a new other model from scratch. Means the previous dataset of the fine-tuned model is not included

2 Likes

When you fine-tune an existing model, do we send a new JSONL with just the new prompt-completions or we add the new prompt-completions to the same previous file (which has all the previous + the new ones)?

I tried with just the new ones, and it seems it completely forgot about the old ones. I also tried adding the old ones to the new one, and the cost it showed me was for the whole file (old + new). What am I missing :question:

2 Likes