Lifecycle of finetunes: how long until current finetunes will be deprecated

Is there any public commitment to lifecycle of gpt-4o-mini and gpt-4o finetune? Is the expectation we need to re-train new finetunes soon, or is there any commitment how long they will be available? If no, any educated guess? Old models have deprecated surprisingly quickly.

1 Like

I’d say it is probably difficult to put an exact timeline to it. But I’d imagine that fine-tuned models will remain available for consumption for a significant period in time.

The approach to older models is a good case in point in my view. While the fine-tuning itself is set to be deprecated for these, the already fine-tuned models remain available. I’d expect a similar stance for gpt-4o-mini and gpt-4o fine-tunes.

image

2 Likes

Thanks! I had misunderstood the notification, and I thought the finetunes would go also away if the base model is no longer running in production.

1 Like