With gpt-4o being deprecatiated - what does that mean for gpt-4o fine tunes, given there is no fine tuning on 5?
OpenAI offered the API a “here’s the same version we run on ChatGPT” for experimentation. Only that copy of what ChatGPT was running was shut down.
gpt-4o-2024-08-06, the version that can be fine-tuned, has no deprecation announcement and no hint that it would go away, except the ChatGPT evidence that OpenAI doesn’t like offering delusion-causing AI models as their own product very much.
Hi @Ash_Alliants and welcome!
The official documentation states that as of 13 Feb GPT-4o and 4.1 models are retired for ChatGPT purposes only. Regarding API (inc. finetuning) there is no information nor publicised plans:
They will continue to be available through the OpenAI API, and we’ll provide advance notice ahead of any future API retirements.
I am confused as there is a deprecation date set for 4o latest 17th February
Pick one of the non-deprecated models without strikethrough:
chatgpt-4o-latest
gpt-4o-2024-05-03
gpt-4o-2024-08-06 & fine-tuning
gpt-4o-2024-11-20
Your code can also surface data from this Python dict I made;
some fine-tunings have been allowed past official shutoff,
other deprecation announcements haven’t made any distinction
about their fine-tuning variants.
DEPRECATIONS = {
"gpt-3.5-turbo-instruct": {"shutoff": "2026-09-28"}, # not known to be offered for API fine-tuning
"babbage-002": {"shutoff": "2026-09-28", "ft_shutoff": "2031-09-28"}, # new FT training disabled 2024-10-28; FT persistence after shutoff not specified -> +5y
"davinci-002": {"shutoff": "2026-09-28", "ft_shutoff": "2031-09-28"}, # new FT training disabled 2024-10-28; FT persistence after shutoff not specified -> +5y
"gpt-3.5-turbo-1106": {"shutoff": "2026-09-28"}, # ft_shutoff - undocumented end of fine tuned models
"gpt-4-0314": {"shutoff": "2026-03-26"},
"gpt-4-1106-preview": {"shutoff": "2026-03-26"}, # ft supported - could have been fine tuned
"gpt-4-0125-preview": {"shutoff": "2026-03-26", "alias": "gpt-4-turbo-preview"}, # maybe not offered for ft?
"gpt-4o-audio-preview-2025-06-03": {"shutoff": "2026-03-24"},
"gpt-4o-mini-audio-preview": {"shutoff": "2026-03-24"},
"chatgpt-4o-latest": {"shutoff": "2026-02-17"},
"codex-mini-latest": {"shutoff": "2026-01-16"}, # still working 2026-01-20
"o1-mini-2024-09-12": {"shutoff": "2025-10-27", "alias": "o1-mini"},
"gpt-4o-audio-preview-2024-10-01": {"shutoff": "2025-10-10"},
"o1-preview-2024-09-12": {"shutoff": "2025-07-28", "alias": "o1-preview"},
"gpt-4.5-preview": {"shutoff": "2025-07-14"},
"gpt-4-32k-0613": {"shutoff": "2025-06-06", "alias": "gpt-4-32k"}, # deprecation table separately lists "gpt-4-32k"
"gpt-4-32k-0314": {"shutoff": "2025-06-06"},
"gpt-4-1106-vision-preview": {"shutoff": "2024-12-06", "alias": "gpt-4-vision-preview"},
"gpt-3.5-turbo-0613": {"shutoff": "2024-09-13", "ft_shutoff": "2029-09-13"}, # doc notes FT models not affected (still OK); FT persistence not specified -> +5y
"gpt-3.5-turbo-16k-0613": {"shutoff": "2024-09-13", "ft_shutoff": "2029-09-13"}, # doc notes FT models not affected; FT persistence not specified -> +5y
"gpt-3.5-turbo-0301": {"shutoff": "2024-09-13"},
}