Yesterday GPT-4-Turbo stated that as of February 28th, 2025 GPT 3.5 has been phased out. Interestingly enough, the GPT dialogue still makes references to 3.5 as the active, free tier fallback version and likely will during the transition phase.
The latest GPT rollout appears to be allowing people to continue on a lower GPT-4 tier after reaching the free limit instead of implementing a hard cutoff. Hard cutoffs do exist for image generation and more resource intensive, “PRO tier” requests.
When asked however, what version of GPT is being used after the limit is reached, GPT-4 still doesn’t seem able to definitively tell you. It can only repeat that it’s operating on GPT-4. It also states that GPT-4 isn’t available for public use. It doesn’t appear to correctly distinguish between GPT-4o, GPT4-turbo, or GPT-4-mini. This appears to have been an ongoing annoyance. I’m aware it’s been mentioned here several times over the years.
There are definite differences in processing speed and response efforts pre and post limit. It would be really nice to be able to ask GPT what version it’s running and have it answer correctly/transparently. I’m requesting that this finally be allowed.
A clearly visible version label, either in the dialogue area or in a submenu would be even better.