Mentioning that the default 3.5-turbo model will be pointing to 1106. So did OpenAI change the plans or the default 3.5-turbo is indeed the 1106 version?
I did a quick test and for me, gpt-3.5-turbo shows up as GPT-3.5-turbo-0613 in billing. so it doesn’t seem to be an error.
turbo-1106 has different pricing, maybe they forgot and decided not to once they realized that.
Overall, I don’t think it’s a good idea to use the arbitrary endpoints anyways (“gpt-4”, “gpt-3.5-turbo”); it’s probably best to lock your model in.
The reason that the model alias pointers have not been updated is because there is a significant issue with functions and their encoding of UTF-8 in the previews, needing a new model to be trained.
Also a lot of pushback about the quality of these AI models. A switch would break many production apps that require useful output and not reduced instruction-following.