Suddenly the 32,768 token context window on GPT4.0o went down to 8,192, why?
I’m trying to do some AI personality development, but I can’t keep all the protocols I want loaded with the smaller context window. Just noticed it when it stopped following the initial guidelines I set at the start of the chat instance.