GPT-4 Token Limit Reduced?

I am getting a strange response from GPT-4 Browsing and GPT-4 Default. I am getting size limitation errors on prompts far below 8K. Is this just the result of today’s server issues, or has anyone else been noticing this?

I asked it to summarize a 4K tokens (4,512 exactly) document in 3 sentences.

Same here – GPT-4 ChatCompletion API call with max_tokens=1024 , and my messages size is ~ 4000, and I get an error saying that my messages plus completion size exceeds the maximum context length of 4096.
I don’t think I ever had 8K tokens context size available.

I don’t get it. Everyone’s anxiously awaiting GPT-4 32K, and we’ve got GPT-4 and aren’t even getting 8K?

So… I created a new key, and now I’m seeing 8K context.
My old key was from April.
Not sure if this explains it.

I think it might have been a glitch in the system. I went to playground and pasted in 4.5K text with 2K maximum length and it worked without issue. GPT-4.

No, I take that back. Are the full 8K GPT-4 tokens available on ChatGPT?