As a Plus member, it’s frustrating that the context window is so small, even with GPT-4o, which is supposed to be a larger model.
I actually prefer using GPT-4o mini—it works great! But the limited context window is still a major drawback, even though the mini model should be lighter and more efficient.
It feels like OpenAI isn’t prioritizing Plus members. We’re paying for access to GPT-4o, but we only get up to 8k tokens in the best-case scenario, while the talk is always about the 128k token capacity for GPT-4o and GPT-4o mini.
If the mini version is so lightweight, why not offer Plus members a larger context window with it? It would be a way to give us more value, especially since we’re not seeing a significant difference in benefits compared to free members. Some of us really need that extra capacity, even with the mini model, and we’re paying for it.
I agree. ChatGPT is my favorite model in class, but the small context window lags behind its peers and makes it unusable for some applications. 8k is fast becoming outdated. It would be great to get an upgrade on context window size, at least for Plus subscribers. I may shift my paid account to a different AI with higher context until ChatGPT catches up.
I think for free users the context window should be 32k tokens. For Plus users it should be 64k tokens. For Team and Enterprise users it should be 128k tokens.
I just need a context window indicator in chat sessions. It’s half 2025 and I just realized ChatGPT does not use full model context size, but is capped to 32k for all sessions.
We could have at least something like this embedded in chat interface: Tokenizer - OpenAI API
I just realized a long conversation session I have here reached 51k and maybe the model is now inferring what was mentioned at the beginning of the session.
We really need a context indicator as we need a model usage counter. so when we are about to reach the limit we know that we will lose the information from the beginning of the chat or move to a new session.
Instead of that, now we have a warning to not start new topics about ChatGPT here. Great!