Chat GPT Plus Context window

As a Plus member, it’s frustrating that the context window is so small, even with GPT-4o, which is supposed to be a larger model.

I actually prefer using GPT-4o mini—it works great! But the limited context window is still a major drawback, even though the mini model should be lighter and more efficient.

It feels like OpenAI isn’t prioritizing Plus members. We’re paying for access to GPT-4o, but we only get up to 8k tokens in the best-case scenario, while the talk is always about the 128k token capacity for GPT-4o and GPT-4o mini.

If the mini version is so lightweight, why not offer Plus members a larger context window with it? It would be a way to give us more value, especially since we’re not seeing a significant difference in benefits compared to free members. Some of us really need that extra capacity, even with the mini model, and we’re paying for it.

5 Likes

I agree. ChatGPT is my favorite model in class, but the small context window lags behind its peers and makes it unusable for some applications. 8k is fast becoming outdated. It would be great to get an upgrade on context window size, at least for Plus subscribers. I may shift my paid account to a different AI with higher context until ChatGPT catches up.

2 Likes

I think for free users the context window should be 32k tokens. For Plus users it should be 64k tokens. For Team and Enterprise users it should be 128k tokens.

1 Like