As a Plus member, it’s frustrating that the context window is so small, even with GPT-4o, which is supposed to be a larger model.
I actually prefer using GPT-4o mini—it works great! But the limited context window is still a major drawback, even though the mini model should be lighter and more efficient.
It feels like OpenAI isn’t prioritizing Plus members. We’re paying for access to GPT-4o, but we only get up to 8k tokens in the best-case scenario, while the talk is always about the 128k token capacity for GPT-4o and GPT-4o mini.
If the mini version is so lightweight, why not offer Plus members a larger context window with it? It would be a way to give us more value, especially since we’re not seeing a significant difference in benefits compared to free members. Some of us really need that extra capacity, even with the mini model, and we’re paying for it.