GPT-4.1 Supports 1M Token Context—ChatGPT Still Capped at 32K

I’ve been a long-time ChatGPT Plus subscriber and began using GPT-4.1 in Chat as soon as it was released.

One of GPT-4.1’s core features is its support for a 1 million token context window—an upgrade that dramatically expands what the model is capable of. But in ChatGPT, GPT-4.1 is still restricted to 32,000 tokens. This is the same limit as GPT-4o and far below the model’s full capacity.

While the 1M context window is available through the API, this distinction was never clearly stated in the ChatGPT UI, subscription page, or original rollout announcement. That’s a serious issue for transparency, especially for paying users.

I’m not just requesting clarity. I’m asking OpenAI to:

  1. Update ChatGPT to support the full 1 million token context.
  2. Clearly label current token limits within the ChatGPT UI.
  3. Share a roadmap for when (or if) full-context support will be brought to ChatGPT users.

If the model supports it and the infrastructure already delivers it through API, paying users of ChatGPT should not be left behind. We’re a significant part of OpenAI’s user base and deserve access to the full capabilities we’re funding.

3 Likes

What do you think is going to happen if they made ChatGPT support 1 mil context ? If ChatGPT is free to use and there is no tiering, what’s subscription for ? Why would they post a page on model comparison, token charges of different models ?

Don’t think it will ever be flat rate. At the API level, we see that there is a different in cost for the different models.

Just my thoughts.

OpenAI really needs to do better at differentiating what’s available in ChatGPT versus the API models.

The 1m context window is only available using the API models, which can be accessed in the playground after you deposit $5 in credits.

I think this is unlikely. People notoriously run conversations for very long periods of time, and every individual new message means reprompting the AI with the entire context all over again. They also can’t have all those law firms switching from the API to ChatGPT to summarize all their documents and save incredible amounts of money.

However, they’re still responsible for how they market their products and conduct business, and this is only one of a few problems they currently have.