Complimentery tokens: GPT-5.4-mini is charged for 1m token group rather than 10m token group

Hello,

As the title says, I get charged for using more than 1M tokens when using GPT-5.4-mini, and this does seem to be shared with GPT-5.4. However, the popup in the settings clearly states:

You’re eligible for free daily usage on traffic shared with OpenAI.

  • Up to 1 million tokens per day across gpt-5.4, gpt-5.2, gpt-5.1, gpt-5.1-codex, gpt-5, gpt-5-codex, gpt-5-chat-latest, gpt-4.1, gpt-4o, o1, and o3

  • Up to 10 million tokens per day across gpt-5.4-mini, gpt-5.4-nano, gpt-5.1-codex-mini, gpt-5-mini, gpt-5-nano, gpt-4.1-mini, gpt-4.1-nano, gpt-4o-mini, o1-mini, o3-mini, o4-mini, and codex-mini-latest.

Usage beyond these limits, as well as usage for other models, will be billed at standard rates. Some limitations apply.

11 Likes

I have the exact same issues, but my limit is up to 2.5m for smaller models and 250k for bigger models. Looks like something is wrong on OpenAI side of things

1 Like

same issue here. Charged after use 1M tokens

1 Like

Same issue. I confirmed with 2 days of data that it starts charging at the large model cap despite the documentation that it is in the small model grouping.

1 Like

I can confirm the same issue on my side. It started for me about an hour after the official release. It seems gpt-5.4-mini may be sharing the 1M complimentary pool with gpt-5.4, as if it were still being billed under the non-mini route. I already opened a support ticket and I’m currently in contact with support, with screenshots and a detailed explanation already provided.

1 Like

Same applies for gpt-5.4-nano. the model should have 2.5m or 10m complimentary tokens, but it only has 250k or 1m.

1 Like

I’m using GPT 5.4 nano and mini on complimentary daily usage but they are being charged.

Not the first time I have this problem with new models

1 Like

Will bump this to say I am also having the exact same issue.

GPT 5.4 Mini & Nano were correctly under the incentivised tier for myself until the 26th March 2026 and now being charged but the main GPT 5.4 still under the incentivised tier.

If this can be fixed would be great.

2 Likes

Is there a way to see on some place if somebody is working on the fix for this? Or will somebody let us know when it’s fixed?

1 Like

I raised a ticket and had a reply by a human saying I should be set up with free tokens. Asked for a load of stuff provided back. It’s not due to the mini tokens being charged in the 1m token group though. I’m well under.

There was a further bug for me yesterday that the main GPT 5.4 model changed to be charged as well.

1 Like

Same issue here, but around 0.5M

You’re eligible for free daily usage on traffic shared with OpenAI.

  • Up to 250 thousand tokens per day across gpt-5.4, gpt-5.2, gpt-5.1, gpt-5.1-codex, gpt-5, gpt-5-codex, gpt-5-chat-latest, gpt-4.1, gpt-4o, o1, and o3

  • Up to 2.5 million tokens per day across gpt-5.4-mini, gpt-5.4-nano, gpt-5.1-codex-mini, gpt-5-mini, gpt-5-nano, gpt-4.1-mini, gpt-4.1-nano, gpt-4o-mini, o1-mini, o3-mini, o4-mini, and codex-mini-latest.

Usage beyond these limits, as well as usage for other models, will be billed at standard rates. Some limitations apply. Learn more.

1 Like

same issue… i’m making a giant stink about it and going back and forth with support, who doesnt seem to know what models are in the program given the list they sent me… it’s insane they can’t just get someone to check on this and update the support kb.

I have been told it is has gone into a further review and giving me tips to plan for token usage…

Are they being serious, imagine this was another company charging for a product they said was free.

1 Like

i’m pissed. support is not handling this well at all. this is a them problem and they keep trying to make it a me problem. they sent me an out of date list of models today. it’s not okay.

I use GPT-5.4 and not GPT-5.4-mini. And I’m wondering why they make things so damned complex for you guys…

2 Likes

i use 5.4 through codex. this just runs little dinky automations all day. i wanted to use the complementary tokens for it with 5.4 mini, and did for 2 days, until i found they weren’t holding up their end of the bargain with the 10M free with data sharing at my tier. swapped it back to 4.1-mini and all is going fine again. they’re just lying about the usage you can use on the data sharing page and nobody seems to be able to do anything about it. i’ve started pinging employees that follow me on x at this point. support is just making me angry.

1 Like

I understand that you’ve identified a specific example on March 29, 2026 where requests using the same model, project, API key, and similar configuration were split between incentivized-tier and default, and you’re asking why this occurred and whether those charges are correct.

I appreciate you providing those details, and I’ll help clarify this for you.

I’ve reviewed your request and taken a closer look at the example you shared, including the request IDs and usage pattern.

Based on how the data sharing incentives program is applied, each request is evaluated independently at the time it is processed. This typically occurs when eligibility conditions are applied at the request level, including quota evaluation and how usage is classified at processing time.

In the example you provided:

  • The request at 06:01 (XXXXXXXX…) was classified under the incentivized tier

  • The subsequent requests (07:09, 07:18, 07:22, 08:24) were classified under the default tier

Although these requests appear similar in configuration and token size, the classification difference is consistent with request-level evaluation behavior, where eligibility is determined per request rather than across a batch or workflow.


I am disgusted how this is being handled, I give them 5 examples of the exact same process. One is marked as incentivised tokens and the other as default (Charged) giving me boiler plate responses after asking for exact responses.

I know there is open ai staff around here as well. No help given at all?

Thank you for reaching out to OpenAI support and for sharing the screenshots and additional context regarding your token usage.

We understand your concern about being charged unexpectedly, especially when using gpt-5.4-mini under the impression that it should fall within a specific token tier.

Upon checking your account, we can confirm that your usage exceeded the daily free tier limit on March 22, 2026. As a result, the usage that went beyond the complimentary token allocation was billed, and the charge appeared at the end of the day, processing on March 23, 2026.

While we understand that you’ve enabled shared usage and believe the tiering should align with the 10M token group, the billing system calculates based on actual usage across projects and applicable token thresholds. Our pricing and limits are applied according to the documented structure and may result in charges if complimentary limits are surpassed.

We appreciate your vigilance and for pointing out the ongoing discussion in the community. Feedback like yours is valuable as we continue refining transparency and clarity in usage reporting and billing.

Let us know if you’d like us to walk you through your usage breakdown or assist furth

I am also very disappointed to their reply

Guys it would be nice if you could use formatting to make the quotations stand out more clearly.

Eg Markdown Cheatsheet · adam-p/markdown-here Wiki · GitHub

Or even: Markdown Cheatsheet · adam-p/markdown-here Wiki · GitHub