ChatGPT Plus and Usage Limits

I have a ChatGPT Plus Subscription and today I started to create my own GPTs.

By doing so, I received this message: “You’ve reached the current usage cap for GPT-4, please try again after…” and a link to “Learn More”. When I follow this link, I find this information: “To give every Plus user a chance to try the model, we’re currently dynamically adjusting usage caps for GPT-4 as we learn more about demand and system performance.”

First of all, I wasn’t aware of the fact that there are limits for ChatGPT Plus Subscriptions. This was not clearly stated anywhere.

On your website (, which provides information about ChatGPT Plus, you don’t say a single word about any limits or restrictions.

Why are you unilaterally changing the contract terms? As a paying customer, I was not informed about this practice. Changing contract terms (e.g. varying limits) requires the customer’s consent - at least, that’s what European consumer protection law states (I’m a EU citizen).

How can I monitor current limits and how much I have already used? Finding out the limit just by trial and error is not particularly helpful. This is a very user-unfriendly practice.

If this limitation and intransparency persits, I ask myself: What’s the benefit of a ChatGPT Plus subscription or to put it this way: Why I am paying if I still have a limit of messages?


The usage cap of GPT4 is indicated in the GPT4 FAQ.


Two things:

  1. Why this was not clearly stated when deciding for a subscription? And I’m not sure, if this regulation was already there when my subscription started last year.
  2. As I found out now, I have a limit of 40 messages per 3 hours. At the website you linked, it says 50. So, the statement on the website is practically worthless.

Yes. The information on the official website is meaningless.
Although they actually dynamically control the limit on the number of interaction times depending on the demand on the server,
It’s hard to find where in the official blog that is mentioned, unless you bookmark it when you find it.
Actually they might be serious, but in reality they are actually poking fun at their users.


Hi, I had access to get 4 and turbo but now I don’t, did I run out of tokens? how can I get it it back?

1 Like

Hi, I had access to get 4 and turbo but now I don’t, did I run out of tokens?
I had access to get 4 and turbo but now I don’t, did I run out of tokens?
How can I get it back?

Do you mean the GPT-4 and “turbo” models that I can access through the API or do you mean the ChatGPT Plus?
Or you mean ChatGPT Plus?
ChatGPT Plus has a 40 message per 3 hours limit for now.
You can click on GPT-4 on the ChatGPT screen to see the conversation limit, but it changes depending on the demand at the time.
Right now the tokens for responses seem to be very short.
I don’t think it matters if you are interacting via the API…

I was not aware that they had changed the plan to a dynamic cap. I think I not the only one surprised, here a user mentions that GPTs have a lower usage cap Error in input stream... all day long - #12 by joshuad, but perhaps they where just experiencing the dynamic usage cap in play.

To OpenAI, please don’t be like Tinder. In any case, you need to let us know when we are approaching the cap and how many messages remain. On a few occasions I have run in to the cap after an hour or two and been unable to save my work from the code interpreter. By the time the usage resets, the code interpreter has expired along with any files. I have lost a good few hours time from this, and it makes me nervous about using the advanced features. If I am not using those features, I am better served with the api.


I agree with you. I also had only 40 messages but the webiste said 50. I confused and unaware of it. I want more usage


I noticed this change on Monday – it went from 50 messages per 3 hours, to 40.


We pay for this service and now they’re downgrading the usage capacity?!?!

I’d gladly pay for a higher capacity limit, it’s insane there are not tiers for different plans.


I wish there was no limits but I guess it’s too expensive for ChatGPT to allow that. I’ve left multiple feedback on them giving us a counter on how many prompts we have left. I hate it when I’m using it for work and then half way through what I’m doing I reach my limit and everything is wasted cause I can’t continue


This was the message when you “pay for the service” eight months ago:

at the same price. No outrage then at “we haven’t paid any more for this service, and now they’re upgrading the usage capacity”. ChatGPT Plus existed before the existence of GPT-4 was revealed.

1 Like

I would be fine with 40, but it’s actually only 23-24 for custom GPTs (not even the old 25!) which they should at least make clearer. It’s especially frustrating because, as of late, at least 3-4 generations get wasted on random network errors that make the entire generation useless. I could use regular Chat GPT, but I’d rather use the custom one as I feel I get better results with it, and I don’t understand why that has an arbitrary lower limit.

ChatGPT Plus may have been for unlimited 3.5 initially, but absolutely no one is subscribing because of that now, not when there are so many free alternatives to 3.5 that are at least as good as that model.


I agree with comments above about the need to flag how much usage we have left. I hope they do soon.

Personally I’d be interested to see how my custom GPTs behave with 3.5 (which would presumably be effectively unlimited), so switching between the two models would be interesting, particularly if we could also influence the temperature ie ask it to be less creative.

1 Like

How big are your instructions & what tools are you using? It could be that using tools like retrieval either counts multiple times, or even that the number is an estimation based on token usage.

1 Like

Wait, what? I seem to require twice as many prompts with the custom GPT too, and was getting really frustrated at getting the time-out prompt so quickly. I already cancelled my sub because it is effectively a useless service for me with the current limits.

I personally need lots of short simple prompts and responses, as large prompts and responses aren’t very compatible with my ADHD brain, so a message limit instead of a limit based on compute time seems to be disproportionately hostile to my use case.


I thought tokens may have been a possibility as well, but it makes no difference if I get 1024 tokens or 50. I also tested extensively with bare-bones GPTs with no special tools. I only get a maximum of 24 generations unless I use regular ChatGPT-4.

I’m also starting to suspect that editing and saving a custom GPT counts as a use even if it’s done manually, but I’ll have to test that a bit more to confirm.

1 Like

I see now that limit usage, today I’ve seen 40 message/ 3 days, It’s possible.


I also have 40 messages / 3 DAYs and this is unacceptable


I just heard about the 40 messages/3 days limit and checked it myself;
it seems to be true.
well, you know what that mean…

Either they want to push out users who don’t use the API, or they want to set a price point between Plus and Enterprise.


I agree with the above comments. If we sign up to pay for a service, they shouldn’t revise the user quotas without communicating this to all users. Personally, I didn’t sign up for the restrictive usage caps that are now enforced.

It does feel like they’re changing the rules to deal with their own evolving situation. I appreciate they’re offering a rapidly forming service, but if they reduce the quality of service offered, they should be reducing the charges to customers.

Most of my prompts are to correct mistakes in the previous response.