Chat Limits too low! 40 message per 3hours on GPT4?

The usage limit on GPT 4 seems to be less than the GPT3 . I just got a notification that i can post 40 message per 3 hours. I was setting up a custom chatbot my limit was used in no time. I think this is way too low to be using the app efficiently. The message speed has also been extremely slow. I don’t see this worth the fee right now. Is this a bug/ temporary thing ?



GPT-4, especially with GPTs, is computation-intense. When paying on the API for the actual data services, the increase between 3.5 and 4 is 10x the price.

$20 were it even for 40 questions per day in each of 30 days is a bargain compared to API costs - and you get that many times per day.

Perhaps something like 60 per 6 hours might align better with actual patterns of use, while still keeping off abuse of computation resources.


I asked 19 questions in a period of 2 and a half hours and then I was restricted today.

This doesn’t align with the 40 questions/ 3 hours limit.

$20 a month is really not worth it for only 19 questions…


The same problem, why is there such a strong restriction? In my opinion, the paid version should give more than 40 requests in 3 hours. And here you don’t even have time to do everything necessary, they immediately give you a restriction.


Same here, I received the message about the limit long before reaching the so-called 40 questions, maybe around twenty!

So, not only are 40 questions per hour already very few, but the fact that even these are not respected devalues the paid product!!

One pays for the premium product expecting better performance, not worse…


I had not used chat gpt all day, Tonight I asked about 20 questionsand got restricted until midnight, a wait of around 3 hours. That doesn’t seem right when you’re paying $20 per month for a premium experience. Makes you want to unsub and try competitors.


40 message is way too low. I wouldn’t mind if it was 40 questions or 40 messages per hour but 40 messages/3 hours is nowhere near enough if you need to hone the results or deal with ChatGPTs laziness or if it goes down the wrong path. It often takes me 5 - 10 follow-up messages to get the exact answer I need.
Ask it anything to do with Powershell/Microsoft 365 and it’ll take 5 goes before anythings remotely sensible comes out.
And now that it’s gotten lazy, it takes a lot more extra prompts to deliver anything actually useful.
I asked it 5 questions today. One of them I had to totally rewrite because it misunderstood what I wanted and two others had ‘network errors’ so had to redo them after a good 4 or 5 subsequent honing q’s. That’s it. I’m now locked out for another 3 hours after not getting any suitable results of any of the questions I asked it. I’m pretty sure I was closer to 20 messages, not 40.

Sure, some of today’s issues were GIGO user error but even then, the limits should cater for people learning how to ‘git gud’ at it.


Can we please have a confirmation…
What counts as a message?
Is that just the initial question or does it include all the subsequent honing messages?
What if I don’t like the answer and ask it to redo it? Does that count as a message?
What happens if I get a network error or something similar screws up the results?


Yeah this is just stupid. I understand there needs to be some limit but this one is way too low. Either this gets adjusted or I’m looking for other options.


Honestly the limit has made me not want to use Custom GPTs. I have a set of personal Assistants that help me in my daily life.

I can’t even use them as they have a halved limit.

I’m seriously treating this like a scarce resource and it has ruined the magic.


The limit is somehow not working. Today I got restricted after something like 20 single prompts. There seems to be a bug in counting. At least increase the limit so it spans across a bigger time-frame (for example 8 hours)

I’ve literally just asked 10 questions and was cut off and have to wait 2 hours for it to recharge.

What the heck is going on? It wasn’t like this a couple of weeks ago. For reference, I’m using GPT-4.

This is extremely frustrating.

1 Like

also got capped after 15 messages. Regenerations after network errors also count I guess. That would be 20 messages then. 5 network errors in 15 messages - seems like there is something broken…

1 Like

Why is the limit every 3 hours and not just a daily use cap?

I set up a few GPTs for my writing (story bibles, prose editor etc) and while I am enjoying them, the 40 messages in 3 hours is insanely low. I only have 2-4 hours that I can dedicate to writing a day, maybe more on the weekend if I’m particularly focused/not doing things. I work, sleep, do life things - That’s so many clusters of 3 hours (and 40 messages) that I will never use in a day.

It makes it particularly frustrating when I finally sit down and start working, get into a flow and then BOOM! no more messages for me. Reached the limit.

A daily cap would make a lot more sense, and be a lot more fair. Especially when I’m constantly hitting refresh as it runs into errors, or I need to remind it of the editing goal a few times etc.

Either way, the devs need to make this limit super clear on their subscription tier; personally, I never would have paid for GPT Plus if I knew about these limits beforehand. I would have stuck with POE.

1 Like

what’s even more frustrating is when prompts have to be reasked because the first response was broken!
So many times today the code was only half returned in a code window like this:

and then I get the dreaded nework error!


Message cap down to 20 / 3h again for me right now… thats so annoying.


Not even 40 now, capped after 20 messages :nauseated_face:


Same for me here, capped after exactly 20 messages, not 40.

1 Like

So is this a temporary thing or this is the way the app will operate from now on? Anyone knows?

1 Like

Custom GPTs have a halved limit (20) so I’d recommend to anyone to use the regular ChatGPT (40) instead.

The limits have been dropped without any warning numerous times before. They were alleviated to a more reasonable rate after a couple of weeks.

Saying OpenAI is stretched thin would be a understatement right now.

1 Like