Bug in tier limit and RPMs


I am on usage tier 1 but 3.5-turbo limit is still on 500. It should be 3500 as per the tier documentation. Can someone please let me know what’s the issue? I have also been facing some issues with Embedding model Too Many Requests error even though my requests are way below the limits.

This started happening ever since I was moved to prepaid.

1 Like

Are you still using your old API key?

Sometimes generating a new one helps :slight_smile:

Nope didn’t work. Issue is the complete usage tier is wrong, it should be 3500 not 500 for 3.5-turbo.

1 Like

Tier 1:

gpt-4 500 10,000 10,000
gpt-4-turbo-preview 500 - 300,000
gpt-4-vision-preview 80 500 10,000
gpt-3.5-turbo 3,500 10,000 60,000
text-embedding-3-large 500 10,000 1,000,000

It seems like a message through the help.openai.com bot is in order (the only path for API support is “feedback” last I checked). Inform them of how you arrived at tier-1, how you arrived at a new limit, and it may be an issue with the billing systems that could affect others.

(or it could be a new ploy to motivate more user’s cash to be turned into funbucks.)

Indeed, looks like there’s an inconsistency :thinking:

The documentation doesn’t seem to be getting better, but at least it’s got a dark mode now :laughing:

To get more out of your RPM limits for now, at least for the embeddings, you can try batching; you can send multiple strings as an array of strings, that should be a single request.