I am on usage tier 1 but 3.5-turbo limit is still on 500. It should be 3500 as per the tier documentation. Can someone please let me know what’s the issue? I have also been facing some issues with Embedding model Too Many Requests error even though my requests are way below the limits.
It seems like a message through the help.openai.com bot is in order (the only path for API support is “feedback” last I checked). Inform them of how you arrived at tier-1, how you arrived at a new limit, and it may be an issue with the billing systems that could affect others.
(or it could be a new ploy to motivate more user’s cash to be turned into funbucks.)
The documentation doesn’t seem to be getting better, but at least it’s got a dark mode now
To get more out of your RPM limits for now, at least for the embeddings, you can try batching; you can send multiple strings as an array of strings, that should be a single request.