Currently getting a deluge of 429s in a row

I’ve implemented an API integration that is compliant to the 429 backoff delays, and currently i keep getting this in my logs:

Retrying in 1267.2214920773931 ms
Retrying in 2403.5505048134955 ms
Retrying in 4091.2252655656825 ms
Retrying in 8350.424172878827 ms
Retrying in 16170.189194991199 ms

As you can see, i back off quite nicely, add some random number to make it even more polite, yet i still get 5 429’s in a row.

This is far from an isolated incident.

On top of this, response speed has fallen off a cliff, with the average response time for a ±4k token request averaging around 20-30 seconds.

This makes using the API quite… painful.

Can devops do something?

3 Likes

I have the same problem… Cannot get a single completion. Far away from reaching my daily limit.

Status: 429
Status Text: Too Many Requests
Data:

error: {
    message: 'The server had an error while processing your request. Sorry about that!',
    type: 'server_error',
    param: null,
    code: null
  }
Retrying in 1782.7578082978512 ms
Retrying in 3040.943940289724 ms
Retrying in 4165.420140334169 ms
Retrying in 8189.548413993593 ms
Retrying in 16468.125458574475 ms
Retrying in 32170.131443313177 ms
Retrying in 64043.089451589505 ms

Same. Friend is using a paid account and getting the same too. In fact, they are charging him for prompt tokens! OpenAI literally stealing money at this point. Ridiculous.

Unable to use API as well. 429 Error with “The server had an error while processing your request. Sorry about that” message. Paid Account.

Been having this issue using both API and Playground since morning. Paid account here as well.

API unusable at this point. Perhaps scaling Bing the culprit?

Are the limits per IP or key, or both?

Wow… what are the people at OpenAI doing? See the screenshot that I can’t even post replies unless I write them longer.

Discourse has nothing to do with openAI… stay on topic blz ser

OpenAI status page suggests they are aware of this.

UPDATE: Just checked on my end. There’s an ongoing issue with text-davinci-003 both on API and Playground.

1 Like

That was yesterday, buddy. Still waiting for a response.

2 Likes

I have the same issue getting a RateLimit error but if I log on account I can see that I’m far from reaching my maximum. Hope they are going to fix this soon. It’s also a little unexpected to see the fee’s going up when you’re getting an error from the server.

2 Likes

Those are text-moderation calls and have no charge… FYI…

1 Like

Wow… what are the people at OpenAI doing? See the screenshot that I can’t even post replies unless I write them longer.

This is to reduce spam, the idea is that spammers will not take the time to write something longer, sorry you got hit by this!

Overall, we are aware of the 429 issue, it seems to be recovering, sorry for the delayed response.

4 Likes