All my runs for messages that contains the role assistant when they are in_progress they suddenly stopped with the state failed
Run example: {"run_id": "run_Cn0J4OPCG8Y5J21h6nRd6PVq", "created_at": 1714745218, "status": "failed", "thread_id": "thread_p5H1XC3YDFybFG7Ljl11qEUt", "failed_at": "2024-05-03 14:06:58", "incomplete_details": null, "last_error": {"code": "server_error", "message": "Sorry, something went wrong."}
It’s a deterministic behavior so I seems to be a bug in the OpenAI side.
Btw, the message “Sorry, something went wrong.” is not useful as you can see. We are paying for this product and you guys are not making it easy at all.
1 Like
Same thing happening to me. Trying to figure out if its on my side because I see no outage reported.
I can’t even start up a chat, it just returns
last_error: { code: ‘server_error’, message: ‘Sorry, something went wrong.’ }
1 Like
We are paying for this product. And they does not provide a better error explanation. This is very bad. I would like to use another company LLM product at this point.
@_j I see you’re a regular contributor in the forum. any idea what could be happening here?
My assistant has been working fine for months, but now I’m never able to create a run. It stalls for a while and then enters the ‘failed’ state with a generic error message:
last_error: { code: ‘server_error’, message: ‘Sorry, something went wrong.’ }
Any help would be greatly appreciated!
This has been happening for the past ~24 hours. I was using gpt-4-turbo-preview
, but this error is happening no matter what model I use.
I also tried using a new API key but that didn’t fix it.
OpenAI had introduced some problem, where code interpreter that generated images was having that image placed as an image attachment to assistant (with a messages parameter not documented in chat completions). Future runs then failing.
Perhaps this is symptom of attempt at a solution?
My assistant is not using any code interpreters. No file search or anything else. Just normal functions.
So ya possibly related but unsure
Hi folks – this was a bad deploy on our end. Sincere apologies. We were trying to make some improvements to our truncation logic and introduced this bug. We’ve rolled things back and will fix things before we roll out again.
You shouldn’t see this errors going forward.
2 Likes
Awesome, thanks @nikunj for the fix!
@nikunj I’m getting the same error again? No chats can be created. Please help ASAP as this is mission critical for our clients. Thanks.
last_error: { code: ‘server_error’, message: ‘Sorry, something went wrong.’ },
1 Like