All of the sudden json parsing response with json mode is failing because the json string is wrapped with ```json
{
…
}```
anyone else seeing this?
Yes we just started seeing the same thing about 15 min ago - every single one of our calls fails because it’s wrapped in formatting instead of returning pure JSON.
Same problem here since 20 minutes ago, we are applying a workround in our code to extract the json from the string while they don’t fix it.
It is not working in the playground either:
Yes, I saw this as well. And had to quickly implement and deploy a workaround. I’d love to see an official response to this from OpenAI explaining what happened here. It seems to be ignoring response_format={ "type": "json_object" }
.
We started seeing issues with structured outputs at 12:42.
Happening to us too. Started a little bit ago. We have backup on Azure OpenAI and those are processing correctly. Seems to be an issue with OAI API
We are seeing the same issue with structured output. Working fine on Azure OAI though and fallback is working fine
Shoot, alright we’ll thanks for responding. Is this the new norm or a bug? It’s gotta be a bug.
Has to be a bug. Surprising though because usually openai is pretty on top of their status.openai page
I’m guessing no alarms are going off on their end, as the responses are still 200s. But definitely has to be a bug.
They seem to be on it now OpenAI Status - Invalid Structured Output Responses
Anyone think there will be a post-mortem for this one?
Not to gloat or anything, but it really is mandatory to do a basic cleanup of any responses.
In case of JSON, you should be looking for the opening bracket, then scan the string, do a bracket up/down count until it goes zero, and discard the rest of the string.
Also noticed occassioanl response json body without the bracket, instead of {“key”: “value”} just "key:“value”.
I’m not exactly sure what that example is supposed to demonstrate. That request should fail because neither the system message, nor the user message, specify that the output should be in JSON, or a JSON object.
I like to do this with a validation check using “json.loads” followed by a json_schemas validation check, which can invoke a gpt-4o-mini call to repair output by providing the failed call output and the intended json schema.
This topic was automatically closed 2 days after the last reply. New replies are no longer allowed.