This is a symptom that is most commonly seen when using { "type": "json_object" }
as the type of text output format, instead of providing a strict schema along with json_schema
.
If doing so, or using “strict”:false, then your system prompting must be very elaborate and specific in mandating JSON as the only allowed response type, and if not sending by schema, lay out exactly the JSON format required.
Make it resilient enough to survive stealth changes by OpenAI in the AI model quality offered in the same model name…
If you don’t have a defined format but expect plain text, and might not even use functions, then this is a very bad AI output and model regression. Responses also doesn’t offer a logit_bias to counter tabs (\t) you’d never want.