AMA on the 17th of December with OpenAI's API Team: Post Your Questions Here

Hey Lou -- sorry about this unfortunate behavior. This can happen when the model outputs are sufficiently improbable so the only valid token per your schema is the newline token. It's not really a bug in that engineering changes cannot fix it. When this happens, I would suggest trying to use a simpler schema or changing your prompting so it's more clear to the model what it should respond with. Let me know if this helps!
6 Likes