I am doing responses api + streaming. This occurs on the first request/response (but I think can occur in other ones). I often get a corrupted function call. Not every time. But at least 30%. It looks like this:
read_file({
"line_numbers": false,
"path": "codeai/tools/go_pkg_public_docs_test.go}'}{ errors: 'JSON decode error'}? Wait path wrong due to braces. need proper. Should be good path. Retry. maybe file path. We'll first list? we can use ls? go mode says use go tools first? But need to inspect file. maybe read_file. path ",
"request_permission": null
})
This is from OpenAI’s platform logs UI. (I also see this on my end). But I think this shows that OpenAI is doing something wrong - I’m not simply parsing their correct stream incorrectly.
You can see the LLM inserts the ‘}’ there – that shouldn’t be there, and always happens when this occurs (it’s always that character). From there, the LLM “realizes” the } shouldn’t be there and has a mini breakdown.
I get that LLMs are nondeterministic and sometimes do weird things. But this is very reproducible.
Can anyone give advice or help, confirm/deny this is an OpenAI bug, etc?