Hello everyone! After changing the model to a new one, ‘gpt-4o-2024-08-06’, my application started to give a lot of errors during the analysis of pictures by prompt and giving an answer about what is depicted there
Please tell me how to fix it?
In particular, the error looks like this:
Log context “root”:{2 items “message”:NULL “error”:{4 items “type”:string"Seld\JsonLint\ParsingException" “code”:int0 “message”:string"Parse error on line 1: ^ Expected one of: ‘STRING’, ‘NUMBER’, ‘NULL’, ‘TRUE’, ‘FALSE’, ‘{’, ‘[’" “details “:{4 items “caller”:string"parseResponse” “file”:string”/home/dai3e3voeroc/attributor/vendor/seld/jsonlint/src/Seld/JsonLint/JsonParser.php" “line”:string"407" “context”:{5 items “object”:string"StockItem" “object_id”:int37624 “user_id”:int50 “event”:string"openai.client.response.parse" “data”:{1 item “message”:NULL } } } } }
1 Like
Welcome to the dev forum!
I’ve moved your post to a new thread.
Is the error consistent? Easy to reproduce?
The models are trained different, so it’s usually good to test before the default branch switches to a new model.
1 Like
_j
October 8, 2024, 10:44pm
3
Same symptom, same application of receiving JSON, same source model.
Hello Guys, I am using structured output of OpenAI to get response instructured way, and I am further I am using streaming to improve the response and send faster chunks response to client.
with client.beta.chat.completions.stream(
model=“gpt-4o-mini-2024-07-18”,
messages=user_messages,
response_format=ViniReply, stream_options={“include_usage”: True}
) as stream:
this is my client initialization part, ViniReply is mt structured response class.
and I am processing chunks as
for chunk in …
I’m inclined to say OpenAI broke something in the brains.
2 Likes
Thanks, @_j . Your help around here is very much appreciated.
I agree that sometimes things break when these new models are rolled out. People reporting the problems helps get eyes on them…
Thanks again for your valuable contributions to our community garden!
2 Likes