Hi,
I keep running into HTTP Request: POST https://api.openai.com/v1/chat/completions “HTTP/1.1 400 Bad Request” errors using the Python API on some queries using GPT-4 Vision Preview.
It worked fine until a couple of days ago without ever returning errors.
Now, it happens very incosistently on some messages, others work. Messages is valid json code every time and has the same number of tokens.
Any ideas on why this happens? Thanks a lot in advance!
GPT-4 Turbo works fine also running GPT-4V Preview with example images. Also some of my inputs work, others give the HTTP 400 Bad Request response, usually after 2-3 requests that work okay. Then I have to wait and restart.
OMG! I love Turbo, understand your al so critical, but man - for us users and mear mortals, like4 this user, its a - game changing - life changer! To all those minds at open, thank you.
Hi, I have run into this 400 Bad request problem with got-4-1106-preview. It seems to happen with non-trivial ‘content’ from ‘system’ and ‘assistant’. When content is from user it seems to work fine.
Has anyone found a solution? It/d be greatly appreciated to get past this blocking issue.
-UM
You have different model names in your title and body.
However, you can follow this link to the API Reference, expand messages, and discover that the only role that supports anything other than “string” contents is user.