Image_url is only supported by certain models

Hi Team,
i am using gpt-4o-mini in the chat completion api, from which i am getting the below error: BadRequestError: 400 Invalid content type. image_url is only supported by certain models.

So please help me to fix this.

5 Likes

Hi @aman.gupta1 and welcome to the community!

Can you post the full API call that you used? It could be that the request was not formed properly at your end, e.g. see here.

2 Likes

This the api request which i am using

 response = await openai.chat.completions.create({
            model: "gpt-4o-mini",
            messages: [
                {
                    "role": "system",
                    "content": [
                        {
                            "type": "text",
                            "text": `prompt`
                        }
                    ]
                },
                {
                    "role": "user",
                    "content": [
                        {
                            "type": "image_url",
                            "image_url": {
                                "url": base64Image
                            }
                        }
                    ]
                }
            ],
            temperature: 1,
            max_tokens: 2048,
            top_p: 1,
            frequency_penalty: 0,
            presence_penalty: 0,
            response_format: {
                "type": "json_object"
            },
        });

For base64 encoded images you’ve to specify the mime types as well.

{
          "type": "image_url",
          "image_url": {
            "url": f"data:image/jpeg;base64,{base64_image}"
          }
}
3 Likes

Yes i had used the same @sps
data:data:image/jpeg;base64,

I’ve been encountering a similar issue with the GPT-4o-2024-08-06 where I receive random “image_url is only supported by certain models” errors when sending queries sequentially in response to a stream of video frames. This issue started occurring only recently, and wasn’t a problem before yesterday.

To troubleshoot, I’ve confirmed that there are no formatting issues with my messages. Also, when I resend the same message that previously caused an error, it works without any issues on the second try. I’ve even tried downgrading the python-openai package to v1.50.2, but the problem persists, which leads me to believe that recent changes to the Realtime API might have affected the backend server behavior.

Hope this issue to be resolved soon

Yes the day before yesterday the API was working fine, now some request pass while some fails with the same error from yesterday.

1 Like

Same, just started happening yesterday. It happens sporadically once in 10-20th request and then it works fine.

3 Likes

Thank you for reporting! This behavior should be fixed now.

4 Likes