JSON mode with gpt-4-vision-preview

Hello,

Can JSON mode be set when using gpt-4-vision-preview?
Currently when I try to set it with response_format = {“type”: “json_object”}
I am getting a failure.

Thanks

Yes, JSON mode is available for both the gpt-4-turbo model and the 3.5-turbo model.

https://platform.openai.com/docs/guides/text-generation/json-mode

What Failure you are getting? - Something like this below one

Failed to get a response: 400 {
  "error": {
    "message": "'messages' must contain the word 'json' in some form, to use 'response_format' of type 'json_object'.",
    "type": "invalid_request_error",
    "param": "messages",
    "code": null
  }
}

To Fix this, You can try to append word ‘JSON’ in your prompt in any role.

{
  "model": "gpt-3.5-turbo",
  "messages": [
    {
      "role": "system",
      "content": "You are a helpful assistant. Generate a JSON response."
    },
    {
      "role": "user",
      "content": "Create a list of Item for Travel Kit"
    }
  ],
  "max_tokens": 150,
  "temperature": 0.7,
  "response_format": {
    "type": "json_object"
  }
}

Output:

{
  "model": "gpt-3.5-turbo-0125",
  "choices": [
    {
      "index": 0,
      "message": {
        "role": "assistant",
        "content": {
          "travel kit": {
            "items": [
              "Passport",
              "Travel itinerary",
              "Wallet with cash and cards",
              "Travel insurance",
              "Phone charger",
              "Travel adapters",
              "Travel-size toiletries",
              "Medications",
              "Snacks",
              "Water bottle",
              "Travel pillow",
              "Earplugs and eye mask",
              "Travel-sized umbrella",
              "Travel documents (tickets, reservations)",
              "Portable luggage scale",
              "Notebook and pen"
            ]
          }
        }
      },
      "logprobs": null,
      "finish_reason": "stop"
    }
  ],
  "usage": {
    "prompt_tokens": 28,
    "completion_tokens": 115,
    "total_tokens": 143
  }
}

Example Images:

Response without word JSON

Response with word JSON

Actually the word json does appear in the prompt

Here’s the error I am getting:
Error code: 400 - {'error': {'message': "Invalid parameter: 'response_format' of type 'json_object' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'response_format', 'code': None}}

Notice I am using gpt-4-vision-preview
Can you try the same with this specific model?

2 Likes

Can you try the same with this specific model?

wait let me test this…

1 Like

I have same problem using gpt-4-vision-preview sending:

 "response_format": {
    "type": "json_object"
  }

on my body call

{
    "error": {
        "message": "1 validation error for Request\nbody -> response_format\n  extra fields not permitted (type=value_error.extra)",
        "type": "invalid_request_error",
        "param": null,
        "code": null
    }
}

Use the gpt-4-turbo model and it should be fine (not the preview)

1 Like

Simply mentioning JSON oftentimes is insufficient – you’re still prompting it in the hopes it outputs valid JSON.

1 Like