GPT4 Turbo Vision invalid max_tokens parameter

GPT4 Turbo Vision model gpt-4-vision-preview returns an error when the value for max_tokens is null:

According to the API reference documentation null is a valid value for max_tokens: https://platform.openai.com/docs/api-reference/chat/create#chat-create-max_tokens

Also, if I don’t send the max_tokens param it does not default to infinite

1 Like

as i already stated on the other thread - this is documented here:
Vision - OpenAI API

Currently, GPT-4 Turbo with vision does not support the message.name parameter, functions/tools , response_format parameter, and we currently set a low max_tokens default which you can override.

2 Likes

It is possible to omit the max_tokens, but if included, the data type is validated against the API schema. Azure:

      "max_tokens": {
        "description": "The maximum number of tokens allowed for the generated answer. By default, the number of tokens the model can return will be (4096 - prompt tokens).",
        "type": "integer",
        "default": 4096
      },