as i already stated on the other thread - this is documented here: Vision - OpenAI API
Currently, GPT-4 Turbo with vision does not support the message.name parameter, functions/tools , response_format parameter, and we currently set a low max_tokens default which you can override.
It is possible to omit the max_tokens, but if included, the data type is validated against the API schema. Azure:
"max_tokens": {
"description": "The maximum number of tokens allowed for the generated answer. By default, the number of tokens the model can return will be (4096 - prompt tokens).",
"type": "integer",
"default": 4096
},