Chat completions API - max_tokens default value is missing

What is the default value for max_tokens parameter in chat-completion API? See screenshot below. This is the only parameter for which OpenAI dev team didn’t indicate the default value - why?

https://platform.openai.com/docs/api-reference/chat/create#chat-create-max_tokens

Interesting. Thanks for the feedback @dilshat.

Per earlier docs, on the chat completions endpoint, by default the max_tokens takes the available tokens context length from the model’s context length with a max value of up to 4096, as far as I remember. This was different for vision models, which by default had max_tokens set to 256 tokens.