I finetuned a model based on gpt-3.5-turbo-0125, which is said to have 16,385 tokens. However, I got an error below when I set the max_tokens to 15000. Is there something wrong?
openai.BadRequestError: Error code: 400 - {‘error’: {‘message’: ‘max_tokens is too large: 15000. This model supports at most 4096 completion tokens, whereas you provided 15000.’, ‘type’: ‘invalid_request_error’, ‘param’: ‘max_tokens’, ‘code’: None}}