Can I set max_tokens for chatgpt turbo?

I don’t mind the billing, I just candle handle a response that has more than 1600 characters (letters/spaces). Is there a way to tell openai to consider that and not generate responses longer then that?