I don’t mind the billing, I just candle handle a response that has more than 1600 characters (letters/spaces). Is there a way to tell openai to consider that and not generate responses longer then that?
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Max_tokens seems to do nothing for me 3.5 Turbo | 14 | 3306 | December 18, 2023 | |
Creating Concise AI Replies in Short Interactions without max_tokens | 10 | 2083 | March 12, 2024 | |
MAX TOKENS is 4,096 tokens for gpt-3.5-turbo should fit the the messages sent and the answer generated? | 10 | 6145 | December 18, 2023 | |
Struggling with max_tokens and getting responses within a given limit, please help! | 5 | 18546 | October 28, 2023 | |
Setting max tokens for output issues | 4 | 3508 | January 26, 2024 |