Longer limit of responses


I can’t get an API 3.5 turbo response longer than 2500 characters despite asking it for 7500 characters and allocating 3500 tokens.

Does anyone know why, please?


All the models only write text until they think they are done. They cant count - so if you ask for 7500 characters, it will be ignored

You have to suggest that it write long, verbose, and comprehensive content to get it to give long answers. (Descriptive words - rather than numbers)

1 Like

In fact, it’s much better that way, thank.

For those looking for the answer:

Rather than telling the API “I want a text of 7000 characters/words”, I will tell it “I want a five chapters consisting of three paragraphs”