How to limit the number of words in ChatGPT 3.5 output?

Hi,

I have a stupid question, but I cannot find something helpful on Google :frowning: The following command completely ignores this part: “Make it no longer than 300 words,” and the output is about 500. How can I limit this output, please?

  • Thank you for your help

Write a cover letter based on my resume and job description mentioned above. Also please use style: “formal”, tone: “professional”, position name: “Project Manager”, company name “Facebook”. Make it no longer than 300 words

My resume:
[COPY/PASTE]

Job description:
[COPY/PASTE]

If you specifically want the output to be within a certain limit, max_tokens can help you out there.

Max tokens means the total length of the tokens (your prompt + the output) that will be involved in the generation, so by calculating the number of tokens in your prompt, you can use that plus the number of token you want (usually the conversation rate is somewhere between 3-4 tokens for a word).

However, if the model still goes for a longer generation, it might cut-off when it reaches that limit, so the last sentence can feel unfinished or incomplete, so watch out for that.

Other than that, have you tried a positive version of this sentence, like “The maximum length of the output must be 300 words” in the prompt ?

1 Like

I tried

"Write a cover letter less than 300 words..."

and it simply did (not counting the address, etc.)

1 Like

thank you, guys, for the ideas. It seems like “Write a cover letter less than 300 words…” really works for me. Appreciate your help!

It’s actually quite bad at counting. It can’t count while writing, and it thinks in tokens, not words. Something like maximum sentences or paragraphs may be easier to control.

It can loosely gauge common lengths like 100 and 500, but accuracy drops fast past 500 words.

1 Like

Are you sure? I’ve had max_tokens at 400, but have been feeding it prompts much larger than that and sometimes receiving a 200+ token response. I want to keep it to 200 maximum, which theoretically limiting max tokens could do, but like I said it’s been at 400 this whole time and not limiting it to that at all.