Can I set max_tokens for chatgpt turbo?

Hi @Levatron

I have tried that before and it has never worked because the models cannot count chars in their own completions.

So, tried it again, just now with the same results as the last time I checked. Here I inject a system message to limit the chars, but it does not work. I have tried counting chars, words, etc. before and it has never worked as required. The models just do not count. This is been discussed at length here before in our community, BTW.

system: Do not reply with more than 20 characters.

The reason the completion was cut-off was that I set max_tokens to 100, FYI.

:slight_smile:

1 Like

Hi Ruby!

Yeah, generating code its like an impossible task with those limitations. I’ve only tried it for common questions like:

{“role”: “system”, “content”: “Explain your answer within 150 characters.”}
{“role”: “user”, “How old does a monkey get?”},

temperature=0.9,
max_tokens=500,
top_p=0.1,
frequency_penalty=0.2,
presence_penalty=0.0,

Which seemed to work fine. :slightly_smiling_face:

3 Likes

That is good to know @Levatron , thanks!

Since I only work with code on a daily basic, that is cool to know.

:slight_smile: