Text-davinici-003 always returning 16 tokens regardless of max_tokens

@kiersten, max_tokens prevents more than X tokens from being used. What you’re looking for is a minimum length setting, which is not currently a parameter.

Paul makes a good point - asking explicitly for a longer poem by using descriptive words like long, detailed, and epic are all great. In addition to changing the frequency_penalty to 0, you might consider using a different model like, Curie. Curie is better for instructions, but I find that it sometimes helps when trying for longer responses.