New to the API – Getting Empty or Cut-Off Responses?

Hi everyone,
I’ve just started experimenting with the OpenAI API and running into a small issue. Sometimes when I send a prompt, I get either a very short reply or just a few words when I expect a longer answer.

Here’s what I’m doing:

  • Using the gpt-4 model via the chat/completions endpoint
  • Prompt example: “Write a 200-word article on climate change”
  • Max tokens is set to 300, temperature 0.7

Sometimes I get good output, but other times it feels like the response gets cut off. Is there something I might be doing wrong with the token settings or prompt structure?

Any guidance would be appreciated!

Thanks :folded_hands:

3 Likes

When you set a max output tokens limit it will only answer within that window. Tokens are not exactly words.

If you loosen up allowing like 1000 max tokens, you will probably get that solved.

ps: by gpt-4 you mean gpt-4o or gpt-4.1 right? gpt-4 is super expensive and outdated.

2 Likes