Error: GPT3 responding with finish_reason == None

Undefined behaviour from documentation and never seen before in the past, only appearing for davinci-003.

  • Appearing by chance (~1/5).
  • When returning None as the finish_reason, summary quality appeared to be normal.
  • Endpoint is competitions
  • Started to notice in the last hour.

Sample response when handling responses with openai python library:

“{‘choices’: [{‘finish_reason’: None, ‘index’: 0, ‘logprobs’: None, ‘text’: X(but normal)}], ‘created’: X, ‘id’: X, ‘model’: ‘text-davinci-003’, ‘object’: ‘text_completion’, ‘usage’: X}”

Is this a permanent change, openai error or model error? How should I handle this? (same as “stop”?)

1 Like

Is there a fast method of contacting openai support regarding this? (This is a clear sentence)

You can sometimes have luck on their Discord if you’re lucky and can navigate around amidst the flood of conversations going on.

Might be something new they’re testing… or a bug…

Is it affecting your code somehow?

Yeah, we don’t know how to handle/if we should handle the case where None is returned. Previously its just set to put anything other than “stop” to fail, but the summary quality of None seems fine…

1 Like

Yeah, the verbiage makes me think it’s temporary. Might add it into your whitelist, though, if it’s giving good completions… maybe with an extra warning to users?

Good luck!

Hi @guan

I always specify a stop (normally ####) and have only seen completions end for 1 of 2 reasons:

  1. ‘stop’
  2. ‘length’

Honest, I have never experienced any “None” reason for a finished completion and like you I mostly use the text-davinci-003 model.

If you have any specific prompt you have experienced issues with, please post the prompt (all your API completion parameters) and I will test it for you.

Please note that I can reproduce the “None” completion reason by including the stop in the prompt; so I recommend you only send the stop parameter to the API and do not include the stop in the actual prompt.

Hope this helps.