Sample py code:
openai.Completion.create(
model='code-davinci-002',
prompt=['<|start|>1<|end|><|start|>2<|end|><|start|>'],
max_tokens=10,
stop=['<|end|>'],
temperature=0
)
Around 20-30% of the time, it returns:
"choices": [
{
"finish_reason": null,
"index": 0,
"logprobs": null,
"text": "3"
}
]
Otherwise, it returns expected result:
"choices": [
{
"finish_reason": "stop",
"index": 0,
"logprobs": null,
"text": "3"
}
]
In some cases, for longer prompts, it fails ~100% of the time.
If start/end tags are changed from <|this|>
to <this>
, the problem never happens.
Any ideas why this might be the case? It seems like it started occurring today, since some older prompts I have which I know were working correctly just a couple of days ago, do not work anymore.
1 Like
Hi @ivan.bestvina
What is your expected completion using such a prompt as above?

I wrote the expected result above, which I get ~70% of the time. As far as completion goes, it seems to always be correct, but the finish_reason
is sometimes, randomly, null
instead of 'stop'
.
I see, sorry I missed it
Expected result:
3
Are these all the prompt params you are using?
model='code-davinci-002',
prompt=['<|start|>1<|end|><|start|>2<|end|><|start|>'],
max_tokens=10,
stop=['<|end|>'],
temperature=0
No problem, thanks for responding! Yes, I only used these ones, all the others are left default. I tried other params, but the problem seems to occur with frequency and similar parameters changed as well.
FWIW (not much).
If I remove the array from your prompt and stop, so far, I get 100% expected completions. Tried 10 times.
Example Output
Prompt Setup
Update: Tried 20 times, 100% expected results.
Probably does not help much, but just thought you might like to know.

Thanks for noting that! Sadly, I’m running a bunch of prompts in parallel, so I cannot remove the array. I just put a single example in here for simplicity. But maybe your info helps someone from OAI to fix the problem 
Yeah, gotcha… I noticed problems with arrays a few weeks ago.
No idea why either.

Just started having this same problem today, it’s happening on “length” as well.