Does the text-davinci-003 model support 4000 or 4096 tokens?

I’m just trying to make sure that the documentation is really correct, because, often, the documentation doesn’t seem to contain all info necessary to understand something.

The docs say that the context length for text-davinci-003 is 4000. There would be no reason to believe that this is not correct, if most models didn’t have a context length of 2048 and 4096 = 2048*2

So, can an official OpenAI employee/developer confirm that the context length for text-davinci-003 is really 4000 and not 4096?

Checking with the team, I wrote the docs there based on info I was given which I am fairly confident is right but double checking to be sure.

1 Like