New 4,000 token context window (prompt+output)

I missed the announcement, I think, but it appears you can now use a total of 4,000 tokens - split evenly between prompt and output. This should help chat bots and some other applications. Thanks to the OpenAI team! Keep it up!




Paul, do you know if the new, higher token limit of 4K also applies to embeddings and fine-tunes? I couldn’t find any indication in the documetation. Thanks.

AFAIK, it’s only text-davinci-002 so far… I have high hopes, though, so we shall see.

You are right, confirmed with support. Thanks.


This doesn’t apply to live apps though does it? I asked support and they said it’s 250 tokens maximum. Also the app review form has a check box that confirms you will not use more than 250 tokens or 1000 characters of user input. So a bit confused about this. I see that it is available in the playground, however. Any clarification ? Thank you!

1 Like