Idea of context for GPT 3 API

Hi there,

The token limitation is a technical limitation. We only support 2049 tokens in the prompt and completion combined; if your prompt is 2000 tokens, your completion can be 49 tokens at most.

One creative solution, besides prompt chaining, may be to use the Answers endpoint , which allows you to upload a JSONL File with documents that you want to query. You could upload chats as documents to a file to be queried, and then query those documents.

Edit: There’s another thread on this that you may be interested in. Another option is to create summaries of past conversations, which are naturally shorter and can let you fit more context into the prompt.

3 Likes