How to handle long prompts that exceeds the token limit?

For example, if I want to build a doc summarization tool on top of ChatGPT, some docs are super long. When I’m reading thru the API reference, it requires me to put the doc context in prompt and the summary in completion. However, if the content is too long, the summary can lose context.

I know ChatGPT is able to memorize sequential inputs. Is there a way to link several prompt into one same completion? In that case, for e.g., i can divide my content into several smaller parts but the expected completion are all the same, then ChatGPT can learn that.


You’ll need to break the document into sections and merge resulting completions.

If the prior context is necessary for the summarization of subsequent sections, you can try adding shortened context to your summarization prompt. This might include the document title, headings, or even summaries of prior sections.

1 Like