Is it possible to summarize large amounts of text?

I would like to use chatGPT’s API to read and summarize an entire novel.

However, I cannot summarize because I have reached the limit of the amount of text that can be read at one time.

Please let me know what technology or method I can try to achieve this.

Right now I am imagining using llamaindex, pinecone, etc, and using embedding API, etc.

Thank you in advance.

Yes, summarize each page separately and then put them together.

Set up an API, then programmatically do each page at a time

Thank you very much.

I understand how to make a summary of a long novel, one story at a time, and then summarize what has been created.
And I have a further question.

  1. if the amount of words in Chapter 1 exceeds the token limit, do I need to further divide that chapter 1 and make a summary?
  2. similarly, if the total of the chapter summaries for 10 chapters exceeded the token limit, would I be correct in creating a summary by dividing it into 5 chapters each, and then further summarizing them?

Thank you in advance for your cooperation.

1- You can create summaries of pages using the model. Just provide a text with no more than 5,000 words (8,000 tokens), and the model will return a shorter version of the text.

2- Token limits apply to each run. Make a new request for each chapter or page, and then combine the results. Summaries are usually coherent.