PDF summarizer using openai

hi @jr.2509 @SomebodySysop thanks for the input. did you use a chat completion model or a summarize chain using llm?

Technically, chat completion, although I do sometimes use chunk summarization within this framework.

1 Like

You are filling up the context token window with your text and not giving the model tokens to reply with. You need to ensure you are inputting at least less than the token limit to get a response.

I would use the gpt-3.5-turbo-1106 model, as well. Take a look at the docs: https://platform.openai.com/docs/models/gpt-3-5 … for context window limits.

Either continue summarizing small chunks of your document (ensuring context window length is short enough to allow a response) or use a vector database with embeddings.