Practical Tips for Dealing with Large Documents (>2048 tokens)

Ok first, @Jacques1 what you said doesn’t make sense… This is a place to ask questions…

Anyways a while ago i had a similar question here:

Basically there are a few methods for doing what you want. You can break the text into chunks then summarise them for gpt. Then if you feed gpt the summarised information plus like the last few sentences it can generate a decent response. There are a few difficulties with this including having an accurate consistent summariser but this is one method.

This is a project @daveshapautomator has worked on in the past. It is not fully functional but is still impressive. It might be a good place to start.

Also this is kinda a summary of a couple other posts with an opinion or two thrown into the mix so if you want more information check out some past posts.

Edit: also new davinci has 4000 token limit so that may help

4 Likes