Any idea how to input more than 8k token in GPT 4?

Actually i am waiting chat Gpt4 32k for specific task. but only get GPT4. But its better than nothing. Any Idea how to works with more than 8K token input? Maybe like splitting and merging?

You can process data in chunks.
You can summarize chunks, and then summarize the summaries.
You can use a knowledge database that can retrieve only relevant data to a query.

You cannot load more than 8190 tokens if you want even a 1 token answer.

is the response still accurate? Yes i want to summarize texts and conclude some info based of my query

An AI summary will typically be much smaller than the input. 500-7000 tokens in → 500 tokens out is how the OpenAIs have been trained; a size limitation which is hard to overcome, So providing 1000 token chunks to receive and reassemble 500 token pieces obviously must discard some words and nuance.

Also, it will be rewritten in natural AI language, so, for example, if you were to put Edgar Allen Poe in and ask about particular archaic use in one passage, that detail may have been rewritten without the particular turn of phrase or symbolism needed to answer questions about the literature style.

1 Like