Welcome to the Community!
We’ve got quite a bit of material here on the Forum discussing summarization strategies.
Note that if your input file exceeds the context window, then you really don’t have much of a choice but to look into chunking. Chunking is also critical if you’d like to achieve a certain granularity in your summaries. There’s different ways you can go about chunking.
I’m sharing in the following links to a few threads / posts discussing the topic:
where have you gotten stuck?
There’s multiple levels of sophistication, depending on your budget, and complexity of the work you’re handling
summary of summaries with gpt-3.5
graph TD
C1[Chapter 1] --> S1[Summary of Chapter 1 with GPT-3.5]
C2[Chapter 2] --> S2[Summary of Chapter 2 with GPT-3.5]
C3[Chapter 3] --> S3[Summary of Chapter 3 with GPT-3.5]
C4[Chapter 4] --> S4[Summary of Chapter 4 with GPT-3.5]
C5[Chapter 5] --> S5[Summary of Chapter 5 with GPT-3.5]
S1…
Googles has a million , but have not used it so can’t tell you anything about it. used the phone app version that was free but not with anything of value.
simply put I built my own data structure using Graph databases that run local and they are my unlimited long term memory for my personal gpt client. you can find it in the forums called kruel.ai. it can take voice inputs, has its own message application , has a doc ingester to fill out its brain with manuals etc. its still in developmen…
Hi there!
Summarization is a frequent topic here in this forum and there are a few tried and tested methods for summarization of longer inputs. See the following post by @Diet for an illustrative overview of how to approach it.
In general, the approach depends somewhat on the level of granularity for the summary you are looking for. Technically, if your input text is solely 10,000 tokens, you can generate a summary via one API call. However, the maximum length of your summary is bound by com…
3 Likes