Large input summarized text

I am having large set of questions and answers and I need to summarize them…
the positive news is that it’s working really well and generates at least 80% summarized text well. But, I have been facing a major challenge while doing the summarization for question/answers in OpenAI and this gets even more problematic when the input increases due to the large number of question/answers. In this case, it throws an error about Invalid token length(please see attached). We can do exception handling but is there a way where OpenAI can handle large input and provide the summarized text at once??..also please suggest if I can use any other application(bit cheap though) on the top of my software to get large input summarized text at ONCE?

I have researched about fine tuning option but I think it wouldn’t be appropriate for my use case and there are no examples for now that my model needs to learn from.

I have been curious and researching about this for a few days now but haven’t found a perfect solution for this. I wonder if anyone could please help me with this issue???

It would be helpful if you provided examples of what you’re trying to achieve. It’s not entirely clear from your description what problem you’re trying to overcome.

My solution to this problem when I was experimenting was to build out a list of text sections capped at x tokens, then have a function that iterates over that list and builds a new list of responses. It’s not a perfect solution, especially because of the potential discontinuity between outputs, but it’s one example of a token limit workaround.

You could also potential have GPT-3 break up the text into various sections, each assigned a value for it’s perceived relevance. Then from there you could use the same list iteration strategy