I am having large set of questions and answers and I need to summarize them…
the positive news is that it’s working really well and generates at least 80% summarized text well. But, I have been facing a major challenge while doing the summarization for question/answers in OpenAI and this gets even more problematic when the input increases due to the large number of question/answers. In this case, it throws an error about Invalid token length(please see attached). We can do exception handling but is there a way where OpenAI can handle large input and provide the summarized text at once??..also please suggest if I can use any other application(bit cheap though) on the top of my software to get large input summarized text at ONCE?
I have researched about fine tuning option but I think it wouldn’t be appropriate for my use case and there are no examples for now that my model needs to learn from.
I have been curious and researching about this for a few days now but haven’t found a perfect solution for this. I wonder if anyone could please help me with this issue???