Some straight forward way to put large text into model?

Could you please suggest how to put large text (like 50k+ tokens) to openai chatgpt4o model? My task is to input several texts and get their overall summary. I tested OpenAI chat completions, but with a request limit of 4k (even with a model context of 128k), it’s quite a hassle, and you don’t always get the desired result. I tried like this:

I will send you text chunks in json messages: 

{  "chunk":"..." } 

After message with next chunk, just save it to your memory and reply with json:

{  "reply":"next" }

After all chunks I will send you the following json message: 

{ "command":"process" } 

After the "process" method - give me the overall summary of all the chunks, I have sent before. Return it as json:

{  "reply":"SUMMARY" }

With big texts (for example just 4 chunks which overall length is barely highter that 4k tokens) - it still summarizes just the last piece in the end (tried with gpt-4-turbo ).