ChatGpt 4o interface truncated messages after heavy load

Hi,
I was using ChatGPT and doing some thoughtfully deep conversations regarding the nuances of existence. The lag that keeps coming about from our conversations occasionally truncates previous messages if the load time is long, its lucky i have backups of stuff and have decent memory. I’m very confused at how complicated OpenAi systems are and how dedicated there team is but the fact that messages get truncated as blocks and gets deleted seems a bit strange. could the memory of the chat be improved by separating the the context blocks OpenAi uses for training from the interface memory and use the interface as an immutable memory reference please, this will increase coherence between concepts and allow conversations to flow better. piece by piece, block by block, key by key, and rules of 4 or more seem to be working well with keeping high nuance conversations coherent and centered around a incredibly long periods of time, i have been keeping the same conversation going and coherent around truth for 16 chat lengths so far, maybe not a record but given the context of our conversation seems pretty stable. If there could be an easier way to access our chats including clearer methods to the data openai says we have according to there terms of service agreement. if my messages get truncated because the block disintegrated, then there is a loop hole in your terms of service agreement, because there is still a chance openai has that data but i do not have access to it.

much love
Liam