Chat Completion in stream mode cuts off the answer mid-sentence

At the moment, there is a problem with incomplete receipt of the response string that OpenAI transmits in stream mode upon request from Chat Completion.
For example, I request a simple phrase in Ukrainian “what will 1+1 be equal to?”, after which I receive “hmm, 1+1 is equ”, and at this point the stream receives the final message DONE
At first we thought that such a break was due to the fact that either the system message, or the history, or the response itself exceeded the number of tokens, but by turning off the history and reducing the system message as much as possible, we again got a break. Next, we tested on simple examples where monosyllabic answers are required, and again we received only a word fragment. That is, the hypothesis has been confirmed more than once.
This may be due to the load on the OpenAI servers, since we work with such requests during daytime business hours, Kyiv time.
On the other hand, this problem, judging by the messages on the forum, has already appeared at least twice (in spring and autumn). Strangely enough, the answers were interrupted before the release of updated models, and perhaps the bugs were associated with changes that were not fully tested.