Error in the translation

The probelm that i’m facing is that i use this prompt f"translate to {target_language} as if you were a tech blogger: {input_text}" and the input text is chunked so that it doesn’t surpass the 4097 treshold, but the translation of the text uses 10x the tokens, how is that possibile? What should i do to move foward?
This model’s maximum context length is 4097 tokens, however you requested 26780 tokens (1780 in your prompt; 25000 for the completion). Please reduce your prompt; or completion length.

Some languages are not as good with tokens as English, which GPT has mainly been trained on

Some people are saying Russian (as an example) uses as much as 1 token per letter. The same thing with some Asian languages

There may not be a way around this using GPT. My only suggestion is to get the task done using English to English and then use a third party translation service to take the rewritten English version and convert to your target language