Transparency about Context Length?!? (ChatGPT)

Why the hell are you guys not transparent about context length??

I mean…only reading on some unofficial sites, that its like 32.000 Token, but the Chat Window telling me its 4000 Token Context. I also tested it with larger documents and the results are disappointing. The document has around 25.000 Token (18.800 Words), so easily handled if the Token window would be 32.000 with an Buffer of around 7.000 for an output. But its not following the commands and telling me, that the document is too long… I have a Pro Account and using GPT 4.

What is the official Token Limit for ChatGPT and why not be open and transparent about it, like youre obviously misleading name “Open” AI implies??