Chat GPT Pro Context and Thought Throttling

Not sure if this is a short term issue or a downgrading, but chatgpt used to allow ~150k token context window easily, now i am sending 56k tokens and getting

CleanShot 2025-02-13 at 19.27.16

Also there is less thought/ compute time in O3-mini-high, i compared prompts from the first days and got wildly worse results

What gives?

The token situation appears to be working now, the less thinking thing persists.