Not sure if this is a short term issue or a downgrading, but chatgpt used to allow ~150k token context window easily, now i am sending 56k tokens and getting
Also there is less thought/ compute time in O3-mini-high, i compared prompts from the first days and got wildly worse results
What gives?