Yes, this is lame, but is also exactly what costs actual money that they have curtailed already in AI training: quality attention heads go quadratic with longer lengths
openai.error.InvalidRequestError: max_tokens is too large: 75000. This model supports at most 4096 completion tokens, whereas you provided 75000.
So for input/output tasks like checking spelling, this model gains you nothing except lower quality.
edit, confirmed: