Beyond Frustating - Where is gpt-4-32k mentioned everywhere on the internet?

gpt-4o model doesn’t work well for us compared to gpt-4. The only problem we have is the fact that 8K context is simply too small, thus why we MUST use gpt-4-32k . Yet, I don’t see it in the playground, it worked twice via API, and it is supposed to be deprecated only in June 2025 (according to the site).

Why can’t I access it…and why would it be deprecated when 4o doesn’t even provide the same results and value.

Highly frustrated user who must spend hundreds of thousands of dollars with OpenAI, and yet cannot even use the one API we need or even able to be talking to human on their team, how ironic.

:thinking:

2 Likes

Access to 32k models was only granted to a select few partners early on, no way to apply, and basically ended when OpenAI came out with gpt-4-turbo a year ago.

It can bill $3 per API call…