Error 400, context length is not 128 is 16k

Hi, I am using completions api, openai’s page say that the context limit is 128k, but when I use the api and send some messages I get an exception about 16k, why does it happen? can I increase it?.

I am using gpt-4o-mini.

Hi @BrokenSoul

The context length depends on the model that you’re consuming. Only the models released Nov 2023 and later have 128k context length.

I’d recommend reading the models guide before you proceed.

Hi @sps, I am using gpt-4o-mini, in openai’s page that context length is 128k.

Based on the context limit in error message, it looks like you’re using a gpt-3.5-turbo model. Can you share your code?

1 Like

Oh thank you, the error was cause by a enviroment variable, that changed the model used. But I changed to gpt-4o-mini. Thank you @sps .