Maximum Tokens limit dropped to 2048 (gpt-4o-mini fine tuned model)

Hi!

I am using a fine-tuned model of gpt-4o-mini. Until last week, I was able to set the Maximum Tokens to 16384 when testing in the Playground, however, today it is capped at 2048. Did this limit change, or am I doing something wrong?

Thank you!

1 Like

Welcome @gcalper

Can you share the exact error message that you are getting?

1 Like

Thanks for the reply!

I am not getting an error message, because the UI does not let me set it above 2048.

In the Playground, there is this Maximum Tokens value, I can set it to 16384 for the gpt-4o-mini. However, It is capped at 2048 for my fine-tuned model (ft:gpt-4o-mini-2024-07-18:personal::XXXXXXXX) ).
chrome-24-09-19-6980

I was able to set it to 16384 until 2 days ago, I am guessing it was the last time I used it. I am wondering if something has changed.

That could likely be a playground UI limitation and not a limit on the max_tokens for the fine-tuned model.

2 Likes

Thank you! Seems like that was the case. It worked in the Python script. However, it was pretty convenient to use the Playground screen. I hope they would make it possible to set it to ~16384 again.

2 Likes

Hello @gcalper, we’re aware of the issue and working on a fix. I can confirm that no models are prevented from outputting 16k context, it’s just a playground limitation for now. Thanks for reporting.

4 Likes