Max_tokens not being respected

I’ve noticed that since a few days ago the parameter max_tokens does not get respected anymore by the API, neither in GPT-3.5 nor GPT-4. When I set it to an invalid value though, I receive an error… Indicating that how I send the parameter is indeed correct and well… As mentioned it was working until a few days ago.

Now, if I set it to whatever value, the response does not stop with the reason “length” anymore. Anyone else seen this?

I’ve been scripting away doing something else since this was posted, requesting many small max_token values and getting the length requested.

Do you have older code that was working you can retry, to see if there’s an undiscovered modification in what you’ve got presently?

1 Like

Indeed, I checked again and it was the case that in some other part the accounting was changed from response to total tokens so it always returned me the total tokens, my bad for not noticing it sooner…

Yes, please close it, found the error on my end, thanks & sorry!

1 Like