Slowest response depending on 'user' param

Hi everyone!

I’m using gpt3.5 turbo for my application, when making the request, I noticed a very huge different response time depending on the user parameter of the python sdk openai.ChatCompletion.create method, I’m talking about 4/5 seconds vs 0.6 seconds depending on user string.

It seems that, based on this, openAI keeps a sort of internal db of users (associated to an account or to an api key?) that causes the slowness on the response.

As a work around I currently randomize that parameter before making the call, is there a way to clean this kind of persisted state? Passing an empty string or None would definitely resolve?

Thank you very much

We’re having the same exact issue!! Hope someone will help :pray: