Did API responses from GPT-5 get faster after the release of 5.1?

I have been working extensively with GPT-5 over the past few weeks and noticed large latency (it is okay for my use case). Since the release of 5.1 it feels like the responses from 5 have gotten way faster.

Hoping it doesn’t affect performance! Is this a real phenomenon?

i am a plus user and since last 3 days, taking forever for most of API. very bad service from openAI. is there a customer portal, i can go and raise some service tickets.

You received fabrication – or you were asking only about ChatGPT.

API users pay for the parameters they select, including amount of reasoning. Inconsistency can break an app.

API users do suffer degradation in speed and latency performance often, though. Models can get faster when they are available but under-utilized (see the speed of gpt-3.5-turbo-instruct now!)