OpenAI Why Are The API Calls So Slow? When will it be fixed?

There’s a few things that could be happening:

  • platform: run python locally and see
  • datacenter: you could be routed to a slower one by geography
  • account: different accounts, different levels? (unlikely)

I hit DNS servers around the globe and got the same IP for api.openai.com, and they don’t advertise a iowa.api.openai.com so you can get to a particular api endpoint (if there’s even more than one).

Azure has multiple datacenters where you specifically deploy your OpenAI instance.

Then I suppose second account, feed it $5 and see if you are discriminated against. Or if it’s my monthly billing and history that gets me to fast machines.

Or fine-tune for 4x speed from what you’re getting at 8x the cost…


BTW, my previous speed test code seemed wordy and expository, so I took care of that:

import openai; from openai.util import convert_to_dict as e
from openai import ChatCompletion as f; from time import time as q
openai.api_key = "sk-1234"
def g(z):
 return [z['usage']['completion_tokens'],z['choices'][0]['message']['content'][:80]]
c={"model":"gpt-3.5-turbo","top_p":1e-9,"messages":
[{"role":"system","content":"You are a helpful assistant."},
{"role":"user","content":"write an article on digital transformation, 10000 words."}]}
for m in [1,128,512]:
 s=q();m={"max_tokens": m};o=e(f.create(**c,**m));d=q()-s;x=g(o)[0]
 print(g(o)[1]+f"\n[{x} tokens in {d:.1f}s. {(x/d):.1f} tps]")
1 Like