Open AI Reponse is Slow

I am using the Open AI API to get text variants. So far it works OK. However, it seems a bit slow. It takes me 2-30+ seconds to get a response from Open AI. Is this right? I am having the same problem in Python and JavaScript. Am I doing something wrong?

1 Like

If you’re using DaVinci then no, I don’t think you’re doing something. The bigger the model the slower so far, and I’ve seen pretty slow speeds with DaVinci too. Have had a lot of success so far using DaVinci when designing prompts, then attempting to downgrade to curie. Sometimes curie will do just as well, and be cheaper and faster. That way you only use DaVinci when you need to, and you avoid pounding nails with a sledgehammer.

If you’re using curie or a smaller model and getting consistently slow responses I’m surprised. I’ve had slow responses with curie before but it’s generally quite fast. So if that’s what you’re seeing I’d start looking for where else your bottleneck could be.

1 Like

Welcome to the super-secret cool-kid forums! Small smile.

In all seriousness, Davinci + a large prompt will be slower. How big is your prompt (input/output)?

What’s the task? A fine-tuned Curie MIGHT be able to do just as well (with a good enough dataset to fine-tune with), and that can be a lot cheaper…

Again, welcome to the forum. Good to have you with us.

1 Like