Slow response time with GPT-4

Hi @ondrassssek,

I think with the switch in infrastructure to azure and the increase in computing power and the fact that this is a new model this is normal. I have the hope that this will improve further on :slight_smile:

Problem with API request (long answer time) - General API discussion - OpenAI API Community Forum

Maybe you can build some more resilience to fall back to another model if response times surpass a for your case acceptable duration. I think what @Jeffers did is a great thing:

1 Like