Function Calling is VERY slow

I’m using function calling to return structured JSON for a trip itinerary app and the response is SUPER SLOW, like 2-3 minutes slow.

Because I am requesting a deeply nested response object, my request contains a very long description of how I want the response data formed, so the request is typically around 800-900 tokens and the response is around 1200-1300 tokens. But this seems well within the limitations of the API.

Has anyone else had a similar experience? Any tips on how to improve the response time?

3 Likes

Unluckily I’m having a similar problem. My request is around 2000 tokens and my response goes around 150.
Basically I’m making a assistant to navigate my app, something where the user can say what they wanna see and be redirected there.
Different features in the app require different arguments for building the URL, that’s why I’m defining more and more functions that can be called.
It started with 2 seconds long requests (just one function), now it’s 3 and I wait 25 seconds for them. I’m guessing that it may be because we did something too complex, I’ll try to isolate the problem now. BTW are you still having this issue?

Hey @cesca.leonardo and @dantheman1

Are you still experiencing function calling to be slow? If not, what did you do to fix it?

Many thanks,
Asser

I’m curious too. We have similar experience with our chat app (.NET, AzureOpenAI). Hope to hear some good advice from you folks.

Sane experience. I had nested queries like you, I ended up building a separate replication of API on elastic Search and created an API out of it using elasticsearch query. Gets me least amount of data back with quick responses on local atleast.

can someone please advise how do i train my assistant for function calling so that the performance can be better. As of now, this is too slow