Speeding up the response from the openai's assistant api

I am working on a project to create a voice calling bot on aws. I am using amazon connect with the openai’s assistant api. The assistant api is slow in giving the response.

In amazon connect, if the response takes more than 8 seconds, amazon connect bot think that the user is disconnected and doesn’t wait further for any response and ends the chat. This 8 second timeout is the max value which we can set in amazon connect.

So, I need to speed up the responses from the assistant api, they should be within 8 seconds. Most of the time they are but sometimes (3/10 cases) it takes more than 8 seconds and ends the chat without any proper response to the user.

1 Like

AI is not suitable for use with time constrained system that were never designed to be connected this way, if you want speech output, i’d make use of the OpenAI TTS engine API and build your code flow such that you send blocks only when they are available, close connections when finished and reopen connections when the next fully complete block is ready to go. Streaming the output may provide a better latency experience for end users.

We have the stream feature now in the openai assistant also. This feature automatically solves the above problem!