Realtime API: Parallel conversation.item.create events?

I have a number of items I want to add to a real-time conversation and get a single response.

I issue a conversation.item.create for each one. However, do I need to wait for the item to be created before sending another, or can they be sent in parallel? It doesn’t really matter to me the order in which they’re received by the LLM.

After they are all sent, I send a single response.create to get the response.

1 Like

Interesting that this is my use case this morning

The use case is sending the response to tool call functions. When I get an array of tool function requests via response.done I execute them in parallel and create a separate conversation.item.create for each call, then I send a response.create.

The way the API is modelled and the asynchronous nature of websocket messages leads me to believe this is correct. Haven’t tested yet though.

2 Likes

Thanks for replying @johnroy

Very subjectively, I feel I am getting more quirky behavior by doing this, i.e. not waiting for the conversation.item.created server event before issuing other conversation.item.create events. I wonder if there are perhaps race conditions in the back-end.. I am testing this further to try and get a better idea.

Another question would be: should we be waiting for the last conversation.item.created server event before issuing a request.create client event? If the answer to the first question is yes, I’d assume the answer to this should most certainly be yes.

1 Like

..one funny thing is that I attempted to write the question in a ChatGPT chat. It gave me two answers (and asked me which one I preferred). One answer was “You cannot do this, you need to wait for the “created” server event”, and the other was “you can do this, the API is designed to allow for multiple parallel create events” :rofl: