Why is not possible to execute parallel Runs using Assistant API?

I am playing around with the Assistant API and I was surprised to learn that I can only execute one run at a time per thread. Doesn’t this defeat the purpose a bit? I thought the purpose of a thread is to be able to have multiple assistants working on the same thread doing different things at the same time. An assistant could be working on a 30-min tasks such as researching a topic, while the other assistant is playing a game with the user in the same thread. No?

Will this be possible in the future or should I build my own logic on top?

1 Like

I guess you would run into a race condition. My assumption is that he handles all new user/assistant messages up to the run. These are then “processed” and only then the next run knows what is the new “context”.

Just out of curiosity, what do you need multiple runs for? I guess you need shared “knowledge” so wouldn’t you be better suited with different threads and shared files by the assistant?

I want to have different assistants working on the same thread. I understand that if I execute two runs in parallel, the assistant would both start working before the other one finishes, but I don’t mind that. I don’t get why it is blocked. I can build another layer on-top that provides this behavior but would have been easier for me if OpenAI just removed this limitation from the API