The tricky part is an assistant for a given conversation can only process a single request at a time so if you try to send two request simultaneously to an assistant the second one will fail. You have to poll (which sucks) and for the assistant to finish processing the current request so you need some way of blocking the other users until the current request finishes. The best way to do that would be to put a message queue between the users and the assistant. I couldn’t do that so I created a virtual mutex. It works
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Can one assistant run concurrently on multiple different threads at the same time? | 3 | 3167 | March 12, 2024 | |
Need help deciding what to put in System vs User prompt for dialogue generation | 14 | 7694 | August 9, 2024 | |
"Multi-user" GPT in a Forum | 11 | 15089 | February 6, 2024 | |
Questions about Assistant, threads | 29 | 36703 | July 18, 2024 | |
Adding messages to the assistant API | 10 | 8756 | July 26, 2024 |