This seems like it contradicts the documentation that multiple parallel tools should be called as a array of [tool_calls], and I can’t find any documentation on this function. This seems unexpected behavior at least!
hey, @kraemahz did you sendup finding what could be causing that ? I want to run multiple function but it seems that I get the multi_tool_use.parallel error
@kraemahz got the same output, looks like occuring when the models function calling thinks he needs to run multiple tools, probably something internal yep
I’ve also been facing this issue every now and then - in my experience the calls would follow a predictable structure that could be reliably hotfixed. I’m guessing OpenAI will fix this eventually, but in the meantime I built a quick patch for this that’s easy to drop into a project to make the problem go away:
Usage is as follows: pip install openai_multi_tool_use_parallel_patch
In your code:
import openai_multi_tool_use_parallel_patch # import applies the patch
import openai
client = openai.AsyncOpenAI(...) # sync client will be patched too
...
response = await client.chat.completions.create(...) # no changes to the call signature or response vs vanilla OpenAI client
@atwoodjw_agh Yeah my bad. It also turns out I published the patch with a typo and it would not have worked anyway - it’s fixed now and I just re-released to PyPI. I also tested with python 3.7, things should work okay now. Unfortunately I cannot actually reproduce with a real OpenAI API call at the moment, but if these glitchy responses still gets generated in the same format, the patch should work.