Hello,
Is function call incompatible with streaming?
I tried the example from the doc but as soon as I set stream=True, the chat completion does not provide any arguments to a function call.
{'role': 'assistant', 'content': None, 'function_call': <OpenAIObject at 0x117515770> JSON: {
"name": "test_function",
"arguments": ""
}}
Am I doing something wrong?
Thanks.
3 Likes
msanto
2
Hi together,
join @reginald.l here.
I would also like to have function call with streaming.
Kind regards,
Michele
1 Like
So it turns out that arguments are streamed. In a response there will be multiple function_call objects. I ended up doing something like this:
func_call = {
"name": None,
"arguments": "",
}
for res in response:
delta = res.choices[0].delta
if "function_call" in delta:
if "name" in delta.function_call:
func_call["name"] = delta.function_call["name"]
if "arguments" in delta.function_call:
func_call["arguments"] += delta.function_call["arguments"]
if res.choices[0].finish_reason == "function_call":
# function call here using func_call
return
if not delta.get("content", None):
continue
sys.stdout.write(delta.content)
sys.stdout.flush()
7 Likes
Thanks for sharing this. Could we assume the API would never return both non-empty content as well as non-empty function_call ?
I guess it’s best not to assume this…
ALso in your code I assume you would do json.loads() to parse your arguments
1 Like
I’m not sure if we can assume that, this is all implicit behavior as far as I can tell.
I had another post about json.loads, the API doesn’t seem to always return a valid json object but rather a python object (with triple quotes for instance).
In their examples they use eval to parse arguments which, to me, seems incredibly dangerous.
I would suggest to use ast.literal_eval for now, as it is much safer.
1 Like
Agree, in fact when I coded this I assumed we could get both and also assumed that future iterations may return multiple functions (which I have not seen yet)