A non-deterministic bug but still needs noting

I have a chatbot just idly pretending to be a barkeeper. Nothing much. While reviewing its performance today I noticed that in response to one of its prompts from a user it spewed out the json for its tool use

{
“tool_uses”: [
{
“recipient_name”: “functions.get_rumour”,
“parameters”: {}
},
{
“recipient_name”: “functions.get_price”,
“parameters”: {
“item”: “meal”
}
}
]
}

It inserted that inside a code block in the text response. My bot wrapper wasn’t looking for or expecting anything like that so it simply passed it to the user

It also included a normal text response as follows
“You’re welcome, eur. Enjoy your meal. If you’re interested, I’ve got a fresh batch of rumors from the city. And if you’re eyeing the board, the meal comes to 1 gold piece. Just settle up at the bar when you’re ready.”

The only way I can see this happening from my code is that a function call came back merged with an actual response in message.content rather than message.tool_calls (don’t mind the indentation!)
tool_calls = response.choices[0].message.tool_calls
if tool_calls:
prompt.append(response.choices[0].message)
for tool_call in tool_calls:
functionName = tool_call.function.name
print("Calling function : " + functionName)
callFunction = getattr(botdata, functionName)
if callable(callFunction):
functionArgs = tool_call.function.arguments if tool_call.function.arguments else {}
functionResult = callFunction(data_dict, server_dict, functionArgs)
prompt.append(
{
“tool_call_id”: tool_call.id,
“role”: “tool”,
“name”: functionName,
“content”: str(functionResult),
}
)
response = client.chat.completions.create(
model=model,
messages=prompt,
max_tokens=1024,
n=1,
stop=None,
temperature=temperature,
)
return response.choices[0].message.content

I will keep an eye out for it happening again but the application logs don’t show anything unusual.

gpt-3.5-turbo-1106