I’m having a major issue where function calling in OpenAI’s API is completely ignored —it’s not just failing, it’s as if the assistant doesn’t even recognize the function exists. No function call is being made at all.
I’m using Markdown formatting inside the prompt to describe the function, but instead of calling it, the assistant just returns a normal text completion and ignores all function-related instructions.
fyi, it was working great before. Not this exact function, but the same structured ones.
How the custom function was created on the server:
Function Name: determine_status
Description: Categorizes client responses to assess their interest in selling a property.
Column: Status
Description of Column: Stores the client’s interest status regarding selling their property.
How it’s described in the Markdown prompt:
determine_status
Purpose: Categorize the client’s status based on their response.
Possible Outcomes:
interested
not interested
not for sale
invalid contact
invalid property
auto-reply
Usage:
Use this function after each client response to categorize their interest.
Do not invent any statuses; use only the provided options.
Examples of interested responses:
The client says “Yes,” “Maybe,” or expresses willingness to discuss further.
They ask for more details or agree to provide information.
They suggest scheduling a call or meeting.
They provide property details, financial information, or share documents.
They engage in discussions about selling terms or pricing.
They ask if the buyer is motivated.
They ask about your company.
They provide contact information or request yours.
Handling Objections and Questions
Client Asks If the Buyer Is Motivated:
Respond: “whatever the message is, interested lets say”
Actions:
Wait for their response.
If they proceed:
Use determine_status with status: interested.
The Issue:
Despite everything being correctly set up, functions are not being called at all. There’s no function execution, no logs showing a function request— just plain text responses as if function calling doesn’t exist.
The function is properly defined on the server
Markdown in the prompt is structured correctly
No function execution is happening—completely ignored
Checked logs—no function call attempt is even registered
Possible issue with eval? Could this be interfering with execution?
using model for AI assistant: 4o-2024-05-13
Question:
Has anyone else faced this issue where OpenAI’s function calling is completely ignored? How do you debug something that doesn’t even seem to attempt function execution?
I would suggest a different approach I have used before function calling was implemented by OpenAI to get the same functionality. Here is some pseudo code:
prompt1 = ’
You are a helper in a RAG.
Your purpose is to select a database table to get data from to enrich a prompt we are going to send to a gpt model to generate an answer.
We have the following database structure:
“products” - contains our products
description: we are selling clothes and boots - so when the user has questions about that we always look up our products
“faq” - if the user has any other question that does not belong to product related questions then we always call this
“blackhole” - if the message is not a question we will call this database
here is the message:
[message]
You can only select one table. Answer with one word only! No explanation or introduction or other bullshit
’
table = getresponse(prompt1)
if table is not one of the possibilities then add “§$%§$%$ WTH YOU PIECE OF §%§$% - I said ONE WORD!!!” as another user message to the conversation
if that doesn’t work send a follow up question to the user like “sorry I don’t understand what you want, could you explain it a little bit?”
and repeat that
if blackhole - no database call needed
… continue with more agentic analysis e.g. if it is product related then make a prompt where you give it a list of catagories to search for… then features etc… the more you do the longer it takes - that’s why the programmer gods have invented parallel processing
It sounds like you are simply doing a lot of prompting. That’s not how functions work.
Are you not using OpenAI’s function calling, providing a full function schema specification with descriptions? Describing the code that powers the function and what it is supposed to do?
The AI will call functions when they serve a purpose. A function can be described as: taking some real-world action and returning a status or information. Then, if the user asks something the function will be useful for, the query will be sent to the function instead of a message to the user.
I used a function generator to make a function based on just pasting all you described in your first post, not even optimized. Then, an AI that wants to help “your company” buy houses and get leads. It just takes a mention of being interesting in selling, for the AI to invoke the function for sales lead-collection.
That is using gpt-4o-2024-05-03. There is no prompting at all about the function. The system prompt is “You are a helpful property real estate assistant for our ‘we buy homes’ company, actively screening for leads to obtain homes for us to buy while helping users.” The function is completely self-contained.