How does function calling actually work for the Assistants API?

Another take…
You provide the LLM with the definition or blueprint of a function you have. The example they use is the get weather function that takes in a location parameter.
Once GPT determines it should call this function to generate a better response, it will reply to you (the thing/system sending/processing these Open AI API calls) with a function message that essentially tells your code - “Hey, run this function and return the results to me”.
Your code sees this message and knows (you programmed it) to call the function (remember, you told it you have these functions) and return the results via a special function message.

I think one misconception is that OpenAI/GPT is actually calling a function for you in their system.

In the case of the get weather example they use, they call a 3rd party weather API from the example code in a similar process as I described above.

1 Like