As title. Different from adding custom actions in GPT, function calling in assistant API does not even have server endpoint field? How does it even work behind the scene?
I tried the example get_weather() in playground but it did not work as expected.
The Assistant APIâs function calling feature allows you to define and call custom functions directly within the Assistantâs responses, without the need for an external server endpoint. This is distinct from custom actions in GPT, where you might typically define an action that calls an external API or service.
Hereâs how function calling in the Assistant API generally works:
Defining Functions: You define functions in the toolâs configuration. These functions are written in Python and are executed in a secure, isolated environment managed by OpenAI. This environment is not the same as running a server endpoint; instead, itâs more akin to running a script in a sandboxed environment.
Calling Functions: When the Assistant needs to execute a function, it calls the function with the necessary arguments. The function then executes in the isolated environment and returns its output back to the Assistant.
Integrating Results: The Assistant integrates the output from the function into its response.
The key difference from custom actions is that the function execution happens within OpenAIâs infrastructure, not on an external server. This simplifies the process as you donât need to manage an API endpoint, but it also means youâre limited to the capabilities and resources of the provided execution environment.
If the get_weather() function in the playground didnât work as expected, there could be several reasons:
Function Definition: Ensure that the function is correctly defined in the toolâs configuration. Syntax errors or logical errors in the Python code could cause the function to fail.
Function Scope and Capabilities: The functionâs capabilities are limited to whatâs allowed within the OpenAI execution environment. It wonât have access to external APIs unless explicitly provided by OpenAI.
Assistant Configuration: Make sure that the Assistant is correctly configured to use the function. This includes the correct invocation of the function in the conversation or instruction.
If youâre still facing issues, it might be helpful to look at the specific error messages or behaviors youâre encountering and adjust the function or its usage accordingly. If the issue is complex, reaching out to OpenAIâs support with specific details can also be a good step.
Is that created by ChatGPT? It looks like it, itâs helpful if you can mark ChatGPT replies as such as they may contain errors and people can then know to check for them.
Assistants cannot access the internet except through you.
That âany APIâ would be any function that you specified and support yourself in your own code. You parse the response and see that the AI wants to invoke a function.
If you told AI that you have a specification for a tool function âget baseball scoresâ, and AI wants to know them, it is up to you to handle the function query that came back to you, translate it into your API subscription to baseball.stat-site.org, and get the data needed to make the AI happy enough that it can answer the user.
For example, specify a âmultiply_big_numbersâ function, and that âapi callâ you can just handle with computation locally.
Calling an external API like âhttps: // xxx.xxx.xx.5:5000 / send_dataâ is possible from an assistant API or not? in functions section
if yes provide some examples