As title. Different from adding custom actions in GPT, function calling in assistant API does not even have server endpoint field? How does it even work behind the scene?
I tried the example get_weather() in playground but it did not work as expected.
Solved by chatGPT.
The Assistant API’s function calling feature allows you to define and call custom functions directly within the Assistant’s responses, without the need for an external server endpoint. This is distinct from custom actions in GPT, where you might typically define an action that calls an external API or service.
Here’s how function calling in the Assistant API generally works:
- Defining Functions: You define functions in the tool’s configuration. These functions are written in Python and are executed in a secure, isolated environment managed by OpenAI. This environment is not the same as running a server endpoint; instead, it’s more akin to running a script in a sandboxed environment.
- Calling Functions: When the Assistant needs to execute a function, it calls the function with the necessary arguments. The function then executes in the isolated environment and returns its output back to the Assistant.
- Integrating Results: The Assistant integrates the output from the function into its response.
The key difference from custom actions is that the function execution happens within OpenAI’s infrastructure, not on an external server. This simplifies the process as you don’t need to manage an API endpoint, but it also means you’re limited to the capabilities and resources of the provided execution environment.
get_weather() function in the playground didn’t work as expected, there could be several reasons:
- Function Definition: Ensure that the function is correctly defined in the tool’s configuration. Syntax errors or logical errors in the Python code could cause the function to fail.
- Function Scope and Capabilities: The function’s capabilities are limited to what’s allowed within the OpenAI execution environment. It won’t have access to external APIs unless explicitly provided by OpenAI.
- Assistant Configuration: Make sure that the Assistant is correctly configured to use the function. This includes the correct invocation of the function in the conversation or instruction.
If you’re still facing issues, it might be helpful to look at the specific error messages or behaviors you’re encountering and adjust the function or its usage accordingly. If the issue is complex, reaching out to OpenAI’s support with specific details can also be a good step.
Is that created by ChatGPT? It looks like it, it’s helpful if you can mark ChatGPT replies as such as they may contain errors and people can then know to check for them.
yea, it is. Verified. Good call though.
There are 7 interesting assistant api demos, including function calling.
Just try it here : https://github.com/davideuler/awesome-assistant-api
I thought we would be able to make API calls to any API with functions like we can with GPT actions. Am I wrong in assuming this?
Assistants cannot access the internet except through you.
That “any API” would be any function that you specified and support yourself in your own code. You parse the response and see that the AI wants to invoke a function.
If you told AI that you have a specification for a tool function “get baseball scores”, and AI wants to know them, it is up to you to handle the function query that came back to you, translate it into your API subscription to baseball.stat-site.org, and get the data needed to make the AI happy enough that it can answer the user.
For example, specify a “multiply_big_numbers” function, and that “api call” you can just handle with computation locally.
I’m looking for an example Python script that uses an external API
I would like to build an assistant with which the user can search for a job on our system and apply for it immediately
Calling an external API like ‘https: // xxx.xxx.xx.5:5000 / send_data’ is possible from an assistant API or not? in functions section
if yes provide some examples