How assistant api function calling works w/o server endpoint

As title. Different from adding custom actions in GPT, function calling in assistant API does not even have server endpoint field? How does it even work behind the scene?

I tried the example get_weather() in playground but it did not work as expected.

2 Likes

Solved by chatGPT.

The Assistant API’s function calling feature allows you to define and call custom functions directly within the Assistant’s responses, without the need for an external server endpoint. This is distinct from custom actions in GPT, where you might typically define an action that calls an external API or service.

Here’s how function calling in the Assistant API generally works:

  1. Defining Functions: You define functions in the tool’s configuration. These functions are written in Python and are executed in a secure, isolated environment managed by OpenAI. This environment is not the same as running a server endpoint; instead, it’s more akin to running a script in a sandboxed environment.
  2. Calling Functions: When the Assistant needs to execute a function, it calls the function with the necessary arguments. The function then executes in the isolated environment and returns its output back to the Assistant.
  3. Integrating Results: The Assistant integrates the output from the function into its response.

The key difference from custom actions is that the function execution happens within OpenAI’s infrastructure, not on an external server. This simplifies the process as you don’t need to manage an API endpoint, but it also means you’re limited to the capabilities and resources of the provided execution environment.

If the get_weather() function in the playground didn’t work as expected, there could be several reasons:

  • Function Definition: Ensure that the function is correctly defined in the tool’s configuration. Syntax errors or logical errors in the Python code could cause the function to fail.
  • Function Scope and Capabilities: The function’s capabilities are limited to what’s allowed within the OpenAI execution environment. It won’t have access to external APIs unless explicitly provided by OpenAI.
  • Assistant Configuration: Make sure that the Assistant is correctly configured to use the function. This includes the correct invocation of the function in the conversation or instruction.

If you’re still facing issues, it might be helpful to look at the specific error messages or behaviors you’re encountering and adjust the function or its usage accordingly. If the issue is complex, reaching out to OpenAI’s support with specific details can also be a good step.

1 Like

Is that created by ChatGPT? It looks like it, it’s helpful if you can mark ChatGPT replies as such as they may contain errors and people can then know to check for them.

1 Like

yea, it is. Verified. Good call though.

1 Like

There are 7 interesting assistant api demos, including function calling.
Just try it here : https://github.com/davideuler/awesome-assistant-api

3 Likes

I thought we would be able to make API calls to any API with functions like we can with GPT actions. Am I wrong in assuming this?

Assistants cannot access the internet except through you.

That “any API” would be any function that you specified and support yourself in your own code. You parse the response and see that the AI wants to invoke a function.

If you told AI that you have a specification for a tool function “get baseball scores”, and AI wants to know them, it is up to you to handle the function query that came back to you, translate it into your API subscription to baseball.stat-site.org, and get the data needed to make the AI happy enough that it can answer the user.

For example, specify a “multiply_big_numbers” function, and that “api call” you can just handle with computation locally.

1 Like

I’m looking for an example Python script that uses an external API

I would like to build an assistant with which the user can search for a job on our system and apply for it immediately

3 Likes

Calling an external API like ‘https: // xxx.xxx.xx.5:5000 / send_data’ is possible from an assistant API or not? in functions section
if yes provide some examples

I’m wondering the same. It would be enough to trigger functions on our server

I forked someone’s repo and added function calling example.
a message can be sent to a phone number via the Twilio API through a server set up in local Node.js, although sorry the setup is a bit rudimentary still.

I’m not sure how does assistant api detect the word related to functions.
does it affect by descriptions in function json?

When you provide a tools function specification when creating your assistant, that JSON specification is translated into language the AI can understand, and injected into what would be the system prompt all the time.

When you interact with the AI, it will use its training about functions to determine if the function provided should be called in order to satisfy user input.

The AI emits the function call as special language which the API endpoint recipient recognizes, which also includes the function name and the AI-written parameters for that function call.

The assistants API then reports the run status as action required, and you can retrieve the function language, do whatever you told the AI you were going to do with the function offered, and you return a value back to the assistants API with “submit tool outputs” to the run.

The AI then continues calling functions if it wants, or produces the informed answer for a user as the end product of the run.

2 Likes

Hello, i am building a bot with node js, i want to make several api calls, but i understand half of it, haha, but i know that this is the answer i need. i am having trouble understanding how to make the custom function call to my local function that bring the actual data i need to my assistant to continue the logic

I don’t believe this is correct. In my testing, if you just try to do internal python, your function call will complete and then ask you to Submit the output which it is missing from the external system. The documentation says it is for external API use.

I have question regarding your response, why do we need function calling, instead we can have a direct API call to get baseball scores with a call to action in the chat window., other than the context memory, and Natural language, what difference it will make ?

The anaswer is that: No, Assistant API function call doesn’t support external endpoints.

Reply from GPT4:

The OpenAI Assistant API, which includes capabilities like the one you’re interacting with now, does not have direct access to the internet for fetching real-time data or accessing external endpoints. The model operates in a closed environment, which means it cannot perform live web searches, access databases, or call external APIs directly to retrieve or send data.

thank you man,
life saving advice!