Purpose of having function tools for assistant or chat

Hi. I am slightly confused by the Function Tools that one can add to the Assistant. From the docs I don’t really understand as to what are the intended use cases for them. When they became available, I was sure that I can basically write a python (or any other language) funcation, that will be called by an Assistant when being asked by user. Basically something similar to when you ask Custom GPT with code interpretaion enabled to write a pyton code and execute it. But this time the code is written by the developer for more consistent and predicatble result. Is it possible to achieve this with Function Tools? If yes - how? If not - what is the intended use-case for the function tools? Thank you.

1 Like

Welcome to the Dev Community!

The assistants documentation is a bit confusing for sure. There’s the code interpreter tool which behaves in a similar way to it does in ChatGPT, and then there’s function calling which calls external APIs.

You could accomplish what you’ve described with function calling by creating a python server and allowing the model to call it via API. It’s definitely not at easy or simple, but could be done.

Thanks for reply. Are there any good tutorials on how to make an Assistant to call servers API? I’m almost sure I’ll need to let certain traffic through firewall, so it would be good to know how to set everthing up.
Also. Is it possible to directly send a code to code interpreter similarly to how we send a user prompt to chat?

1 Like