OpenAI Functions vs OpenAI Assistants vs "Agents" vs Plugins

Hello everyone,

i have read a lot about generative ai agents or so called assistants but do not understand on their capabiltiy to be orchestrated by a single prompt. Let me explain what i mean/expect:

I have two generative ai agents 1) gathers weather data for a city from a weather api and 2) a hotel finder in a city

Assuming the following prompt: “Please tell me about the weather in rome and by the way check for a 3* hotel in milano near the duomo di milano”

So one single prompt with two questions. Can a GPT handle this? If yes how?

As a relative Noob to the concept, the API tool functions you provide when creating an assistant are considered when analyzing a message. The assistant may use zero, some, or all tools available during the formulation of an answer.

Welcome to the community!

You may not need specialized agents: the base model can already do that to an extent:

If you play around you’ll know that this isn’t always super stable. How can you improve that? This is more of an art than a science, but Chain of Thought is typically a good idea: Generate a task list, work through the task, and then provide an aggregate response.

Hi IAMYB, You can do that with a single Assistant, last time I checked an assistant can have 128 function definitions so in that respect it could handle both those cases, the only reason to have multiple assistants I guess would be to have seperate specialisations even then 128 functions is a massive amount of tools it can access! :slight_smile:

Can anyone provide an example for a function calling assistant?

You can refer to the cookbook for some actual code.

Note that in the first example a hypothetical weather API is used.
We have another forum topic where community members actually made it work:


I created a blog post that shows you how you can use Custom GPTs with the API as “Agents”. Maybe it will help or give you some idea’s: