Is it possible to call an Assistant via an action within a custom GPT?
It seems like GPTs are designed to be user facing and if this is possible it would be an easy way to connect a custom GPT with all the features that assistants offer. The problem with sharing an assistant via playground is you don’t know if anyone is using it and whether its working, I believe. Of course you could build out your on site with a chat interface and call the API but it would be great if a custom GPT could be the user interface.
Has anyone thought about this? Is it a terrible idea?
I think it’s a very interesting idea, you could host something on a server that makes use of the assistants API and then call it via a GPT. Might be worth taking a look at the Fast PAI framework.
Unless you mean for a GPT to call the assistant endpoint directly… now that would get interesting.
Assistants are an API product for developers. They are also not practical as their cost per query can be very high, maximum context length, iterative, delayed output. You want to sell a value-added service where you only find out the next day that user questions were costing you $2 a go?
GPTs are within ChatGPT plus.
You can share GPTs, and GPTs can be set to use actions to call on your own oauth domain API to provide functionality. This is like plugins: use the wolfram plugin, and it employs an API that can calculate math.
However you would bear the cost if you make an API that costs you significant money for every use, such as fulfilling general user requests with AI model calls of your own.