The new GPT functionality of ChatGPT offers some kind of LLM agent capability, which is far simpler to setup than using platforms like Langchain. But I see several problems.
First, browsing is too slow to be really useful for those that just want swing by for a quick reply. Often one needs to browse several websites for one prompt (and each one needs first “google” and then the actual website retrieval).
Second, with APIs alone it is difficult to obtain more complex functions. Often the returned json is far too big. Then, APIs often use ids which require some kind of sql-join. This could in principle be done by chatGPT, but it requires two large jsons in the prompt. The join should better be done before the result is send to ChatGPT. If anyone has a solution to this, please let me know.
Hi @sten.ruediger, so a couple of thoughts that may help.
If using browsing, often giving very specific queries in the Instructions can speed up that phase of the experience. If you know the likely end domain for example you an skip the “Google” step and instruct it to go right to a domain or specific url path.
To execute more complex actions with multiple api calls, yes it is best to have a middleware that joins the two large jsons - and then crucially - prunes the resultant JSON so only the essential information is passed to the GPT via the Action. If you have yet to find a solution for that step I can suggest a couple of approaches.
Thanks, Cass, that is very helpful. I tried Zapier for the middleware, until I realized their API cannot send back data Now I am trying replit, connecting it to rapid-API. If you have a suggestion for this step…
Well you are definitely on the right track.
I would probably write the custom function myself in a Python project to join the responses according to whatever rules you need, create a simple FastAPI set of endpoints, and then grab the OpenAPI Schema that is automatically created and paste that into the Action of the GPT.
You could also build a custom database with frequently requested information. This database could be updated periodically and would allow for quicker access to information.