A Pragmatic Approach to Custom GPTs: Why I Chose Airtable Over Building a Store

Im not sure a store (like the likes of Apples) is the way to go (though now ive said it outloud this will probably come back and bite me)

Im not sure i want to hunt down ( search for) a GPT to offer me cheese recipes

Or a GPT to pretend its Snoop Dogg being a dungeon master

What the GPT landscape IS however is an excellent way to crowdsource Action generation

For i while ive been thinking that Zapier (were they IPOd) would be an incredible stock to own

With them as the communicative glue, and AIs as the reasoning CPU… you have an extremely capable autonomous agent

By bringing the API connections into the OpenAI ecosystem… GPT becomes the same

My prediction (for what its worth) is that (like DallE being blended into ChatGPT)… he actions ecosystem will be consumed into the main ChatGPT thread

The only point of having multiple agents (apart from having different personality facades) is when their underlying models and knowledge are tuned to certain domains (e.g. LegalGPT)

And the GPT store is not the place for that

1 Like

why don’t you just use Lang chain with air table api?

The benefit of having isolated GPTs to their purpose is the same programming principle of each function having a single responsibility (or maybe Separation of Concerns is better lol). I agree that having a single thread which can do everything would be ideal, and may be the case in the future with an AGI.

For now though for the sake of tokens, clarity, and retrieval, having them isolated just makes things are lot easier and cleaner to work with.

It really depends. I don’t think you would need to “hunt [it] down”. If you like to cook, or… have to cook (:laughing:) then you would have a “Chef Assistant” GPT which would take seconds to find and use. It would take the same time to google search it. More time to read about the dog of the recipe owner (because for some reason they always need to talk about their dogs and cats before the recipe)

It could be that there will be an “Gating/Routing GPT” that can select/include the appropriate GPT(s) depending on your query.

It’s almost natural to “categorize” conversations with ChatGPT (vanilla). So I have one instance to discuss programming, another instance to discuss the secrets of cats, and then another to talk about recipes. Separate “experts”. It could be that this philosophy possibly led to the creation of MoE (Mixture of Experts).

Admittedly I have never used Zapier. AFAIK it acts as a communicative layer (or glue as you said) between different apps. It could be that GPTs completely overshadow Zapier if they can communicate between eachother (through some sort of gating/routing GPT)

I’m all for predictions! I agree. Actions are going to be very potent. But, if I am offering an API service I most likely also have a website that I can use to extend the functionalities of ChatGPT and may not necessarily want my clients to be there.

Plugins were actions consumed into the main ChatGPT thread. For whatever reason it’s believed that they just didn’t get traction.

It makes things cleaner. Instructions can be more specific and not generalized. I have been able to identify and separate the common reasons I use ChatGPT and split them into GPTs with specific instructions and functionalities.

1 Like

Totally agree. I’m in, please share! Let’s do it.

Another great topic! I’ve been reading a lot of good content since I joined yesterday. Nice to be part of a forum with such a calm & knowledgeable crowd.
Thanks for the great content everyone.

3 Likes

That’s a great idea. Thanks,

:pray: Thank you. I try, All in the name of efforts towards making AI accessible. I guess I would say we are “Early Adopters” who would stop at nothing to try new tech.

1 Like

I would say be careful with Airtable. You are limited to 5 requests per second per base. This could be a no-go for fast-paced API calls to Airtable.

1 Like

That is a potential issue that will arise. Not a matter of “If” but “When”. Will figure something out.

Fair response

And on second consideration RAG gives a clear differentiator between the GPT and the vanilla ChatGPT as well as the Actions

I also heard rumours of finetuning being a future aspect of them as well.

And its probably not too long until Actions have response handlers through code intepreters etc etc

Maybe Im just being a bit of a boomer.

Though still feel i only want one interface

Lets assume that Alexa, OKGoogle, Siri and GPT are, within the next 12 months, one of primary interfaces with AI… I dont want (or have the capacity to easily) switch context from one GPT to another

Unless is say something like “Ok Google, enter Cheese Recipe Chef mode” or such like

Instead if expect the cheese recipe “skill” to be added to my AI (if ive paid to add it)

Skills vs GPTs

I guess thats what i boil my thoughts down too

And im leaning more towards “add on skills”

Though, as youve said, maybe thats what the plugins world was/is.

And that didnt pan out well

1 Like

Love it. Savvy thinking.

What would you call such a resource. that is connected to a Smart Home?

Yep! That’s exactly how I see it as well.

1 Like