Are Custom GPTs replacing Plugins?

Hi there,

it seems pretty obvious by now that Custom GPTs will replace plugins. They pretty much allow for everything you could do via plugins (via actions), plus allow you to provide more context, connect things with browsing the web, set conversation status. Plus they are much more discoverable than plugins.

Last but not least, I think Custom GPTs provide a much better experience over plugins, given that enabling and disabling plugins inside your conversation all the time felt really cumbersome.

If somebody from OpenAI could chime in and let us know if we should expect plugins to be phased out any time soon (or not), that would be great.

Cheers from Sydney,

Sascha (AI Forms, Checklists & Workflows)

Hey there!

So, I’m not OpenAI staff (and interacting with one on here is rare at best), but I can validate that yes, it is their intention to phase out plugins. They have already asked plugin maintainers to try building a custom GPT version of their plugin, because they do want to eventually deprecate plugins. They’re not shy about that anymore.

1 Like

The language in the service terms makes the connection between plugins and Custom GPTs (specifically actions) much more explicit.

1 Like


Is it OK if a moderator closes this topic.

1 Like

Right now I use several plugins daily in the same context; one to read PDFs, one to search the web, one to analyse GitHub, and a couple of different ones to draw diagrams, also Wolfram Alpha sometimes (obviously there is a maximum of 3 per chat, I mix and match depending on the task).
None of these are particularly useful on their own, but combined they are amazing.
Am I correct in saying that GPTs will have no way of replicating this functionality of using a variety of plugins together in the same context?


You make an interesting observation.

With plugins users were able to mix-and-match functionality provided by different developers—this was very powerful and very useful.

It was also unpredictable and uncontrollable.

It’s certainly possible that some combination of plugins could result in some kind of unaligned behavior/outputs from the models.

I’m not suggesting this is why plugins are being phased out, just making an observation.

I don’t see “plugins” as modules coming back any time soon—the focus seems to be on discrete GPTs.


As mentioned above, this is one of the drawbacks when it comes to GPTs over Plugins.


If this particular combination is useful to you, have you considered constructing the combination itself into a GPT in some way? :slightly_smiling_face: We can’t mix and max as users, but as developers, knowing a good combination is a good step in building a practical and functional GPT many users can enjoy.

1 Like

I do have a hunch that soon we will have the ability to route & chain GPTs, similar(ish) to Plugins.

I recall mixing plugins led to a lot of unfun chaos.

1 Like

I mean, I hope so.

You can definitely tell OAI is experimenting with this as much as we are lol.

It’s like I have this weird feeling something is just missing from both plugins and GPTs. It’s not the popularity or eagerness to develop, but some way in which novel, useful tools can be built seems to be tough for everyone to figure out right now. We all want to make useful tools, OAI wants developers to make useful tools, yet we’re all kind of bumbling around trying to figure out how to actually make that happen.


I am truly under the belief that the whole purpose behind these GPTs (& ChatGPT in general) is to gather/create/curate training data & metrics.

How can I develop a GPT if I have no idea how it’s being used, still have no clear idea of monetization, and only have a single “initiated conversations” metric.

How can I use this store when I am limited to 25 messages, have no essential metrics like user reviews/feedback, and 80% of the GPTs I see in the top 6 listings can be easily replicated by myself and be more personalized to what I want within 5 minutes.

I can see myself using a GPT as an extension to a third-party service I already use. Such as a scheduling app, or a workout tracker. But… I mean, they would probably have their own AI agent app.

I was really hoping Assistants will take us there. It’s been 3 months without any sort of noticeable update :smiling_face_with_tear:. Which I wouldn’t mind if it wasn’t so closed-in. I seriously do not understand why we cannot add Assistant messages or manipulate the conversation

So now, I’m under the belief that providers like AWS will come through with some seriously powerful tools that allow us to use specialized models for specialized tasks, that along with RAG will be beast mode.

I’d love to use GPT-4 to take in a message and route it, use specialized models to process it, run the logic, and then have GPT-4 crunch it all together for a response. It’s just that in-between bit & latency that’s hard as ufkc


80% of the GPTs I see in the top 6 listings can be easily replicated by myself and be more personalized to what I want within 5 minutes.

It is not just you my friend - even media outlets seem to be reporting similar feeling on all of this

I think once Assistants evolve a bit, we’ll finally be able to break that dam finally. This must be the pain points of the 90’s era nobody talks about as much, because the transitional phase is slow and boring before people can really start making headway.


The journalist’s GPT doesn’t even need to be a Custom GPT :laughing:.

Which I really eventually found for myself. I had a Coding GPT, Gym GPT, Designer GPT, Research GPT, Car Maintenance GPT.

Mainly from the halved message limit I just had a moment to myself like “Why am I using these…” So I’ve reverted back to using ChatGPT :person_shrugging:.

For people willing to test, tinker, and risk: it’s a time to be able to explore uncharted lands. Mostly water & expenses but I’ll be damned if it’s not an adventure.

Hooking onto slow, mostly walled-off ships with no communication can be very boring, but still an investment.

I spent a obviously large amount of time creating an image to represent this (Dall-E apparently can’t do boats on the top of a tidal wave) (Total fatality from both boats in this photo, probably)


The true spirit of this community summed up in one sentence right here. At least from my POV :rofl:.

1 Like

I expressed similar sentiment in another topic. I see very little value added in almost any of the GPTs I have seen.

It’s just that everyone’s needs are so different and personalized.

Myself, I made a GPT that has complete details of my desktop machine and all of the software installed on it so that when I want to ask it a question related to some issue I am having I don’t need to either specify details ad nauseum or wait for it to pump out unhelpful Windows or OSX before getting to Linux.

I also gave it shell access so it can query for more information or run commands and get back error messages.

Another nice thing is I can use ChatGPT to restart services or do other mundane tasks if I need to.

This wouldn’t be useful to anyone else, but it is super useful to me.

Now, I could set up OAuth, store user PC info in a database and feed it to the models and slap together a relay to a Flask app users would install to give the LLM shell access on their own PCs, but I am not overly entrepreneurial and I have zero interest in being responsible for securing that sort of thing.

Most of what I have seen in the GPT Store consists of some instruction sets anyone could come up with in a few minutes if they wanted to, or a bunch of knowledge files likely scraped from the internet which just about anyone could acquire if they wanted to.

I’m not seeing very many things where I am like, “Oh, that’s clever/interesting/useful/etc.” Just a lot of going after low-hanging fruit and hoping being “FIRST!” is enough to net a big payday.

Beyond that, with most of these instruction+knowledge GPTs, it is an open question if they can do what they claim to specialize in any better than base GPT-4.


See, it’s conversations like these that make me go:

A. Clearly the way in which we all use GPT to help us is very personalized, making personalization the likely next step in this process after the GPT store. GPTs won’t seem to be very complimentary after that aside from the tool use (which, could be extremely easy to make a feature in your own ChatGPT experience)

B. The real money is going to be when we can make personalized GPTs talk to each other


What I keep saying is, “in a gold rush, sell shovels.”

Outside of a few of the very best GPTs out there, I didn’t see many that will likely make any real money—at least not so far.

I suspect where the real money will be is in providing backend services to individuals and other builders in a no-code framework.

@matcha72 with gpt-auth has the right idea—even if I don’t particularly care for the implementation. Building tools that builders (and users) can drop into their own GPTs as actions for a nominal fee is a better path forward for profitability as a developer.

Set up your API endpoint, then for each Builder/user you service give them an API key, a URL for a unique schema, and ba privacy policy link and they can :hand_with_index_finger_and_thumb_crossed: have a value-added service up and running in no time.

Hell, get a bunch of open or low cost APIs together set up your endpoint as a relay to them, and you can just let people buy actions à la carte from a single endpoint, dynamically generate the schema based on what services they want…

The value comes in building or aggregating new, novel, interesting, and useful actions.


I really would prefer not to invest hours into replacing a functionality which already exists, and its not clear that it’s even possible to call vendor’s cGPT’s from my own.

I never had any issues with unexpected interactions, and was happily paying one of these plugin providers US60 per month for access to their RAG service.

Now I wont be paying for it anymore because it’s not useful without all the other plugins, good job OpenAPI.

Personally I don’t see what the benefit of cGPT’s is even supposed to be.

Each context is a ‘personalised GPT’ already, and if necessary I can copy/paste a prompt to get it into the state I want, which is all a cGPT seems to be doing for the typical user who isn’t crafting API/actions.

I summarized some thoughts around this in a recent blog post of mine: Why Custom GPTs are better than plugins, for creators and users!

I agree with others that being able to enable CustomGPTs talking to each other is the next big milestone.

The good news: that day is (sort of) here. You can use @ to activate a GPT in a thread and then a few messages later use a different @ to call a different GPT. Seems the GPT-to-GPT exchange would need to go via the conversation window (i.e. not direct exchange behind the scenes). But it is way better than what we have now with just one CustomGPT in a thread.

The bad news: This beta feature seems to only be available to a few users. If you are one pls say how you are finding it and how you think you got the early access.