GPTs updates. What can we expect in the future?

Hello
It has been a long time since the last update on custom GPTs. We have not received any news about how GPTs will develop in the future, what will happen to the store, or what the future monetization vector will be for GPT authors.

I tried to find at least some information about what awaits us in the future, but unfortunately, there is nothing.

Maybe you know or have read in some sources about upcoming updates?

4 Likes

Hey there and welcome!

custom GPT news has been pretty dark for a while. This isn’t exactly unexpected though I’ll say that much.

The problem many GPT builders faced is that many people loved building them…for themselves. Not many people use other people’s GPTs, but people tend to use their own if they build it.

OpenAI seems squarely focused on agentic stuff right now along with the rest of the industry. When this era begins to simmer down, things start to mature a little bit, we may see some kind of evolution in the form of a custom agent system, or custom agentic GPTs (maybe even with built-in MCP support or something), but that’s purely my speculation.

After being active around this space for a couple years now, it’s clear that custom AI stuff isn’t going to be like the web. They tried plugins, and that failed. Then they tried custom GPTs, and excluding a couple outliers, none of them ever really took off, because ultimately many of the things custom GPTs did could have been done on the base models, and for the stuff that couldn’t, people just developed with APIs and coding tools, either from knowledge or vibe coding.

People have very strong opinions about vibe coding at the moment, but regardless I think what happened was that instead of people relying on a limited interface to build kind of what they wanted, they just used the same models to generate the code they actually wanted to do instead. Which makes sense if you think about it. Custom GPTs were meant to be an alternative to coding, but came with a very limited feature set unless you already knew how to code, effectively defeating the purpose. Instead, people realized the better “alternative” to coding something yourself was to copypasta whatever the LLMs made and just keep winging it until it did what you wanted. The barrier to programming that custom GPTs were trying to solve was instead solved by the greater intelligence of the models themselves.

5 Likes

Indeed, it’s a distribution problem. I think GPTs can be incredibly powerful but the discovery is just incredibly difficult for the average user. The name clash with the main product doesn’t help. The marketing department (who gave us o3, 4o, 4o-mini and 4o-mini-high, by the way) needs to step up their naming game :smiley:

If only there was

  1. MCP support within custom GPTs
  2. An API to query the list of GPTs

Then I could build a custom GPT to find the right custom GPT for the job :smiley:

In any case, there’s something very powerful about safely wrapping up access to a few specific API’s, and being able to query them, with OpenAI authentication in front to prevent abuse. I believe this is still unique to OpenAI at this point.

If we move to automatic selection of MCP servers, you still have the authentication and selection issue to deal with. In other words: why would it use MCP server A over server B who offer the same kind of thing, but have slightly different datasets, for example? I’m sure this is not a super simple problem to solve.

Something must be taken into account, the whole boom of GPT and AI in general is aimed at feeding models and improving them in the future, OpenAI is not interested in monetizing something that already has a benefit for future models, a gpt is only used by the person who created it for their own purposes, and commercializing something that has no market makes no sense when their base models do very well.

So in this boom of AI, this is not locked in a laboratory because it needs to feed on thousands of interactions with humans, I regret this for those who thought that the Gpts would have economic gains.

I’ve heard rumors that a model switcher will be introduced to GPTs. I’m excited about the potential of using o3 with GPTs because it will expand its scope.

Since I have published many GPTs, I would also like to have my own page on the GPT Store. It would be useful for users to have a function that allows them to collect their favorite GPTs.

It’s a shame that most people don’t understand the true value of GPTs.

4 Likes