Is it still worth it to develop and maintain GPTs?

It’s been almost six months since we were teased about revenue sharing for the GPTs, but so far, we haven’t heard any news about it.
Is it still worth it to develop and maintain GPTs?

  • Yes
  • Nah, it’s a waste of time and money
0 voters

If you’re looking to make profit: Probably not.

If you just want to augment the current ChatGPT experience w/ a system prompt, RAG, & Actions yes, 100%.

I have my own customized cGPTs that work wonderfully.

7 Likes

Developing and maintaining GPTs can still be worth it, especially if they serve specific needs or niches that generic models don’t fully address. By creating tailored GPTs, you can ensure the model is fine-tuned to handle specialized tasks, cater to unique audiences, or maintain certain ethical or operational guidelines specific to your project or organization. Additionally, maintaining your own GPTs allows for greater control over updates, privacy, and data handling, which can be crucial depending on the application. However, the decision ultimately depends on the value these custom models bring compared to the cost and effort of development and maintenance.

1 Like

I don’t think there’s profit in the large public GPTs. The ones with the largest conversation bases are all very public-spirited.

I do, however, think there is a great opportunity to profit by helping small businesses operate and maintain their own advanced private GPTs.

2 Likes

This is a very good point. I’m doing a lot of this these days. Not specifically creating GPTs, but rather using our own technology to deliver “custom AI apps” to clients, looking for some sort of specialised AI app, allowing them to solve business needs.

2 Likes

The situation regarding the status of the gptstore is frustrating. That said, there are still immense benefits in creating GPTs and publishing them to the GPT Store, such as enabling access to your custom Assistants (which these GPTs are) through store exploration, rather than solely from your own website.

firefox-cheapest-virtuoso-offer-via-custom-gpt-using-virtuoso-assistant-demo-2

Here are examples of Smart Agents available in OpenAI’s Custom GPT Store that embody these principles:

  • OpenLink Data Twingler: Executes SQL, SPARQL, or GraphQL queries directly from a ChatGPT session using various language models. Click here to watch an animated demo.

  • Virtuoso Support Assistant: Provides expert-level support based on knowledge from curated knowledge bases (or knowledge graphs) and product documentation. Click here to watch an animated demo.

  • ODBC & JDBC Connectivity Assistant: Offers expert product support based on curated sources. Click here to watch an animated demo.

  • News Reading Assistant: Reads news from sources that publish RSS, Atom, or OPML feeds. Click here to watch an animated demo.

OpenAI Custom GPT Store Links:

Absolutely!
This is one of the most important trends in the community. An ability to share effort and collective wisdom. I don’t expect it to happen overnight, but in time this is one of the ways to leverage network effect and create yet another S Curve in the AI industry.

2 Likes

There are also ways to monetize your Gpt on your own directly in the chat field.

Yes, but the GPT Store is still a very important piece regarding network effects. We need to encourage OpenAI to continue this effort, since they’ve gone somewhat quiet about it of late.

I don’t believe there’s benefit in the huge public GPTs. The ones with the biggest discussion bases are extremely open energetic. I do, in any case, think there is an extraordinary chance to benefit by aiding independent companies work and keep up with their high level confidential GPTs. You likewise can take help of chatbot development companies giving you the thoughts on the most proficient method to boost the effectiveness and so forth.

1 Like

I am this small business and have several ideas that will, at the very least, attract new clients. I think there is an app to be built and sold as well, but we need to understand the most efficient way to pursue monetizing these concepts.

:money_with_wings:

I would love to connect if you have experience and would be interested in us paying you for some consulting time on how we proceed!

Jane

Hi Jane, I posted my email address to you such that you could contact me, as you requested. But some of the moderators here are a little bit too trigger happy every now and then. Not sure how we’d get in touch here. Maybe if you post your email address or some contact credentials …?

To the moderators, Jane wants to get in touch with me, she asked to connect with me. How would we go about getting in touch? I posted my email address (only), and the list was flagged. How can I get in touch with Jane, as she requested, without violating policies here or getting flagged …?

Discoverability is a challenge that better search options can solve. The concept of publicly accessible Custom GPTs is incredibly beneficial, especially as AI Agents are set to replace traditional apps. The GPT Store could ultimately replace the Apple Store—unless Apple evolves in a similar direction.

This is just simply not true.

GPTs and Apps serve completely different purposes.

It may be that GPTs can be used on top of apps, but they are not by any means the same. I’m sorry for the nitpick but it’s a common belief that drives me crazy. Since the release of LLMs everyone is going crazy looking for a nail to hit with it.

It’s much more preferred to use a visual interface that cleanly and rapidly delivers information alongside a bunch of interactive tools to interact with these elements than a GPT.

For example. If I want to book a trip I want to visualize the trip and see all the location, costs, timings all within a menu that I can sort & filter. Tap, tap, tap, done. It remembers me, tracks everything, and everything works fine without me spending anything besides the actual fares in the trip.

I do not want to have to explain everything and verbally understand where the locations are, the costs, and the timings. The only exception would be when my hands are occupied.


Here’s where I think the difference is:

If I did want to use an LLM or whatever for booking a trip, I would want to use it WITHIN the apps framework. So either

A) The app provides this experience
or
B) I have my own PERSONAL LLM that can communicate with the app through some eventual LLM protocol that apps will have

1 Like

Monetization concepts could include:

  1. Metered access to Custom GPTs.
  2. A revenue-sharing model between OpenAI and Custom GPT providers based on their contributions to GPT usage.
  3. A combination of both of the above.