A user can wear many hats, it’s not a single-functionality.
This is all that needs to be said in my opinion. If you’re selling/serving enterprise customers just build it with the API.
I think OpenAI made a mistake (and I’m sure some of them would agree) by calling it the GPT “Store”. It’s not like Apple’s App Store, not even close really.
Everyone heard the word “Store” and “monetisation” though and saw $ signs.
In summary, if you want to make money then just build with the API.
As far as I’m concerned, this is not a all-or-nothing problem.
Nothing prevents you to build a custom action with he API and use the GPT as your first frontend to save time.
Keymate achieved over $100K in MRR thanks to their plugin/GPT, so saying “don’t use GPT just use the API” seems a bit reductive to me.
I think it is a valid discussion about ideas on how to monetize - whether as a “store” or like in a partner program.
The GPTs are a powerful tool, and can definitely be extended to several industries, and the entry barrier is quite low. It has its strengths, and what’s lacking now is that singular part of monetization and governance.
We can also load the GPT with historical data and build knowledge on this. We can’t always rely on the API as a single data source for GPTs.
It’s not only about monetization but the distribution channels and controls of a GPT. It’s about the governance of the products (GPTs).
Yes, I think many people don’t realize the huge amounts of compute and data required to appear to act like a human.
And ChatGPT launching for free reinforces the assumption that this just happens. I realize that OAI had to seed the market and create awareness, but we got (too?) fast to the point where the field has to be plowed in neat lines in order to facilitate the harvest.
An ISV Partner Program makes a lot of sense, and they’ve been around at least since the 80’s. Remember dBase III anyone? Oops, I dated myself
However, the first order of business is for OAI to find a better way to manage capacity, performance and reliability.
A Partner Program could help pay for this. And it helps enlist the troops (us) for OAI before Google or others do.
So don’t use it.
Why not? Why let this opportunity go? I’m confused. Anyway, there will be other AI providers building “GPTs” alike, even with more functionalities, and we’re looking at the market for these platforms. That’s just a matter of time.
Which, I think, should answer your question as to why OpenAI doesn’t let you just sell their UI any way you want.
Which is PRECISELY why you should develop your IP using the API. With the API, you can use any provider model you choose. If your IP is based on the OpenAI GPT, then it’s not just your IP anymore, is it?
Precisely what I’ve done. I have a complete AI Knowledgebase Chat System coded from the ground up in PHP. Was it hard? Yes. But it works, and while it primarily uses OpenAI models (via API) for compute, it is designed to use any major provider model we want – just in case OpenAI decides to change the rules midstream.
However, as icing on the cake, I now have a GPT which accesses this standalone system via it’s API. The best of both worlds, as they say.
So, if the product is worth anything, I’ll be able to charge people for access via the GPT, via the standalone frontend, or via it’s API backend. Sweet.
This is the power, and benefit, of developing your own thing.
I don’t think you got my point.
It’s not for me their UI the way I want. That’s not the case. I’ll sell their engine and configure it the way my customers needed. I have to solve their problems, and the GPT is the perfect solution. However, the lack of governance - without roles and a deployment model, it’s impossible to monetize.
If there were a minimum of control, the revenue stream for OpenAI would be gigantic. We could develop our GPTs connecting to enterprise systems for example, with all the content and fees for OpenAI, and the customer would pay not only for the usage, but for a support from us (ISVs).
API is fine. There are a lot of low-code/no-code solutions to plug in, however, the versatility of the GPT is that makes all the difference. I have GPTs that plots charts from sales orders, and list and approve Purchase Orders from ERP systems, my GPTs are fully functional and integrated with enterprise systems, but the only missing piece is the governance.
I’m not saying to open all GPTs in the enterprise, but at least an access, and controls helping the admins to deploy/publish/monitor the GPTs whether for costs perspective, usage, guardrails, security, or troubleshooting.
I think there is a lot of space for improvement, and I hope they don’t abandon GPTs like Google does with the majority of its projects.
Again, nothing against the APIs, but the GPT idea is killer, only needs some “fine-tuning”.
This is my point: It was not designed to make you money. It was designed to make OpenAI money. Which, by definition, does not make it the “perfect solution” you believe it to be. It saves you the hassle of coding the interface yourself, and the only cost to you is – loss of the monetization methods you seek.
So, in the end, it’s not the free lunch it appears to be.
Personally I feel as a builder we should be patient and focus on building cool and useful customized GPTs first.
100%, build cool stuff to share first and foremost. If you’re looking to make a living off it, go ahead and apply what you’ve learned to build a product using the API.
Again, I think you did not understand my point:
OpenAI will have its revenue stream: it still be paid for 1. the GPT, 2. New users 3. Tokens.
It’s not for free. They will continue making the same money to fund their operations and make a profit.
However, by providing refined access control mechanisms to a GPT, you can scale your IP. Instead of making and repeating the same concepts, it is just a matter of deploying it to another customer safely and securely, with controls, and everyone will be happy. It’s a win-win situation.
I never considered “free lunch.” I’m on the enterprise side of the thing, and I have customers willing to pay for it, but it’s difficult to scale this operation using the GPT engine.
I know I can use the API, but the GPT engine is already there. It is just a matter of organizing and deploying with the right permissions and controls. Not a free lunch. Many companies would pay a lot for these capabilities, saving them the burden of building models to interact with their enterprise systems, but for companies, we need to apply some segregation of duties.
The history is different when building for enterprises. At least a minimum of controls are required.
The current problem is that there isn’t a quality search and ranking system to identify high-quality GPTs.
I think that’s one of many problems with the GPT Store.
They have a great idea, they just haven’t executed it as well as most would hope.
I am not worried about that, since my GPTs are not focused on the general public. I could “sell” my GPTs within my industry. My problem is that I can’t sell.
The bigger problem is they’re not your GPTs.