Is it possible to create a profitable AI mobile app with OpenAI API costs?

Has anybody built and monetized a successful AI chatbot-type mobile/web app? I am working on a chatbot mobile app for a niche within the therapy space, and while I am still building out broader features and testing it, I can see how quickly this may become difficult to scale due to the cost of the AI (tokens). I am using OpenAI 3.5 turbo and have got it to behave as I wanted, however, I was planning on using a freemium model. I.e., paid monthly premium sub, and a limited free model with rewarded/interstitial ads. However, I have noticed many of the AI chat apps are predominantly free and not ad-supported. I have seen others go the Product-Hunt release route. But I’m sure someone here is well-versed in monetization/cost reduction for carry (ongoing API cost). I was hoping you might have some wisdom to impart. or maybe someone knows where I can find more information on this? I feel like I am stuck at an impasse where the ad revenue/sub-conversion would need to be significant in order to break even on the cost of the API/token use. how are these free chat-bot style apps carrying the significant running cost, let alone making a profit?

1 Like

I’m running into similar issues. Did you end up solving the problems?

I can’t speak to this with experience because I have only, in the past week, began advertising the “chat app” I’ve been working on for over 2 years. No idea who is going to buy in. Fingers crossed.

Monetarily, I am simply marking up the cost of tokens and passing that along to the end user. I believe it’s the “retail” model that business have been using for millennia. Will it be successful? Only time will tell.

As for how these other guys have been doing it for free (this original discussion is over a year old now), I suspect they raised funds and are using the “penetration pricing” tactic. Google it.

The good news is that with the variety of models now available, the per token costs have been dramatically reduced. The bad news is that with the rise of “agents” and other mechanisms where we use the models to extract more accurate responses, the token usage has risen dramatically.

Always something, I guess.

1 Like