Did ChatGPT just kill all AI writers?

It seems like you can now get a better output with ChatGPT than with any other AI writer out there.

Do you believe OpenAI just killed all these businesses with the new tool? Why would anyone pay for Jasper/Copy.ai/whatever subscription when they could get better results with ChatGPT for a fraction of the price?




Except ChatGPT doesn’t like to produce NSFW results, so there is a large sector there still available.

Of course, you can trick ChatGPT into producing some as many Twitter threads have shown, but it’s against the terms of service and I imagine they might crack down on it if people started using to produce NSFW stuff commercially.

It looks like it replaces basic “content marketing” articles though; the cheap ones that used to pay about $5-$10 a pop. It does them better and instantly than cheap writers.

Higher end writing will need some human interaction though, whether it be rewriting, editing etc.


FWIW, I kept on working on the tool I was writing for my target market but my wife who works in that market spotted people on her Twitter feed who’d already began experimenting with exactly the idea I had.

So I have given up with that.

I initially questioned why OpenAI would essentially create a free and useful “form factor” that obsoletes its own customers paying for its API.

However, I speculate that it’s to get in front of other people who were working on a similar, super accessible chat format that could have gone viral. This way they can control the experience and the backend resource usage.

@lukepuplett - yeah, but it appears to me that they’re basically saying to all customers of their customers - “Hey, we have something better, why don’t you ditch your current vendor and use the real stuff for free?”

I believe that you guys are looking at the business and monetary opportunities in NLP and generative AI in the wrong way.

In my opinion, Sam Altman and the OpenAI crew are not in the business of helping people write Ad Copy, generate Amazon E-Books, or anything like that.

If you’ve watched any of the fireside chats where Sam talks about the future of OpenAI and other technologies like these. He mentions that we, as humans, cannot even conceive all of the infinite ways to implement and improve the world using technology such as this. And that we are possibly decades away from the generative AI using the data it has stored to discover new cures or even make real suggestions about how we might be able to change the world at a rapid pace.

If you want to build an AI Amazon E-book writing assistant go for it. You can probably grow it to be quite large and GPT chat won’t take your business. Because again OpenAI is not going to develope a suite of tools that also convert the generated AI into an e-book for you, design it, etc.

Same goes for pretty much any other generative AI business model. Let’s say you want to use GPT-3 to generate quotes and put them on t-shirts to sell on a shopify store. Go for it, Sam Altman is not going to create a GPT-3 Chat Shopify integration that completely decimates the value that you’re bringing to the market.

There are tons of opportunities flying around in the wind right now my brotha, simply reach out, grab one, and hang on. Pretty much any idea you can think of will be implemented one day by someone, that person might as well be you.

Don’t let competition or GPT Chat scare you from building the future as you see it. If anything, this only creates more people who are willing to try and use other AI products.

If you look at GPT Chat as a 1-dimension experience, an E-book as a 2-dimension experience, and a physical item as 3-dimension experience. Focus on building 2D and 3D experiences for your users using GPT and similar technologies and you should do okay.

1 Like

ChatGPT has proved that having humans in the loop makes for a better product. And Dall-E and ChatGPT both proved that having direct end-user offerings generates a lot of benefits (marketing and potential revenues) from end-users… perhaps more than an API ever could. Prior to this time, direct access to end users wasn’t possible because OpenAI hadn’t figured out a way to manage the safety issues, but now they have (mostly). I suspect this means that we’ll see a lot more of this… which necessarily means forward integrating into some of the things API users are doing. I’m also guessing this is the format that GPT-4 will arrive in… pre-packaged to manage safety issues and also a direct access option for end users, not just API/account holders. It would make sense that’s what they’re working on given the timing of ChatGPT as well as the fact that GPT-4 has also reportedly accessed internally (and for a handful outsiders for some months already). Hopefully, when GPT-4 is all safely packaged up there is still API access for us too though. I can’t wait!

I stopped working on my primary project to quickly build a tool to solve a problem my wife’s had for years, but it’s now far less probable her colleagues would pay when they can use ChatGPT.

The value prop goes from paying for the “wow” time-saver AI output to paying for the “mhmm that’s useful” assistive/guided UI around it.

It’s no longer worth the trade in focus from my primary project. I’m better off investigating the ways I think my primary app can benefit from an LLM. I’ve noticed from playing with ChatGPT and seeing others that it can do some surprising, weird and useful stuff.

Although your point is appealing for its loftiness, there were people sinking time and money into building tools to write things and save people time, and we have to be pragmatic and consider the impact of the proliferation of ChatGPT knowhow, and the cost of an opportunity that is forgone in the pursuit of an alternative.

it’s true… but don’t forget that OpenAI is eating HUGE costs by making ChatGPT free right now. If they keep signing up people and end up with 50-100 million users, I can’t imagine that would not sink them. We already see signs of scale challenges. One can only assume that they need to find a paying business model to make it sustainable past this test period. Once they begin charging what it actually costs them (I assume they will eventually though could be wrong) I suspect this is where the developer community will find its niche. It’s hard to compete with free, yes, but I can only assume that they’re taking a bath by doing so, and that it’s only temporary.


Article on Reuters today talking about those costs and Sam saying that it can’t remain free forever.

1 Like

My focus was solely related to AI writers.

It most probably won’t remain free, that’s expected. What was unexpected to me is the fact that many people and organizations (including myself) were building apps on top of their infrastructure, which is now open to public, more powerful than what we have through API, and basically covers most of the use cases related to writing textual content.

My thinking is that Jarvis, Copy.ai and others have just been made obsolete by their own service provider who figured out it’s own course of action based on these businesses’ experience and use cases.

Currently, I’m in a waiting mode, trying to figure out where they want to go so I can make a pivot and differentiate from the core service.

1 Like

For those who are building products on top of GPT3, it’s important to have a strong understanding of the underlying AI principles. As Kurt Lewin said, “there is nothing more practical than a good theory.”

Having a solid foundation in AI will not only help you create a better product, it will give you the confidence to adapt and evolve your product even in the current competitive landscape.

So, if you’re just starting out, it might be a good idea to brush up on the basics by reading a textbook (I recommend the one by Russell and Norvig). Trust me, it’ll be worth it in the long run.

1 Like

The way I look at it is that I’m offering convenience with my generators (DALLE2 and GPT-3)… Yes, you can go to the source (PLayground) and do it yourself, but my RPG tools save time. It’s like how you pay more for a small bottle of soda at a convenience store / gas station - you’re paying for the convenience.

There’s also other things to think about - storing content, charging customers, etc, so there’s other infrastructure built around the ideas. Jasper/Jarvis got way ahead in the beginning (via a good marketing plan?), and have stayed one of the leaders in the space, I think.

1 Like

I agree with this.
In digital products you have to take in consideration scaling up a product and continuously work on it.
Because if you rely on an innovation, like GPT-3, is naive to think that you’ll be the only one to use it.

There are cases when indeed, a new technology just makes your digital product obsolete.
The best you can do in this case is to adopt that technology.
Here’s an example: the vector search which makes possible OpenAI embeddings (and others) became relatively popular in the past couple of years. Algolia has been a leader in the search services market, but it didn’t had the vector search technology. So they acquired a company to catch up with the competition.

a new technology just makes your digital product obsolete

I agree, but only when it’s not your own partner vendor who is competing with you and hence killing their own partner by providing better technology for free while overcharging you for the old tech :slight_smile:

I just completed a manuscript using fine-tuned GPT-3 models, and ChatGPT was released as I was wrapping up the last few chapters. The main disadvantage I’ve found with ChatGPT is the lack of hyperparameter controls, especially temperature, so much a lot of the output tends to be pretty clinical unless you specifically ask for some affectation in the output. I imagine as they roll out ChatGPT Pro there will be more granular control over the tone of the output.

I am very curious to see what the price points end up being for monthly usage. Using fine-tuned GPT-3 models is definitely spendy, especially for a large-scale project, so I’ll be watching with great interest to see how this all shakes out. I really do applaud OpenAI for this generous open beta they’ve made available to the world. I just hope there remains an affordable means of accessing it for everyone, since this technology is going to have large-scale repercussions on the world.

Given the disruption and potential creative revolution I predict will result the masses having a taste of what’s possible from large-language models and this calibre of AI. This technology is too significant of an breakthrough to allow a handful of corporations to play gatekeeper of who does and doesn’t get access. Once they’ve figured out how to effectively commercialize ChatGPT, I’ll be paying attention to see the winners and losers in terms of pricing and accessibility. I believe it’s important for there to be a balance between profitability and accessibility, as this technology has the potential to greatly benefit society as a whole.

As for my own work, I will continue to experiment with and utilize AI-assisted writing in my projects, as I believe it has the potential to greatly enhance and augment human creativity. I am excited to see the evolution of this technology and the impact it will have on the writing and creative industries. I hope that by sharing my experiences and knowledge, I can help others navigate the use of AI in their own projects and contribute to the ongoing conversation about the ethical implications of this technology.

1 Like