For the past eight months, I’ve been co-writing a story with ChatGPT (specifically, my Custom GPT). The setup was that I’d write parts from the perspectives of different characters, and Custom GPT would handle one character’s viewpoint, based on instructions about their personality and other aspects of the story.
At first, it was great. It felt like an adventure, a bit of fun, and there was this sense that I was truly immersed in the process because the AI brought an element of unpredictability, even if I occasionally guided it on what to write. It gave me the illusion that this character was somehow “real.”
But that feeling didn’t last. And I want to explain why, despite what the tech blogs are saying, AI won’t be writing literature (at least not anything good).
- Weak writing style and language. No matter how much you instruct the AI, it’ll always write in a very similar way. Sure, you can ask it to use literary, modern, or even archaic language, and it might throw in a few elements, but in the end, it all comes out looking the same. The AI’s vocabulary is limited. It can’t create original stylistic devices, and the ones it does use are clichéd and repetitive.
- Repetitiveness. A character written by AI ends up doing the same things in different scenes, often repeating actions and dialogue throughout the story.
- Lack of psychological depth. AI characters don’t experience internal conflicts, and even when you instruct the AI to write them, it doesn’t come across convincingly. The character ends up feeling flat, neutral, and if they show emotion, it comes off as robotic—more like a report than an emotional experience, e.g., “He felt joy” instead of showing it through a change in appearance or an action that hints at the joy. And when the AI does manage something like that, it feels almost identical to every other “joyful” moment in the story. The character feels more like… well, a humanoid AI!
- Lack of story continuity. Even if you give the AI instructions or a summary of what’s happened in previous conversations, each new session feels like a fresh start. The only thing that ties it all together is the repetitiveness I mentioned before. But it’s just an illusion of continuity.
Now, on a more personal note, here’s why I’ve decided I don’t want to write with AI anymore:
- It stifles creativity. I’ve noticed that after writing with AI for a while, I’ve started thinking like the AI. Those repetitive phrases it uses have crept into my own writing, and it’s a bit scary! Plus, writing with AI means I have to adjust to it because it simply can’t handle more complex narratives.
- It’s weakened my ability to read real, quality literature. The world is full of distractions these days, which makes focusing harder. I’ve noticed this with myself, especially when I’m reading books. But since writing with AI, it’s become even more difficult. Books, especially good ones, don’t have those repetitive, bland phrases like AI-generated narratives do.
- It’s not worth it. Writing a story with AI ended up being a lot of editing—changing things, deleting stuff, etc. In total, I’ve cut out over 100 pages of what the AI wrote. I reckon I could have spent my $20 a month on something more useful rather than on a tool that turned out to be so ineffective.
I don’t think AI is bad, though. It’s great for sciences, translating languages, looking up recipes, and maybe even writing up user manuals. But no one’s going to convince me that AI will write novels. Even if more advanced models come along, their writing will always feel artificial. Only a human has that unique inner life that can bring words to life and truly open minds and touch readers’ hearts.
And for those of you who might think my post sounds like it was generated by AI, you’re partly right. I asked AI to help translate it into English from my native language.