As a creative user, particularly one who works in fiction, world-building, visual storytelling, and speculative settings, I often find myself constrained by content filters that, while well-intentioned, unintentionally stifle legitimate artistic and narrative expression.
I understand the importance of safety and responsible AI use. However, I believe there’s a growing need for a dedicated creative mode or “mature” setting that allows for more flexibility around themes such as darkness, conflict, ambiguity, tension, and human complexity, without crossing into harmful or exploitative territory.
Many professional and adult creatives work in genres that explore serious or emotional topics: horror, tragedy, post-apocalyptic worlds, psychological depth, and more. We need tools that trust users to handle those themes responsibly.
What I’m asking to be put into consideration:
- An opt-in “creative” or “after-dark” mode with additional guidance, warnings, or friction, but not blanket blocks.
- Greater transparency about what types of content are filtered and why.
- A way to submit specific prompts that have been wrongly flagged for feedback review.
- Recognition that context and intent matter (e.g., an interrogation device in a museum exhibit is not the same as glorifying harm).
Why this matters:
AI tools should be allies to creativity, not guards against it. With proper controls, it’s absolutely possible to enable deeper, more challenging storytelling without compromising safety. I, and many other creators, would greatly value a system that treats us like adults and collaborators, not risks to be managed.
Thank you for considering this feedback. I’m passionate about the potential of AI in the arts and hope to see the platform evolve to better serve nuanced creative needs.