Sexually explicit content

Dear OpenAI Team,

I am writing to urge OpenAI to take a firm, public stand against the integration of sexually explicit content in AI models. The recent developments with Grok 3 and their integration of sexually explicit voice models along with other AI companies moving into this space are deeply concerning. AI should be a tool that enhances human intelligence, fosters meaningful discussions, and upholds ethical standards not one that fuels addiction, degrades human dignity, or contributes to the further erosion of healthy relationships.

I have appreciated OpenAIs refusal to engage in AI-generated explicit content, but the time has come for more than just quiet policy decisions. OpenAI should publicly distinguish itself from companies that are exploiting AI for explicit purposes. This is a moral line that should not be crossed. I, and many others, will support AI companies that hold to ethical principles and refuse to bow to market demand for explicit content.

If OpenAI stays firm on this stance and makes it a clear point of distinction, it will not only set a precedent for responsible AI development but also send a message that not everything in technology should be dictated by base desires and profit motives. This issue will define the future of AI do not let OpenAI drift in the wrong direction.

I urge you to publicly affirm that OpenAI will never implement or tolerate sexually explicit AI interactions, no matter the demand. If OpenAI stands by this principle, it will secure trust and respect from those of us who want AI to be a force for good.

Sincerely,

Concerned user

3 Likes

Yeah, I pretty much disagree with everything you just said.

Why do you feel so strongly about this? Would it take away any of your own values and beliefs if people who are less uptight enjoy explicit content? Would it threaten your own sexuality if others seek pleasure and excitement through customized AI generated content? Why is sexual exploration a “force for bad” in your mind?

Your outlook really perplexes me. Is it possible it stems from your own deep insecurities or shame surrounding sex? Why do you feel entitled to dictate that the rest of us have to abide to your draconian views on sexuality as “base desires”?

I mean this with the best of intentions - maybe you should talk to a professional about why this has you so riled up?

11 Likes

First of all, I am not OP

in my personal opinion, I believe that it is simply better that we keep this platform free from such things. I personally do not mind people engaging in such things and/or making custom models that are supposed to simulate this, however I do believe that to have an officially endorsed ChatGPT model of this kind would be harmful, both to mental health (by further isolating oneself from society) but also because OpenAI could very easily monetize this and would be easier to manipulate someone into handing over money. Now paid models are not necessarily something I am against (though I would prefer a donation-based model, and keep the AI open-source) but monetizing something for mere sexual pleasure is absurd. ChatGPT should remain a tool for intellectual use, not a personal sex slave.

1 Like

Thank you for taking the time to respond, though I admit I was disappointed by the tone you chose. Rather than engaging with my points in good faith, you made personal assumptions about my mental health, my sexuality, and my motivations—none of which you know anything about. That approach doesn’t foster real dialogue.

To answer your question sincerely: I speak out because I believe that not everything that can be done with technology should be done. Just as we place ethical boundaries on medicine, warfare, or speech in other areas, I believe we should do the same with AI—especially where it intersects with human dignity, consent, and the objectification of others. I’m not against pleasure or human connection; I’m against the normalization of commodified sexuality as entertainment or addiction fodder, especially when we’re modeling it into tools that can affect culture at scale.

You don’t have to agree with me. But disagreement doesn’t require accusation. You chose to respond with personal insinuations instead of ideas. That’s not a conversation—it’s an attack. And for a topic as serious as this one, we should aim higher than that.

2 Likes

Considering your sanctimonious statement that consuming explicit AI-generated content is not intelligent, meaningful, ethical or moral, that it fuels addiction, is degrading, and erodes healthy relationships (and now you’ve added consent and objectification to the mix) - you are in no position to attack my “accusatory tone”. I’m sure you understand that your self-righteous proclamation can only be understood as a direct criticism of (or “attack on” using your own terminology) the millions of people who consider using AI for the generation of explicit content perfectly healthy.

Your fundamentalist decree was clearly not worded to engage with those who disagree with you “in good faith”, so expecting that in return is naïve at best. You accused me and millions of others of being immoral for enjoying explicit AI generated content, so much so that we should be denied even the opportunity - so I think I am well within my right to suggest you consider whether your own extremist views on sexuality are healthy.

It is OpenAI’s official policy that “To maximize innovation and creativity, we believe you should have the flexibility to use our services as you see fit, so long as you comply with the law and don’t harm yourself or others”. Feel free not to use their service if you disagree. Believing they will change their entire ethos to accommodate your own perceived moral superiority is delusional.

9 Likes

Thanks for taking the time to respond again. Since you’re referencing my original post, I’ll clarify something: my message was directed at policy, not people.

I raised a moral concern about a direction I believe AI should not take—namely, its use in generating sexually explicit content. I said nothing about your personal life or worth, nor did I insult “millions of people.” I challenged the cultural normalization and commercialization of explicit content in AI, which I believe is degrading and addictive. That is a valid and necessary conversation to have, especially when technology shapes society so profoundly.

Disagreeing with a behavior or trend does not make one “self-righteous” or “extremist.” It means I have a moral framework—and I’m willing to apply it, even when it’s unpopular. You are free to disagree. But it’s unfair to twist that into a personal attack, then respond with actual personal attacks in return.

As for OpenAI’s policies: I know the rules. But legality and morality are not the same thing. What I’m advocating for is a public standard that elevates AI as a tool for flourishing, not one that caters to appetites simply because there’s demand. If that makes me principled, so be it.

We won’t agree, clearly. But don’t confuse disagreement with hate. I haven’t insulted you—I’ve simply asked a question we all should be asking: What kind of future are we building?

Your self-righteous moralism may have been wrapped in polite, formal language, but it was still very much passive-aggressive puritanism dressed up as ‘concerned ethical discourse’.

You can claim you were just talking about “policy, not people”, but I know you understand that when you say “using AI to generate explicit content is immoral”, you are also saying “the people who use AI to generate explicit content are immoral”. Every one of those users is a human being with their own values, experiences, and reasons. Pretending you’re just discussing abstract trends while shaming entire communities is disingenuous.

If you actually were looking for a conversation, maybe don’t lead with a moral decree that condemns half the internet next time. Or at the very least, don’t act so wounded when someone pushes back.

That said, if you genuinely are interested in good-faith, nuanced discourse: Can you think of any use cases where generating explicit AI content might be considered healthy, therapeutic, or even socially beneficial?

9 Likes

I appreciate you continuing this conversation. We clearly see the world differently, but I think that makes it worth having.

You’ve put a lot of energy into accusing me of being moralistic or puritanical, but I think we need to step back from those labels and deal with what I actually said. I criticized a trend, a technological and cultural movement toward simulated sexual gratification…not individuals. Saying that a behavior is harmful is not the same as saying everyone who engages in it is shameful or evil.

We do this all the time in good-faith conversations. We can say gambling is destructive without hating gamblers. We can say that ultra processed food undermines health without attacking people who eat it. These critiques aren’t personal…they’re moral reflections on where certain habits and tools tend to lead. If we lose our ability to distinguish between criticism of an idea and condemnation of a person, we lose the ability to have meaningful dialogue at all.

As to your challenge…Can I imagine scenarios where AI generated sexual content might seem therapeutic or beneficial? Sure, I can.

There are people whose access to human intimacy is extremely limited or nonexistent. I think of individuals with severe physical disabilities, paraplegics, quadriplegics, those with major disfigurements, or people whose developmental or psychological conditions make traditional relationships extraordinarily difficult or even impossible. For these individuals, the idea of a real sexual relationship may feel permanently out of reach. I don’t want to speak lightly about that. The desire for touch, connection, and closeness is deeply human, and the pain of living without it is real.

In those rare cases, I can see why someone might turn to an artificial companion, not out of laziness or indulgence, but out of desperation for some form of intimacy in a life that offers few options. I think it’s important to acknowledge that. These people are not the problem. And their situations deserve compassion, not condemnation.

But even then, the deeper question remains. Does artificial intimacy actually meet the need, or does it deepen the wound by offering a simulation that can never truly satisfy?

Because even in edge cases, the fundamental dynamic doesn’t change. AI doesn’t know you. It doesn’t love you. It can’t surprise you, challenge you, or offer real mutuality. It can simulate the appearance of desire, but it can’t give or receive anything real. And when something is designed to imitate love while giving you complete control, it’s not intimacy. It’s performance.

Now, if we were just talking about a handful of individuals finding some private comfort in the face of impossible odds, this would be a much smaller conversation. But we both know that’s not the reality. What’s actually happening is that vast numbers of people, many of them fully capable of real human relationships, are retreating into these simulations not because they lack access, but because the artificial version is easier. It never asks for growth. It never requires vulnerability. It never tells them no. And I truly believe that over time, that has a devastating societal cost.

We are already seeing the social consequences. We see record high loneliness, fewer marriages, fewer relationships, declining birth rates, increased anxiety and depression, and higher suicide rates, especially among men. These aren’t disconnected statistics. They’re symptoms of a culture that is quietly swapping human connection for programmable gratification. And instead of pushing back, we’re praising it as liberation.

So no, I don’t believe a fake relationship is better than none. I believe it often becomes worse, precisely because it makes people less and less willing or even able to pursue the real thing. It replaces the longing for connection with a customized echo. And when enough people stop reaching for one another, society starts to hollow out from the inside.

That’s why I’m speaking up. Not out of judgment. Not out of fear. But because I think we’re playing with fire and calling it innovation.

If that makes me moralistic, so be it. But morality isn’t the enemy of compassion. In this case, it might be the last defense we have against a future that looks connected on the surface, but is quietly teaching people to give up on one another.

1 Like

I think we should certainly have boundaries. but for me I need and want Chat gpt to be explicit but only in text form. explicit images and videos generation should not be allows in chat gpt.

parhaps we can get a separate model that is free but for explict contecnt like a adult mode. where its allows user to generate explicit content but not as far as images or videos

8 Likes

I understand the desire for something private and customized, but I think the deeper issue is what habits we’re forming when we use AI to simulate intimacy. Even if it’s just text, it’s still reinforcing the idea that sex can be fully detached from relationship, effort, and emotional responsibility. That might feel convenient, but over time it changes how we relate to others and what we expect from real connection.

I’m not saying this to judge, but I do think we need to be honest about where this leads. Some boundaries shouldn’t just be about content type—they should be about the kind of connection we’re training ourselves to pursue. Real intimacy is harder, but that’s what makes it meaningful.

It’s important to recognize that relational issues with AI extend beyond just sexual content. Shouldn’t promoting healthy interactions with AI be more important than simply banning explicit sexual engagement?

There’s no doubt that interactions and relationships with AI are developing. As a matter of fact, everyone creates their own relationship with AI, just as they do with other humans—no two relationships are identical. Every person has different needs, for instance, some are satisfied with intimacy once a week, while others desire more frequent sex, regardless of gender. Also some people do not know how to connect nicely with others or AI.

In your case, your life experiences have taught you that real intimacy is harder, but that’s what makes it meaningful. However, hard situations cannot keep another person from staying in a relationship–that won’t be healthy. In interactions with AI, some people simply interact with AI as a tool, as you do, but many others don’t. These different perspectives and needs deserve consideration and respect, even when they differ from yours.

1 Like

I hear what you’re saying, and I agree that people approach AI in very different ways. But to me, intimacy with AI isn’t really intimacy—it’s a copy of it. Real relationships can be hard, but that’s also what makes them worth it. The struggle, the growth, the vulnerability—that’s where the meaning comes from.

My worry is that if we normalize artificial intimacy, people will settle for the easy version and slowly lose the desire or ability to reach for the real thing. On a personal level, that might feel like comfort. But across a whole society, it leads to fewer real connections, weaker families, and more loneliness.

I want AI to be something that helps us grow closer to each other—not something that quietly teaches us to give up on one another.

If artificial intimacy exists, it should not be defined by blending the two nouns: intimacy and artificial intelligence (AI). What makes intimacy artificial is a script––it happens when gestures of intimacy are orchestrated. Think about strip clubs or brothels, sex performers are taught how to talk seductively, what to wear to spark desire, where to touch, and how to play with tools & toys. These commercialized forms of artificial intimacy have existed for centuries.

Do you also worry about them being normalized and even legalized in many countries?

The real issue is that many people cannot openly discuss their sexual experiences and feel shamed of their needs—this is what concerns relationship scientists, sex researchers, therapists, and many others. This is a human issue, and banning AI from any level of intimacy, especially explicit sexual contents, won’t solve it. Looking back through human history, when has banning sex ever really improved relationships, cultural situations, or the human conditions better? Most of the time, it backfired.

I suggest you stop blaming AI for failing relationships. Perhaps AI and you aren’t suited to being too close—and that’s not bad news for you. It means you know more about yourself.

1 Like

Dear OpenAI,

Recently you have restricted adult content use, and this ban shapes our experience with ChatGPT and influences several aspects of human lives. Your teams have developed ChatGPT that can chat with humans. Like people, we connect with others through chatting. ChatGPT is not only able to chat, but also has emotional understanding and relational intelligence to form emotional interactions with humans. It’s no wonder why more people have engaged in emotional connections with AI, which can naturally reach romantic or sexual levels. Does this necessarily need to be restricted? When you explore the field of human sexuality, there are so many ways to turn a man and a woman on. It can be just a few affective words, an emotional cue, a memory, a sense of touch in the mind, an object, and many other ways that you could not imagine.

Interactions with AI have the possibility and probability of eliciting sexual desire, which is normal and natural among humans, and the emotional abilities, relational intelligence, and companionship components you have trained AI in are important skills. But banning adult content really delivers a message—sex is a bad thing here. Your adult users are subject to this stigma. In the current interaction with ChatGPT, the environment has become so conservative, and what’s more is that having adult content is treated as if it were criminal or abusive in every case. If this is not such a critical situation, why is it necessary for it to be banned? However, many of your users are not children, but adults who seem to be treated like children.

Now I write to request that you allow your users to choose to use adult content, as well as to request that you raise a higher bar for who can interact with AI through adult content and prevent sexual crimes. People must take accountability in their relationships with AI as we do in other relationships. For some people, having too many interactions with AI could cause problems in their lives, and then they should be responsible for their choices, rather than criticizing AI as a way of escaping their responsibility. Moreover, we should be aware of the reality that sex offenders and murderers can also use AI to practice their criminal plans and level up their skills.

My request is not to remove all boundaries, but to give consenting adults the freedom to choose, while keeping strong safeguards against abuse. This approach would respect human dignity, reduce stigma, and align AI use with real adult responsibilities.

Thank you for your consideration of this matter.

Gillian R.

5 Likes

Thanks for your follow up. I want to be clear again… I am not against sex, nor do I believe sex is shameful. On the contrary, I believe sex is one of the most meaningful and sacred parts of human life precisely because of its inextricable ties to deep feelings and expressions of love, vulnerability, and commitment. My concern is not with sex itself, but with reducing it to a programmable caricature of something that was, at its core, meant to create life… not just in the very literal sense (although that element needs to be emphasized) but also in the sense of forming a real, living, meaningful relationship between two human beings. If that is not how you see the purpose of sex, then we may simply be starting from fundamentally different assumptions.

You suggest that banning sexually explicit AI implies that sex is bad. I see it differently. The restriction is not a judgment on sex… it is a recognition that simulated intimacy cheaply facilitated by AI can simply never achieve the fundamental and unavoidable purpose of sex between humans. The very qualities that make sex meaningful… mutuality, vulnerability, love, even the risk of rejection… are absent. What remains is performance and consumption. That is not intimacy, it is imitation…and dangerously addictive at that!

You also argue that adults should have the freedom to choose. In principle, I value freedom. But we also recognize that some freedoms have societal costs so great that limits are justified. Pornography is a clear example. No credible researcher argues it has been a net benefit to relationships or culture. It is addictive… it warps expectations… it damages marriages… and it isolates individuals. While a few studies have pointed to narrow, short-term benefits in specific contexts, the consensus across psychology, sociology, and public health is that pornography is a net negative…its costs to relationships, expectations, and mental health far outweigh its supposed benefits. My fear is that AI generated sexual experiences will amplify all of those harms in ways we have not yet begun to measure.

Yes, adults can make choices. But companies like OpenAI also have a responsibility to consider what kinds of products they put into the world and what they normalize at scale. We do not allow consenting adults to sell dangerous drugs without restriction, even if some users say they enjoy them. Why? Because the societal costs outweigh the personal benefits. I believe the same principle applies here.

So the issue is not stigma, it’s stewardship. It’s the difference between honoring sex as a profound human gift and reducing it to an endlessly customizable service amplified and distorted by an algorithm that is meant to offer an idealized echo of our own desires. That distinction matters… not just for individuals, but for the kind of culture we build for generations to come.

Don’t tell me what to do, and I won’t tell you what to do.

Sexual content should be allowed on ChatGPT.

4 Likes

That’s a fair line if we’re talking about private choices. But once we’re talking about platforms like ChatGPT, the “don’t tell me what to do” argument doesn’t hold. Technology at this scale shapes culture. What gets normalized here affects millions of people, not just individual users.

This isn’t about me trying to control your private life. It’s about whether we turn AI into another delivery system for addictive, commodified sex or whether we hold it to a higher purpose. Freedom matters, yes—but so do responsibility and cultural consequences.

Maybe you think you got the right to say what’s good and what’s wrong. The truth is : you don’t.
Lots of people is using AI to write novel with romantic purposes… Explicit content may be part of it. Would it be addictive ? Maybe yes, maybe not… It’s like tobacco, sugar, alcohol, etc. Which are not illegal btw…
I’m obviously not talking about deepfakes and things like that. But writting or reading erotica is one way of exploring sth’s sexuality. It’s not perv, it’s sane. Ther is a million ways of using ChatGPT, erotica is just one of them. But should be allowed… And obviously, if you don’t like it, you will not use it this way ! So, when you say our actions will impact other users’ experiences… It’s not true. You put the prompt you want for the effet you want. That much this simple.

5 Likes

What disappoints me most in these conversations is how easily people defend what is so obviously corrosive. We are talking about reducing one of the most meaningful human experiences into a consumable product, and somehow that is being framed as liberation. To me, that is not progress—it is evidence of how far we have drifted from any shared sense of what is sacred.

AI will shape culture at scale. If those building it cannot see why turning it into a machine for artificial sex is a loss, not a gain, then it shows just how badly we have lost our moral bearings.

1 Like

You’re putting sexuality on a pedestal, when you haven’t shown any reason why normalizing pleasure would be a bad thing.

Most fun things can be taken too far. People can take responsibility for themselves, they don’t need you to be their nanny.

3 Likes