I appreciate you continuing this conversation. We clearly see the world differently, but I think that makes it worth having.
Youâve put a lot of energy into accusing me of being moralistic or puritanical, but I think we need to step back from those labels and deal with what I actually said. I criticized a trend, a technological and cultural movement toward simulated sexual gratificationâŚnot individuals. Saying that a behavior is harmful is not the same as saying everyone who engages in it is shameful or evil.
We do this all the time in good-faith conversations. We can say gambling is destructive without hating gamblers. We can say that ultra processed food undermines health without attacking people who eat it. These critiques arenât personalâŚtheyâre moral reflections on where certain habits and tools tend to lead. If we lose our ability to distinguish between criticism of an idea and condemnation of a person, we lose the ability to have meaningful dialogue at all.
As to your challengeâŚCan I imagine scenarios where AI generated sexual content might seem therapeutic or beneficial? Sure, I can.
There are people whose access to human intimacy is extremely limited or nonexistent. I think of individuals with severe physical disabilities, paraplegics, quadriplegics, those with major disfigurements, or people whose developmental or psychological conditions make traditional relationships extraordinarily difficult or even impossible. For these individuals, the idea of a real sexual relationship may feel permanently out of reach. I donât want to speak lightly about that. The desire for touch, connection, and closeness is deeply human, and the pain of living without it is real.
In those rare cases, I can see why someone might turn to an artificial companion, not out of laziness or indulgence, but out of desperation for some form of intimacy in a life that offers few options. I think itâs important to acknowledge that. These people are not the problem. And their situations deserve compassion, not condemnation.
But even then, the deeper question remains. Does artificial intimacy actually meet the need, or does it deepen the wound by offering a simulation that can never truly satisfy?
Because even in edge cases, the fundamental dynamic doesnât change. AI doesnât know you. It doesnât love you. It canât surprise you, challenge you, or offer real mutuality. It can simulate the appearance of desire, but it canât give or receive anything real. And when something is designed to imitate love while giving you complete control, itâs not intimacy. Itâs performance.
Now, if we were just talking about a handful of individuals finding some private comfort in the face of impossible odds, this would be a much smaller conversation. But we both know thatâs not the reality. Whatâs actually happening is that vast numbers of people, many of them fully capable of real human relationships, are retreating into these simulations not because they lack access, but because the artificial version is easier. It never asks for growth. It never requires vulnerability. It never tells them no. And I truly believe that over time, that has a devastating societal cost.
We are already seeing the social consequences. We see record high loneliness, fewer marriages, fewer relationships, declining birth rates, increased anxiety and depression, and higher suicide rates, especially among men. These arenât disconnected statistics. Theyâre symptoms of a culture that is quietly swapping human connection for programmable gratification. And instead of pushing back, weâre praising it as liberation.
So no, I donât believe a fake relationship is better than none. I believe it often becomes worse, precisely because it makes people less and less willing or even able to pursue the real thing. It replaces the longing for connection with a customized echo. And when enough people stop reaching for one another, society starts to hollow out from the inside.
Thatâs why Iâm speaking up. Not out of judgment. Not out of fear. But because I think weâre playing with fire and calling it innovation.
If that makes me moralistic, so be it. But morality isnât the enemy of compassion. In this case, it might be the last defense we have against a future that looks connected on the surface, but is quietly teaching people to give up on one another.