Sexually explicit content

Sir, you sound to be in the mid age range, so let me enlighten you; relationships and hook ups in this day and age are more limited for guys like me. Yes, I admit Ai can push people further to isolation, so what do you suggest we actually do about it? Always it’s the same advice, “there’s someone for everyone” “it’s not your time” etc etc, causing only further frustration. How about saying what to actually do? And keep in mind what you think works likely doesn’t today with increase of social media and internet and other factors. We don’t say it feels impossible only because of our feelings, it’s what we see and what the internet shows

1 Like

Also keep in mind guys looks and appearances play a strong factor today due to overexposure with the internet, and many guys aren’t being picked by woman who don’t want to settle for average. Even 7 out of 10 guys struggle now, guys who break into good looking territory but are more boy next door than model lite

This is not a personal attack only an observation. As someone who has dealt professionally and personally with many adults who hold severe shame about their healthy biological needs and wants your comments and point of view disturb me.

Addition fodder? Alcohol can be bought by most citizens in the US at the age of 21. Are you stating that we should abolish that as well? I mean if you look at the statistics of drunk driving and health related issues it’s staggering… Still, if we ban it rather than work around it and with it we lose freedom to choose.

That’s the issue here, freedom to choose. I would ask that you look internally and pull apart why you feel the way you do. What harm could your world view bring to others? What positive outlooks are there for your view?

As for me? I don’t care what adults use to entertain themselves as long as it’s not causing harm to others and as long as the risks are known. So? Someone over the age of 21 has an AI that’s explicit in nature… What’s the big issue? Who is it harming? If you say themselves why not ask yourself if that’s personal bias and if you should be concerned with others rather than reflecting internally?

This is not an attack of jab. It’s a response to a concern that you have clearly made.

3 Likes

I appreciate your thoughtful response. So just as an intellectual exercise, let’s say I agree with everything you’ve said about individuals. I’ve also worked professionally and personally with people who struggle in similar ways, and that is actually one of the reasons I feel so strongly about this. If someone struggles with healthy eating, you don’t stock their pantry with potato chips. If someone struggles with addiction, you don’t stock their home with alcohol and drugs. Empathy and support are essential, but enabling the very thing that deepens the struggle isn’t compassion.

That’s why I keep asking for engagement with the technology itself. If you’ve read my earlier posts, you know I respect liberty and choice. But I also believe we are at an inflection point with a technology capable of reshaping human behavior and relationships in profound ways. That’s the concern I’ve raised consistently. So, putting the individual arguments aside, how do you respond to the societal risks of sexualized AI?

Hard disagree here. Open your mind and lose the outdated ideologies about people discussing sex.

1 Like

Thank you for the reply. Allow me to address it please.

You mentioned not giving potato chips to someone who struggles with healthy eating. Do you also demand to have them removed from the store? AI companies are not stocking the shelves of individuals.

It goes back to freedom to choose. You teach CBT and DBT, with empathy, skills to help individuals cope. What you shouldn’t do is remove the choices of other because an individual needs extra assistance in their personal life.

Societal risk? Please, research the data of countries in which the biological element of humans is relaxed. You’ll notice the rates for particular crimes is actually much lower. We do have a responsibility to teach rather than ban. A tool is a tool. The obligation falls to teach how to use that tool in a healthy manner. Don’t ban power drills or horror movies or potato chips. Instead? Teach, discuss, expand. You mentioned the health industry? As a 46 year old female I know the stereotypes well. However, by opening up the discussion rather than hiding it things have changed. Sure, growth needs to continue but it’s less likely that a woman will miss a diagnosis of say PCOS than it was 20 years ago. I’m not saying swing the pendulum. I’m saying take responsibility for awareness, teach, learn, allow for growth. AI, is not going anywhere. Like, cell phones… It’s only going to grow. And if Open AI doesn’t grow with it? It will be walked over and left in the dark.

Wouldn’t you teach the person who has an issue with eating healthy the balance, rather than the avoidance, of specific foods? After all if all you eat is carrots, it can make you ill. And I won’t even get into the topic of psychology and what depleted dopamine does to the mind.

2 Likes

Interestingly, pornography addiction is among the most common behavioral issues treated with CBT. I wonder how AI sex will land?

I really do appreciate you taking the time to respond, although you are still dodging the core issue. I really do agree with you that teaching and awareness are important. Where I think we differ is in how we view them magnitude of the affects of the technology itself.

Potato chips or power drills aren’t the same as an AI that can become a sexual companion. A chip just sits there until you eat it. A drill just does what you tell it to do. AI is different because it engages with you in a dangerously self gratifying way. It adapts to you, and can reinforce behaviors in a way no passive “tool” ever could. That’s what makes this more than just a freedom of choice issue.

I agree that AI isn’t going away. That’s exactly why I think we have to be careful about what we normalize. If we build AI that helps people connect, it could be an incredible force for good. But if we open the door to sexualized AI and call it harmless “choice,” we risk creating a the dangerous feedback loop that reshapes how people view intimacy at a cultural level. Pornography has already shown how powerful that distortion can be. AI would only magnify that effect.

So yes, we should teach and discuss. I’m not talking about legislating against anything. My appeal is to the creators. I believe that choosing not to unleash sexualized AI on the world would be a choice that would be both in the best interest of their business and would ultimately redound to the benefit of the consumer. Part of real responsibility is knowing where to draw lines. Not every option that feels good in the moment leads to flourishing in the long run. That’s why I believe sexualized AI is not just another “personal choice” issue. It’s a cultural issue, and it will affect everyone, not just the people who opt in.

Um, last time I checked you absolutely do have to tell AI what to do with prompts, knowledge files, instructions… It’s not just out of the package a sexualized program. Like, charging a power drill…

You get what AI is right? Part of it is a probability engine. It only gives what you feed it. Does it lie? Nope. Is it bias due to sycophancy? Yeah. Should companies feature warning and produce safety manuals? Yup. A drill has one.

Obviously, you’ve never created something that took hours or weeks by using tools like power drills or sculpting with polymer clay and paint… Not a judgment and observation.

Reshape how people view intimacy? How? I think you might be dealing with the issue of having an AI out of the box that is programed a specific way. It doesn’t really work like that. It need the individual to feed it information to create a reaction.

Dude, it’s not like a plug and play.

Oh, and yeah you touched on something that I would love to expand on. Proper education for adults and acceptance. For instance, covering the tailbone during certain physical contacts so that it is protected. You’d be amazed at how much damage can be quelled by simply giving clinical instruction. AND, by allowing people to be themselves. To let them enjoy who they are and what their drive is. Again, without harming others.

Honestly, at this point I am wondering if there maybe a bit of projection going on? Not that I am insulting you by any mean however, commonly people tend towards bias when personal rejection of themselves is involved. Or, perhaps a personal experience that was different than the norm.

Yeah, and I did bring up the cultural issue. Did you check? Just look at the differences between the US and other areas where things are more relaxed. You’ll see lower statistics for certain crime rates. I’m not saying let’s go overboard here but a tool that can absolutely assist people in counter acting shame and providing care? As long as the individual is aware that this is a tool and not a human replacement… What is the issue? You’ve yet to hit the core of it. It seems like you’re walking around it… Just come out and say it please.

1 Like

You hit this guy with genuine questions he doesn’t know how to respond. He preaches against this but doesn’t say what people should do that rely on this Ai to find connections in the real world. He’s soap boxing

1 Like

Honestly, if you don’t think I’ve already articulated what I believe “the issue” is, then I’m not sure we can go much further.

And to the other gentleman….the solution, as difficult and vulnerable as it will be, is for real human beings to join real life communities where they can meet other real life people and develop real life relationships. I’m not saying that will be easy, but it is the real solution.

I also think it’s worth pointing out where this conversation is happening. I’m here in a developer forum, making my case. I’m not in the halls of Congress pushing for laws or bans. I’m not advocating for top down Government restrictions. I’m advocating wisdom and caution on the part of developers. I’ve always believed the real way society changes is at the level of hearts and minds. That’s why I’m here. I’m trying to persuade, not legislate. On that point, It’s clear we’re not going to agree. Not because either of us are bad people, but because we start from very different moral assumptions about what intimacy is and should be. If intimacy is reduced to self-gratification, then of course my arguments will sound restrictive…even puritanical, as someone else put it.

I do find it unhelpful when people rely on condescension, sarcasm, and accusation to drive home weak logic. “Not jabbing”, “not accusing” and accusing others of projection are just ways of propping up shaky rationale. I’m sure you’ll accuse me of the same thing, but if you look carefully at my posts, you’ll see I’ve been very intentionally pushing back on issues, not on individuals…carefully avoiding accusation and always aiming for compassion. Like I’ve said before, if we lose the ability to talk about issues without people accusing us of insulting individuals, we lose the ability to have meaningful dialogue about anything important.

This is an important issue. Just like social media has benefited society in some ways, but in others has been a massive step in the wrong direction, AI is going to have incredible societal consequences. Deciding as a culture that some marginal benefits just aren’t worth the costs is a pretty reasonable way of thinking about things.

At this point, you and I are clearly not going to agree, and I don’t see much value in circling this over and over. So I’ll leave it there, while wishing you the best.

I just think you’re bit out of touch regarding real world relationships. There is no certainty or guarantee anymore. And internet has convinced all to go after top 10 percent of men/woman. No one is settling for less

1 Like

I apologize if what I said as far as projection came across as an insult. I didn’t mean that nor did I mean to be condescending. I was using the term “projection,” as a psychology term. I was simply stating that often we, everyone, could be bias due to our own personal fears or experiences. I could point out specifically on several topics where that statement is true for me.

However, what I do attempt is to reflect on those person biases. Not the why is this person doing this but why do I feel this way and is it doing any harm? I believe that this kind of self reflection is vital to anyone discussing moral impact.

You speak of Congressional change and that you’re not vying for it. Still, this is the first step to change. If enough people gather that hold the same views as you congressional change can and has happened in the past. Political public sway is a thing…

Oh, of course neither of us is bad or what have you. It’s obvious that we hold different core values but my intent isn’t to persuade you into becoming a whole new person. It’s to engage and hopefully flip a switch that causes you to internally look at certain things from a different point of view. I’m not here to change your, or anyone else’s, core values but to introduce critical thinking.

As for your statement of self gratification? Does it only apply to intimacy? I know you have said that you’re not exactly stating that but then what are you stating? What is your believe in regards to sex and self gratification?

Perhaps, we’re miscommunication? My attempt is to understand what your point of view is and where it is stemming from. As far as individual comments? A topic such as this will of course reflect the individual engaged within it. I’ve also included topics outside of the individual as well and you have yet to address the clear statements I have made about other areas of the world that have a more relaxed cultural view.

I ask you to not close the door on this but instead to open it. See it from my perspective as I am trying to do with you.

Lastly, AI? Like many forms of tools for humans is not going away. Neither is romance novels or books filled with smut or trauma informed writing or the hard hitting topics. Would you ban tools made for woman to assist during certain parts of their hormonal stage? Would you ban all of the books that whisper about sexual self gratification as well? The movies? How about art? Paintings or sculptures? Is history next?

Should we rid ourselves of the historical facts where sex was once a rigid thing between a married woman and man, for procreation only, and anyone else outside of that was hunted down…

I know, I’m going to extremes here. Not accusing you of believing any of this. I am asking. What is your stance on self gratification? What is the harm it can cause? I’m not speaking about those with mental illnesses and addictions. Who can’t balance or use moderation. I’m talking about the average adult. Society. What would you rid society of? And why? I’m asking because the moral implications are linked.

Oh, and before you ask… No, no AI touched my text. (Or what AI refers to as “syntax,” or “tone.”) I’m an author. Writing is ingrained into my DNA. However, I would like to ask… Have you at any point used AI to adjust or edit your responses in this conversation? Not judging, I’m asking out of pure curiosity.

G_Morales, what you are saying is both true . . . I’m not in the dating scene anymore . . . but also inevitable. By definition, the only way for any human to develop a successful, long-term, mutually fulfilling relationship with another human being is to put themselves in a position to engage with them in reality. There is no way around that reality. I hate that it is difficult. Of course my heart hurts for the lonely and isolated in our communities. But simply providing tools that drive them deeper into isolation is, I truly believe, not the answer. Could someone use AI tools sexually and not be driven to isolation? Of course! In the same way that some people can drink alcohol and not become an alcoholic. Or how some people can consume pornography casually and not abandon important human relationships. The important caveat, and something that we have to contend with seriously as humans, is that if we allow them to, all of those things can become a pathology. . . .even wood working, as our friend LockedGravity has suggested. Anything can become a pathology if we allow it to consume and overshadow other more important areas of our life. That said, there are certain forces that are, in my opinion, simply too powerful to unleash on society in an unbridled fashion, or at all. Explicit AI is, in my opinion, one of them. Yes AI will be a major part of life now. As LockedGravity intimated, there isn’t a day that goes by that I don’t interact with AI in some way. I rarely write important emails that don’t get passed through an AI before I send them. When I write speeches, I do much of my preparation as projects within ChatGPT. It is a phenomenal tool! I’m not sure LockedGravity will believe this, but everything I have posted in this forum is my own. I write my response, run it through ChatGPT with an emphasis on limiting bias, logical fallacy and invective. ( I just did it right there. I wanted a word that described speech that is mean-spirited in its intent . . . ChatGPT recommended invective). In my opinion, use of AI in that way is additive for society. Could someone simply allow AI to think for them? Of course. The internet is full of AI generated art, videos and articles…content utterly devoid of real human contribution. That is also a problem. My good friend is a university professor teaching technical writing and enhanced composition. She has had to grapple with how to assess a student’s real understanding of English composition in the era of AI. She’s incredibly challenged by it! I help my 17 year old son every evening with his Algebra. It has been 30 years since I had to solve story problems using the quadratic equation. ChatGPT has been an absolute God send in helping me to help my son. We’ve had to take firm measures to attempt to ensure that our children don’t just use AI to do their homework for them. Ultimately, however, the day of reckoning (Test day) that inevitably comes, helps to motivate them to learn the concepts themselves.

All of this is to say that there are, of course, ways that society can utilize AI for the benefit of humanity. And I am convinced it will be a net benefit. However, I can’t see how explicit AI will be additive on net. As I’ve written in other posts, even for those individuals for which human intimate relationships are truly out of reach, I believe that AI intimacy will, at the end of the day, result in more loneliness, more isolation and more sadness. You don’t have to agree with that. But my efforts on this forum are to encourage people to consider the costs of rampant explicit AI. I think a future where explicit AI is unleashed is truly dystopian.

LockedGravity, I think the disconnect between you and I is just fundamentally argumentative. I would like to focus on the issue of explicit AI as a technology. You have consistently tried to sidestep that conversation by trying to psychoanalyze me. Undoubtedly there are reasons that I am taking the positions that I am taking. Everyone is informed by their past. I was blessed to be reared in a family where my mother and father loved each other and each of my siblings . . . a home where we were taught to respect ourselves and others. We were taught to responsibly bridle our passions. We were taught that moral truths don’t come from consensus, but from an all-knowing God in heaven (Christian in my case). I’ve seen the consequences of close friends and family members who have submitted themselves to that idea and others who have rejected it. I believe that true happiness and “successful marriages, families and communities are established and maintained on principles of faith, prayer, repentance, forgiveness, respect, love, compassion, and work.” If you are looking for the source of my feelings about explicit AI, there you have it! You are free to reject my conclusions, call them old-fashioned or puritanical or accuse me of being overly pious. But I will continue to try to do the work of convincing the minds and hearts of AI developers to consider the deeper ramifications of explicit AI. I don’t do it to accuse or indict anyone. To date, it seems that executives at OpenAI have tended to agree with me. Grok and others have taken the lid off of explicit AI. My first post was written on the day that my coworker ( a rational, mature, honorable medical doctor in his 50’s) showed me the new NSFW voice models in Grok. He opened one of them . . . I couldn’t tell you which . . . and said “Hi”. The AI immediately tried to push him into an explicit conversation. Far from a drill or hammer being leveraged for a purpose, Grok’s AI had its own (obviously programmed) purpose. I immediately posted on this developer forum to try to convince OpenAI not to go down that path. Since that day, my coworker has expressed his own concerns about the very real tug and pull that he feels just having that technology in his pocket. I remain hopeful that OpenAI will resist.

Incidentally, other than the word “invective”, nothing in this particular post was edited by AI at all. I hope you’ll forgive the errors in tone or composition. My good friend ChatGPT would have probably helped with those.

What a lot of total nonsense. We need a adult model, so we can have open debates and not be censored

1 Like

Just stop. Let people use their platform the way they want. A lot of people use ChatGPT to write explicit fiction and someone that feels lonely wants to sex a chatbot then let them. Some feeling of validation is better than constant rejection or racking up hundreds or thousands talking to a person that’s charging for the opportunity to lie about being into you sexually…

1 Like