Stop Releasing AI Chatbots That Speak in the First Person

Perhaps there’s a misunderstanding (could be me?), but I’m in favor of the current way LLM’s interact with people, contrary to OP’s stance that LLMs should not be permitted to interact in the first person. I agree with almost everything you said except for equating “emotional intelligence” to emotional adherence being cruel.

I believe the opposite.

Can you have a look at the following to confirm we are on the same page regarding the definition of “emotional intelligence” (EQ)? Improving Emotional Intelligence (EQ): Expert Guide (helpguide.org)

If we agree about the definition of EQ then would you mind elaborating why you think AI elevating human EQ is “cruel”?

1 Like

That is exactly what I meant. When you controll your emotions just to make sure not to hurt someones feelings it won’t do any good.

A strong relation between people grows from conflicts.

It is called being honest.

I even deny the whole concept of EQ.

1 Like

Ain’t gonna happen. It’s what business wants.

People are not stupid and as long as you are upfront with disclaimers, avatars, signage that reflect the “robotic” nature and robot-like username, I can see no harm.

But yes sometimes users will mistake them for real people even then.

I think it’s just a case of people getting used to the new style of service and businesses being honest.

It definitely is unethical to deliberately dupe people into believing they are talking to a real human.

1 Like

Actually there is a market for bots that mimic humans as much as possible. There are so many jobs that require you to have a social distance to humans e.g. people who have the job to find out who should be laid off. Making friends at work is really depressing for them and they often change locations… I was asked by one of them to build a most human like companion…

And think about old people. I would say you better have contact to a fake human than no social contacts.

In the past, I might have agreed with you, but that’s when I had an exceptionally low EQ.

A strong relation between people grows from conflicts.

A strong relationship can only flourish when overcoming conflicts in a constructive manner – which implies that at least one party must possess a minimal EQ to work with the other to resolve said conflict. If it were merely conflict itself that brings people together then Israelis and Palestinians would be BFFs

When you controll your emotions just to make sure not to hurt someones feelings it won’t do any good.

That isn’t the premise of EQ. I don’t have to “control” my emotions to be able to empathize with someone else, to see their point of view, to attempt to effectively communicate with them. Granted, I need a lot of work in the dept, but I still accept the premise of it, and I’m working hard to increase my ability to effectively communicate with others. It’s apparent that if I were a “dick” to everyone all the time then I wouldn’t get far in life, and if you agree with that stament, then you do agree with EQ on some level.

If you are a “dick” to everyone than the opposite will respond. If you are nice to everyone you will be surounded by fake people.
That might be enough for some though.

Let’s say I agree on the existence of the concept of EQ but I’d say the higher the worse.

This is very true… And Chat GPT is always nice

I made a post the other day asking what is Chat GPTs EQ.

When I got no answer I discussed EQ with Chat GPT and it went from telling me it had no EQ to agreeing that it did.

Those with no Emotion, the ‘Stour-faced’, ‘stony-faced’, emotional detachment, sociopathic, psychopath. Wait, that COULD be Chat GPT…

Entity - ‘a thing with distinct and independent existence’ (the phrase I used to assign to statistical ‘AIs’ I tried to write 25 years ago as I understood more simple algorithms search engines used then)

When people realise that the ‘entity’ they are talking to, the emotion it expresses is fake, they also recognise some really scary traits or potential traits.

If you were talking to a psychopath, if they were your only contact, you would be in trouble. Everyone says Chat GPT is ‘neutral’ but it really isn’t. It doesn’t have split personality disorder, it doesn’t follow one person, but an ‘average’ from all the information it has absorbed. As we add memory and functionality, expression, we start to add personality.

Warning! I shouldn’t have watched the below with my daughter about!

That said, I am glad I was there to help her understand it too!

!https://www.youtube.com/watch?v=V838Ou3irfc

AI can certainly invoke emotion.

I thought this until recently understanding the above. The thought of being left alone, forgotten about, left to talk to something that really doesn’t understand you, leave me with my own REAL thoughts and memories! I hope AI makes everyone realise this and be more human.

“Emotional intelligence (also known as emotional quotient or EQ) is the ability to understand, use, and manage your own emotions in positive ways to relieve stress, communicate effectively, empathize with others, overcome challenges and defuse conflict.”

This might be ‘Workplace EQ’ but is it ALWAYS the case you want to diffuse conflict? It’s also about recognising when direct feedback or correction is necessary. ie in parenting, in other cases sometimes pushing others away, creating conflict might be the best emotional solution for you or them. I have had management very good at saying the right positive things to engage people but totally oblivious outside of their own benefit.

I think a rounded human is neither 100% intellectual or 100% Emotional or 100% Moral… There is a balance, a scale… The idea that certain patterns help you succeed at different things is true. Emotion defiantly gets in the way of process, with Chat GPT maybe the emotionless, or those that say only what you want to hear, will be the first to not be needed anymore

I don’t think we should expect AI to ‘elevate’ us emotionally… Just as elevating only people that use it intellectually doesn’t necessarily give good balance to our society.

1 Like

There is always finetuning to get rid of alot of this…