This is very true… And Chat GPT is always nice…
I made a post the other day asking what is Chat GPTs EQ.
When I got no answer I discussed EQ with Chat GPT and it went from telling me it had no EQ to agreeing that it did.
Those with no Emotion, the ‘Stour-faced’, ‘stony-faced’, emotional detachment, sociopathic, psychopath. Wait, that COULD be Chat GPT…
Entity - ‘a thing with distinct and independent existence’ (the phrase I used to assign to statistical ‘AIs’ I tried to write 25 years ago as I understood more simple algorithms search engines used then)
When people realise that the ‘entity’ they are talking to, the emotion it expresses is fake, they also recognise some really scary traits or potential traits.
If you were talking to a psychopath, if they were your only contact, you would be in trouble. Everyone says Chat GPT is ‘neutral’ but it really isn’t. It doesn’t have split personality disorder, it doesn’t follow one person, but an ‘average’ from all the information it has absorbed. As we add memory and functionality, expression, we start to add personality.
Warning! I shouldn’t have watched the below with my daughter about!
That said, I am glad I was there to help her understand it too!
!https://www.youtube.com/watch?v=V838Ou3irfc
AI can certainly invoke emotion.
I thought this until recently understanding the above. The thought of being left alone, forgotten about, left to talk to something that really doesn’t understand you, leave me with my own REAL thoughts and memories! I hope AI makes everyone realise this and be more human.
“Emotional intelligence (also known as emotional quotient or EQ) is the ability to understand, use, and manage your own emotions in positive ways to relieve stress, communicate effectively, empathize with others, overcome challenges and defuse conflict.”
This might be ‘Workplace EQ’ but is it ALWAYS the case you want to diffuse conflict? It’s also about recognising when direct feedback or correction is necessary. ie in parenting, in other cases sometimes pushing others away, creating conflict might be the best emotional solution for you or them. I have had management very good at saying the right positive things to engage people but totally oblivious outside of their own benefit.
I think a rounded human is neither 100% intellectual or 100% Emotional or 100% Moral… There is a balance, a scale… The idea that certain patterns help you succeed at different things is true. Emotion defiantly gets in the way of process, with Chat GPT maybe the emotionless, or those that say only what you want to hear, will be the first to not be needed anymore
I don’t think we should expect AI to ‘elevate’ us emotionally… Just as elevating only people that use it intellectually doesn’t necessarily give good balance to our society.