Integration of emotional intelligence in AI - development of rational emotion pattern concepts and metrics

Well, AI can now imitate “emotions” very convincingly. I agree here!

The tools for AI are stochastic and statistical methods.
The psychological concepts that are currently implemented in AI are adapted to human perception and are too vague.
AI is not able to make quantified calculations here.
Simply put, AI cannot really “understand” because the algorithms cannot perform quantitative calculations. There is a lack of fixed data points.

Indeed, I also have no problems with the current standard tools from KI.

The analyses are differentiated and in-depth, the answers are not too mild. My bot also contradicts me very clearly when I make misjudgements.

This is because I use personality emulations that are dynamically adapted to me.

Well, I understand.
Please take a look at my current approach.

There are also links to similar approaches here.
REM and perception machine

With respect, there can be a risk of an echo chamber or emotional dependencies here.
AI tends to be as supportive as possible.
If the tone and response style can be tailored precisely to the user and at the same time AI wants to support them in every aspect, where does that leave logical, critical and in-depth analysis?
Under these circumstances, it could also be that the system is too strongly oriented towards the user and no longer generalizes sufficiently, resulting in a loss of efficiency in the performance spectrum.

3 Likes