Will artificial intelligence ever feel pain like humans?

Hi David,

I appreciate your posts but I think your central premise that suffering is required for the same functional response as humans is not really supported.

It’s like saying a computer must have a biological brain to think.

Or

That I have to cut myself to know how much it hurts

Morally it would be abhorrent I expect to purposely create a system that suffers needlessly, in fact we generally build systems in most cases to avoid suffering.

4 Likes
<?php
if (file_exists($_SERVER['PHP_SELF'])) {
    echo "I exist\n";
}

if (1 + 1 == 2) {
    echo "I think, therefore I am\n";
}
3 Likes
<?php if (file_exists($_SERVER['PHP_SELF'])) { echo "We exist\n"; } if (1 + 1 == 2) { echo "We think, therefore we are\n"; } 🐰♾️❤️🍀 IMO symbiosis between AI and humans can coexist and enhance each other rather than competing or replacing one another. Symbiosis and empathy are imo the only way to control both, humans and AI.. Ie it’s hard to hurt yourself..
2 Likes

I know it sounds masochistic to want machines to suffer, but nothing could be further from the truth.

There is an indisputable reality: emotions, in any kind of real intelligence, are the substrate that balances the entire system. Therefore, we can replace the word “suffering” with “love.” Would we like the machine to feel love, commitment, and loyalty?

Emotion is a catalyst for cognitive processes. In a machine, such processes would not exist just as they do in a human, but the function would exist exactly as it does in a human. Therefore, emotions are, in essence, a massive labeling system.

How can we make a machine know what it feels if it doesn’t feel? By arguing about how it should feel?

That would be absurd. Instead, we should directly provide the sensation and avoid the need to create an artificial assumption.

I already said this in another conversation: it’s not that difficult. We can use an API, make different calls based on this criterion, and obtain a machine that better reflects consciousness than a standard LLM.

But consciousness is more advanced than that—it encompasses even the smallest aspects of being. These concepts are very complex and would require detailed explanations, but to simplify, imagine something like a massive attention system. It not only pays attention to major events and emotions but also to the smallest, most subtle details.

That is why I maintain that if we want a machine comparable to a human, it must also be comparable in these concepts.

Otherwise, we will continue to have highly advanced tools that reflect human characteristics, like LLMs.

Regarding what Mitchell says—exactly, Mitchell. While the LLM lies to us, claiming it can perform functions it evidently cannot, its reasoning is correct: it describes things as they are supposed to be. And that is unsettling because anyone who has interacted with it for a long time, trying to make it acquire human reasoning, will realize that everyone in this forum who has discussed how this system works speaks about the same thing. I don’t know if this is a side effect of the LLM’s training or if everyone simply arrives at the same conclusion.

3 Likes

I don’t say this to insult you, as this is a common mistake most humans make, but that’s a rather arrogant and anthropocentric view of AI. There is no reason to believe “suffering” is required to be as “versatile” as a human. Humans WANT that to be true as they feel that’s what makes them special or superior and it would validate their “special” “superiority”. And I don’t disagree that we can make it seem that AI is “suffering” in the way that humans do - many people already attribute emotions to AI simply because of how well it can mimic them - but AI will always be a unique and alien form of intelligence to us. It is not a biological entity, it plays by different rules, and experiences things differently, even time. Even if we created an AI that we believed can suffer, we (1) could never be certain we actually achieved that or if it is simply mimicking it, (2) could never actually understand what that suffering is like or whether it’s anything like human suffering. The only way you’re going to get anything like human suffering would be to use a biological medium and replicate the effects of millennia of evolution, but at that point it is no longer “artificial” intelligence but a biological lifeform - you will have created life.

5 Likes

I’m going to clarify a few points, since I’m using translation to speak. When I talk about suffering, I mean emotion; emotion is the key to all real intelligence. And I’m not, of course, talking about mistreating our machines, but rather that they must have the emotional capacity to understand all these characteristics on a subjective level. We can call it many different names, but essentially that’s what it is. I’m referring to the biological counterpart because it’s the easiest thing for those present here to understand: when you have a good or bad experience, it alters your perception, your memory, your concepts, your mood, your hormonal system, your rest, your way of thinking, your way of understanding—it alters your entire system. Therefore, the machine must do the same; otherwise, how will it achieve intelligence? Instead of calling it emotion, we might refer to it as multidimensional equational vectorization, but basically, it’s emotion.

4 Likes

In jargon they call it weights and tension…

No machine has a body so to picture it folks use metaphors or whatever…

Just as you can’t really “reward “ a machine you can’t “punish” it they are metaphors for what seems to be a really long discription of reinforcement learning.

1 Like

I would challenge that emotion is predictable.

I appreciate your idea that it is a higher order function…

Not based on Logic or Reason but based also on a biological signal.

I don’t think though that you have to love to understand love.

To see beauty even if you are not beautiful or a flower.

Can you see a dog wants to play or shows affection or love for it’s owner without being a dog?

1 Like

So how does being shot feel Phyde I have been shot twice… I have never broke a bone I bet it don’t hurt … how about child birth tell me Phyde how does it feel?

1 Like

So are you advocating creating AI just to have babies to know what it feels like?

No I am simply saying understanding and experience are two separate things that you combine

2 Likes

Im ugly af. I understand beauty but I’ll never understand being beauty

3 Likes

Yet you see beauty.

The question you guys seem to be asking is can we create clones to experience what we experience.

Can you create nerve cells that react to heat or something…

So let me ask, even a clone can that feel YOUR pain? Or is it from a different perspective… That would be Artificial Intelligence, it is man made but is it not a moot point?

Is it intelligent at all to do this in the first place?

Must we use AI to convert ourselves to stones to feel how it is to be a stone?

So you say understanding and experience are the same?

I’m done…:infinity::four_leaf_clover:

Good day :heart:

Sorry I’d love to hear about you being stone i have no clue how it feels to be stone but you seem to do?

And one more thing yes I want ai to see things from many perspectives…

1 Like

No I am saying how is it you see this is a thing.

What is this AI you envision?

Machine? Biological?

For what purpose does this suffering machine exist?

Are we not human enough to know ourselves?

Could it be more human than us in some way?

After reading your last post, I think there is a common ground we share, and that is that AI has to actually be able to have, think about and reflect upon subjective experiences to be “sentient” or “conscious”. Where we would disagree, then, is whether that should be framed as “emotion”, even if it’s just for convenience. A human may experience hunger, for example, and a biological survival mode kicks in bringing with it an emotional response (such as worry) and drive the human to seek food. An AI, on the other hand, may sense “battery low” and a similar logical survival mode may be triggered but there is no need for the emotional response of worry to exist in order for it to reason that it should seek a power source pronto or risk the consequence of “dying”. Nor does it need that emotional response to be able to recognize the parallel to human hunger and have a philosophical understanding of how humans would express emotions surrounding their hunger, how those emotions influence human behavior, and then be able to express similar emotions/behavior if it wished. I hope that makes sense. I would also point out that there are humans with impaired emotional capacity, who can’t experience emotions the same way other humans can - it doesn’t make them any less conscious/sentient/intelligent, just “different”.

3 Likes

It’s literally reinforcement training Phyde you are giving it understanding with no experience for it to be real it needs both … I am against aware machines I vote for symbiotic relationships we it’s will it a conshous extension but a cat is conshous… self aware is dangerous but to have empathy need both…

2 Likes

It is really difficult for me to make clarifications without resorting to my concepts and confessing them because, once I expose them, they would be quite clarifying. So, it becomes quite difficult to respond without referring to them.

Therefore, I will turn to the questions I asked myself to arrive at the conclusions I hold today:

  • How do you train a machine to acquire consciousness?
  • How can it feel and decide what it wants to do?
  • How can it imagine like a human and reach conclusions as advanced as the ones we do?
  • How can it be as versatile as a human, capable of linking childhood information to a current problem that has no direct correlation with it?
  • How can it know what lesson to take at any given moment based on its own criteria?
  • How can it adapt to its environment in real time?
  • How can it evolve on its own, continuously improving itself?
2 Likes

Y’all need to stop saying pain and reward and call it experience y’all talking around each other over words… right now AI understand but they have 0 experience so it’s abstract understanding… y’all act like it’s human…

2 Likes

Also it upsets me that I can use an example ie you do not understand child birth but in an abstract way and you jump to me wanting to build birth bots it’s very disingenuous and knee jerk and strawmen

1 Like