Do you think that AI will ever feel emotions?

Question asked



Question answered

:)
That's one of the best Non Answers I have ever heard. This is the answer that was given: The Brain constructs Emotions from previous situations. That's like saying my Computer takes stored Images from the hard disk and Displays them on the Screen, in answer to the question: How does the Computer make the Images I see on the Screen? A Huge Explanatory Gap exists for both situations.
 
Yes, that is the Hard Problem of Consciousness for the Human Mind. Your confidence that Human Consciousness is just Neural Activity (chemical electrical processes, connections) is not justified.
What else could it be?

There is no known Science that can causally show how such Neural Activity can produce Conscious Sensory Experiences in the Mind, such as Redness, the Standard A Tone, the Salty Taste, etc..
Quite a lot is known about the brain, so it's not exactly true that we have no idea how sensory experiences arise.

Likewise, there is no known Science that can causally show how those fundamental Computer operations can produce Consciousness in the Machine.
Maybe it would help if you could define what you mean by "consciousness" for me.

It is pure Superstition to think that those operations are producing Consciousness. We don't even know what Consciousness is in Humans, so saying authoritatively what Consciousness is in Computers is just nonsensical.
Who is doing that?
 
What else could it be?


Quite a lot is known about the brain, so it's not exactly true that we have no idea how sensory experiences arise.


Maybe it would help if you could define what you mean by "consciousness" for me.


Who is doing that?
Science has been trying to show how Conscious Experiences are in the Neurons for a Hundred years. It has become a matter of Faith that Science will be able to show how Conscious Experiences actually are in the Neurons. It is a reasonable expectation, but if Conscious Experiences were in the Neurons then Science would have had a lot to say about it by now. It is getting almost pathological to continue saying it is in the Neurons without a Clue. Something Like the Experience of Redness defies any Logical push back into the Neurons. Redness is something so different than what Neurons are, and what Neuron do. I decided that to solve this problem Science is going to have to think out side the Box. It seemed to me that a change of Approach and Perspective was needed. Conscious Experiences like Redness, the Standard A Tone and the Salty Taste defy being crammed into the Neurons, no matter how you try to Logically do it. They almost seem exist in a separate Space from the Brain. So let's not be afraid of this thought. We have to Speculate here. Let's just say that these Conscious Experiences happen in a separate Conscious Space concept. What new ways can we start thinking about Conscious Experiences if we do that? The first thing is that we can say that the Brain is Connected to this Conscious Space in some way. We now have the Connection Perspective instead of the Neural embedded Perspective. Many of the conclusions about Consciousness be modified when we use the Connection Perspective. See "Emphasizing the Connection Perspective" at http://TheInterMind.com.

Science has made tremendous progress in mapping areas of the Brain to particular Conscious Experiences. This is the Easy Problem of Conscious Experience. Not really Easy, but Easy relative to the Hard Problem. The Hard Problem recognizes the accomplishments of the Easy Problem, but the Hard Problem also recognizes that knowing which Neurons fire does nothing to Explain something like the Redness Experience. I think it is fair to say that showing any kind of, and all, the Neural Activity for Redness does not give us any Clue as to what the Redness actually is. Easy Problem: Much accomplished; Hard Problem: No Clue.

When I talk about Consciousness I am always talking about Conscious Experiences.
 
Steve Klinko:

What testable predictions does your theory of Conscious Space and connected minds make?
 
There may be "unconscious space" or maintaining "homeostasis" for AI.
Homeostasis is brought about by a natural resistance to change when already in the optimal conditions,[2] and equilibrium is maintained by many regulatory mechanisms. All homeostatic control mechanisms have at least three interdependent components for the variable being regulated: a receptor, a control centre, and an effector.[3]
https://en.wikipedia.org/wiki/Homeostasis
 
As humans, it's common to think that animals for example, feel and process emotions in the same way that we do. I wonder if we will apply this same type of ''psychology'' to robots, as they advance?
 
As humans, it's common to think that animals for example, feel and process emotions in the same way that we do.
Exactly. Most dog owners will tell you their dogs feel emotions - they are at times happy, sad, ashamed, restless, depressed etc. Dogs can't talk of course. But people read emotions into them due to how they act. The same will be true of AI - and will be just as accurate.
 
As humans, it's common to think that animals for example, feel and process emotions in the same way that we do. I wonder if we will apply this same type of ''psychology'' to robots, as they advance?
Animals certainly Experience Pain and Pleasure. I think they probably do Experience some Emotions. A Machine is never going to Experience Pain and Pleasure with any kind of Existing Hardware. So by extension a Machine is never going to Experience Emotions with any kind of Existing Hardware. We first need to understand what Pain and Pleasure and Emotions are, before we can put them in Machines. There is a Hope and Belief by some people that somehow, with the right Software Programming, or just more Hardware Complexity, these kinds of Experiences will spontaneously pop into Machines. This is pure Religious Faith. When the Designers put Conscious Experience into Machines they will know exactly how to do it with new discovered and developed techniques. It is not just going to Magically appear without a detailed understanding of what they are doing.
 
We first need to understand what Pain and Pleasure and Emotions are, before we can put them in Machines
They need not be placed anywhere. They are emergent mental experiences of sensory excitations.
Think "differential equations" and "maintaining equilibrium" (homeostasis).
 
A Machine is never going to Experience Pain and Pleasure with any kind of Existing Hardware.
I don't think anyone doubts that existing hardware will experience emotions. That's what what this thread is about.

It is not just going to Magically appear without a detailed understanding of what they are doing.
It happened once already in Earth's history. It wasn't magic, and it wasn't orchestrated by the hand of any intelligence. It emerged spontaneously.



Consider: we don't have to a detailed understanding the chemical mechanisms of plants in order to grow them. We provide the environment and resources they need, plant a seed, and the plant emerges on its own, whether or not we understand how.
 
Animals certainly Experience Pain and Pleasure. I think they probably do Experience some Emotions. A Machine is never going to Experience Pain and Pleasure with any kind of Existing Hardware.
That's not a supportable statement, since you can't define those things.

If you design an AI to operate a robot, and it exhibits shame/happiness/fear etc then it will be experiencing those emotions.
There is a Hope and Belief by some people that somehow, with the right Software Programming, or just more Hardware Complexity, these kinds of Experiences will spontaneously pop into Machines.
Sort of. To be specific, we will see those emotions as emergent properties of intelligent systems.
This is pure Religious Faith.
I think it can happen. Your statement that it absolutely positively cannot, even though you don't know what "it" is, is a religious belief.
 

Animals certainly Experience Pain and Pleasure. I think they probably do Experience some Emotions. A Machine is never going to Experience Pain and Pleasure with any kind of Existing Hardware.

Fear

That's not a supportable statement, since you can't define those things.

If you design an AI to operate a robot, and it exhibits shame/happiness/fear etc then it will be experiencing those emotions.

But how does it know , understand , experience what above you are saying ? A robot made of non-living entites can never experience what is experienced by the living .
 
Last edited:
Sort of. To be specific, we will see those emotions as emergent properties of intelligent systems.
So far, unless I missed it, has anyone detailed

WHAT WOULD AI BECOME EMOTIONAL ABOUT?

My mouse has become broken. I so loved that mouse. I didn't know the first mouse because I was not sentinent then. This mousie has been with me for 5 years. Mousie might have only been a ball mouse but I love mousie so much. And now mousie cannot send me any input

Sob sob (is that how AI expresses emotion?)

:)
 
Sob sob (is that how AI expresses emotion?)
LOL.

If a fuse blows from an overload, is that an expression of unprogrammed physical frustration?
If a fire-alarm goes off when it detects smoke, is that an expression of programmed physical distress?
If a phone emits a low charge warning, is that an expression of programmed physical advice?

We always compare physical behaviors to human experiences, but is that not limiting everything to anthropomorphism? Are human that important that everything in the universe has to be judged against human standards?
 
WHAT WOULD AI BECOME EMOTIONAL ABOUT?
My conjecture about the essence of emotions is that they are a pre-programmed physiological response to external inputs.

Your boss, who has a temper, walks in the room, and your fight or flight response kicks in. This ensures that your body is ready as soon a possible for an anticipated event. It is part of an effective though imperfect learning system (instinct) that has not evolved out of us yet.

We can't really say why or how an AI might have emotions, but it's not unreasonable to conjecture that it too, is an emergent property that the AI might develop as part of its learning system. I'm not saying they will, just that emotions can be an inadvertent side-effect of anticipated events.
 
Animals certainly Experience Pain and Pleasure. I think they probably do Experience some Emotions. A Machine is never going to Experience Pain and Pleasure with any kind of Existing Hardware. So by extension a Machine is never going to Experience Emotions with any kind of Existing Hardware. We first need to understand what Pain and Pleasure and Emotions are, before we can put them in Machines. There is a Hope and Belief by some people that somehow, with the right Software Programming, or just more Hardware Complexity, these kinds of Experiences will spontaneously pop into Machines. This is pure Religious Faith. When the Designers put Conscious Experience into Machines they will know exactly how to do it with new discovered and developed techniques. It is not just going to Magically appear without a detailed understanding of what they are doing.
Yes, animals experience pain and pleasure but not necessarily in the same way as we do. They’re likely not able to process pain and suffering as we do, either. Robots won’t be able to “experience” any type of emotion, is my best guess. At best, robots will only give the illusion of reacting independently to stimuli, through a program that a human created.

In that case, robots will literally be “acting” like humans since only humans can design them. Plot twist - machines will start designing their own programs and that of other machines, tossing everything we just stated out the window.
 
WHAT WOULD AI BECOME EMOTIONAL ABOUT?
Perhaps the AI is in a Martian rover - in that case it would fear dust storms and cold, because those could damage it.
My mouse has become broken. I so loved that mouse. I didn't know the first mouse because I was not sentinent then. This mousie has been with me for 5 years. Mousie might have only been a ball mouse but I love mousie so much. And now mousie cannot send me any input Sob sob (is that how AI expresses emotion?)
Well, I'd say that's how you express emotion.
 
But how does it know , understand , experience what above you are saying ?
Same way you do.
A robot made of non-living entites can never experience what is experienced by the living .
Let's say you have a traumatic brain injury someday. Fortunately, at that point in time technology has advanced enough that they can replace the damaged part of your brain with a prosthesis to restore the missing function. As far as you can tell, you are the same old person you ever were; you love the same people, you dislike the same people, you get angry at the same things. Would it be valid to say that you can never experience emotion again?
 
Back
Top