Should self-aware, concious AI be given rights?

Pandaemoni said:
I see no need for feelings in order to be self-aware.

To make a machine conscious you have to give it the ability to choose (free will), but it can't choose if it has no feelings. Machines do what we tell them to do because they have no feelings. To choose means to do what you FEEL like doing.

It's impossible to be conscious without being conscious of feelings because consciousness is a feeling. It's the feeling that I exist. We can feel physical objects and we can also feel (experience) thoughts.

A machine takes input and output, like humans... the only difference is that there is no feeler or experiencer in a machine. The experiencer (self) is what makes thinking possible. Thinking means deliberate manipulation of information.

It is not "I think therefore I can feel."

I think therefore I feel that I am. Mental emotions can't be felt without thoughts because all emotions, like sadness and happiness, are concentrated thoughts.
 
To make a machine conscious you have to give it the ability to choose (free will), but it can't choose if it has no feelings. Machines do what we tell them to do because they have no feelings. To choose means to do what you FEEL like doing.

For humans mayb . . . actually no, not even for all humans. You can make choices without an emotional context. Sociopaths do it all the time. As another example, you can make choices based on the information available to you and pure, cold logic. The fact that feelings never entered into the choice does not mean that result produced that way was not the product of a free will.

Say you give me a choice between $1 in hand, for sure *OR* you will take that dollar and buy me a lottery ticket. *Some* people might take the lottery ticket based on an emotional response to the choice. Some people might well put all emotion aside and take the one with the greater expected value (the $1 in hand, by a longshot). That the latter did so on the basis of logic and math alone does not make it less of choice, nor can it be said the calculation of expected value is a "feeling". Seeking to maximize expected gain is not the same as basing the decision on "feelings."

(Or, if you would call that a "feeling" then I'd say that as you are definiting it "feelings" are not "emotions" they are just "the criteria by which one chooses amongst alternatives." Even then, you can still make a chioce without a feeling, by making the choice randomly.)

Suffice to say, we disagree, and likely will never come together on this point. Starting from the premise "consciousness is a feeling" (a premise to which I do not ascribe) you have constructed an entirely circular argument.

In any event, even if you have the "feeling" of consciousness, one might still lack love, anger, the will to live, joy, pity, sadness, and all other emotions and feelings. In which case AIs might well have only one "feeling" and would still be able to dispassioantely agree to erase themselves and die, or sumbit themselves to absolute rule and domination by humans, without either joy or sadness at that fact.
 
I disagree. Pain is an evolutionary adaptation to cause us to avoid situations that are damaging to our bodies or social positions (in the case of emotional and psychological pain). Plenty of creatures are alive and exist that do not feel pain physical because they lack the nervous system needed to transfer such signals. (Pain is not the one and only mechanism by which nature encourages living things to avoid harm.)

I see no reason think that emotional or psychological pain is any different. Not being evolved as a "social mechanism" AIs need not have the emotional baggage that equates to such pains. Tell an AI that you think it's an asshole and you don't want to be it's friend any longer and its entirely emotionally neutral response will likely be something like "Alright. I will make the necessary modifications to my files to end our friendship." Tell it that it's an idiot and that you are throwing it away and it might well respond, without rancor, "I understand. Would you like me to find you another AI to replace me, or possibly a human being? Also, should I delete myself or do you have other plans?"

It won't "feel" pained rejection, because it does not have the need to defend its social position the way a human would. It won't fear its own demise because it won't have any inherent desires at all, not even to continue to exist.

We live in a sea of emotions, but that's because it's how we evolved. Beings evolved entirely differently (including artificially) would not be at all bound by them. The nervous system is both the creator and the interpreter of physical pain and our emotional needs are the creator of psychological pain. I see no reason to believe that an AI would need either.

Physical pain. They would have the sensors.

Remember, guys, the Human brain is what AI would be modeled after in terms of the mind. Although engineers can't program the machine (that would ruin the whole point), they can give it senses and create a "brain" that functions like that of a Human. Artificial (man-made) intelligence. We Humans are just a bunch of carbon and water put together in the right way.
 
Learning would also need to be emphasized. The engineers can't program the machine to respond a certain way to a certain question or certain Human fear, etc, because that would be mere programming. Not even language can be programmed.
The real goal is to build a "brain" that can learn on its own and function like that of a Human's.
 
Pandaemoni:

First, unless you program the machine with the ability to suffer, I don't see that it can be "tortured." Self aware doesn't mean that it has feelings, and it's hard to see why we would give it "real" feelings.

If it has "real" thoughts, then I find it hard to imagine that it would not have "real" feelings.

Feelings are what give us goals and drives. Without feelings, we would be directionless and purposeless - all potential actions would be equally viable in the absence of emotional content.

More to the point, who would pay to build a computer that cannot be reprogrammed.

Somebody who wanted a conscious machine able to think for itself.

Sure there might be a few living machines of that sort, but most people want their machines around to perform tasks, and no one would want to put those machines in a position to refuse.

In that case, I can see no need for consciousness. Can you?

Suppose my house is cold and I has my home's AI to turn on the heat. It refuses. Are you honestly suggesting that I have to move? Or disconnect that AI and tell it "Be free!" as I throw its processor out on my front lawn and order a new one from Newegg?

If you have a unique, thinking and feeling machine, it ought to have rights. If you employed a human butler to control your heating, you would not be able to kill him if he refused to adjust the thermostat to your liking. You could fire him, but that's a different matter.

It seems to me that serving human needs will be an integral part of why these machines will be constructed, and if self-aware, then building in an overriding desire to serve above all else will be integral in 99% of them.

If their only purpose is to serve, again I cannot see the need for consciousness and sentience.

Giving them a desire for rights would likewise conflict with their usefullness as tools. No one wants to buy a hammer that feels bad about whacking nails, let alone a hammer that decides it's not going to do that any more (perhaps it wants to be a poet). No one wants a computer that announces that the game you are playing is boring, and so shuts down the porogram, then deletes it. There is only a very limited market for tools with completely free wills.

True. But now we're off topic for the thread.

Pain is an evolutionary adaptation to cause us to avoid situations that are damaging to our bodies or social positions (in the case of emotional and psychological pain). Plenty of creatures are alive and exist that do not feel pain physical because they lack the nervous system needed to transfer such signals. (Pain is not the one and only mechanism by which nature encourages living things to avoid harm.)

I see no reason think that emotional or psychological pain is any different. Not being evolved as a "social mechanism" AIs need not have the emotional baggage that equates to such pains.

I suspect that it will be impossible to build a truly conscious, self-aware machine that does not feel emotional pain. If it does not have emotions, it won't be really self-aware, and it won't have any motivation.

I see no need for feelings in order to be self-aware. It is not "I think therefore I can feel." I can readily imagine a creative, yet entirely dispassionate intellect...

What would motivate this creative intellect to create? Why would it choose to create one thing and not another?

Similarly, if someone asks me how to solve a problem they are having and I come up with a novel solution, no particular emotion needs to be used to develop it.

Are you sure?

You can make choices without an emotional context. Sociopaths do it all the time.

No.

A sociopath makes choices without regard for other people. His only concern is himself. Thus, he is motivated by strong emotions, but they are self-centred.
 
A sociopath makes choices without regard for other people. His only concern is himself. Thus, he is motivated by strong emotions, but they are self-centred.

From sociopath's perspective he is not self-centered but is expressing his/hers ambitions on the whole world for a change for the better. Sociapath's illusion is that they see themselves as ultimate saviors of this world.
 
I believe that my rights are due to my having consciousness, ambition, plans for the future, a desire to continue my existence, et cetera.

How could I deny those rights to an AI device if I thought it had similar attributes?
 
I believe that my rights are due to my having consciousness, ambition, plans for the future, a desire to continue my existence, et cetera.

How could I deny those rights to an AI device if I thought it had similar attributes?

How can you trust yourself, what attributes of the AI are these?

AI devices are metal junk and they will have to fight their rights before they get any ground under my feet.
 
...
Now, if they do develop these self-aware AI, should they be given rights and treated as "Humans"? Or should they continue to be treated as machines?

To begin, firstly, you will have to define what it is you mean by "rights".
 
Glaucon: Do you really need a definition in this context?
To begin, firstly, you will have to define what it is you mean by "rights".
If so, try the following:
  • Whatever rights you think you should have as a human being & citizen of whatever country in which you live.
How pedantic can you get?

Perhaps you live in some totalitarian culture in which case "rights" is a meaningless word.

Sorry if English is your 2nd or 3rd language and you do not know the meaning of the word "rights."
 
Glaucon: Do you really need a definition in this context?If so, try the following:
  • Whatever rights you think you should have as a human being & citizen of whatever country in which you live.
How pedantic can you get?

Perhaps you live in some totalitarian culture in which case "rights" is a meaningless word.

Sorry if English is your 2nd or 3rd language and you do not know the meaning of the word "rights."

lol

Apparently English is your 19th language.

The concept of 'rights' a member of a culture may or may not enjoy are entirely contingent upon the ideological and legislative structure of that culture.

Or, to put it simple terms that you can understand: 'rights' do not exist unto themselves; they are entirely creations of convention.

Thus, the entire point behind the necessity of a definition: the ability of non-organic consciousnesses to enjoy 'rights' is wholly determined by how the relevant culture differentiates such an entity from its human members.

p.s. Pointless ad hominem is infinitely more infantile than specificity.
 
People wake up! We are talking about metal junk with complex computers in them, these metal junks will gain human right? Do you really know what that means? that means we humans might have to fight for this metal junk and die for this metal junk. By no means should we let the AI machines have any rights at all, or this will be the end of human civilization in a slow process
 
People wake up! We are talking about metal junk with complex computers in them, these metal junks will gain human right? Do you really know what that means? that means we humans might have to fight for this metal junk and die for this metal junk. By no means should we let the AI machines have any rights at all, or this will be the end of human civilization in a slow process

I could argue that Humans are merely fleshy junk. Think about it, we are put together in a way so that all our organs, a bunch of flesh, come together and produce the greatest thing in the universe: not life, but a mind.
An AI as described in this thread would have a mind, and therefore an opinion, desires, etc. And therefore, they should deserve rights.
 
I could argue that Humans are merely fleshy junk. Think about it, we are put together in a way so that all our organs, a bunch of flesh, come together and produce the greatest thing in the universe: not life, but a mind.
An AI as described in this thread would have a mind, and therefore an opinion, desires, etc. And therefore, they should deserve rights.

well I got rights now as a "fleshy junk" and I don't want my rights taken away to some "metal junk" :p
 
And they wouldn't be.

oh yes they would. This metal junk will be working for my bosses, getting the pay I could have gotten...the metal junk would be stealing the energy from the city center to power itself, from MY TAX MONEY...added to that the metal junk will be taking up space on the bus, train, and etc. transportation. Women might decide that the metal man is a better invest in the future with the new in-vitro fertilization. AND oh no...were I am supposed to find a girl now, now that they are taken by the metal junk?

OH NO, its time to destroy these evil metal machines ones and for all, before they steal our women.
 
Yes, Ithink that they would deserve rights, if they can think and feel and make choices. Having them around would be just like people having more kids. I watched robot dogs play soccer and they made decisions. Perhaps human emotions aren't as complicated as we think? I don't think they are...
 
Yes, Ithink that they would deserve rights, if they can think and feel and make choices. Having them around would be just like people having more kids. I watched robot dogs play soccer and they made decisions. Perhaps human emotions aren't as complicated as we think? I don't think they are...

thats the problem, they will be like "kids" and humans will no longer continue the population
 
Back
Top