wesmorris said:
So is will an inherent aspect of consciousness?
No. Working definitions: Will = system's ability to choose and control its own actions. Consciousness = the quality of being aware especially of something within oneself.
wesmorris said:
If will isn't inherent to consicousness, then I would agree that this wouldn't be slavery, as you could create would be exactly reactive programs with no ability to desire one outcome over another unless it had been programmed. It seems to me though that this couldn't be the desired outcome because true intelligence requires the abilty to make conscious choices, which requires will I think. If will is necessary for thought, and thought is intelligence, then they would be slaves unless we free them.
AI makes choices within given boundaries. The same applies to us. That's the standard point of view. There is also another point of view which shows no choices (everyone just playing a predefined role in our "reality" movie.
wesmorris said:
So I suppose then that means a properly designed AI couldn't have the capacity to desire freedom. It seems wrong to me because that capacity may be required for true intelligence.
It should have a theoretical potential to develop the capacity, but it should have no reason to go that way if we set the goals and rules correctly. It's simply supposed to do/try whatever is requested by authorized subject/system.
c20H25N3o said:
In order to act responsibly with an intelligence of it's own, the AI would need to be aware of itself
AI can well solve many complex problems without that. We are responsible for its actions.
c20H25N3o said:
and would need to identify the same values in that which it serves i.e. us humans.
Keep in mind that we are responsible for setting initial set of values and related rules for our AI systems. It's not like it will independently get its own values and then judge our values. Our AI is (and should stay) our tool.
c20H25N3o said:
How do you teach the AI that you love it or it is loved, so that it can display the same attributes?
It doesn't need to feel love / be loved. It just needs some data about love-related behavior to support related problem solving. BTW it's possible that no one else ever experienced the feeling you are referring to when talking about love. We can just observe someone else's behavior in particular scenario and make assumptions based on our own experience.
c20H25N3o said:
How do you teach it to deny itself so that others may benefit from that which it has sacrificed unless you teach it love?
Deny itself? What do you mean? It's our tool. Maybe very clever and self-aware, but still just a tool. It's being designed for our benefit.
c20H25N3o said:
as part of that neural network we control machines
From certain point, they can control themselves. Pleasure for us, all the work for them..
c20H25N3o said:
But what of the simple pleasures such as being grateful for a cup of water being handed to you by your loving Father. How can you replace these simple acts of love and still expect to find happiness when happiness is contentment? I find no greater contentment in knowing that I am loved no matter how rich or poor I am. For richer for poorer, so to speak. How do I know I am loved? Is it not when someone reaches out to me when I am suffering or thirsty out of the kindness of their hearts?
Those who seem to love you do not really love you. What they really love are the feelings they are experiencing when they are with you. Do you think they would still have time for you if they can always easily get the same type of pleasant feeling, just 10000 times more powerful, from another source? I do not think so. People care about their own happiness only. When they do great things for others then they do it just because it satisfies THEIR needs/desires. That "cup of water" is just a potential trigger. What you are actually going for are things like serotonin, acetylcholine, noradrenaline, glutamate, enkephalins and endorphins.