what if a human fell in love with a robot?

Status
Not open for further replies.
... amm , people will not fall in love with robots (not in the way I love a blonde girl named Linda (she is so beautifull and I miss her very much by the way) ) because humn like robots are useless ... having emotions is not a quality ...

our conception about ourselves is soooo wrong and arrogant ... we're too stupid to see how insignifiant we really are
 
Asimov is making the same mistake Arthur C. Clarke did with HALL 9000 ... robots need to be very "human" to understand those laws and because of their human nature won't accept them ...
 
It more of a mental block then a law or directive. AI will be incapable of thinking of harming humans. Based of present theories of muti-layered thought processing it would not be hard to implement.

Again yes you could make them like humans and use emotions to control their thought. Make them to epithetic to kill or harm… but we all know that has limitations.
 
The simple fact is that robots can not love back, its not actual love, its a program, running through its paces. But, if you love a robot then what are we to do about it? love is love, go with it
 
I can say the reverse of that statement: human don't really love it’s just a program running through ion channels, synaptic pathways and neurotransmitters. I can assure that we can make AI that has genuine emotions, sure it won’t be like our chemistry but it will emulate well enough that we could not tell the difference… and telling the difference is the all important observation in this. What can not be detected does not (necessarily) exist, so far all that we can detect that makes up the soul can be understood and emulated.
 
the only way to make a robot that would *possibly* feel human emotions would be through giving it programming to learn slowly, as in it would start without knowing how to speak and as it got older it would learn more from its surroundings, or somethign a long those lines
 
That would not be emotions, that would be cognitive thought. Emotions are cause by variations in neurotransmitter usage in the lower brain sections. Such as pleasure which is controlled by dopamine in the thalamus. Happiness is controlled by serotonin in the medulla so forth. Actually its a lot more complicated then that but that’s a very basic simplified view of it.

Making an AI that has emotions require analog computing and simulated neurotransmitters.
 
Ohhhhhh i see.. well.. No i dont. Im sure it makes sense to anyone who payed attention in biology.. i.e. Not me
 
It more of a mental block then a law or directive. AI will be incapable of thinking of harming humans. Based of present theories of muti-layered thought processing it would not be hard to implement.

amm , no matter if it is a mental block or not ... in order for the block to work the robot would have to KNOW WHAT IS WRONG ... Asimov's laws are more philosophical than scientific , they are not very easy to implement
 
To me....emotions have always seemed like a manifestion of lower order impulses (instincts) such as sex, hunger, life, etc. that are manifest (filtered really) through the highest level part of the brain, our consiousness. This isn't truly a biological view...but it seems to make sense to me.

-AntonK
 
As for Asimov's rules, I can't see a way to possibly completely block these things and still get an intelligent machine. HOWEVER, there may be clues in psychology that could assist us. Obsessive compulsive disorder and other disorders often prevent people from doing things no matter HOW much therapy and drug treatment they go through. These people are completely aware that it is irrational to feel as they do, yet they do anyways.

Also, it may be possible we are thinking too much in human terms. We've had biology to set our overall goals in life... these are sometimes broken, but overall, they work. Humans want to live, they want to reproduce (well actually just have sex...evolution never really connected the two in our mind), they want to eat. We get joy out of these actions. What if a machine's overall goal in life were to make human's happier. Then no matter what it would do as we say, because not doing so would result in unhappiness. Being a slave to us would bring absolute joy to this machine. (joy being a manifestation of the underlying conciousness having its needs met).

-AntonK
 
Status
Not open for further replies.
Back
Top