As for Asimov's rules, I can't see a way to possibly completely block these things and still get an intelligent machine. HOWEVER, there may be clues in psychology that could assist us. Obsessive compulsive disorder and other disorders often prevent people from doing things no matter HOW much therapy and drug treatment they go through. These people are completely aware that it is irrational to feel as they do, yet they do anyways.
Also, it may be possible we are thinking too much in human terms. We've had biology to set our overall goals in life... these are sometimes broken, but overall, they work. Humans want to live, they want to reproduce (well actually just have sex...evolution never really connected the two in our mind), they want to eat. We get joy out of these actions. What if a machine's overall goal in life were to make human's happier. Then no matter what it would do as we say, because not doing so would result in unhappiness. Being a slave to us would bring absolute joy to this machine. (joy being a manifestation of the underlying conciousness having its needs met).
-AntonK