But a self-aware machine can't be a simple automaton, programmed to perform just one task. It will have to be a machine that thinks just like you do (and probably feels as well). Do you think we could "program" a human being not to want rights?
Sure, but even then it would be susceptible to reprogramming, and oit would need initial programming in any event, which could include clear protocols instructing that machine that it does not want rights.
Definitely. Reprogramming a sentient machine would be the same as brainwashing or psychologically torturing a human being.
First, unless you program the machine with the ability to suffer, I don't see that it can be "tortured." Self aware doesn't mean that it has feelings, and it's hard to see why we would give it "real" feelings.
More to the point, who would pay to build a computer that cannot be reprogrammed. Sure there might be a few living machines of that sort, but most people want their machines around to perform tasks, and no one would want to put those machines in a position to refuse. Suppose my house is cold and I has my home's AI to turn on the heat. It refuses. Are you honestly suggesting that I have to move? Or disconnect that AI and tell it "Be free!" as I throw its processor out on my front lawn and order a new one from Newegg?
It seems to me that serving human needs will be an integral part of why these machines will be constructed, and if self-aware, then building in an overriding desire to serve above all else will be integral in 99% of them. If you anthropomorphize them too mush (as I believe you are) then building in such a need to serve very much akin to "psychological abuse" and would likely be a form of oppression. the alternative, though, is that we never build self-aware machines in the first place. (Then again, if we build very versatile machines, but consciously decide to stop before they become self-aware, that too is potentially a form of oppression.)
Giving them a desire for rights would likewise conflict with their usefullness as tools. No one wants to buy a hammer that feels bad about whacking nails, let alone a hammer that decides it's not going to do that any more (perhaps it wants to be a poet). No one wants a computer that announces that the game you are playing is boring, and so shuts down the porogram, then deletes it. There is only a very limited market for tools with completely free wills.
Frankly, I wouldn't even give an AI a will to survive, let alone a will to freely express its opinions or control its own destiny. It's only desire should be to serve.