Should your self-driving car kill you to save others?

The point is that if a computer has to decide between the two, it has to have some criteria to work with. No matter how ridiculous you think those criteria are, no matter how arbitrary they are, there have to be criteria.
No there really doesn't.

1] The situation where two options are identical in every way that a car can distinguish is too extreme to worry about. And if you insist on going to the extreme, then it wouldn't matter whch option it chose.

2] We do NOT open the door to placing value based on knowing a person's livelihood. That is a slippery slope I am certain no sane citizen will go down.
 
Self driving cars do not make decisions in line with "saving" occupants. They make decisions to avoid collisions. Thus the position of the switch would be meaningless to the car's driving algorithm.
Yes, but we can extrapolate to what a car might have to decide if it could not avoid an accident. It can detect pedestrians; it will try to avoid them, but what if it can't? This is the leaky edge we're exploring.
 
Only if an engineer had built in a switch that can say that. I don't believe a sufficiently convincing case has been made for its installation.
The day after self-driving cars appear on the market, you'll be able to download hacks for the software.
 
The situation where two options are identical in every way that a car can distinguish is too extreme to worry about.
That's what's called a bug. "We never thought it would happen so we didn't bother to program for the possibility." Think Y2K.
We do NOT open the door to placing value based on knowing a person's livelihood. That is a slippery slope I am certain no sane citizen will go down.
Sanity has never held humanity back before.
 
That's what's called a bug. "We never thought it would happen so we didn't bother to program for the possibility." Think Y2K.
No, it isn't a bug. A bug is software behaving in a way other than intended.
There is no down-side to it, so there's nothing to program against.

If the AI was faced with two choices, both equally bad, and it chose one over the other, that is not a fault. By definition, since both are equally bad, it doesn't matter.
 
The day after self-driving cars appear on the market, you'll be able to download hacks for the software.
Exactly. That's why it should not be installed, not be programmed, not be designed and not be conceived. If the software architects and hardware engineers all practice reasonable prevention, they won't get anyone into these silly dilemmas.
 
But the moral - or more accurately, social - attitude is what separates the good and bad drivers, far more effectively than skill. It's the selfish and heedless who speed, run stop signs and cut in. So they would set their car's moral standard to Maximum Me, while the responsible, altruistic drivers, who are already careful and law-abiding, would set theirs to Save the Child at any Cost. The meek would become prey to the aggressive - SOP.
But you miss the point that these are autonomous vehicles we're talking about, right? There is no breaking of speed limits, running stop signs or cutting in etc. The car is already driving "safely". The question is who would the car prioritise: passenger or pedestrian, in the event that the car has to "choose" one over the other.

Any issue of "poor driving" is thus a red-herring in this matter, as far as I can see.

That altruistic drivers would set their car's moral compass differently to others is not saying that they would be driving any more safely... as the car itself is already driving, and presumably driving safely. To me that latter aspect is a given, and this question is purely and simply about at a choice that the car would effectively have to make.
 
But you miss the point that these are autonomous vehicles we're talking about, right?
No I didn't miss the point. Aggressive driving habits do not result from lack of skill (could have sworn I mentioned this, but maybe the herring ate it); they result from a lack of consideration for others.
It's not a question of "poor" drivers, but of selfish drivers.
You suggested selfish drivers be give the option of installing their moral standard into an autonomous vehicle, and I didn't like the idea.

.... this question is purely and simply about at a choice that the car would effectively have to make.
And I keep saying the probability of an autonomous car ever getting itself into such a quandary is so low as to be unworthy of consideration. And even if the situation could somehow be contrived, the computer could not, would not and should not be expected to make a moral choice. Its job is to make the smartest decisions available, at all times, without unnecessary emotional constraints placed upon it.

Do you guys want gridlock while all the poor robots burn out their microcircuits, trying to figure out what Papa would approve of?
 
No I didn't miss the point. Aggressive driving habits do not result from lack of skill (could have sworn I mentioned this, but maybe the herring ate it); they result from a lack of consideration for others.
It's not a question of "poor" drivers, but of selfish drivers.
You suggested selfish drivers be give the option of installing their moral standard into an autonomous vehicle, and I didn't like the idea.
No I didn't suggest that. I suggested that they set up testing to be able to calibrate - I.e. From within a defined set of parameters, none of which will be "selfish" - and more specifically to set up the car in line with the drivers own morality with regard this single issue. This is, after all, an autonomous car, which will drive as safely as possible and abide by all Highway Code. Being selfish or not won't be an issue.
But at no point did I suggest, or in any way imply, the entire driving habits or reasons for them (I.e. this entire moral standard) be uploaded into the car. Consideration for others will be a given in any autonomous car. At least that is how I see it.

In my view, the question of whether you would sacrifice your own or the pedestrian's life has nout to do with how selfish a driver you are. It is simply a matter of how much you value your life compared to the life of others. You could be the most considerate and unselfish driver around but value your own life above everyone else's in such a situation: better to live with the consequences than not live at all, etc.
And I keep saying the probability of an autonomous car ever getting itself into such a quandary is so low as to be unworthy of consideration. And even if the situation could somehow be contrived, the computer could not, would not and should not be expected to make a moral choice. Its job is to make the smartest decisions available, at all times, without unnecessary emotional constraints placed upon it.
This is simply begging the question: what does it mean to make "the smartest decision" in the case when it needs to put at equal risk of fatality either the occupant or the pedestrian? Answering the hat question with just "it will make the smartest decision" is simply no answer at all.
Do you guys want gridlock while all the poor robots burn out their microcircuits, trying to figure out what Papa would approve of?
No, which is why giving it an ultimate default for when all other decision-making produces no clear result sounds like quite a good thing to do.


And yes, it's a hypothetical question, sure, but still worthy of consideration, not least because someone somewhere will have to program relative values for such things into the AI.

Call it a matter of morals or simple programming or anything else. It wouldn't be emotions, that's for sure (unless we're talking seriously advanced AI) and it is a question that will be asked again and again. Whenever an autonomous car crashes there will/should be post-mortem of the decision making (as far as possible) and at some point there will be a scenario where the car has prioritised one life over another, not necessarily through direct decision making of which to sacrifice but through whatever priority it does have. Who gets to determine what that priority is? That's the question here.
 
In my view, the question of whether you would sacrifice your own or the pedestrian's life has nout to do with how selfish a driver you are. It is simply a matter of how much you value your life compared to the life of others.
Oh. Sorry for failing to see that distinction.
 
By definition, since both are equally bad, it doesn't matter.
But it's a human being making the ultimate choice about what is "equal" i terms of badness. Should I program it to save the surgeon, who has a known value, or the unknown pedestrians? Don't kid yourself that no programmer will make the obvious choice.
 
And don't kid yourself that programmers have that choice to make. The system isn't good enough to make that decision.
(We were supposing, for the sake of argument, that it was technically possible. The issue was: would we make use of such knowledge? Well, that wasn't an issue really, because no sane adult thinks that a machine should be programmed to choose one life over another based on some value of their life).

But it's a human being making the ultimate choice about what is "equal" i terms of badness. Should I program it to save the surgeon, who has a known value, or the unknown pedestrians? Don't kid yourself that no programmer will make the obvious choice.
The surgeon has as known a value as any other unknown pedestrian.

But what does the programmer have to do with it? Since we don't write the software to check the pedestrian's Facebook profile in the first place, there's no choice for the programmer to make.
 
Oh. Sorry for failing to see that distinction.
One is with regard how one uses the road in general, the other is about a willingness to die instead of killing a pedestrian.
It is indeed a distinction that is rather important, so there is no need be sarcastic about it.
 
One is with regard how one uses the road in general, the other is about a willingness to die instead of killing a pedestrian.
It is indeed a distinction that is rather important, so there is no need be sarcastic about it.
If you are asserting that selfish people who disregard the safety of others on the road "in general" will set their permanent preference to sacrifice themselves for a stranger, then the sarcasm was quite necessary.
 
If you are asserting that selfish people who disregard the safety of others on the road "in general" will set their permanent preference to sacrifice themselves for a stranger, then the sarcasm was quite necessary.
That is not what I am asserting, but thanks for the strawman.
But even if you think, as you seem to do, that all people who drive selfishly would choose themselves over the pedestrian, you have missed the occupants who are not themselves selfish drivers. Do you similarly think that they would all choose the pedestrian over themselves given the dilemma in question?
Are you a selfish driver? Would you choose yourself or the pedestrian?
 
Back
Top