Should your self-driving car kill you to save others?

The autopilot may have a lot of complex information, and be required to make a lot of very fast decisions. Its information will always be limited by security clearance, availability of data and the accuracy of its sensors (The one in Florida couldn't distinguish a hulking great truck from the sky; the truck driver apparently didn't see a low, dark-coloured - more like the one in the background http://www.wcpo.com/news/national/tesla-driver-killed-in-crash-while-using-cars-autopilot - car approaching at speed in the opposite lane; all three drivers were inattentive, but only two died. ) Assuming that its programming is sophisticated enough to make the kind of informed choices mentioned in this thread, its decision-making still doesn't require an "ethical" component. It could simply assess relative quantities of damage. Maybe according to the odds of saving the lives of victims in the projected accident. Maybe in $ figures. Maybe in time and material to produce replacements for the personnel, mechanics and road furnishings. Maybe in terms of damage to society or disruption to traffic. Maybe according to a table of human valuation by age, sex, occupation, police and health record.
It's a machine: give it technical terms of reference, not sentimental ones.
 
Last edited:
The simplest example would just be a user input, a checkbox for "save occupant(s) at all costs". James Bond's Aston-Martin would have it grayed so that it couldn't be unchecked, though that could be over-ridden by editing the Registry. For company cars there might be an Advanced button so that system administrators could choose who needed to be saved and who was expendable (in case of adverse publicity).
Your original example was surgeon versus gang-banger.

Regardless, I think you're waxing fanciful - as in: fun to make conjectures but not really a serious discussion. I don't think you honestly believe that, in a real world / near-future, the auto company or the driver would ever or could ever place a value on a life in such a situation.
 
Your original example was surgeon versus gang-banger.
Well, the example is actually "us" versus "them".

I don't think you honestly believe that, in a real world / near-future, the auto company or the driver would ever or could ever place a value on a life in such a situation.
I do. I think Asimov's Laws of Robotics are utterly unrealistic.

Look at drones. One of the first uses of the technology was to choose whom to kill. We're only a very small step away from eliminating the remote human operator entirely.
 
Well, the example is actually "us" versus "them".
Precisely. So more a movie plot than real life. There's no realistic separation between those two outside of a contrived movie plot.

Look at drones. One of the first uses of the technology was to choose whom to kill.
You mean military drones?

You've got the cart before the horse. Killing is why they were invented. The 'killing' need came first; the technology was developed to meet that need.


Isn't that like saying 'they put 75mm guns on tanks, so they'll be appearing on cars next'?
 
Isn't that like saying 'they put 75mm guns on tanks, so they'll be appearing on cars next'?
Well, kinda. More like 'they put wifi in restaurants, so they'll be putting it in cars next'.

Are you suggesting that they wouldn't put 75mm guns on cars if there was a demand for them? The NRA and the 2nd Amendment would certainly support it.
 
Well, kinda. More like 'they put wifi in restaurants, so they'll be putting it in cars next'.

Are you suggesting that they wouldn't put 75mm guns on cars if there was a demand for them? The NRA and the 2nd Amendment would certainly support it.
I'm suggesting that we're drifting into pure speculation for the sake of something to talk about. The opening topic is a real, if rare, concern in the real world. Flipping switches to customize a car's weighting system for what lives to spare is sci-fi.
 
Flipping switches to customize a car's weighting system for what lives to spare is sci-fi.
On the contrary, it's a very real possibility. We set up our computers to suit our personal preferences. A computerized car could be set up in exactly the same way. There's no question of sci-fi. The only question is, "Should we?" We can't decide whether we should do it if we pretend it can't be done.
 
I'm suggesting that we're drifting into pure speculation for the sake of something to talk about.
Agreed. Trolley problems are interesting thought experiments, but don't currently have real world applications when it comes to autonomous vehicles (and will not for a long, long time.)
 
Agreed. Trolley problems are interesting thought experiments, but don't currently have real world applications when it comes to autonomous vehicles (and will not for a long, long time.)
So, this comment dated itself pretty quickly...

They're already rolling onto the streets.
 
So, this comment dated itself pretty quickly...They're already rolling onto the streets.
?? My comment was not "autonomous vehicles won't happen" - heck, I have one. It is that the "trolley problem" thought experiment does not really apply to them, any more than it applies to modern aircraft autopilots or modern autonomous train control systems.
 
?? My comment was not "autonomous vehicles won't happen" - heck, I have one. It is that the "trolley problem" thought experiment does not really apply to them, any more than it applies to modern aircraft autopilots or modern autonomous train control systems.
Ah. mea culpa.

In that case, yes. An interesting philosophical debate, but its practical application is dubious.
 
There's one semi-practical aspect - it might be in the interest of the rest of us to include a "kill the driver to save others" subroutine in these very expensive self-driving cars. So it's there, if we need it.
 
There's one semi-practical aspect - it might be in the interest of the rest of us to include a "kill the driver to save others" subroutine in these very expensive self-driving cars. So it's there, if we need it.
And if "kill the driver to save others" were ever hacked it would be a hack of a way to go

:)
 
There's one semi-practical aspect - it might be in the interest of the rest of us to include a "kill the driver to save others" subroutine in these very expensive self-driving cars. So it's there, if we need it.
What does the cost of the car have to do with it? Is it be less important to do in cheaper autonomous vehicles?
 
What does the cost of the car have to do with it? Is it be less important to do in cheaper autonomous vehicles?
Hint: the Republican tax bill kills Social Security and Medicare as we know it, and guarantees the transfer of another 15% or so of existing wealth (along with continuing the current sequestering of essentially 100% of new wealth) into the ownership of the very rich.
 
Hint: the Republican tax bill kills Social Security and Medicare as we know it, and guarantees the transfer of another 15% or so of existing wealth (along with continuing the current sequestering of essentially 100% of new wealth) into the ownership of the very rich.
Ah, so killing such people is OK.

Next up - a car that sacrifices the least productive group of people. For the good of the country, of course.
 
Back
Top