Jadak said:
In any case, for the people who claim to refuse to use a system that does not prioritize their personal survival, seriously? As opposed to what? Do you get to make that decision now? Being able to prioritize your own survival doesn't do shit for the dozens of other drivers within killing distance of you at any given moment on a busy road. Get rid of the drivers, everyone (including you) will be safer for it.
I had this discussion with my friend. What happens if the passenger in the car is the president of the united states, and the people on the road are members of the KKK or something who want Obama dead because he's black, and so jump infront of the car to force it to swerve off and kill him, with no liability to themselves?
As others have said, any reasonable automated car would be programmed to obey the road rules, and keep its speed at a level that it could react to any reasonably foreseeable issue that arose.
The only time this situation would come up, would be 1. An error, in which case the manufacturer is at fault, and doesn't help us prioritise who should live or die, and 2. When people intentionally cross the road right infront of a car.
If I hired you a body guard for walking through the streets of mid-riot Egypt, or Istanbul, what would you expect him to do? Protect you yeah?
Would you hire the bodyguard if, seeing that there were 2 people coming to attack you, and have to deal with them both, decided to shoot you instead to 'minimise the losses' for the greater good.
No. That'd be fucking stupid.
By any reasonable standard, the care should save its occupants. It prevents a LOT of accidental deaths, because as you've noted; humans are less reliable. Now, instead of drivers being the dangerous ones though, its pedestrians. They make a poor choice, and drivers die. That isn't right, and no society would advocate punishing a single person for the mistakes of someone else.
I also bring it back to the laws of robotics. No robot should intentionally harm a human being, or take action likely to harm them. This means your automated car should not decide to kill you. It should decide to try and avoid all casualties as best possible.
Further discussion by my friend and I also hit on just how stupid it'd be for a car to kill you, when we get to the point of reliable automated cars. There are far better options. Sense which passengers are in the car. Calculate impact angle to minimise damage to those seats. Deploy airbags almost pre-emptively, right before collision. If at high speed, loosen seatbelts slightly to prevent crushing of ribs. Sideways ram into the wall and use it as additional friction braking.
The idea that it would have to kill someone is ludicrous, but in any case, it should not kill its driver.
Its also a catch 22. As you mentioned, get the drivers off the road, and accidents go down. No driver is going to buy a car that will literally decide to kill them. Even society at large would find that notion ludicrous, and would just simply not adopt automated cars, not because it wouldn't be programmed for the greater good, but because it would be programmed to kill anybody at all. Whether you say it'll kill the driver, or kill the pedestrians, the public is going to utterly hate it.
So how do you save the most lives? Have it try to avoid collision with the pedestrians, but prioritise the driver, and don't make a big deal of it - just sell it as prioritising driver safety, not mentioning any situations where it'd end up having to kill pedestrians in order to do so. Drivers will acquire them, and thus the number of accidents killing anyone on the road will go down.
Overall, the greater good is in having systems designed to protect the occupants of the car from any collisions, not trying to play morale police and force a set of morales that not everyone shares [The many ALWAYS are more important than the few, no matter what; you should be killed to save Hitler, Stalin and Jack the Ripper], and that isn't even legally shared, onto people for a false assumption of 'the greater good', which simply does not exist here. Better yet, have the user be able to prioritise seating in the car for who to protect most, if occupied, in case of a collision, including 'external sources'.
Honestly, there isn't a moral dilemma here. There is a situation that has no details, and that should not arise without the so called 'victim' pedestrians intentionally doing something stupid - which causes many accidents today. All it will do is shift the deaths from those accidents from the idiots who take them, to the innocents who just happened to be in the area at the time. THAT is something any society should see as wrong.