The Moral Dilemma of Self-Driving Cars - Save the Driver Or Others

Recommended Videos

FoolKiller

New member
Feb 8, 2008
2,409
0
0
Floppertje said:
I agree with the general sentiment here; a car should not prioritize others over its driver (or owner, since the driver would be the car itself).
I think that the more important point is that self-driving cars will not be as dangerous as people think. Yeah, at some point a computer (or rather, a programmer) might decide over life or death but it'll at least be able to make that decision (people don't really think that fast or clearly in that situation) and self-driving cars won't BE in that situation as often as human drivers are.
Robot cars don't have to be perfect. They just have to be better than human drivers, and that isn't really that hard to pull off...
I disagree. Any Robot car driving for me has to be above perfect. I follow all the rules of the road (that matter), and on a daily basis I have to outdo that to avoid a collision, whether its someone not yielding right of way, cutting me off, turning when they don't have time. If all I did was drive perfectly according to the rules of the road I would most likely have an accident every day I drive more than 15 minutes. And that's most of them.

The only way for this to work correctly would be for all the vehicles on the roads to be automated and the software would be identical so they would not conflict with each other's logic. Until that day, I will drive myself because IF I die while driving, I want to have as much control as possible.
 

FirstNameLastName

Premium Fraud
Nov 6, 2014
1,080
0
0
How about you just give the driver the option of whether to prioritise their life, the life of another, or the maximum number of lives? I doubt very much that anyone will want to drive a car that's designed to kill them if some idiot walks out onto the road. As selfish as that might sound, people would rather have their own life prioritised. And if anyone thinks it would be immoral to give people the choice to save themselves rather than an entire group, I will remind you that people already have that choice with manual driving.

Also ...
inu-kun said:
I've heard about this dillema from a coworker and it makes absolutely no sense, the entire idea of a self driving car is that it will behave as a responsible driver, meaning driving in speed that will allow stop at any conceivable case, so nothing short of people teleproting in front of the car while it's on a high way on a rainy day will actually necessetate this kind of dillema, at real life the worst is a swerve that will scratch the paint job.
.. this. Why should the driver be killed for any number of pedestrians? Assuming the car is driving on the road and not the footpath, then the car should have right of way. Just program them to stop at pedestrian crossings and try to avoid hitting rouge pedestrians when it's safe to do so. I get that there will be scenarios where this sort of thing comes up, but, outside of some error in the programming that's caused the car to be somewhere it shouldn't, then the pedestrian is the one who shouldn't even be there. If people are too stupid to cross at the pedestrian crossings then why should their life be prioritised over the life of some innocent person in the car?
 

Jadak

New member
Nov 4, 2008
2,136
0
0
Meh, prioritize the saving of the largest number of people. Not really a dilemma, just something that will unnerve people regardless of which decision you make.

As plenty of people have mentioned, many people claim they wouldn't be comfortable in and would not buy a product knowing that it would not always prioritize their own like. On the other hand, there's no way in hell the public would stay accepting of having vehicles on the road knowing that the machines would choose to slaughter a crowd rather then risk their driver.

Thing is, people already die in accidents. A lot of them, both bystanders and drivers, drivers fault and not. Point is, fuck the dilemma. Automated traffic has the potential for such vast superiority in terms of safety overall that even should the car make the worst possible decision in these sorts of decisions, it would still be fine.

In any case, for the people who claim to refuse to use a system that does not prioritize their personal survival, seriously? As opposed to what? Do you get to make that decision now? Being able to prioritize your own survival doesn't do shit for the dozens of other drivers within killing distance of you at any given moment on a busy road. Get rid of the drivers, everyone (including you) will be safer for it.
 

KoudelkaMorgan

New member
Jul 31, 2009
1,365
0
0
Always prioritize the customer's life if you want people to adopt your cars. Pedestrians in general can control where and when they have ANY potential traffic coming their way barring egregious driver error or jaywalking. Also cars are designed today to minimize injury to pedestrians/side effect of lighter frames for fuel efficiency. Last time I checked that hypothetical wall I'd be plowed into does not have anyone's safety in mind.

Also, fuck motorcycles. They don't belong on the freeway anymore than bikes or pedestrians do. A long desert highway, sure whatever. Weaving in and out of 6 lanes of rush hour traffic without a helmet because MERICA FUCK YEAH, nah you can fuck right off.
 

dreng3

Elite Member
Aug 23, 2011
771
410
68
Country
Denmark
If this was a case of car to car collisions I would agree that the car should inherently try to protect the driver, but as we've already hashed out the cars will communicate and so the risk of such collisions are negligible.
The great issue here is car to pedestrian versus car to environment collisions, and as I see it the car should almost always chose to endanger the driver. Now hear this out before you jump down my throat about who's responsible; in a collision between a car and a pedestrian the pedestrian is 1,5 times more likely to sustain damage than the occupants of the vehicle if I understand this ( http://www.cdc.gov/motorvehiclesafety/pedestrian_safety/ ) page correctly, furthermore the driver will have a variety of protective measures in place, presumably not just limited to airbags.
This will, if I haven't gone completely unreasonable over night, mean that the driver and passengers are actually more likely to survive than the hypothetical pedestrian, and even if the car collides with the environment the driver and passengers would most likely only sustain minor injuries, and only in the very worst cases fatal/debilitating ones.
 

11zxcvb11

New member
Apr 13, 2012
15
0
0
lacktheknack said:
Space Jawa said:
I'll go one further and say I have no intention of ever buying a self-driving car at all.
Nailed it.

If self-driven cars become standard, I'm swapping permanently to buses and city trains, the end.
what will you do when buses and trains become self-driving and planes become self-flying? i guess you could bike or walk if things are withing reasonable distances. maybe buy a segway if you trust the electronics :p

i think the easiest way to 'force' autonomous vehicles on the population is for insurance companies to raise premiums for human drivers (this would make sense from their perspective, since humans are more prone to accidents than even a half-decent robotic car). sure, some people would be stubborn, but most would just choose the self-driving cars (the vast majority of people don't stick to principles when it actually hurts their wallets). and of course, all the really stubborn people would just die out and the new generations who grew up in a world of (mostly) driverless cars will have no trouble adopting it.

i am curious though, all the "i will never buy an autonomous car" people here, would you still stick to driving if it meant an extra 50$ a month for insurance? what if it was 100$?
 

Joccaren

Elite Member
Mar 29, 2011
2,601
3
43
Jadak said:
In any case, for the people who claim to refuse to use a system that does not prioritize their personal survival, seriously? As opposed to what? Do you get to make that decision now? Being able to prioritize your own survival doesn't do shit for the dozens of other drivers within killing distance of you at any given moment on a busy road. Get rid of the drivers, everyone (including you) will be safer for it.
I had this discussion with my friend. What happens if the passenger in the car is the president of the united states, and the people on the road are members of the KKK or something who want Obama dead because he's black, and so jump infront of the car to force it to swerve off and kill him, with no liability to themselves?

As others have said, any reasonable automated car would be programmed to obey the road rules, and keep its speed at a level that it could react to any reasonably foreseeable issue that arose.
The only time this situation would come up, would be 1. An error, in which case the manufacturer is at fault, and doesn't help us prioritise who should live or die, and 2. When people intentionally cross the road right infront of a car.

If I hired you a body guard for walking through the streets of mid-riot Egypt, or Istanbul, what would you expect him to do? Protect you yeah?
Would you hire the bodyguard if, seeing that there were 2 people coming to attack you, and have to deal with them both, decided to shoot you instead to 'minimise the losses' for the greater good.
No. That'd be fucking stupid.

By any reasonable standard, the care should save its occupants. It prevents a LOT of accidental deaths, because as you've noted; humans are less reliable. Now, instead of drivers being the dangerous ones though, its pedestrians. They make a poor choice, and drivers die. That isn't right, and no society would advocate punishing a single person for the mistakes of someone else.

I also bring it back to the laws of robotics. No robot should intentionally harm a human being, or take action likely to harm them. This means your automated car should not decide to kill you. It should decide to try and avoid all casualties as best possible.

Further discussion by my friend and I also hit on just how stupid it'd be for a car to kill you, when we get to the point of reliable automated cars. There are far better options. Sense which passengers are in the car. Calculate impact angle to minimise damage to those seats. Deploy airbags almost pre-emptively, right before collision. If at high speed, loosen seatbelts slightly to prevent crushing of ribs. Sideways ram into the wall and use it as additional friction braking.
The idea that it would have to kill someone is ludicrous, but in any case, it should not kill its driver.

Its also a catch 22. As you mentioned, get the drivers off the road, and accidents go down. No driver is going to buy a car that will literally decide to kill them. Even society at large would find that notion ludicrous, and would just simply not adopt automated cars, not because it wouldn't be programmed for the greater good, but because it would be programmed to kill anybody at all. Whether you say it'll kill the driver, or kill the pedestrians, the public is going to utterly hate it.
So how do you save the most lives? Have it try to avoid collision with the pedestrians, but prioritise the driver, and don't make a big deal of it - just sell it as prioritising driver safety, not mentioning any situations where it'd end up having to kill pedestrians in order to do so. Drivers will acquire them, and thus the number of accidents killing anyone on the road will go down.

Overall, the greater good is in having systems designed to protect the occupants of the car from any collisions, not trying to play morale police and force a set of morales that not everyone shares [The many ALWAYS are more important than the few, no matter what; you should be killed to save Hitler, Stalin and Jack the Ripper], and that isn't even legally shared, onto people for a false assumption of 'the greater good', which simply does not exist here. Better yet, have the user be able to prioritise seating in the car for who to protect most, if occupied, in case of a collision, including 'external sources'.

Honestly, there isn't a moral dilemma here. There is a situation that has no details, and that should not arise without the so called 'victim' pedestrians intentionally doing something stupid - which causes many accidents today. All it will do is shift the deaths from those accidents from the idiots who take them, to the innocents who just happened to be in the area at the time. THAT is something any society should see as wrong.
 

Floppertje

New member
Nov 9, 2009
1,056
0
0
FoolKiller said:
Floppertje said:
I disagree. Any Robot car driving for me has to be above perfect. I follow all the rules of the road (that matter), and on a daily basis I have to outdo that to avoid a collision, whether its someone not yielding right of way, cutting me off, turning when they don't have time. If all I did was drive perfectly according to the rules of the road I would most likely have an accident every day I drive more than 15 minutes. And that's most of them.

The only way for this to work correctly would be for all the vehicles on the roads to be automated and the software would be identical so they would not conflict with each other's logic. Until that day, I will drive myself because IF I die while driving, I want to have as much control as possible.
Aside from the fact that there is no such thing as 'above perfect' by definition, you don't think they'll consider other people not following traffic rules while programming the AI for these things?
People get tired and angry, have crappy reaction times and muscles that cramp up or those little stinging bits of dust or hair that get in your eye and you get the point. When others have self-driving cars too (and I think it'll be faster than you think because the first ones are already on the street), there will be less people driving shittily. Better for everyone.
 

chikusho

New member
Jun 14, 2011
873
0
0
An incredibly unlikely scenario. A self-driving car would follow traffic laws and keep the speed limit.
These laws and limits are designed for human reaction times and the amount of distance and space needed to swerve or slow to a stop in the case of sudden obstructions.
A self driving car would 1. have a greater spacial awareness, and thus be able to spot potential obstructions (human or otherwise) a lot more quickly, readily, and probably at greater distance than any human driver, and 2. have a much quicker reaction time, thus being able to stop completely or calculate a safe swerving route without needing to risk either driver or potential pedestrian.
Any situation where a self-driving car would not be able to react safely and properly is one where a human would, at most, _just react_, and at worst, not register the obstruction until after it's been hit. In either scenario, the self-driving car is the safer option.
 

Joccaren

Elite Member
Mar 29, 2011
2,601
3
43
shinyelf said:
If this was a case of car to car collisions I would agree that the car should inherently try to protect the driver, but as we've already hashed out the cars will communicate and so the risk of such collisions are negligible.
The great issue here is car to pedestrian versus car to environment collisions, and as I see it the car should almost always chose to endanger the driver. Now hear this out before you jump down my throat about who's responsible; in a collision between a car and a pedestrian the pedestrian is 1,5 times more likely to sustain damage than the occupants of the vehicle if I understand this ( http://www.cdc.gov/motorvehiclesafety/pedestrian_safety/ ) page correctly, furthermore the driver will have a variety of protective measures in place, presumably not just limited to airbags.
This will, if I haven't gone completely unreasonable over night, mean that the driver and passengers are actually more likely to survive than the hypothetical pedestrian, and even if the car collides with the environment the driver and passengers would most likely only sustain minor injuries, and only in the very worst cases fatal/debilitating ones.
And again, here is the optimal answer; Try to save everyone. Realistically a situation shouldn't arise where the car is unable to do anything to save everyone, or at least minimise the odds of death.

The whole situation this is based on though is one where any reasonable logic and behaviour does not apply. For some reason, the car HAS to kill one person or the other. Its driving down the road at 100 Km/h, and a pedestrian jumps out right in front of it, and it either has to hit the wall and 100% kill the driver, or hit and 100% kill the pedestrian.
Its a bit of a stupid situation. Optimally, it should do everything it can to reduce the damage, but overall its priority should still be on its occupants. If it has to maim or kill an occupant to save a pedestrian, it should favour the occupants, and do everything within its power to save the pedestrian without endangering the occupants unreasonably. Give them some bruises and shit? Yeah, sure. Destroy the car itself? Yeppers. Kill or maim the occupants? Nope. Destroying the life of one person because of the actions of another is wrong.

The car should act as much as possible to save the pedestrian, but it should never choose to kill or seriously injure anyone. That should only happen as an unavoidable side effect in situations beyond its control - and the insides of the car are almost always within its control.
 

Dimitriov

The end is nigh.
May 24, 2010
1,215
0
0
Stupidity said:
1.2 million people die every year in automobile accidents. 1.2 MILLION.
This moral dilemma represents the tiniest fraction of lives that would be saved by driverless cars.

If it takes giving driverless cars a license to kill to get them on the road than we should do it.
Counterpoint: there are more than 7 billion people the planet. Which I would say is at least 6 billion too many. We should in no way be spending so much time trying to save lives that aren't currently in imminent danger.
 

Flames66

New member
Aug 22, 2009
2,311
0
0
Kross said:
There's no way I'm ever (voluntarily) driving in a machine that prioritizes killing me over anyone else.

There's also no way I'd ever want to use an automated vehicle unless every other vehicle on the road is also automated, as that swarm coordination is where the true benefit resides.
fix-the-spade said:
If they're all automated, they all have to communicate with each other, their operator and some kind of central control. Under that kind of load even a nearly perfect system would fall on it's face constantly, I'm yet to see human being build a perfect system. I suppose it would keep you in a job.
If the car can't drive itself without being connected to a network, it can't drive itself. I will never use a vehicle that is part of a large network and could be so easily tracked. All hardware and software required must be within the vehicle with no external connections.
 

Flames66

New member
Aug 22, 2009
2,311
0
0
Dimitriov said:
Counterpoint: there are more than 7 billion people the planet. Which I would say is at least 6 billion too many. We should in no way be spending so much time trying to save lives that aren't currently in imminent danger.
I agree. There are far too many people on the planet already.
 

fix-the-spade

New member
Feb 25, 2008
8,639
0
0
Flames66 said:
If the car can't drive itself without being connected to a network, it can't drive itself. I will never use a vehicle that is part of a large network and could be so easily tracked. All hardware and software required must be within the vehicle with no external connections.
That's impossible, there's no way a vehicle could track it's own location without connecting to GPS at the bare minimum. You also need to connect to some kind of local traffic control since on board sensors need line of sight and that isn't possible at all junctions, especially in Europe, the continent of wiggly roads.
 

Flames66

New member
Aug 22, 2009
2,311
0
0
fix-the-spade said:
Flames66 said:
If the car can't drive itself without being connected to a network, it can't drive itself. I will never use a vehicle that is part of a large network and could be so easily tracked. All hardware and software required must be within the vehicle with no external connections.
That's impossible, there's no way a vehicle could track it's own location without connecting to GPS at the bare minimum. You also need to connect to some kind of local traffic control since on board sensors need line of sight and that isn't possible at all junctions, especially in Europe, the continent of wiggly roads.
Then I won't be getting one or willingly riding in one.

There are certain exceptions I would allow (after doing a lot of research into how they work).
GPS, as long as it is impossible to track. The only external connection involved would be finding the vehicles current location, all map data and route planning would have to be in the car.
Traffic light giving out a local area signal stating stop or go. No feedback involved, just a simple broadcast.
Any more inter connectivity than that is too much.
 

Loonyyy

New member
Jul 10, 2009
1,292
0
0
That is a factor, but it's more of a factor with pedestrians and people not using self-driving vehicles.

One of the big advantages of self-driving vehicles is reducing collisions between vehicles, so realistically, we're looking at a reduction in injury from those collisions, and in pedestrian heavy zones, speed limits are already reduced to prevent injury to pedestrians, at at those speeds, you're looking at a much lower risk of the passengers being injured. So far, it looks like self-driving cars at the moment are less prone to error than human driven ones, and the collisions they've been involved with have not been the result of error on the part of the car, but humans driving poorly, or disobeying the road rules-they work, and they tend to have less accidents than people. In the pictures in question, am I supposed to take it that an entire school is crossing the road? Because where I live, that means that we're looking at stopping from 40 kph, and that's not exactly a challenge.

It's not so much a dilemma of self-driving cars, as it is being in a car at all, and it's interesting in terms of the prisoner's dilemma how people react to them. People would rather other people in them, people would rather someone else made a change rather than them. We'd rather the car crashed itself, unless we were in it.

Self driving cars are meant to a) drive as accurately as possible, to the limits and conditions b) drive as safely as possible. Under what scenario is it that we're expecting it to approach a group of pedestrians at speed, and why would this not be more of a problem for a human driver? We're either asking why is the car going at 80 in a pedestrian heavy environment (Which our self-driving car would not do, unlike many human drivers), or why a crowd of pedestrians decided to cross the highway, and why the car was unable to see them in this environment until a collision was inevitable. This is a very specific problem.

Eventually, it's likely that people won't be driving at all, and we'll all be in self-driving transport, and we'll be the safer for it.
 

Lightknight

Mugwamp Supreme
Nov 26, 2008
4,860
0
0
pacouranga said:
Where's option D where it just plows through the crowd while laying on the horn?
We already have that in current car models. It involves texting while driving. Drinking while driving was the original form of that but it erroneously had some time to slow down and swerve. So the new model takes reaction time entirely out of the equation and the texter doesn't know what they've done until a few seconds after the bodies roll over and under their vehicle.

What I'm saying is fuck human drivers that text while doing so.