The Moral Dilemma of Self-Driving Cars - Save the Driver Or Others

Recommended Videos

Thaluikhain

Elite Member
Legacy
Jan 16, 2010
19,538
4,128
118
As long as it's not enforced by anywhere, people can buy the competition's cars instead.

One wonders, though, assuming if it's enforced, if they could stick other factors into the equation, such as the demographics of the people involved. Make for a nice dystopia.
 

Piorn

New member
Dec 26, 2007
1,097
0
0
Simple, don't dodge at all, just make them brake as hard as they can.
The sensors reach far enough to predict that kind of thing, and if the pedestrian just suicidally leaps on the street, then tough luck.
 

fix-the-spade

New member
Feb 25, 2008
8,639
0
0
Flames66 said:
GPS, as long as it is impossible to track.
If it's on GPS it can be tracked. The recent killing of three British Islamic State militants was done by tracking the GPS signals of their mobile phones and bombing it.
 

lacktheknack

Je suis joined jewels.
Jan 19, 2009
19,316
0
0
11zxcvb11 said:
lacktheknack said:
Space Jawa said:
I'll go one further and say I have no intention of ever buying a self-driving car at all.
Nailed it.

If self-driven cars become standard, I'm swapping permanently to buses and city trains, the end.
what will you do when buses and trains become self-driving and planes become self-flying? i guess you could bike or walk if things are withing reasonable distances. maybe buy a segway if you trust the electronics :p
The trains and planes here are already self-piloting. I don't have an issue with that, because planes have no real traffic to deal with and fewer "moral dilemmas" to consider, while trains have exactly one job: To stop and go. As for buses, I could live with that too, because if a bus is in a car accident, the bus wins. I've been in one, I didn't even notice we crashed.
 

Mortuorum

New member
Oct 20, 2010
381
0
0
lancar said:
I would NEVER buy a vehicle that would willingly sacrifice me to save others without even asking me first. And since there's no time for an autonomous car to ask me for my opinion in the matter once an accident is imminent, I'll have to go with the computer that thinks I'm more important than whoever decided to step out in front of me.
Exactly. How about we program the self-driving vehicles to obey the rules of the road? That removes almost all of the ethical considerations. If a pedestrian steps out in front of the car, it makes a best effort to brake, but ultimately that pedestrian (or its parents, if a child, or its owners if it's a pet) need to take responsibility for their actions. Same with a motorcycle or bicycle that swerves in front of you. Sucks to be the cyclist, but they should have been paying more attention to what they were doing.

That does raise a good point, though. In order for driverless vehicles to achieve acceptance, the owner and passengers of the vehicle will need to be indemnified from liability in the event of an accident. I certainly wouldn't want to own a driverless vehicle if I could go to jail or be sued if it injures or kills someone.
 

Aeshi

New member
Dec 22, 2009
2,640
0
0
11zxcvb11 said:
i am curious though, all the "i will never buy an autonomous car" people here, would you still stick to driving if it meant an extra 50$ a month for insurance? what if it was 100$?
Yes, I'd like to think not placing my life in the hands of a machine that thinks I'm worth less than the 2 "pranksters" who just jumped out into the middle of the road is worth $50-100 a month.
 

11zxcvb11

New member
Apr 13, 2012
15
0
0
Aeshi said:
11zxcvb11 said:
i am curious though, all the "i will never buy an autonomous car" people here, would you still stick to driving if it meant an extra 50$ a month for insurance? what if it was 100$?
Yes, I'd like to think not placing my life in the hands of a machine that thinks I'm worth less than the 2 "pranksters" who just jumped out into the middle of the road is worth $50-100 a month.
so you'd pay 1000$/year to avoid a situation that has extremely, extremely low chance of happening? i bet you play the lottery, too.
 

Dimitriov

The end is nigh.
May 24, 2010
1,215
0
0
Bitter_Angel said:
Dimitriov said:
Stupidity said:
1.2 million people die every year in automobile accidents. 1.2 MILLION.
This moral dilemma represents the tiniest fraction of lives that would be saved by driverless cars.

If it takes giving driverless cars a license to kill to get them on the road than we should do it.
Counterpoint: there are more than 7 billion people the planet. Which I would say is at least 6 billion too many. We should in no way be spending so much time trying to save lives that aren't currently in imminent danger.
If you want to commit mass murder, have the stones to advocate for it actively, not through passive inaction.
If you can't understand the difference between actual people and statistical probability then there is really no point in talking to you. There is no moral imperative to try to save theoretical victims who might exist in the future.

Murder it is not, and trying to frame it that way is a lazy and dishonest strawman argument.
 

Aeshi

New member
Dec 22, 2009
2,640
0
0
11zxcvb11 said:
so you'd pay 1000$/year to avoid a situation that has extremely, extremely low chance of happening? i bet you play the lottery, too.
No, if that were the case I'd probably just join the ranks of the uninsured. Because hey, if the roads really are that safe I won't need it anyway, and if they're not that safe then at least I haven't been killed off "for the greater good."

And as for "extremely low chance of happening", I'd like to borrow the time-travel machine you must clearly have so that I too can travel into the self-driving future to see the precise chances of that happening.
 

1981

New member
May 28, 2015
217
0
0
Baffle said:
If the car is self-driving, do I really need to be in it? Can't I just send it to B&Q, where my (robot) personal shopper will load it with the items I've purchased online?
Right. Everyone talks about the driver, but what does a driver do in a self-driving car? Do we assume there's always an override? A mechanical switch would make it impossible for the AI to go rogue. Anyway, I understood this to be about smart cars that aren't yet fully autonomous but can take over in a tough spot or help the driver.

As several posters have suggested, pedestrians and vehicles rarely appear out of nowhere. A sophisticated computer should be able to spot erratic or otherwise suspicious behavior and act accordingly.
 

Lightknight

Mugwamp Supreme
Nov 26, 2008
4,860
0
0
Antigonius said:
I have a question
Why can't a car just...stop? I mean stopping in front of an obstacle - that shouldn't be hard.
Imagine this scenario:

Your car is driving through a tunnel at a rate of speed that is legal but also high. Right before the car exits the tunnel a person just out of view trips into the line of traffic.

This is new information and is provided too late to properly decelerate the vehicle. Now the car has to decide: 1. Hit the person. 2. Hit oncoming traffic. 3. Hit the wall.

1. Would almost certainly cause the most damage to at least one individual. 2. Includes more momentum and another driver. 3. Includes a lot more mass but no momentum and no other individual.

The idea is that 3 would be the least likely to cause bodily harm to anyone. This doesn't have to involve a tunnel either. We're also talking about someone stepping out from behind a car, bush or building. We're talking about human drivers that do something completely erratic that prevents the car from safely stopping in time.

There are situations you can get in where there is no flawless solution. Where there is only least harm scenario.
 

FoolKiller

New member
Feb 8, 2008
2,409
0
0
Floppertje said:
FoolKiller said:
Floppertje said:
I disagree. Any Robot car driving for me has to be above perfect. I follow all the rules of the road (that matter), and on a daily basis I have to outdo that to avoid a collision, whether its someone not yielding right of way, cutting me off, turning when they don't have time. If all I did was drive perfectly according to the rules of the road I would most likely have an accident every day I drive more than 15 minutes. And that's most of them.

The only way for this to work correctly would be for all the vehicles on the roads to be automated and the software would be identical so they would not conflict with each other's logic. Until that day, I will drive myself because IF I die while driving, I want to have as much control as possible.
Aside from the fact that there is no such thing as 'above perfect' by definition, you don't think they'll consider other people not following traffic rules while programming the AI for these things?
People get tired and angry, have crappy reaction times and muscles that cramp up or those little stinging bits of dust or hair that get in your eye and you get the point. When others have self-driving cars too (and I think it'll be faster than you think because the first ones are already on the street), there will be less people driving shittily. Better for everyone.
You brought the concept of perfection into the discussion. I defined it as following the rules of the road exactly. I indicated that I would want my car to drive beyond that level. But that's not really the point of the discussion.

You actually missed my point and brought up an interesting new one. The original point I was making was that all of the companies (that compete when making the cars) would need the same software running. Otherwise Audi's Aggressive AI (AAA) would save itself from a fatal accident caused by another cutting off BMW's Benevolence Brain (BBB) which would promptly put the owner into a pole killing him/her. If two cars make different decisions, they could end up with catastrophic consequences.

The other point you've illuminated is the idea that they would consider illegal actions by others in the design of the AI. But that notion is wrong. You can't program for the unexpected. And when the unexpected happens, the AI may do unexpected things.

Of course this is all skips over some of the other questions like:
Do we need driver's licenses or a minimum driving age if we aren't driving?
Do you need insurance?
If there's an accident, does it go on the user's record?
Does the manufacturer pay for the damages?
 

ZCAB

Regular Member
Jan 15, 2013
81
0
11
Country
Netherlands
The problem here is that they're trying to find a specific reaction to an endless number of possible situations. The answer could be to add some sort of manual mode - perhaps some pedals that the driver could use to slow down or speed up the vehicle to suit the situation, or some sort of rotary device that could be used to change the vehicle's direction.
 

Imperioratorex Caprae

Henchgoat Emperor
May 15, 2010
5,499
0
0
Kross said:
There's no way I'm ever (voluntarily) driving in a machine that prioritizes killing me over anyone else.

There's also no way I'd ever want to use an automated vehicle unless every other vehicle on the road is also automated, as that swarm coordination is where the true benefit resides.
I can't honestly put myself in a position where a machine has all the control, not after 20 years of being a consistently safe driver and feeling like I lose more control over my car every time I end up with a newer model. I grew up driving manual, and have developed a sort of 6th sense with whats normal with gear shifting and whats a bad sign, then automatics became much more standard and its been hard to buy a new(er) car with stick. I've ended up driving, and hating, automatics and feeling like more of the car is out of my control... I'm doing my best to avoid getting any cars that have wireless functions like OnStar because I don't want to have anyone able to control my car externally. I don't believe in trading my own control for "safety" reasons because I believe that safe driving is something anyone can do if they are taught properly in the first place and better standards for licensing were in effect.

As it is its too easy and lenient the way licenses are given out. I still believe that owning and operating a car is not a right but a privilege and people should be under heavy scrutiny when it comes to renewing licenses each year. Accidents should count against a driver to the point where licenses are yanked from the most dangerous ones. Its no coincidence that the more accidents a person has, the more likely they are to be in another.

Automated cars just screams a huge warning to me, because there are idiots out there who could very well decide that causing major accidents by hacking into the wireless network controlling the traffic would be "lols." I do not at all put it past the juvenile minds that create viruses and such to ignore such a potential for causing chaos.
 

MoltenSilver

New member
Feb 21, 2013
248
0
0
my thoughts on this have basically been echoed lots already but to metaphorically piss in the rain:
1. This should be a mostly-non-issue from square 1 as the car shouldn't be going faster than the speed at which it can respond and maneuver
2. If it doesn't have time to maneuver it means someone broke the law and I'm going to hazard a guess here that it's not the robot (or, in the case the programmer did screw up, at least the occupants in the car shouldn't be held any more responsible than someone in the passenger seat of any other crash would be).
 

RoguelikeMike

New member
Nov 10, 2014
5
0
0
I believe that people have no business owning vehicles, and shouldn't have the right to drive them. Why do I think this? Well first of all, because I have been a pedestrian for 28 years. I'll tell you this right now, that everyone, everyone, everyone without fail, everyone that sits behind the wheel of a car has no respect for anyone else. Everyone, even you. You, the one reading right now, thinking you do. You don't. How do I know this? Because I cross a double crosswalk every day. This cross walk tells the turning lane they can go at the same time as it tells the pedestrians they can cross. I stand there for 2-3 minutes often times waiting for the light to change, the people in the turning lane know I am there. They have 2-3 minutes to see me. BUT THE LIGHT IS GREEN! So they go. And I get to watch as my 15 seconds to cross tick by, and the turning lane keep driving on as if I don't exist. I HAVE TO RUN INTO THE MIDDLE OF THE ROAD, and STOP cars with my body in order to get to my job. EVERY DAY. WITHOUT FAIL.

Humanity has proven to me that it cannot handle the SIMPLE responsibility of driving a car. IT IS SAFER for me to run into the middle of the road than to use the pedestrian cross walks. By the way, I have tested this theory in Florida, and Utah, two very different parts of the world, and people are the same.

Transportation is based on a very obsolete idea:

The rich have it
Because the rich have it there are few cars
Because there are few cars it is safe to drive

That has changed. Now everyone has a car (except me, but I live a life harder than 99.9.9.9.9.9.9.9% of humanity, as this post should point out that just getting to work almost kills me daily).

You know, I often have people scream at me, curse me, flip me off, threaten me with their cars, for trying to use the pedestrian crossing (when its my turn), as if they have no idea how to drive. YOU have the convenience of sitting on your ass and getting propelled at insane speeds to any destination you could possibly want, and you can't wait 5 seconds for me to cross the street. None of you can. None of you will. None of you have. Not if that light turns green and says "HEY BUDDY, YOU CAN GOOOOOOOOOOOOO!" Oh, a person? LOL I'll drive around them, flip them off, and tell them they're an idiot! It was -MY- turn after all... the light told me so! What? I couldn't be wrong, I've been driving for 30 years!

Transportation needs to be like flying, because people can't handle the EXTREMELY challenging responsibility of not being selfish, self centered, and self serving, at ALL times.

I am all for computers driving people, and I'm sure I'm in the minority. As soon as these cars exist, people will "jail break" them themselves and be "cool" by breaking the mold and causing havoc in traffic, running over people trying to cross the street because the 'man' is trying to take away their 'right' to control their death machine. No, they're too busy driving to Wal-Mart and revving their engine in the parking lot thinking they're impressing everyone. "Look! I have a loud car. Loud cars are cool. They make noise. I like banging rocks together. I am a cool person. I spent a lot of money to make this car loud. I am sure no one has done this befor-- Hey! Look! Several dozen other loud cars near me. Wow, those people are smart too. They make smart choices with their money. Loud cars achieve many things. Definitely not a way to compensate for cowardice or lack of personal strength. Definitely not a way to bully people or be needlessly intimidating. Definitely. I think I will sit here and rev my engine at the next person that walks by, they'll smile, wave and thank me! I am a cool person. I have a cool car."

Can you tell I hate cars? Probably because I've never benefited from one, and they cause me stress, anxiety and actual physical pain almost daily.