[RD] Self-Driving Uber Car Kills Arizona Pedestrian

This wasn't murder. With this being a self-driving car, this should be, at worst, classified as an industrial accident.

Unless, of course, you have access to evidence that someone at Uber programmed this car to kill people.

You could likely go after the employee on some form of negligence, not sure if it's plausible, depends on the conditions. I'm pretty sure you have different standards for hitting pedestrian when they have right of way vs someone dolphin diving into traffic with < 1 second reaction time. I suspect this one is somewhere between those extremes.
 
At some point negligence becomes criminal. Unfortunately it seems that in this case the local (state) authorities were in bed with Uber and will be unwilling to probe just how negligent it was.
 
At some point negligence becomes criminal. Unfortunately it seems that in this case the local (state) authorities were in bed with Uber and will be unwilling to probe just how negligent it was.

Negligence should be relevant to (backup) driver and pedestrian alike. I would have a different evaluation of the automated driving process if the pedestrian was doing something illegal/unexpected quickly than if the pedestrian is simply crossing the street, same for the person behind the wheel.
 
At some point negligence becomes criminal. Unfortunately it seems that in this case the local (state) authorities were in bed with Uber and will be unwilling to probe just how negligent it was.

Your corporate paranoia is showing. They looked at the video and concluded the obvious. There's nothing more to "probe."
 
If the 'driver' was supposed to be watching the road and was doing something else like she was in a cab, I could see her being sued too. But Uber's got the bucks so why bother.
 
If the 'driver' was supposed to be watching the road and was doing something else like she was in a cab, I could see her being sued too. But Uber's got the bucks so why bother.

The key there is that whatever she was doing, she knew she was on camera doing it. That means that either she knew it was okay with Uber (bad policy), they'd never notice her doing it (poor supervision), or she was altogether unaware that she couldn't do it (no training). Uber was on that hook like they'd sucked the bait through their gills. That's almost certainly why they settled and were happy to do it rather than go to court and lose.
 
It takes far more will power just to get your name on a selection list (much less actually get selected) than it does to apply to become an cashier or apply to a college.
The analogy was to the promotion to manager not to the application to be cashier, but you are right of course that its more difficult to become an astronaut. But then were talking about scale. Its still a promotion. If its more helpful scale-wise, take the folks who "volunteer" to be professional athletes as an example. But im way off topic, so please have the last word on this issue. The real point was @Commodore 's correct observation that you can't have an omelette without breaking some eggs.
I'm as appreciative of the 'we are forced by society to work' insight as the next person, but I don't think that putting astronauts into that category is reasonable. You need massive privileges and massive drive to become an astronaut. It's very much what they wanted, and portraying them as victims rather than heroes really doesn't generate the dissonance that people appreciate (in retrospect), but as just 'incorrect'. There are a few jobs in society where the holder of that job isn't so much a 'victim' as a '(lucky) winner', and American astronaut is one of them, for sure.
Im not portraying them as victims or making any commentary on being forced to work. I agree that astronauts want the job, and are the lucky few who are selected for the job. But its still a job. They aren't, strictly speaking, volunteers. See my point above about pro athletes.

But again, ive derailed enough about astronauts, I invite you to take the last word. The real point here is that progress requires risks and sacrifices and that we should not give up on the driverless car technology based on this woman's death, despite the fact that she was not a willing participant in the car's test. I will say though that I remain leery of the tech, not because I think it won't be effective, but because I worry it will become so effective that it will eliminate a ton of driver jobs across many industries.
 
Last edited:
Progress absolutely requires the fine walk between risks and sacrifices, and is very easy to derail. This is why I so commonly beg for help on the research charities, because it's absolutely sacrifice that's required to speed the momentum to where we need it to be.

I see no way to test technology without creating risks. I guess it's reasonable to have the 'risk threshold' be lower than 'business as usual'. A company whose cars kill at less than the normal rate should iterate to improve themselves. But a car that kills at more than the normal rate should probably be recalled until a quantum leap in capability is discovered. That said, I am a little gun shy (I'm not a switch-puller during the trolley incident), I would prefer to pay more money than risk others (to a certain degree). And guys like me normally slow progress, except that we fund other areas to compensate for our hand-wringing.
 
At some point negligence becomes criminal. Unfortunately it seems that in this case the local (state) authorities were in bed with Uber and will be unwilling to probe just how negligent it was.

This seems silly, my own negligence is the entire point of car insurance in the first place.

The only way I'm not covered is if I'm intentionally negligent.

The reason Uber pays out is because they're large enough in scale to self-insure, which I'm not.
 
The key there is that whatever she was doing, she knew she was on camera doing it. That means that either she knew it was okay with Uber (bad policy), they'd never notice her doing it (poor supervision), or she was altogether unaware that she couldn't do it (no training). Uber was on that hook like they'd sucked the bait through their gills. That's almost certainly why they settled and were happy to do it rather than go to court and lose.

I'm not quite sure what the actual purposes of these 'drivers' is, in this case. Obviously, in the first tests they were there to watch the self-driving software, but I've read comments that in current self-driving cars, the back-up driver is only there to take over when the software gives up on a particular situation/disengages. Either to safely park the car or to move past the difficult section. These were only comments on an online forum though, so take this with a huge grain of salt. I'd be interested to read some decent in-depth analysis.

Regarding settling, on top of the usual calculus that settling replaces a probabilistic event by its expectation value while saving on lawyer expenses, there are two more factors here:
* A law suit is negative publicity. You can be sure every aspect receives coverage and it just brings headlines that contain the words 'Uber' and 'deadly accident'.
* A civil suit involves a discovery phase and business sensitive information about the current status and details of Uber's self-driving project inevitably leaks to the competitors.
 
Related though indirectly...

Does two months after your self driving car kills someone seem like the best time to broach the subject of flying taxi services?
 
Related though indirectly...

Does two months after your self driving car kills someone seem like the best time to broach the subject of flying taxi services?
Make them self driving! Self Driving flying taxis! What could possibly go wrong!
 
So the problem wasn't that the car didn't detect the woman, it was just told to not think of pedestrians as an obstacle...

Tuning sensitivity and specificity in any test is hard work, but really it should not be done on humans, and of course if there are too many false positives in a situation when any false negative maybe lethal really should make developers question their design, not simple tune down sensitivity.
 
Related though indirectly...

Does two months after your self driving car kills someone seem like the best time to broach the subject of flying taxi services?

Well, it's a better time than two months from now after they've resumed autocar testing and have killed another pedestrian within the first week.
 
Well, it's a better time than two months from now after they've resumed autocar testing and have killed another pedestrian within the first week.

Well, that seems true enough.
 
Top Bottom