[RD] Self-Driving Uber Car Kills Arizona Pedestrian

SS-18 ICBM

Oscillator
Joined
Mar 5, 2006
Messages
15,289
Location
Here and there
Title says it all, really. Further details on the first pedestrian death caused by self-driving vehicles below:
SAN FRANCISCO — A woman in Tempe, Ariz., has died after being hit by a self-driving car operated by Uber, in what appears to be the first known death of a pedestrian struck by an autonomous vehicle on a public road.

The Uber vehicle was in autonomous mode with a human safety driver at the wheel when it struck the woman, who was crossing the street outside of a crosswalk, the Tempe police said in a statement. The episode happened on Sunday around 10 p.m. The woman was not publicly identified.

Uber said it had suspended testing of its self-driving cars in Tempe, Pittsburgh, San Francisco and Toronto.

“Our hearts go out to the victim’s family. We are fully cooperating with local authorities in their investigation of this incident,” an Uber spokeswoman, Sarah Abboud, said in a statement.

The fatal crash will most likely raise questions about regulations for self-driving cars. Testing of self-driving cars is already underway for vehicles that have a human driver ready to take over if something goes wrong, but states are starting to allow companies to test cars without a person in the driver’s seat. This month, California said that, in April, it would start allowing companies to test autonomous vehicles without anyone behind the wheel.

Arizona already allows self-driving cars to operate without a driver behind the wheel. Since late last year, Waymo, the self-driving car unit from Google’s parent company Alphabet, has been using cars without a human in the driver’s seat to pick up and drop off passengers there. The state has largely taken a hands-off approach, promising that it would help keep the driverless car industry free from regulation. As a result, technology companies have flocked to Arizona to test their self-driving vehicles.

Autonomous cars are expected to ultimately be safer than human drivers, because they don’t get distracted and always observe traffic laws. However, researchers working on the technology have struggled with how to teach the autonomous systems to adjust for unpredictable human driving or behavior.

An Uber self-driving car was involved in another crash a year ago in Tempe. In that collision, one of Uber’s Volvo XC90 sport utility vehicles was hit when the driver of another car failed to yield, causing the Uber vehicle to roll over onto its side. The car was in self-driving mode with a safety driver behind the wheel, but police said the autonomous vehicle had not been at fault.

In 2016, a man driving his Tesla using Autopilot, the car company’s self-driving software, died on a state highway in Florida when it crashed into a tractor-trailer that was crossing the road in front of his car. Federal regulators later ruled there were no defects in the system to cause the accident.

The National Transportation Safety Board was sending a small team of investigators to Arizona to gather information about the Uber crash, said Eric Weiss, an N.T.S.B. spokesman.

What are your opinions on self-driving vehicles? What precedents do you think this incident will create? What steps do our societies need to take to prevent such incidents? How should an autonomous vehicle decide who to preserve in an emergency? Other thoughts?

RD tag since I am not in the mood for jokes about people getting killed by AI. Take your Skynet memes elsewhere.
 
I thought these things were supposed to be able to detect objects or people in front of them.
 
Glitches do happen. I expect we'll hear more details as the investigation gets underway.
 
As crass as it may sound, every incident is extremely beneficial for future progress in the industry.

This was bound to happen. Car deaths are common and self-driving cars will reduce that significantly once they've been fleshed out. I hope this gets used to improve the model instead of as fuel to ban or restrict it.

10 P.M. outside of a crosswalk zone is a really good case study on current feasibility of self-driving. It's unfortunate it cost someone their life. :sad:
 
What are your opinions on self-driving vehicles?

Inevitable in the long term. Impractical in the short term.

What precedents do you think this incident will create?

Shouldn't really create any precedent. This should have been thought of already. Any type of vehicle, autonomous or not, needs to plan for how to deal with some fatality rate of both passengers and non-passengers.

What steps do our societies need to take to prevent such incidents?

Better tech. Appropriate liability.

How should an autonomous vehicle decide who to preserve in an emergency?

Passenger, always. There's no other reasonable option.
 
Passenger, always. There's no other reasonable option.
I think that's the most likely option, since people won't buy vehicles that sacrifice them. I don't know if that's the most moral option though.
 
I think that's the most likely option, since people won't buy vehicles that sacrifice them. I don't know if that's the most moral option though.

Morality aside, it probably is the most accurate emulation of a human driver.
 
Yes, the choice between hitting a school bus vs a group a children on the sidewalk? How to program, how to litigate?
 
What are your opinions on self-driving vehicles? What precedents do you think this incident will create? What steps do our societies need to take to prevent such incidents? How should an autonomous vehicle decide who to preserve in an emergency? Other thoughts?
I believe that self-driving cars are still more safe than human drivers, even if this accident was very unfortunate. I wonder how this could happen, and whether or not that woman was behaving in an erratic manner. As far as the self-driving cars go, the technology still seems great and I hope that this won't halt development.
 
When a human driver makes an error you can sue them, when it's a program......
If your company programs 1 million cars, the liability is almost unlimited. We'll need some company friendly court decisions before they take over.
 
When a human driver makes an error you can sue them, when it's a program......
If your company programs 1 million cars, the liability is almost unlimited. We'll need some company friendly court decisions before they take over.

Manufacturer liability is already a thing, though.
 
When a human driver makes an error you can sue them, when it's a program......
If your company programs 1 million cars, the liability is almost unlimited. We'll need some company friendly court decisions before they take over.

I am thinking it will be the first thing on the setup menu. Press here for the car to save your life alongside releasing the company of any liability for casualties caused by your choice.
 
As crass as it may sound, every incident is extremely beneficial for future progress in the industry.

This was bound to happen. Car deaths are common and self-driving cars will reduce that significantly once they've been fleshed out. I hope this gets used to improve the model instead of as fuel to ban or restrict it.

10 P.M. outside of a crosswalk zone is a really good case study on current feasibility of self-driving. It's unfortunate it cost someone their life. :sad:

I can sound even more crass... Pedestrians have the right of way but pedestrians who assume that right are soon-to-be recipients of the Darwin Award. I grew up in a densely populated city (San Francisco), before we got cars my friends and I were crisscrossing that city on 10-speeds. You learn how to jaywalk in cities like that... Frankly I dont know how a pedestrian can get hit. I can see someone getting hit in a crosswalk by a stopped car that suddenly accelerates, but jaywalking?
 
If your company programs 1 million cars, the liability is almost unlimited.

It only makes a difference if you've got weird liability laws, from a broader economic perspective there's no real difference in liability.

Hell, you don't even need to have the company liable for anything. Put the entire onus on the car owner, make them get liability insurance, let insurers charge appropriately for the risk. No real difference from having the manufacturer liable other than shuffling some debit/credit columns around.
 
This is the first I’ve heard about a self-driving car killing someone. I’m sure it’s big news any time it happens, or it would be. They’ve been driving for a couple years now. All things considered I feel this is a pretty good track record considering the relative number of human-driven cars that kill people.
 
Yeah, I prefer individual liability but I think this one will be fought in the courts for awhile first. A lot of deep pockets.
 
It only makes a difference if you've got weird liability laws, from a broader economic perspective there's no real difference in liability.

Hell, you don't even need to have the company liable for anything. Put the entire onus on the car owner, make them get liability insurance, let insurers charge appropriately for the risk. No real difference from having the manufacturer liable other than shuffling some debit/credit columns around.

There is a difference in the narrower perspective though. The insurance company counts on 'their' driver as an ally in any dispute over responsibility. That's lost in a driverless car. If you and I get in a crash and our narratives vary regarding who was responsible your insurance company might have to pay or it might not. If I get in a crash with a driverless car there is no countering narrative and their insurance will have to pay. I would expect rates for coverage of driverless cars to be high...maybe prohibitively high.
 
While that is my fear also Tim, some history will be required for the actuaries to come up with real rates. And if incidence is drastically reduced as we all anticipate, the rates may actually go down. Go Math.
 
While that is my fear also Tim, some history will be required for the actuaries to come up with real rates. And if incidence is drastically reduced as we all anticipate, the rates may actually go down. Go Math.

Big if. I am absolutely confident that if ALL cars were made driverless tonight accident rates tomorrow would drop like a rock. Unfortunately, I think that the one thing driverless cars cannot currently be equipped to manage around is drivers. Even if all they are driving is a pair of sneakers, as in this case. So I'm not really buying the idea that accident rates overall are going to fall, and I'm thinking that any possible statistical evidence that driverless cars are actually safer is going to be washed out by them shouldering unwarranted responsibility. If they are less likely to be responsible but more likely to take the blame when they aren't responsible statistics will be hard pressed to help the situation.
 
If I get in a crash with a driverless car there is no countering narrative and their insurance will have to pay.
Their insurance sends in a techie who downloads all sensor data including video of the 5 minutes prior to the accident and uses it to beat the hell out of you in court.
 
Top Bottom