Transportation Secretary: Google Self-Driving Car Crash No Big Deal

Transportation Secretary: Google Self-Driving Car Crash No Big Deal

For the first time ever, the computer running one of Google's self-driving cars was blamed for a traffic accident. As you can see in the video, the self-driving car pulled out in front of a human-driven bus after the computer erroneously predicted that the bus would slow down.

Google didn't reject blame: "We clearly bear some responsibility," read a statement released by the company, "because if our car hadn't moved, there wouldn't have been a collision."

While this isn't the first time that one of these futuristic vehicles has been involved in a real world collision, it was the first time that the technology was at fault.

Will this setback slow the progress toward driverless roadways of tomorrow? Anthony Foxx, the United States Secretary of Transportation, does not think so.

"I think the question here isn't comparing the automated car against perfection," Foxx told the BBC. "I think it's a relative comparison to what we have now on the roads, which is you and I."

And while he did say liability remains one of many potential roadblocks self-driving cars face in coming years, the Transportation Secretary took this recent, much-publicized computational error in stride.

"It's not a surprise that at some point there would be a crash of any technology that's on the road," said Foxx. "But I would challenge one to look at the number of crashes that occurred on the same day that were the result of human behavior."

Source: BBC

Permalink

Glad to see people aren't freaking out, or at least not the important people. It would be a real shame to put back self driving cars because the tech isn't literally perfect. This is a technology that will be extremely valuable to everyone.

So long as they crash less often than people.

Wait a second.

a) There are self-driving cars?

b) That are allowed on public roads?

Zhukov:
Wait a second.

a) There are self-driving cars?

b) That are allowed on public roads?

Yep and yep. However they're not out for general consumer use yet, and this is the first time after several years that an accident can be blamed on one of these self driving cars. Before this anytime a self-driving car was in an accident was because a human driver messed up and caused it.

thats a surprisingly coolheaded response from everyone involved. Even if every crash google cars were involved it were their fault that record would still exceed humans by far, so yeah, keep on going there please.

Zhukov:
Wait a second.

a) There are self-driving cars?

b) That are allowed on public roads?

they have been driving around for a few years. but only in some areas of US so most people never get to see a live one.

Strazdas:
thats a surprisingly coolheaded response from everyone involved. Even if every crash google cars were involved it were their fault that record would still exceed humans by far, so yeah, keep on going there please.

Zhukov:
Wait a second.

a) There are self-driving cars?

b) That are allowed on public roads?

they have been driving around for a few years. but only in some areas of US so most people never get to see a live one.

So far.
Google has apparently mentioned trialling some in other places, one of which was apparently a city in Australia. (I think Melbourne, as I recall.)

I believe the reason has something to do with comparing their behaviour in various different environments.
(Australia probably has marginally different road rules, a very different climate, and of course, the drivers here might behave differently too. - and of course driving on the opposite side of the road.)

Zhukov:
Wait a second.

a) There are self-driving cars?

b) That are allowed on public roads?

Here in Europe we've actually been getting self-driving trucks. And not test trucks either, but proper ones doing their job. They still have people in 'em but they're more like co-pilots keeping an eye on things while the truck does its thing.

CrystalShadow:
So far.
Google has apparently mentioned trialling some in other places, one of which was apparently a city in Australia. (I think Melbourne, as I recall.)

Pretty sure it was Adelaide actually (my hometown, woo!). We also held a conference here back in November IIRC.

South Australia is one of the least populated states in the country, so it makes a bit of sense to host the trials here.

EbonBehelit:

CrystalShadow:
So far.
Google has apparently mentioned trialling some in other places, one of which was apparently a city in Australia. (I think Melbourne, as I recall.)

Pretty sure it was Adelaide actually (my hometown, woo!). We also held a conference here back in November IIRC.

South Australia is one of the least populated states in the country, so it makes a bit of sense to host the trials here.

Ah well. Shows how reliable my memory is.

I mean, sure, I could've fact-checked before saying it, but doing that every single time you say something, just in case you're wrong is... Tedious. XD

Though I'm not actually sure using areas that aren't very populated is necessarily an advantage for something like this, since understanding how other traffic behaves is the single biggest issue in getting a self-driving car to work...

Wasn't the reason it crashed because it was faced with a scenario and picked the one that it was "supposed" to? Or defaulted to rather.
Probably the best part about it is that unlike most human drivers, it actually learned from this experience, as did the fleet.

Strangely enough I was at a machine learning talk last week, and this is exactly the scenario they said the google car would screw up.

Well of course it's no big deal. Unless you're a daily mail reader, in which case; be very afraid!!

LegendaryGamer0:
Wasn't the reason it crashed because it was faced with a scenario and picked the one that it was "supposed" to? Or defaulted to rather.
Probably the best part about it is that unlike most human drivers, it actually learned from this experience, as did the fleet.

If I'm understanding right, it's a case of the car predicting what the bus would do incorrectly. It predicted a bus would slow down, allowing it to pass safely.

And honestly, it's very telling that in this and similar cases, that the crashes are the result of interacting with humans. Each of these incidents only further proves that self-driving cars should only be out driving with other self-driving cars if we're looking for complete safety. It'd honestly be better that way.

The patch they are rolling out for the car's OS, involves extending a prosthetic hand, flipping the bird while swearing and honking the horn.

Google Car - Because giving a power mad corporation any measure of control over where, and if, you're driving is a great idea.

These are the same guys who want you to wear spyglasses that record everything for them, and of course greatly influence what you will and won't notice about your surroundings. I feel conflicted enough about putting myself on their grid by using their search engine, and now they want people to be mere passengers in their traffic flow?

Safe or not, I'm too old to get hyped over their brand of Dystopia.

The source of the problem seems to be that the car had to interact with a vehicle driven by a human, had both vehicles been unmanned they would, more likely than not, have been able to communicate and thus avoiding accident. The perfect, or near perfect, communication between cars seems to be one of best parts of the driverless car idea.

While this isn't the first time that one of these futuristic vehicles has been involved in a real world collision, it was the first time that the technology was at fault.

To clarify, this was the first time the technology was legally at fault. The technology was actually at fault in almost every other case; the problem is that the software drives the car like a computer, not a person. For example, it's technically legal to pull part way into an intersection and stop while the software evaluates its options. I the real world, though, if you do that someone will rear-end you reasonably assuming you are committing to a course of action.

It's very much like my grandmother before they took away her car keys. She would get confused in traffic and just stop. While the car that hit her was technically at fault, her driving was a major contributor (arguably the major contributor) to the accident.

I just assumed that car companies would start to become car insurers at this point. Or they would get bulk insurance and charge consumers for it at a much lower rate than we can currently get since both the car is safer and the insurance was negotiated in large numbers.

Not sure why people think this would be such a big hurdle? Just different.

Zhukov:
Wait a second.

a) There are self-driving cars?

b) That are allowed on public roads?

Google's 5G drones are going to be streaming porn to your self-driving car as you beat off on your way to work.

Welcome to the future Zhukov.

My carrer of the future is being the co-pilot of a self driving truck. I'll post up, drink tea and play my DSX-Treme or whatever the new gadget is while watching porn on my VR headset and live chatting with my wife in full holographic 4D!

Lightknight:
I just assumed that car companies would start to become car insurers at this point. Or they would get bulk insurance and charge consumers for it at a much lower rate than we can currently get since both the car is safer and the insurance was negotiated in large numbers.

Not sure why people think this would be such a big hurdle? Just different.

When I worked at an American car insurance company a few years ago the main CEOs of the big insurance companies were taken to a big meeting by Google (and a few other software and car companies) and basically told to sort out the insurance intricacies amongst themselves and have it all ready to accept people by 2018, because that was the date they expected to roll off the first civilian models, so Google was hoping the insurance companies would do what they do best and thrash out all the puzzling details of insurance so the Search Engine company could be left to do what it does best... which is designing self driving cars, apparently.

StatusNil:
Google Car - Because giving a power mad corporation any measure of control over where, and if, you're driving is a great idea.

These are the same guys who want you to wear spyglasses that record everything for them, and of course greatly influence what you will and won't notice about your surroundings. I feel conflicted enough about putting myself on their grid by using their search engine, and now they want people to be mere passengers in their traffic flow?

Safe or not, I'm too old to get hyped over their brand of Dystopia.

Citation needed.

Seriously though they are a company like any other, not angels. But you're describing them as literally trying to take over the world... Which they might be.

I think it's unreasonable to condemn the cars for one crash, it's not like people don't get into car crashes all the time for similar reasons.

I'll condemn the cars because it hands over control of your car from you to a government and/or corporation, or at least leaves the option to do so way wide open, and fuck that.

So, just to be clear... The car was to the far right in a single (wide) lane in order to make a turn. It encountered an obstacle* and when attempting to go around it, the bus - while attempting to overtake in the same lane - made contact and its the cars fault?

Last I checked a vehicle has right of way to the entire lane it occupies, and while moving over when slowing down for a turn is nice, it does not forfeit it's right to return to the center of the lane once immediately adjacent vehicles have passed. It's not just that it was assumed the bus would yield, the bus should have yielded; the Google car signaled it's intent well before the bus passed it and with plenty of room for the bus driver to respond safely.

* from another article there were sandbags on the side of the road

Dyspayr:
So, just to be clear... The car was to the far right in a single (wide) lane in order to make a turn. It encountered an obstacle* and when attempting to go around it, the bus - while attempting to overtake in the same lane - made contact and its the cars fault?

Last I checked a vehicle has right of way to the entire lane it occupies, and while moving over when slowing down for a turn is nice, it does not forfeit it's right to return to the center of the lane once immediately adjacent vehicles have passed. It's not just that it was assumed the bus would yield, the bus should have yielded; the Google car signaled it's intent well before the bus passed it and with plenty of room for the bus driver to respond safely.

* from another article there were sandbags on the side of the road

It didn't follow law zero of driving: don't crash. Or more specifically, the bus should have stopped and let the car into the lane and the car had every right to get into the lane, but the rule is vague enough, especially in actual driving conditions, that the google car should have backed down and stayed where it was rather than continue moving.

It'd be similar to a (slow moving) car running a stop light in front of you. Sure, you have right of way and the other car isn't obeying the rules of the road, but things are happening slowly enough that you can easily not crash into them, and so are expected not to do so.

Seems to be intersection area, Google car is way too late to change lanes at that spot. I would pin this 100% on Google.

I think they still have a much better track record than humans, though.

Xeorm:

Dyspayr:
So, just to be clear... The car was to the far right in a single (wide) lane in order to make a turn. It encountered an obstacle* and when attempting to go around it, the bus - while attempting to overtake in the same lane - made contact and its the cars fault?

Last I checked a vehicle has right of way to the entire lane it occupies, and while moving over when slowing down for a turn is nice, it does not forfeit it's right to return to the center of the lane once immediately adjacent vehicles have passed. It's not just that it was assumed the bus would yield, the bus should have yielded; the Google car signaled it's intent well before the bus passed it and with plenty of room for the bus driver to respond safely.

* from another article there were sandbags on the side of the road

It didn't follow law zero of driving: don't crash. Or more specifically, the bus should have stopped and let the car into the lane and the car had every right to get into the lane, but the rule is vague enough, especially in actual driving conditions, that the google car should have backed down and stayed where it was rather than continue moving.

It'd be similar to a (slow moving) car running a stop light in front of you. Sure, you have right of way and the other car isn't obeying the rules of the road, but things are happening slowly enough that you can easily not crash into them, and so are expected not to do so.

The Google car never actually changed lanes, the bus was trying to go around it while in the same lane. And to your point, if I t-bone a car while I'm on the green light, and they're on the red, I can almost guarantee (depending on local laws) that they will be held responsible so long as I did not accelerate or steer towards them to show that I willfully made contact. There are any number of legitimate reasons I may not have noticed the offending car running the light in time to stop: sun, checking for oncoming traffic from opposite direction, assumed they were creeping to get a better view for a potential turn-on-red. In the majority of places the person that legally has to yield is responsible for avoiding those that have right-of-way .

You and LD3 would be perfectly correct -if- the car was making a lane change, indicator signal or not. But as long as the car was in front of the bus in the same lane, the bus hit the car, not the other way around. If you are in an area that assigns partial liability, there is an argument that it was technically avoidable, but I would guess a 75/25% liability split at most.

008Zulu:
The patch they are rolling out for the car's OS, involves extending a prosthetic hand, flipping the bird while swearing and honking the horn.

That patch is only available in the USA, however.

This is all such bullshit, even if they perfect the system one angry cyclist could lock up a Driverless car for as long as he likes and don't think they wont!

Not to mention the fact that many people simply wont trust a piece of machinery that could crash (in both senses of the word) or be hacked, look at how many times GPS fucks up...

 

Reply to Thread

Posting on this forum is disabled.