MotorBeat Blog

MotorBeat

MotorBeat covers the latest developments in car manufacturing, and technological advancements in transportation as a whole. The blog will help automotive enthusiasts and drivers to get a glimpse of the motoring future.

Jordan Perch loves automotive innovation and that is his ultimate passion. He is managing the resourceful DMV.com and is an active contributor to numerous consumer and automotive blogs.

Previous in Blog: New App Aims to Reinvent and Disrupt the Trucking Industry   Next in Blog: Europe Might Get Elon Musk's Hyperloop Before the U.S.
Close
Close
Close
13 comments

Google Admits One of Its Self-Driving Cars Has Caused an Accident

Posted March 15, 2016 10:52 AM by Jordan Perch
Pathfinder Tags: driverless cars google

Last year, Google started testing a fleet of its driverless cars on California's public roads, and so far, they have only been involved in a dozen of minor accidents, with all of them being caused by other vehicles.

The tech company claimed that its autonomous driving technology wasn't at fault for any of the incidents involving its vehicles, suggesting that it is reliable enough and does not pose a threat to public safety. But now, according to the latest report of a traffic accident involving an autonomous vehicle, which Google has submitted to the California Department of Motor Vehicles, the search engine giants claims responsibility and admits that one of its vehicles has caused a collision.

Collision with a Bus

In the report, Google provides a detailed description of how the accident occurred, stating that it happened while its vehicle was driving in autonomous mode on a street in Mountain View, near the company's global headquarters.

The vehicle in question was a modified Lexus SUV, equipped with autonomous driving technology. According to the report, the crash occurred on February 14, while Google's car was approaching an intersection, when it moved to the far right lane to make a right turn, but was forced to stop as it came across sandbags that were blocking its path.

The car had to drive around the sandbags, moving back to the center of its lane, but in the meantime, a public transit bus approached it from behind in the same lane, which was extra wide, with the test driver sitting in the self-driving vehicle thinking that the bus had enough time to stop or slow down to allow Google's car to move along.

But, the bus didn't stop, and its right-hand side was hit by Google's autonomous vehicle. At the moment of the collision, Google's car was traveling at less than 2 mph, while the bus was moving at 15 mph, as reported by Google.

The report concludes by stating that there were no injuries at the scene, and that Google's car sustained body damage on several places.

"Some Responsibility"

As reported by Tech Insider, which has received Google's February report on accidents involving autonomous vehicles, the tech company admits that its car is responsible for the incident to some extent.

"In this case, we clearly bear some responsibility, because if our car hadn't moved there wouldn't have been a collision," the report notes.

Even though this accident did not result in injury or severe property damage, it might have serious consequences to the company's self-driving project. While fault for all previous incidents involving Google's autonomous vehicles was attributed to other vehicles they were sharing the road with, this time around, the company itself admits that it bears responsibility.

This means that Google can not make the claim that its cars are flawless anymore, which will affect the way the general public perceives Google's self-driving technology, raising the question of whether an autonomous driving system is truly better and safer than the average human driver.

Reply

Interested in this topic? By joining CR4 you can "subscribe" to
this discussion and receive notification when new comments are added.

Comments rated to be "almost" Good Answers:

Check out these comments that don't yet have enough votes to be "official" good answers and, if you agree with them, rate them!
Guru
Engineering Fields - Instrumentation Engineering - New Member Hobbies - Automotive Performance - New Member Technical Fields - Education - New Member Fans of Old Computers - TRS-80 - New Member Hobbies - Musician - New Member

Join Date: Jan 2008
Location: Tucson, AZ
Posts: 1331
Good Answers: 30
#1

Re: Google Admits One of Its Self-Driving Cars Has Caused an Accident

03/15/2016 1:17 PM

So, who PAYS for the accident? The software developer, the hardware developer, the vehicle integrator, or the PUBLIC?

I contend damages are the responsibility of EVERYONE associated with releasing the autonomous vehicle onto the public streets. If it ain't 100% tested safe, it should NOT be on the streets...ever!

__________________
...and the Devil said: "...yes, but it's a DRY heat..!"
Reply
The Engineer
Engineering Fields - Engineering Physics - Physics... United States - Member - NY Popular Science - Genetics - Organic Chemistry... Popular Science - Cosmology - New Member Ingeniería en Español - Nuevo Miembro - New Member

Join Date: Feb 2005
Location: Albany, New York
Posts: 5060
Good Answers: 129
#3
In reply to #1

Re: Google Admits One of Its Self-Driving Cars Has Caused an Accident

03/16/2016 8:43 AM

Just to play devil's advocate, let's say it's 1900 and someone told you about a new mechanical device called an automobile that allowed you to travel far more easily but that there was a risk of accident or death. Would you demand that all automobiles be taken off the roads? What if someone came from the future and showed you this graph of automobile related fatalities per year?

Reply
Guru
Engineering Fields - Electrical Engineering - New Member Fans of Old Computers - Commodore 64 - New Member Popular Science - Evolution - New Member United States - Member - New Member

Join Date: Oct 2013
Location: Illinois, 7 county region (The 'blue dot' that drags the rest of the 'red state' around during presidential elections.)
Posts: 3688
Good Answers: 89
#6
In reply to #1

Re: Google Admits One of Its Self-Driving Cars Has Caused an Accident

03/16/2016 12:35 PM

"If it ain't 100% tested safe, it should NOT be on the streets...ever!"

Okay, let's see what we need to get off the streets for Public Safety:

  • Autonomous vehicles.
  • Human-driven semi trucks.
  • Human-driven buses.
  • Human-driven delivery trucks.
  • Human driven cars.
  • Human-driven motorcyles.
  • Human-driven bicyles.
  • Pedestrians.

(ED-209 voice) "Stay inside your house, Citizen Cuda. If you leave your house you will need to be terminated before you potentially harm another person. This is for Public Safety. Humans have a non-zero chance of hurting other humans, therefore humans are not '100% safe,' and therefore, humans cannot be allowed on the public streets...ever."

__________________
( The opinions espressed in this post may not reflect the true opinions of the poster, and may not reflect commonly accepted versions of reality. ) (If you are wondering: yes, I DO hope to live to be as old as my jokes.)
Reply Score 1 for Good Answer
Guru
Popular Science - Cosmology - New Member Technical Fields - Technical Writing - New Member Engineering Fields - Energy Engineering - New Member Engineering Fields - Electrical Engineering - New Member Engineering Fields - Control Engineering - New Member Engineering Fields - Electromechanical Engineering - Old Member, New Association

Join Date: Apr 2008
Location: Lexington, KY
Posts: 1639
Good Answers: 72
#2

Re: Google Admits One of Its Self-Driving Cars Has Caused an Accident

03/16/2016 8:04 AM

Maybe the solution is to have streets where only autonomous vehicles drive. Mixing humans with machines is always going to result in some surprises because humans are unpredictable.

I'd welcome autonomous driving if we could keep people away from these vehicles. There is no shortage of bad human drivers.

__________________
A great troubleshooting tip...."When you eliminate the impossible, whatever remains, however improbable, must be the truth." Sir Arthur Conan Doyle
Reply
Guru
Engineering Fields - Electrical Engineering - New Member Fans of Old Computers - Commodore 64 - New Member Popular Science - Evolution - New Member United States - Member - New Member

Join Date: Oct 2013
Location: Illinois, 7 county region (The 'blue dot' that drags the rest of the 'red state' around during presidential elections.)
Posts: 3688
Good Answers: 89
#4
In reply to #2

Re: Google Admits One of Its Self-Driving Cars Has Caused an Accident

03/16/2016 10:11 AM

Or maybe the solution is to have the human drivers PAY ATTENTION to the vehicles ahead of them.

"The car had to drive around the sandbags, moving back to the center of its lane, but in the meantime, a public transit bus approached it from behind in the same lane, which was extra wide, with the test driver sitting in the self-driving vehicle thinking that the bus had enough time to stop or slow down to allow Google's car to move along." (Emphasis added.)

The bus driver either was not paying attention to the plight of the vehicle in front of it IN THE SAME LANE or he was CHOOSING to 'dance around' the other car AHEAD OF HIM in the SAME LANE. The bus driver CUT OFF the Google car, and is the one at fault.

If the bus was controlled by a Google system instead of a human, the Google system would have correctly identified a vehicle AHEAD OF IT in the SAME LANE and not tried to 'ignore the problem.'

Many of the 'incidents' where the driver had to take control from the Google system include events such as the map not matching the street, where the car stops and waits for the 'congestion' to clear before turning, when the 'congestion' is a building in the middle of a street the map says is there but reality says is not. Or the car is waiting for a pedestrian who is waving the car on. The Google system is programmed to give pedestrians the right of way, so when a pedestrian is in a crosswalk, the car will wait 'forever' for the pedestrian to finish crossing.

We are seeing things the programmers had not thought to look for in normal traffic, such as overly aggressive bus drivers, or lanes that are two lanes wide and are treated as 'sometimes one lane, sometimes two lanes, depending ho how much of a butthole the drivers are being.'

If we taught the Google system just how bad American drivers CAN be, the cars would never leave their garages our of fear.

__________________
( The opinions espressed in this post may not reflect the true opinions of the poster, and may not reflect commonly accepted versions of reality. ) (If you are wondering: yes, I DO hope to live to be as old as my jokes.)
Reply
Guru
Popular Science - Cosmology - New Member Technical Fields - Technical Writing - New Member Engineering Fields - Energy Engineering - New Member Engineering Fields - Electrical Engineering - New Member Engineering Fields - Control Engineering - New Member Engineering Fields - Electromechanical Engineering - Old Member, New Association

Join Date: Apr 2008
Location: Lexington, KY
Posts: 1639
Good Answers: 72
#5
In reply to #4

Re: Google Admits One of Its Self-Driving Cars Has Caused an Accident

03/16/2016 12:13 PM

Or maybe the solution is to have the human drivers PAY ATTENTION to the vehicles ahead of them.
If only people would do this instead of playing on their little electronic gadgets, eating and smoking at the same time, applying makeup, beating their kids, etc., etc.

Perhaps the Civil Engineers could do something about this. Such as building intersections where priority was not automatically given to pedestrians. Come ON! Is it a street or a really big sidewalk?

Yes, I know it is not legal to run over pedestrians, but I am submitting legislation to make them fair game!

__________________
A great troubleshooting tip...."When you eliminate the impossible, whatever remains, however improbable, must be the truth." Sir Arthur Conan Doyle
Reply
Guru

Join Date: Dec 2007
Location: Haverhill, MA
Posts: 1149
Good Answers: 151
#7

Re: Google Admits One of Its Self-Driving Cars Has Caused an Accident

03/16/2016 1:33 PM

After passing Driver's Ed. I got my licence at the age of 17. Since then it took me about 15 years to gain the experience (and maturity) to become a "good driver". Along the way, it took me a few accidents, minor "fender benders" and a couple close calls, to learn my limits as a driver, the limits of my vehicle, and, most important, the realization that the other drivers that I shared the road with would not necessarily behave as I assumed that they would behave. I was lucky! During my learning years I never caused injury to anyone or myself.

Just like any teenage driver, it will take some time and a lot of experience in real world driving situations for the computer systems of these cars to mature.

__________________
The older I am, the better I used to be
Reply
Guru

Join Date: Apr 2010
Location: About 4000 miles from the center of the earth (+/-100 mi)
Posts: 9198
Good Answers: 1046
#8

Re: Google Admits One of Its Self-Driving Cars Has Caused an Accident

03/16/2016 3:38 PM

Have they programmed in the city driving rule "The other driver will cede the right-of-way if you avoid eye contact". - Oh, right, driverless cars don't have eyes!

Reply
Power-User

Join Date: Nov 2015
Posts: 133
#9

Re: Google Admits One of Its Self-Driving Cars Has Caused an Accident

03/16/2016 8:12 PM

"autonomous driving system is truly better and safer than the average human driver." NOT TRUE. Human intelligence is way better than AI.

__________________
formerly Legolaz
Reply
Guru
Engineering Fields - Electrical Engineering - New Member Fans of Old Computers - Commodore 64 - New Member Popular Science - Evolution - New Member United States - Member - New Member

Join Date: Oct 2013
Location: Illinois, 7 county region (The 'blue dot' that drags the rest of the 'red state' around during presidential elections.)
Posts: 3688
Good Answers: 89
#10
In reply to #9

Re: Google Admits One of Its Self-Driving Cars Has Caused an Accident

03/17/2016 9:13 AM

"Human intelligence is way better than AI."

But AI REFLEXES are better than human ones.

When was the last time you pumped your brakes for better stopping in less-than-perfect traction? You've probably even trained yourself OUT of that habit, since the ABS can 'read' the road better than you can, and it can pump the brakes faster, fore efficiently, and more precisely for the conditions than you can.

__________________
( The opinions espressed in this post may not reflect the true opinions of the poster, and may not reflect commonly accepted versions of reality. ) (If you are wondering: yes, I DO hope to live to be as old as my jokes.)
Reply
Power-User

Join Date: Nov 2015
Posts: 133
#11
In reply to #10

Re: Google Admits One of Its Self-Driving Cars Has Caused an Accident

03/17/2016 11:45 AM

I could not agree, because we are not design for braking stuff. And, its not even close to how fast our reflex I/O communicates the brain and other multi-tasking job it does, without having us to think. Example, having sex with your wife but with completely focused on different woman in mind.

__________________
formerly Legolaz
Reply
Guru
Engineering Fields - Electrical Engineering - New Member Fans of Old Computers - Commodore 64 - New Member Popular Science - Evolution - New Member United States - Member - New Member

Join Date: Oct 2013
Location: Illinois, 7 county region (The 'blue dot' that drags the rest of the 'red state' around during presidential elections.)
Posts: 3688
Good Answers: 89
#12
In reply to #11

Re: Google Admits One of Its Self-Driving Cars Has Caused an Accident

03/17/2016 12:05 PM

"I could not agree, because we are not design for braking stuff."

Which is kind of my point, we are not optimized for controlling a car, not like the AI's will be once they're properly tuned. We will always be better acordianists that the self-driving AI's, but that's the core of the issue: Humans are generalists "Jack of all Trades, Master of none, but oft times better than a master of some." the AI's will be specialists, the Master of One Trade. The jack will never surpass the master in the master's trade, but the jack can do 'mediocre' in all trades, while the Master is 'poor' in any trade not his own.

"Example, having sex with your wife but with completely focused on different woman in mind."

And how often do you do that while driving? -On second thought, skip it, I don't want to know.

__________________
( The opinions espressed in this post may not reflect the true opinions of the poster, and may not reflect commonly accepted versions of reality. ) (If you are wondering: yes, I DO hope to live to be as old as my jokes.)
Reply
Guru
Popular Science - Cosmology - New Member Technical Fields - Technical Writing - New Member Engineering Fields - Energy Engineering - New Member Engineering Fields - Electrical Engineering - New Member Engineering Fields - Control Engineering - New Member Engineering Fields - Electromechanical Engineering - Old Member, New Association

Join Date: Apr 2008
Location: Lexington, KY
Posts: 1639
Good Answers: 72
#13
In reply to #11

Re: Google Admits One of Its Self-Driving Cars Has Caused an Accident

03/18/2016 1:23 PM

I hate to burst your bubble but automatic braking will be mandatory in all USA made cars in 2020. Multi-tasking is a myth. You may be multiplexing your thoughts, but you only have one brain. It has been proven over and over that when humans multitask they simply do the different tasks less effectively and accurately than they would have if they had concentrated on a single task at one time.

Maybe quantity is more important than quality to you but with the multi-core processors available on mother boards, I don't think you will be faster or more efficient than any future AI machine.

What tends to make humans faster than machines is our search algorithm. We think in contextual terms and store memories that way. The early AI machines were sequential and had to run through all kinds of rule testing to determine the correct response. Now, with contextual considerations and multiple processors, you don't stand a chance at being faster.

But it really doesn't matter anyway. With all the lawyers to relieve us of our own responsibility we can always blame the machine for our mistakes. Machines have not yet been taught to complain.

__________________
A great troubleshooting tip...."When you eliminate the impossible, whatever remains, however improbable, must be the truth." Sir Arthur Conan Doyle
Reply
Reply to Blog Entry 13 comments
Interested in this topic? By joining CR4 you can "subscribe" to
this discussion and receive notification when new comments are added.

Comments rated to be "almost" Good Answers:

Check out these comments that don't yet have enough votes to be "official" good answers and, if you agree with them, rate them!
Copy to Clipboard

Users who posted comments:

70AARCuda (1); adreasler (4); Bayes (1); gringogreg (1); NotUrOrdinaryJoe (3); Peterpipper (2); Rixter (1)

Previous in Blog: New App Aims to Reinvent and Disrupt the Trucking Industry   Next in Blog: Europe Might Get Elon Musk's Hyperloop Before the U.S.

Advertisement