Analysis

Google's robot cars are not reliable without humans

13th January 2016
Enaie Azambuja
0

Google's self-driving cars aren't perfect, but it's important to emphasise that they're still better than human drivers. The company has revealed that its human drivers had to take control of its robotic vehicles 341 times in the last 14 months, after a total of 424,999 miles (468 682,361km) of driving.

As part of the agreement to test self-driving cars in California each company involved - so far Bosch, Delphi, Google, Nissan, Mercedes-Benz, Tesla, and Volkswagen Group - were asked to submit "disengagement" incidents, where a human has had to take control of the car. Disengagements can happen when a human takes control of the car, or the car tells the human it is switching to manual mode.

The cars experienced 272 failures. If humans hadn't taken over driving, they would have crashed 13 times, the company has said. 

Self-driving cars aren't perfect yet. But the amount of times humans are needed is decreasing. The available reports show Google had the most disengagements with 341, although it drove the most miles, Nissan's autonomous vehicles drove 1,485 miles and disengaged 106 times, Delphi drove 16,662 miles with 405 human takeovers and Tesla didn't report any, according to Ars Technica.

However, Google said it is only reporting the 'significant' times when humans were forced to take over as the reality is there are "many thousands" of occasions" where control is taken.

"Safety is our highest priority and Google test drivers are trained to take manual control in a multitude of situations, not only when safe operation "requires" that they do so," the company said in its report of September 2014 to November 2015.

The reasons why Google's self-driving cars had to be controlled by humans

The majority of the disengagements from Google came after the car detected a technology problem within itself. These ranged from sensors malfunctioning to software problems. On 23 different occasions humans had to take over because of "a recklessly behaving road user" and eight times they had to "disengage for incorrect behaviour prediction of other traffic participants". 

"Disengagements are a critical part of the testing process that allows our engineers to expand the software’s capabilities and identify areas of improvement," Google said.

"Our objective is not to minimise disengagements; rather, it is to gather, while operating safely, as much data as possible to enable us to improve our self-driving system."

Product Spotlight

Upcoming Events

View all events
Newsletter
Latest global electronics news
© Copyright 2024 Electronic Specifier