Google on Tuesday said that while its self-driving cars have safely driven more than a million miles, there have been times when humans have had to take over to avoid crashing.
System “anomalies” caused drivers to take the wheel 272 times in California test cars in the 14 months leading up to December, Google said in a report to the California Department of Motor Vehicles.
The test period saw cars travel more than 420,000 miles (676,000 kilometers) across the state.
There were an additional 69 occasions when drivers seized control from automated systems based on their own judgment calls, according to the report.
The most common cause for intervention occurred when technology did not properly sense a real-world situation, the report indicated.
– ‘Trend looks good’ –
Google then plays out these situations on a simulator to reveal whether the vehicle would have hit something had the human not taken control, according to Chris Urmson, head of the Internet giant’s self-driving car team.
Simulations determined that 13 of the 69 “driver-initiated disengagements” would have resulted in crashes if the car had been steering, the report indicated.
Two of the incidents involved traffic cones and three were blamed on reckless driving by someone in another vehicle.
Eight of the near-misses took place over the 53,000 miles traveled in California in 2014, while only five happened as the cars logged a hefty 370,000 miles during the 2015 part of the trial, according to Urmson.
“This trend looks good,” he said.
Urmson cautioned, however, that the number could actually rise as Google self-driving cars are tested in trickier environments such as dangerous weather or traffic.
“On our test track, we run tests that are designed to give us extra practice with rare or wacky situations,” Urmson said.
Engineers also use a powerful simulator to generate scenarios and variations on circumstances.
“Thanks to all this testing, we can develop measurable confidence in our abilities in various environments,” Urmson said.
“This stands in contrast to the hazy variability we accept in experienced human drivers — never mind the 16-year-olds we send onto the streets to learn amidst the rest of us.”
Urmson was not ready to declare self-driving cars safer than those controlled by humans, but believed Google was making progress toward getting them to market.
– Drivers still needed –
California Department of Motor Vehicles officials last month proposed self-driving car regulations that would mandate that a person be able to take the wheel if needed.
California has the potential to set precedent with its rules for self-driving cars, and the proposed regulations were seen as sure to slow down the technology’s progression as it heads mainstream.
“DMV got it exactly right and is putting our safety first,” said John Simpson, director of nonprofit Consumer Watchdog Privacy Project.
“How can Google propose a car with no steering wheel, brakes or driver when its own tests” show so many failures, he added.
Overall Google’s self-driving vehicles have logged more than 1.3 million miles (2 million kilometers), the company said.