More human than human —

Google’s cars need to drive less like robots and more like humans

It turns out that only robots strictly adhere to the rules of the road.

A Google self-driving car.
A Google self-driving car.

Google has spent several years working to teach robots how to drive perfectly. The Wall Street Journal reports that the company now has to teach the cars how to break speed limits and aim for corner apices instead. The company's self-driving fleet of cars is pretty safe, with just 16 accidents since 2009.

But the vast majority of those were human drivers rear-ending the Google car, something that many in the field attribute to the very conservative nature of the cars' programming. The self-driving cars will slow down or stop based on cues they detect from car passengers or other road users that humans would ignore.

A Google car was paralyzed by a cyclist balancing on his fixed-gear bike at a stop sign in Austin last month. The cyclist's motion of rocking back and forth to stay upright fooled the car into thinking he was moving, so the car refused to move. And Google had to change the cars' programming with regard to crossing double yellow lines after discovering the autonomous vehicles would instead just sit permanently behind other vehicles double-parked on the road.

Channel Ars Technica