Programmers and engineers at Google and the GM-Carnegie Mellon Autonomous Driving Collaborative Research Lab, among other institutions, are finding driverless cars are too good at what they do, and it's causing a crash rate double that of cars with human drivers. That is to say, a robot driver that obeys the law to the letter every time doesn't mesh so well with human drivers that don't do the same. For example, a driverless car will go the speed limit on a busy highway whereas everyone else will be going well above it, or be wanting to, thus increasing the probability of a crash. As well, the reflexes of a driverless car are better, which can catch a human off guard.
Though all crashes have been minor and none of them the fault of a driverless car, researchers are of course debating what to do about the situation. One possibility: programming the vehicles to behave more like humans and better fit into the "social game" (as Google describes it) that is driving, even if that means making them a little less lawful.
"It's a sticky area," says Brandon Schoettle, co-author of the University of Michigan's Transportation Research Institute's study. "If you program them to not follow the law, how much do you let them break the law?"
Schoettle points out that while crashes may increase as more driverless cars are put on the road, injuries will decrease. Sergeant Saul Jaeger, head of the Mountain View police department's traffic enforcement unit says he expects even crashes to decrease as the public becomes more accustomed to driverless cars, and as programmers get better at programming them.