Tesla’s autopilot feature allows Model S drivers to sit back and relax while their car maintains a safe distance from the car in front of it and changes lanes all by itself. Pretty neat technology, but recently, a driver died after his Model S collided with a semitrailer. The semi apparently cut the driver off and his car’s camera was unable to detect the white side of the semitrailer from the brightly lit sky. The car crashed into the semi, went under it, and continued on for a distance before stopping.
What Does This Mean for the Future of Self-Driving Cars?
Honestly, it could mean quite a lot. Tesla apparently tried to hide the incident and it was only released to the public after a US governmental agency got involved in the investigation. This is, no doubt, bad for business and will probably change the driving habits of thousands of Model S owners that use the feature.
Tesla immediately released a lengthy statement titled “A Tragic Loss” where they tried ensure the public of the car’s safety.
The whole situation raises an important, if not even slightly philosophical, question:
Are Self-Driving Cars Dangerous?
Well, if you look at the data on Tesla’s statement, and just use some common sense, they’re actually safer. Somewhere between 50 and 250 people die in car accidents every day in the United States and over 3,500 every day worldwide; one person has died in an automated car. However, there clearly are more normal cars on the road, so that really doesn’t prove anything.
However, Tesla claims that self-driving cars have driven over 130 million miles on the road and led to one death, while normal cars average one death for every 60 million miles. So, self-driving cars are TWICE as safe as normal ones. Or put another way, driving yourself is twice as dangerous as being driven by your automated car. So, are you ready to flip on autopilot? Probably not; In fact, an AAA poll found that 75% of Americans are not ready to trust self-driving cars.
So Why is Everyone So Scared?
Here’s the philosophical part, we rather put our lives in our own hands than in a machine’s, even when we’re more dangerous. For some reason, we expect machines to be perfect 100% of the time and when they’re not, we revert back to trusting ourselves.
If a calculator gave you the wrong answer one day, would you trust that calculator again for something extremely important? Probably not, you’d bust out the old pen and paper and do it yourself.
This is the problem with self-driving cars – we have an alternative that puts us in control. We can’t fly our own planes or conduct our own trains, but we can drive our own cars. Sadly, the truth is that there will always be deaths and crashes where driver-less cars are at fault and we need to somehow get over this.
The question we need to ask ourselves is, would you rather crash because you ran a yellow light or because a typo in a computer algorithm?