We are searching data for your request:
Upon completion, a link will appear to access the found materials.
A car without a driver can never be as safe as one with a human decision-maker calling the shots. After all, a machine could never have the experience that you have gained from years of driving. However, contrary to what you may think, humans are actually pretty terrible at not getting into accidents.
Are self-driving cars safer than humans?
Even if we can establish that self-driving cars are safer, i.e., they get in fewer accidents, there’s still the tricky question of who’s to blame when they do get in accidents. The issue of autonomous car’s safety and the ethics behind the driverless machines need to be addressed before they can become commonplace in today’s society.
To address these concerns, let’s start looking at how safe autonomous cars really need to be to beat out humans. Research published by the U.S. Department of Transportation National Highway Traffic Safety Administration (USDOT NHTSA) found that 94% of all car crashes that occurred between 2005 and 2007 were the result of driver error.
RELATED: THE SIX LEVELS OF AUTONOMOUS DRIVING AND THE FUTURE OF AUTONOMOUS CARS IN CHINA
That’s to say that nearly all car crashes across the US could be eliminated if one or all of the drivers in a crash were paying more attention and reacting accordingly. So, it is beginning to look like autonomous vehicles don’t have to be perfect; they just need to be able to prevent more accidents than humans can.
Looking into some of the data gathered from the many self-driving car projects across the world, we can begin to get a grasp on how autonomous vehicles are faring. Back in February, Google’s autonomous car hit a bus, marking only one of a dozen crashes since the project began.
While Google’s program is obviously still in development, any self-driving crash sparks concerns among the general public. Somewhat ironically, Google’s spokeswoman said this about the self-driving car program before any crashes had occurred:
"We just got rear-ended again yesterday while stopped at a stoplight in Mountain View. That's two incidents just in the last week where a driver rear-ended us while we were completely stopped at a light! So that brings the tally to 13 minor fender-benders in more than 1.8 million miles of autonomous and manual driving — and still, not once was the self-driving car the cause of the accident.”
Even after the accidents, however, the track record of self-driven cars remains far better than any human driver. The difficulty is currently that there isn’t much data to draw conclusions from.
Elon Musk, the CEO of Tesla, which is admittedly one of the leaders in the autonomous car industry, has said that eventually, self-driving cars will be so safe that regulators will have to determine whether to ban manual driving. Looking back to the Google self-driving car project, which is the most extensive study of self-driving technology, we can begin to look at some raw data.
Their autonomous vehicles were involved in 13 fender benders in 1.8 million miles of driving for the project, and not a single one was the car’s fault. From this data, we can see that self-driving cars are dramatically safer than their human-driven counterparts. However, given that self-driving technology is still emerging, there, unfortunately, isn’t much hard data to go off of other than general speculation.
Looking at the industry as a whole, self-driving cars are involved in a far fewer percentage of accidents than their human-driven counterparts. It would appear that self-driving cars are safer than human-driven ones.
What regulation will be required for self-driving cars?
Now that we have demonstrated that self-driving cars seem safe, how much testing would be needed to prove that they are safe? In order for the government and other regulating agencies to determine an acceptable standard of autonomous safety, there has to be some testing method.
NHSTA is working with road safety methods developed in the 1960s, whose creators never even considered humans being removed from the equation. According to some researchers, given the current safety determination methods, it would take many years of driving to determine if even one version of a self-driving car was safe.
This sparks further discussion about the need for regulating agencies to overhaul their safety determination methods in order to prepare for fleets of computer-driven cars.
RELATED: THIS GAME MAKES YOU TRY AND SOLVE THE TYPES OF SITUATIONS THAT AUTONOMOUS CARS WILL HAVE TO IN THE FUTURE
So far, in all the data gathered, it would appear that self-driving cars are much safer than human-driven cars, but a lot of work needs to be done to prove this. Self-driving cars are made safe, given that their sensors and response times are all electronic and mechanical, making them by nature faster than humans.
Through a variety of sensing technology and creative programming, self-driving cars will be able to overcome virtually every obstacle they may face out in the traffic world. So, self-driving cars are safe because, frankly, it doesn’t take much to be a better driver than humans.
All that said, the major hurdle isn't really the data that proves that self-driving cars are safer than human drivers, we already have that. The major hurdle is establishing public trust in autonomous driving systems and developing regulation so every time an autonomous vehicle gets in a crash, the automaker can't get sued for millions.
Automakers will be hesitant to role out autonomous tech if there isn't government regulation giving them a clear path forward, largely for fear of litigation from the public. Public trust in autonomous driving is also largely needed as ultimately people will only buy these cars if they are willing to trust them with their lives.
So, autonomous cars are safe, in fact, already much safer than human drivers, statistically. We're still likely 5 to 10 years off before we see this type of technology regulated in such a way that it becomes prominent on our roadways, though.