Safety Measures For The Future Self Driving Cars Have To Be Safer Than Human Drivers

You might think that self-driving cars would cause a huge debate from a technology point, but you would be wrong. The story is more about the philosophy and how safe is safe enough? With bigger players getting into the market the global value is an estimated $54 billion and will grow tenfold in the next seven years. Technical issues are not the only one we have to deal with because there are moral issues.

Safety Measures For The Future Self Driving Cars Have To Be Safer Than Human Drivers

For every 100 million miles driven in the US, there is one death, which means human driving is very safe. Self-driving cars would have to do better than that to answer this moral question.
Testing safety is a challenge too. Gathering the data to prove than they are safer than you would require millions or billions of miles to be driven. It is prohibitively expensive to wait for all this to happen.

Tesla for example is already using the data that their hybrid vehicles and self-driving vehicles are gathering in the streets. Waymo is relying heavily on computer simulations to fast-track the data collection and integrate it into their cars.

Elon Musk may have promised 1 million completely self-driving cars by next year but you should probably not wait for it. Tesla is making large strides in this endeavor and maybe, it might work but the reality is very different. There will always need to be a human present, if this question of safety is now answered.

In the human driving sector, there is room for improvement because of deaths that occur but when it comes to how driverless we want cars to be, the answer is a little bit elusive. Also, who will be responsible when something goes wrong. Whom do we hold accountable when a life is lost as a result of these cars crashing.
Some studies from reputable researchers have found that the sooner automated cars are dispatched, the more safer we will be, ultimately. This is based on the fact that humans have a threshold and machines can get better every year. The question of responsibility is one that we have seen being a challenge to answer. Tesla’s fatal crashes have seen people sue and the matter of who would be liable come into debate with no real answers.

Safety Measures For The Future Self Driving Cars Have To Be Safer Than Human Drivers

There is also the question of a car making a choice when faced with a tough decision. In case of a bad scenario, should it run into a slew of pedestrians or a pole and endanger the passengers? That is the question off the Moral Machine as posited by the MIT Media Lab project.
There is also a debate about what choices the vehicles should make if faced with a tough situation — for example, if an accident is unavoidable, should a self-driving car veer onto a pedestrian-filled sidewalk, or run into a pole, which might pose more danger to the people in the vehicle? How do we harm fewer people in case of scenarios like this one?
Until these questions can be answered, self-driving cars will most likely face challenges from all directions when it comes to what is moral and what is not.