Self-driving cars seem like something straight out of a movie placed in the future, but they are very much a reality in 2016. These vehicles are supposed to create a world that has no car accidents, no injuries and keeps everyone safe on the road, but in reality these self-driving vehicles have encountered over double the number of accidents as a human driver would.
The purpose of self-driving vehicles was to ensure the cars on the road obeyed the law at all times – with no exceptions. This may seem like the correct way to avoid accidents until you consider the reality factor. In a jam-packed highway full of chaotic drivers, you cannot always obey the rules of the road and sometimes you need to be able to react in a certain way (which may break a law) in order to avoid an accident with another car. This is the issue that these self-driving vehicles are having and the programmers working to create them now face a dilemma: should they train a vehicle to break the law when it is necessary – and how do they train vehicles to break the law only when it is necessary to do so?
Programmers have decided that it is not worth the risk to allow vehicles to break the law. Instead, they will have vehicles remain at the posted speed limit regardless of whether other vehicles are driving faster.
Another issue with these self-driving vehicles is the process of merging onto a highway full of vehicles. While these vehicles have a 360-degree view of what is around them, they cannot comprehend if drivers will provide them with enough room to merge onto the highway without accident; therefore, the vehicle slows down out of caution and the human driver inside will have to take over to get the vehicle into traffic.
Accident rates with driverless vehicles are twice as high as those of regular cars – according to a news report in the Insurance Journal. Driverless vehicles, however, were never at fault for those accidents. They are typically struck from behind by another driver in slow-speed crashes because of the aggressive driver or their inattentiveness to see that a vehicle in front of them is actually driving the speed limit. So, in truth, these vehicles may have more accidents, but that is only because other drivers disobeying the laws do not have the patience for those vehicles.
One thing that programmers have yet to address is how to program their vehicles to make it through life or death situations on the highway – something a human driver would be able to assess and perform. While these reactions can cause accidents themselves, they can also help prevent deadly accidents from occurring.
Google has been thoroughly testing their self-driving cars for some time – and they have even been pulled over by police. A Mountain View police officer noticed a large volume of cars stuck behind a self-driving vehicle and pulled the vehicle over. He did not issue a ticket, since the vehicle was driving – and not the human driver. But, the fact the vehicle was driving 24 miles per hour in a 35 mile per hour zone creates a hazard. Just like human drivers, the self-driving vehicle should have pulled over and allowed other vehicles to pass. Holding up traffic or creating a stacked line of cars only increases the hazards on the road.
Right now self-driving vehicles are in their test phase, but with all of the issues, no one can say for sure if there will be more or less accidents after self-driving cars are sold to the public.
If you or a loved one was injured in a car accident, contact the attorneys at Malman Law today. When drivers act negligently, you may be entitled to compensation for your injuries. Call us now at (888) 625-6265 to schedule a free consultation or fill out an online contact form with your questions.