Google has just recently published the data it garnered from experiments that concerns the self-driving cars, which they are trying to develop. The data included 11 minor crashes that transpired over the past six years, which caused many questioning whether autonomous vehicle is a fluke idea.
Based on an interview with one of the researchers at the partners for Advanced Transportation Technology at the University of California in Berkley, most of the crashes were linked to errors committed by human drivers, and could have been prevented.
Despite this, experts are confident that they will be able to work within the time frame, which they originally set. They also added that while some cars today already have some sort of automation in terms of its controls, there is much work to be done to develop trustworthy cars that does not need humans.
Shladover told Live Science that, “If you want to get to the level where you could put the elementary school kid into the car and it would take the kid to school with no parent there, or the one that’s going to take a blind person to their medical appointment, that’s many decades away,”
According to experts, there are several problems that stand in the way of developers to come up with a successful model that is 100% free from road mishaps.
The first issue comprises the need for better software. The United States is perhaps one of the safest countries to drive in. Despite this, it is important for autonomous cars to drive much safer than humans, according to Shaldover. The present software these cars are using are just not as powerful as they would want and will make it hard for the developers of the car to ensure safety is sustainable.
The next issue is the need for better maps.
While the test location is considerably, a difficult terrain to drive in, at the mountains of California, such environment is still very controlled. The success of such cars in the mountains they are conducting their experiments in is due to a complicated map which they put in the system of the cars. This enables the computer to mimic the driving patterns easier. Despite this, they are yet to test these cars in major highways where traffic is denser and requires heavier programming.
Another problem that they are encountering is the inability of their systems to imitate human ethics. For example, in a life or death situation, people are sometimes forced to decide whether to turn right and kill pedestrians or turn left and cause a substantial but not deadly damage to a property. These are situation where in developers have yet to fix.