Tesla Autopilot Design ‘Led To’ Crash

According to a report released by the US National Transportation Safety Board, driver distraction and the design of Tesla’s Autopilot System resulted in a crash in early 2018.

It stated that Tesla’s semi-autonomous driving system relaxed the driver from the driving task and allowed him to disengage, resulting in the crash, on a California motorway. However, Tesla says Autopilot demands active driver direction and drivers should have their hands on the steering wheel.

Fortunately, no one was hurt in the January 2018 crash.

What Did The Report Say?

As per the report, Autopilot was started at the time of the crash and the driver did not have his hands on the wheel. The car was following the vehicle ahead of it but the front vehicle switched lanes to the right to bypass a parked fire engine blocking the path. The Tesla speeded, identifying the fire engine in its way only about 0.49 seconds before the crash.

The car’s collision-warning alarm was initiated but the car did not stop and crashed into the fire engine at about 30mph (48km/h).

How Tesla's autopilot works - Source: TechCrunchX.com

The NTSB told the likely cause of the crash had been the driver’s absence of response to the fire truck positioned in his lane, because of his inattentiveness (the drive was probably reading about 5G network on his smartphone) and complete dependence on the car’s unique driver assistance system. It also mentioned the design of Tesla’s Autopilot system had permitted the driver to disjoin from the driving task, though it admitted the driver had used the technology in contradiction to warning and guidance from Tesla.

An inquiry into a separate deadly Tesla crash is still in progress. The firm publishes a three-monthly safety report to call attention to crashes involving Autopilot are rare.

Tesla issued a statement saying Tesla owners have driven many miles with Autopilot involved, and data from the company’s quarterly Vehicle Safety Report specifies that drivers using Autopilot remain harmless than those working without assistance.

“While our driver-monitoring system for Autopilot always reminds drivers of their duty to remain alert and forbids the use of Autopilot when warnings are overlooked, we’ve also presented numerous updates to make our safeguards safer, smarter and more operational across every hardware platform we’ve deployed.

“Since this incident happened, we have made updates to our system including modifying the time intervals between hands-on warnings and the conditions under which they’re triggered.”

Past week, Tesla released a car insurance service for drivers in California, the company’s prime market for its electric vehicles. It said it could propose its customers lower rates because of the safety features made into its cars.

Matthew Edmonds, Tesla’s head of insurance, said that the data and cameras are all installed in the car. The company said it was not presently using “data from individual vehicles, such as GPS or vehicle camera footage.”

Rather, it would control “anonymized, aggregated” data to prove the safety record of its cars.

Ella Smith has been a brilliant writer and her writing is impressive. She often writes for Educational and motivational topics that is a great point in her. She has started writing for Brightshub for a couple of months.
Translate »