Lack of 'safeguards' in Tesla's Autopilot contributed to fatal crash

A Tesla logo hangs on a building outside of a Tesla dealership in New York U.S

The driver's "overreliance" on the Tesla system - designed as a semi-autonomous diving system to be used with a human operator - permitted "prolonged disengagement" that led to the collision with the freight trailer, the National Transportation Safety Board report said.

Reuters reported Monday that the NTSB is expected to find that the system was a contributing factor.

The fatal incident raised questions about the safety of systems that can perform driving tasks for long stretches with little or no human intervention, but which cannot completely replace human drivers. On some roads, drivers could use Autopilot at up to 90 miles (145 km) per hour, it said.

The board recommended that automakers incorporate safeguards that keep drivers' attention engaged and that limit the use of automated systems to the conditions for which they were designed.

"We will also continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology and drivers need to remain attentive at all times", Tesla said.

In the May 7, 2016 crash that killed 40-year-old Joshua Brown in Florida, the NTSB determined Autopilot worked as designed before Brown's Model S plowed into a truck.

Tesla in September 2016 unveiled improvements in Autopilot, putting new limits on hands-off driving and other features that its chief executive officer said likely would have prevented the crash death. Other recommendations centered around data collection and designs for determining whether drivers are actually paying attention behind the wheel.

An NTSB investigator testified Tuesday that "collision mitigation systems" do not reliably detect cross traffic.

Monitoring driver attention by measuring the driver's touching of the steering wheel "was a poor surrogate for monitored driving engagement", said the board.

Brown had his hands on the sedan's steering wheel for only 25 seconds out of the 37.5 minutes the vehicle's cruise control and lane-keeping systems were in use prior to the crash, investigators found.

The statement also praised Tesla for improving its Autopilot software after the accident, changes it said were a direct result of the crash.

'Nobody wants tragedy to touch their family, but expecting to identify all limitations of an emerging technology and expecting perfection is not feasible either, ' the statement said.

As NHTSA found, the Automatic Emergency Braking feature on the Tesla-in common with just about every other AEB fitted to other makes of cars-was not designed in such a way that it could have saved Brown's life.

"People die every day in auto accidents", the statement said.

A spokeswoman for Tesla and a lawyer for the family, Jack Landskroner, have declined to say if the automaker has reached a legal settlement with the Brown family. In a report set to be released in the next few days, the NTSB concludes that the accident was the fault of both drivers and has issued a series of recommendations to the Department of Transportation (DOT), the National Highway Traffic Safety Administration (NHTSA), manufacturers of Level 2 automated driving systems, the Alliance of Automobile Manufacturers and Global Automakers. The NTSB report did find that a Vehicle-to-Vehicle communication system (V2V) could have alerted both vehicles to the potential danger, but as we have discussed ad nauseam, V2V is still absent from new cars even though the spec is nearly 20 years old.

Two minutes earlier, according to reports, Brown had set the speed at nearly 10 miles per hour above the posted speed limit.

Related News: