Fatal Tesla crash involved autopilot, driver inattention

1124
Tesla crash

Reports of a tragic fatal 2018 Tesla crash show the limits of autonomous vehicle technology in detecting some types of deadly obstacles.

 

Today, various media outlets continue to report findings by the U.S. National Transportation Safety Board showing that an Apple employee who was distracted by a smartphone app allowed his vehicle to run into a “gore area” (a sharply emerging triangular obstruction) on a dividing highway that was not recognized by the Tesla autopilot’s collision avoidance system.

 

Some investigations into the incident also show that this was not the first time that Tesla autopilot had made the same error – it’s just that the driver had caught it before.

 

Chilling stories like these, rare as they may be, show us the practical limitations of self-driving vehicles – although they are smart enough to avoid most types of accidents, they are not a replacement for human vision and intuition.

“The use of Tesla’s Autopilot software has been implicated in several crashes,” writes an anonymous BBC reporter this morning. “The system lets the car operate semi-autonomously, changing lanes and adjusting its speed. But critics say the ‘Autopilot’ branding makes some drivers think the car is driving fully autonomously.”

Tesla has repeatedly warned drivers to stay in the front seat and alert while using Tesla autopilot, but sadly, that has not fully curtailed this type of human error.

Then there’s the issue of regulation and who shoulders the blame for these types of tragic accidents.

 

“Industry keeps implementing technology in such a way that people can get injured or killed,” NTSB chair Robert L. Sumwalt III said in a press statement. “If you own a car with partial automation, you do not own a self-driving car. Don’t pretend that you do.”

What NTSB cites as ‘system limitations’ involves high-stakes – human life.

 

What we must take away from the most dangerous applications of automation is the idea that AI does not replace human supervision. It can be a powerful helper to human decision-making, but in the end, humans must remain in the driver’s seat.

 

NO COMMENTS

LEAVE A REPLY