On April 19, Elon Musk speaks up about the recent "Tesla Autopilot Crash" that has claimed the lives of two men in Texas, earlier this Saturday night, with initial findings pinning it to the driverless feature being used. The CEO and another Twitter user have explained the use of Autopilot and its safety features that cannot engage fully driverless or no one on the seat.
The Tesla driverless accident in Spring, Texas has been one of the hottest cases this week, especially as it led to combustion and the death of two passengers of the electric vehicle. This came after the release of Tesla's Q1 safety report which has claimed that the Autopilot is 10 times safer in this version compared to the ones that came before it.
Also, this has come after the nearing release of Tesla's Full-Self Driving (FSD) public beta feature which was revealed by Musk to be releasing the "FSD Beta Button" sometime during the Summer season. Additionally, its subscription service would begin by May, which is less than two weeks away from unfolding, and would slowly introduce it to the public soon.
Elon Musk Denies Tesla Autopilot Crash
When the news was hot after the discovery of local authorities in Texas, the electric vehicle manufacturer and clean energy company has not released any statement regarding the accident. However, it seems that it did its research and investigated the crash on their end, looking for signs if the Autopilot was engaged or if the users were subscribed to the FSD.
The Tesla CEO replied to a tweet by a Tesla fan which has included data and parameters of the Tesla Autopilot on how it would be engaged and how it would not when driving in a "driverless" situation. According to Elon Musk via Twitter (@elonmusk), he agreed with the tweet by a user named "Ahmad A Dalhat" (@Amart15416132) which has explained Autopilot's function.
Additionally, Musk has condemned the Wall Street Journal report which talked about the Tesla Autopilot to be the reason for the recent crash that claimed the lives of its two passengers. The CEO even went as far as to say that the individual's research was better than theirs (WSJ), as A Dalhat gave a detailed look at how the Autopilot functions and engages.
How Does the Autopilot Work and Why It Cannot Be the Cause?
The Tesla Autopilot is said to be more than a self-driving feature that can be used on roads and highways, but it is also a safety feature that the company has included on all of Tesla's electric vehicles. Elon Musk has revealed that the company has gathered the data logs from the crashed EV, and revealed that the Autopilot was not engaged during the time of the crash and was not an FSD user.
According to A Dalhat's explanation (which Musk agreed to), the Tesla Autopilot cannot turn on if it senses that there are no drivers in the seat with its weight sensors, with it requiring the hands on the wheel every 10 seconds. Additionally, Autopilot cannot go over speed limits, which brings a heavy argument that it was the car manufacturer's fault for how the accident happened.
Related Article : Elon Musk Says Tesla FSD is Ready for Beta v. 9.0 to Improve on Cornering and Bad Weather, Self-Driving Coming to Boring Loop Tunnel EVs
This article is owned by Tech Times
Written by Isaiah Alonzo