[WATCH] Tesla's FSD Is Still Buggy Despite Upcoming Beta Next Month--EV Owners Encounter Problems While Testing

Tesla took driving to the next level through its "Full Self-Driving" (FSD) technology, which Elon Musk's company introduced in October 2020. However, the company should know better that while the said process is impressive due to its automation, it still has many features that need improvement.

Videos online showed that electric vehicle drivers had experienced a lot of troubles while testing their Tesla cars. Some reported that they steered into barricades, while others said the vehicle's FSD has failed to follow simple road rules.

Tesla's FSD Problems Posted Online

Tesla FSD
Screenshot from YouTube/Tesla

The FSD feature will be granted to Tesla owners who paid for the beta access. By next month, it will be accessible to every Tesla owner.

However, videos online led many people whether the technology is still safe for the users.

On Mar. 12, AI Addict, a YouTube content creator who specializes in reviewing AI products, tested the FSD beta version 8.2 during his trip to downtown San Jose. While riding his Tesla Model 3, the vehicle encountered near-static collisions down the railroad tracks.

Moreover, the car also mistakenly entered the parking garage, and it even attempted to cross the wrong lane. If it wasn't for the driver's quick response, he could have met an instant accident on the road.

Another video, which was uploaded by Chuck Cook, who owns Tesla Model Y, displayed some flaws of the FSD technology. For his experiment, Cook wanted to validate if the beta's ability was secured enough in making left turns.

However, there are multiple instances when Cook was forced to take full control of the brake so that he would not crash into other cars on the road.

Last year, the Center for Auto Safety executive director Jason Levine said that Tesla is only using the car owners as "guinea pigs" who will test its product, Autosafety noted.

Tesla's FSD is a more high-tech edition of the company's Autopilot feature. Since its electric cars have several sensors and cameras to keep track of the road lanes, FSD extends the capabilities of an Autopilot system. Some additional features include stop sign recognition, self-parking, and traffic light recognition, to name a few.

Full-self driving relies on a complex technology, where driving automatically is possible without the need for human interaction to operate the car. However, as the earlier videos showed, encountering some obstacles and performing other road tasks still seem hard to execute.

Tesla's Issue About Autopilot Abuse

According to a report by Business Insider, Tesla was criticized last fall over its controversial involvement in the Autopilot abuse. Earlier this March, the National Highway Traffic Safety Administration (NHTSA) made a confirmation about the investigation on multiple crashes reported using Tesla's Autopilot.

It was said that there are cases when the cars bumped into parked vehicles. Moreover, there are also reports that the Tesla drivers abused the Autopilot by sleeping inside the vehicle.

Because of this, Levine recommended that Tesla Cars should have built-in cameras to keep track of the driver's activity inside. Currently, Tesla only uses monitoring to know the driver's activity when controlling the steering wheel.

He also added that terms such as Full Self-Driving and Autopilot could mislead the users since neither of the two features is autonomous. Also, it still requires the driver to be aware of the technology so that accidents can be avoided.

In light of this, Tesla encouraged the owners to responsibly use the technology through safe driving and paying attention to the road signs and other important rules that they should follow on the road.

This article is owned by Tech Times.

Written by Joen Coronel

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics