Safety advocacy group The Dawn Project aired its campaign against Tesla's Full Self-Driving (FSD) system in one of the largest events of the year, the Super Bowl, reported first by TechCrunch on Sunday, Feb. 12.

The 30-second ad was viewed by millions of football fans in Washington, D.C. including Austin, Tallahassee, Atlanta, and Sacramento.

GERMANY-AUTOMOBILE-TESLA-MUSK
(Photo : ODD ANDERSEN/AFP via Getty Images)
Tesla CEO Elon Musk talks to media as he arrives to visit the construction site of the future US electric car giant Tesla, on September 03, 2020 in Gruenheide near Berlin.

Super Bowl Commercial

The ad lists various alleged safety flaws in Tesla's FSD, which serves as the company's advanced driver assistance system (ADAS).

A Dawn Project representative revealed to CNN that the ad was worth $598,000.

In a series of tests by the Dawn Project, the video depicts a Tesla Model 3 allegedly operating in Full Self-Driving mode striking a child-sized dummy on a school crossing and then a fake infant in a stroller.

The voiceover claims that FSD will "run down a child in a school crosswalk, swerve into oncoming traffic, hit a baby in a stroller, go straight past stopped school buses, ignore 'do not enter' signs, and even drive on the wrong side of the road."

The National Highway Traffic Safety Administration and the Department of Motor Vehicles (NHTSA) were urged by the advocacy group to disable FSD until all safety issues are resolved. The group also claimed that Tesla's "deceptive marketing" and "woefully inept engineering" is posing risks to the public.

The Dawn Project, which produces its own videos as tests of Tesla's purported design problems, claims it seeks to make computer-controlled technologies safer for humans.

In 2022, Tesla responded to these videos saying that they are defamatory and that they may give a harmful interpretation to the public.

Read Also: Tesla's Full Self-Driving Beta v11 Update is Delayed Again, Elon Musk Says It Will Be Limited Beta

NHTSA Investigation

The NHTSA announced last December that it was leading an investigation into the reported Tesla accidents.

The Department is currently looking into 41 cases that reportedly involved the autonomous driving system of Tesla. Two of these incidents allegedly involved an eight-car collision with a 2021 Tesla Model S on the San Francisco Bay Bridge.

According to a recent report by the California Highway Patrol, the driver claimed that they were utilizing the vehicle's FSD technology when the accident happened.

Additionally, 14 of the 41 reported crashes resulted in fatalities.

This NHTSA inquiry aims to examine the Tesla EVs' safety features, including autonomous emergency braking as well as Autopilot and FSD beta functions.

Even though many contend that Tesla's self-driving technologies are risky and are still in the early stages of development, others are defending the company.

One Tesla FSD user put his trust in the company's autonomous system, believing that it would save his children from suffering harm or injury, by using them to pose as roadblocks in a video.

Related Article: NTSB Report Claims Autopilot is Not the Culprit in 2021 Tesla Crash Despite No People Found in Driver's Seat

Byline

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion