Self-driving cars are not perfect. Just like human drivers, they need to be aware of their surroundings to avoid road hazards and travel safely.
The most advanced autonomous vehicles frequently employ lidar, a rotating radar-like device that serves as the vehicle's eyes. For the car to decide what actions are safe to perform, the lidar continuously gives information about the car's distance from objects.
However, researchers found out that these eyes can be tricked by lasers.
Laser Tricking Autonomous Cars
According to a new study, precisely timed laser beams directed toward an approaching lidar system can produce a blind spot in front of the car that is sufficiently large to obscure moving pedestrians and other impediments.
This means that it could give the self-driving cars a false perception of safety on the road due to the deleted data putting everything that might be in the attack's blind spot in peril.
The researchers said that this marks the first time that lidar sensors have been deceived into deleting data about obstacles.
Researchers from the Universities of Florida, Michigan, and Japan's University of Electro-Communications discovered the vulnerability. The experts also offered improvements that could get rid of this flaw and shield people from malicious assaults.
Similar to how a bat utilizes sound echoes for echolocation, lidar uses laser light to compute distances and then records the reflections. The attack confuses the sensor by producing phony reflections.
"We mimic the lidar reflections with our laser to make the sensor discount other reflections that are coming in from genuine obstacles," Sara Rampazzi, a UF professor of computer and information science and engineering who led the study, said in a press release statement.
"The lidar is still receiving genuine data from the obstacle, but the data are automatically discarded because our fake reflections are the only ones perceived by the sensor."
Demonstrating the Attack
The attacker was positioned around 15 feet away on the side of the road as the scientists demonstrated the attack on moving automobiles and robotics. However, in theory, it might be done with better technology at a greater distance.
The required technology is all quite simple, however, to maintain the laser's proper orientation while moving vehicles are being monitored, the laser must be precisely synchronized to the lidar sensor, as per the research team.
The researchers were able to remove information for moving pedestrians and stationary obstructions using this method. Additionally, they showed through practical tests that the attack was capable of monitoring a stationary car using simple camera tracking technology.
This erasure of data resulted in a car continuing to accelerate toward a pedestrian it could no longer see during simulations of autonomous vehicle decision-making.
Addressing the Vulnerability
The researchers said that this vulnerability might be fixed with updates to the lidar sensors or the software that analyzes the raw data. For instance, the software may be trained by the manufacturers to hunt for the distinctive signatures of the faked reflections that the laser attack introduced.
The study's lead author Yulong Cao said that revealing this liability will help automakers create a more reliable system for autonomous cars. Cao claimed that in their study, they showed how earlier defense strategies are inadequate and that changes must be done to address this flaw.
The results, which are now available on arXiv, will be presented at the USENIX Security Symposium in 2023.
Related Article : Creepy Googly Eyes Could Make Self-Driving Cars Safer for Pedestriants, Researchers Claim
This article is owned by Tech Times
Written by Jace Dela Cruz